Using Technology in Scientific Research

Explore top LinkedIn content from expert professionals.

  • View profile for Will Ahmed
    Will Ahmed Will Ahmed is an Influencer

    Founder & CEO at WHOOP®

    107,204 followers

    NOVEL DIGITAL BIOMARKER FOR UNDERSTANDING MENSTRUAL CYLCES I’m excited to share groundbreaking WHOOP research published in Nature’s Digital Medicine journal earlier this week. This work is a big leap forward in how we understand and monitor female physiology. It was previously well understood that several vital signs, (most notably temperature, but also importantly Heart Rate and Heart Rate Variability) fluctuate with the phase of your menstrual cycle, but it was never before understood whether or not the extent of those fluctuations were significant and if so, what they might mean. We set up a study with WHOOP data harnessing the insights from 11,500 women who opted into our research program. By analyzing over 45,000 menstrual cycles, our team was the first to describe the significance in the amplitude of those fluctuations, and in doing so, develop a novel metric that may change the role wearables play in female reproductive health. Amplitude was observed to be suppressed in individuals with characteristics of reduced fertility, such as higher BMI and older age. This completely non-invasive marker could one day be used to identify irregularities in reproductive health earlier, reducing time to diagnosis. This study was made possible by our always-on data and global scale. It utilized over 1 million days of WHOOP data, enabling insights that aren’t feasible without a 24/7 wearable. Only 3% of medical research is focused on women. Whoop is committed to investing equally in women’s and men’s health research. We’ll continue to share women’s research discoveries like this and more. We use this research to inform and guide changes to our product. Thank you to our incredible research team, led by Emily Capodilupo, for their dedication and innovation. Check out the full study here: https://lnkd.in/eXSqC4-5 #WHOOP #wearabletech #research #WomensHealth #innovation

  • View profile for Shyamal Patel

    Science @ Oura

    3,822 followers

    An important new study published in the peer-reviewed journal, Gastroenterology, has revealed the potential of wearable devices to predict flares of inflammatory bowel disease (IBD) up to seven weeks in advance. Researchers at Mount Sinai found that physiological data collected from devices like the Oura Ring, Fitbit, and Apple Watch can identify subtle changes in heart rate variability, heart rate, oxygenation, and activity patterns that precede flare-ups. Notably, Oura Ring proved particularly valuable in capturing these physiological changes. The ring's ability to continuously monitor heart rate variability, a key indicator of autonomic nervous system activity, allowed researchers to detect subtle shifts that often precede inflammation. This breakthrough could transform how IBD is managed. By providing early warning signs, wearable devices can empower patients and doctors to proactively adjust treatment plans, potentially reducing the severity and duration of flares. This non-invasive, continuous monitoring offers a significant advantage over traditional methods like blood tests and colonoscopies, which can be inconvenient and only provide a snapshot of disease activity at a given moment in time. Beyond IBD, the study's findings pave the way for personalized management of other chronic conditions, where wearable data is integrated with artificial intelligence algorithms to predict flares and manage disease on an individual basis. This could revolutionize the lives of the hundreds of millions of people living with chronic diseases in the US alone, offering a new level of control and improved quality of life. Oura’s capabilities as a powerful public research tool are no secret to those of us who have spent time with the product, and I’m thrilled to see others harnessing the power of Oura Ring to improve patient outcomes. #WithOura https://lnkd.in/ghUJ7zwV

  • View profile for Dean Lee

    I help biologists learn computational skills with hands-on projects

    41,467 followers

    Good practices for sharing your biological data: - Separate the metadata (could just be a TSV file) for the sample/cells from the actual numeric measurements. That way others don’t have to download and read in your entire dataset into memory just to see the metadata. A quick glance at the metadata should be enough for others to see what exactly is in a dataset and whether it is relevant to their question. - Don't share the metadata only as a table in a figure, locked away in a PDF. To transfer that information, some other human has to read-and-type the information over from the PDF. It's a highly error-prone process. - Don’t share your giant dataset as an R object. R just doesn’t handle large datasets like that very well, and now everyone who wants to learn from your data is forced to read that large dataset into R. Instead, share it in a language-agnostic, compressed format. - Don’t make your data available only upon request. Just make it available. - Interactive data visualizations are great! But also make sure to include a download button for the data you are using for the visualization. Usually people who explore you data would also want a copy of that data offline to do more in-depth analysis. Taking care of these basics multiplies the impact of your work because more people will actually be able to learn from your data. You’ll get more citations and more street cred in your niche. If there ever was a time to squeeze more use out of existing scientific data, now would be it. #compbio #computationalbiology #bioinformatics

  • View profile for Yossi Matias

    Vice President, Google. Head of Google Research.

    44,030 followers

    Today, we release a preprint describing a new AI system built with Gemini, designed to help scientists write empirical software. Unlike conventional software, empirical software is optimized to maximize a predefined quality score. Our system can hypothesize new methods, implement them as code, and validate performance by iterating through thousands of code variants. AI-powered empirical software has the potential of accelerating scientific discovery. Here is how it works (also on the visual graphic): ➡️ The system takes a "scorable task" as input, which includes a problem description, a scoring metric, and data for training and evaluation. ➡️ It generates research ideas, and an LLM implements these ideas as executable code in a sandbox. ➡️ Using a tree search algorithm, it creates a tree of software candidates to iteratively improve the quality score. ➡️ This process allows for exhaustive solution searches at an unprecedented scale, identifying high-quality solutions quickly. We rigorously tested our system on six challenging and diverse benchmarks and demonstrated its effectiveness. The outputs of our system are verifiable, interpretable, and reproducible. The top solutions to each benchmark problem are openly available. We look forward to taking this research through full peer-review. This new ability for AI systems to devise and implement novel solutions highlights AI’s capacity to help accelerate scientific innovation and discovery. The role of AI is evolving from a lab assistant to a collaborator that can transform the speed and scale of research. Read  the blog: https://lnkd.in/dPCZCCHS the preprint: https://lnkd.in/dQqfq8yg

  • View profile for Matt Hatami

    Graduate Researcher | HydroClimate Extremes | Climate Risk Intelligence

    5,857 followers

    In academia, we often publish groundbreaking research that remains confined to journals—what if a few extra steps could amplify its impact and visibility? It's very common to generate valuable datasets, maps, and models, publishing our findings in peer-reviewed journals. However, these contributions often remain within the academic community. By taking additional steps—such as creating interactive visualizations and sharing them publicly—we can significantly increase the reach and impact of our research. This realization led me to develop two interactive tools based on the study "Integrated Socio-environmental Vulnerability Assessment of Coastal Hazards Using Data-driven and Multi-criteria Analysis Approaches" by a colleague of mine Ahad Hasan Tanim, published in Nature, Scientific Reports. Coastal Vulnerability Index StoryMap: https://lnkd.in/dTCrmgrq An interactive narrative that visualizes the study's findings, allowing users to explore various vulnerability categories across the region. Coastal Vulnerability Dashboard: https://lnkd.in/dJ7p24zA A dynamic dashboard that provides in-depth analysis and visualization of the coastal vulnerability data, facilitating informed decision-making. These projects were initially a way for me to apply and reinforce the skills I acquired from an ESRI course earlier this year. However, they also serve a deeper purpose: to enhance the visibility and impact of our academic work. Research indicates that sharing data and visualizations can lead to higher citation rates and broader dissemination of findings. Moreover, open access to research outputs fosters greater transparency and collaboration, accelerating scientific progress. I hope these tools inspire fellow researchers to consider how we can make our work more accessible and impactful. A few extra steps can transform our research from a published paper into a resource that benefits a wider audience. #CoastalResilience #OpenScience #DataVisualization #GIS #AcademicImpact #ClimateChange #PublicEngagement #visualization #dataViz #GISvisualization #vulnerabilityMap #coastalVulnerability #interactiveMap #ModernGIS

  • View profile for Silvia Pineda-Munoz, PhD

    Founder, Climate Ages | Paleontologist, Ecologist, & Science Storyteller | Naturally Caffeinated and Optimistic | Did you see my YouTube show?

    5,751 followers

    If you’ve ever felt like shouting “JUST READ MY PAPER!”...  This is for you. Here’s the uncomfortable truth: If you don’t communicate your research clearly, consistently, and strategically, most people will never hear about it. Not funders. Not policymakers. Not journalists. Not even your peers. Yes, those who study the same things and never invite you as a coauthor. But when scientists hear “go viral” or “personal brand,” they picture loud, self-promotional nonsense that feels icky. That’s not what this is. This is about being visible for the right reasons and making your work easier to fund, share, and scale. Here are 10 ways to do just that (without sounding like a sales pitch): 1. Tell the story behind the data: what inspired the question? 2. Show the real-world impact: why does it matter, and to whom? 3. Use simple metaphors to explain complex findings. 4. Talk about the process: the failures, surprises, pivots. 5. Share quotes from collaborators or community members. 6. Repurpose talks or papers into short LinkedIn posts. 7. Make one powerful visual that captures your key point. 8. Answer the question: “So what?” in plain language. 9. Engage with others in your space, don’t just broadcast. 10. Build a narrative thread that connects your work to your purpose. This isn’t fluff. It’s a strategy. And it works not just for algorithms, but for funders, reporters, and decision-makers who need clarity more than data. – Want your research to spark action, not just citations? ✅ Follow for more science & purpose reflections 📬 Subscribe to Outreach Lab (link under my name) ☕ Book a free clarity call; I'd love to hear your story. Bridge your Science with the World. It’s Ready to Listen.

  • View profile for Sumaiya Iqbal

    Senior Group Lead, Principal Investigator | Iqbal Lab (Bioinformatics & Machine Learning), Ladders to Cures (L2C) Scientific Accelerator, Broad Institute of MIT and Harvard

    3,166 followers

    📜 Published today by Springer Nature Group in Nature Methods. Genomics 2 Proteins portal: a resource and discovery tool for linking genetic screening outputs to protein sequences and structures. ➡ https://lnkd.in/ebbvAXaY ➡ The G2P portal (https://lnkd.in/gAFErcdt) is an open-source tool for proteome-wide linking of human genetic variants to protein sequences and structures and hypothesizing the structure-function relationship between natural/synthetic variations and their molecular phenotypes. ➡ An elaborated thread on the amount of integrated #variant, #structural and #functional data, and case studies on analyzing #geneticvariants in #raredisease genes and #syntheticvariants from #baseediting screens are available here: https://lnkd.in/eb6-SZDD (and of course, in the paper ➡ link above!) ➡ The #resource and interactive #method on the G2P portal are developed for a broad community of researchers: #MolecularBiologist#Bioinformatician#VariantAnalysts#TherapeuticScientists, and #MachineLearners, to name a few! Check out the tool if you find it useful, and let us know if you need training and collaborations! 🎯 Finally, I couldn't be more delighted and grateful for the terrific effort by the G2P portal team: Jordan Safer, Seulki Kwon, Duyen Nguyen, colleagues and collaborators: Arthur J. Campbell, David Hoksza, Alan Rubin (and others from Atlas of Variant Effects Alliance), Alex Burgin, thousands of users of the portal, and our funding supports, the Broad Institute of MIT and Harvard SPARC award and David R. Liu, Merkin Institute of Transformative Technologies in Healthcare at the Broad Institute of MIT and Harvard.

  • View profile for Raya Khanin
    6,215 followers

    🧬 Your Skin Is a Sensor. This Wearable Reads It—No Contact Needed. A new Nature Portfolio study introduces a non-contact wearable that captures molecular fluxes from the skin in real time—tracking water vapor, VOCs, CO₂, and even the absorption of environmental chemicals. 📍 What it does: Forms a microchamber just above the skin—no adhesive, no irritation. Wireless sensors inside detect how molecules move in and out, revealing physiological and environmental signals with clinical-grade precision. 💨 What are VOCs? Volatile Organic Compounds are vapors released from the skin via: • Microbial activity (e.g. hygiene, odor, dysbiosis) • Inflammation and wound healing • UV-triggered oxidative stress • Uptake of airborne chemicals like ethanol or solvents VOC spikes preceded wound infections, revealed poor hygiene, and increased after UV exposure—correlating with oxidative DNA damage. CO₂ flux reflected metabolic stress. Inward ethanol flux measured chemical absorption through the skin. 💡 Why this matters: The skin is not just a barrier—it’s a real-time interface with the environment. This wearable turns it into a window on whole-body physiology. 🌍 Implications far beyond dermatology: ✅ Critical care: Monitor hydration and tissue integrity in neonates and ICU patients ✅ Respiratory health: CO₂ flux as a non-invasive proxy for arterial CO₂—no need for bulky equipment ✅ Wound care: Track inflammation, healing progress, and infection without disrupting dressings ✅ Occupational safety: Quantify skin absorption of industrial solvents and pollutants ✅ Metabolic monitoring: VOCs and CO₂ patterns could flag systemic conditions before symptoms appear ✅ Environmental research: Real-time exposome data from the skin itself 💡 The device weighs just 11g, runs wirelessly for 24+ hours, and operates without touching fragile skin—ideal for vulnerable populations and long-term monitoring. 📖 Nature, 2025: https://lnkd.in/eRCqAJbY #biotech #wearables #healthtech #exposome #noninvasive #VOC #CO2monitoring #woundcare #pollutionmonitoring #criticalcare #skinbiosensor #molecularhealth #beautybiotech #metabolichealth

  • View profile for Reeba Thomas

    PhD Candidate in Mechanical Engineering | Experimental Materials Enthusiast | Mentoring & Connecting One-on-One| Helping international students navigate PhD/Postdoc applications to the U.S. |

    2,551 followers

    How Mechanical and Materials Engineers Can Start Using AI in Their Work Artificial Intelligence is no longer limited to computer science, it’s becoming an essential tool across disciplines, including engineering and academic research. For mechanical engineers, materials scientists, and educators, here are some practical ways to begin integrating AI into your workflow: 1. Automated Literature Reviews Tools like Elicit, Connected Papers, and ResearchRabbit use AI to identify relevant studies, suggest related work, and even generate summaries; saving hours of manual searching. 2. Data Analysis and Visualization AI-integrated platforms (e.g., PandasAI, ChatGPT Code Interpreter) can help analyze experimental data such as stress-strain curves, thermal profiles, or SEM image results. This can be particularly useful for high-throughput testing or large datasets. 3. Assistance with Simulations For those working with FEA or thermodynamic modeling (e.g., using COMSOL, ANSYS, or CALPHAD), AI tools can help debug code, suggest boundary conditions, or optimize parameters more efficiently. 4. AI in Teaching and Assessment Educators can use AI to generate quizzes, explain complex topics in simpler terms, and even provide feedback on written assignments. It can also support personalized learning pathways for students. 5. AI for Research Planning GPT-based tools can assist with writing research proposals, identifying potential research gaps, and even outlining experimental plans. 6. Exploring AI-Driven Design Algorithms like genetic algorithms, reinforcement learning, or neural networks can be trained to assist in materials discovery, structural optimization, or predictive modeling. Getting Started: • Choose one task from your current workflow (e.g., paper summary, data cleaning, teaching content creation). • Use a trusted AI tool to assist and not replace the process. • Evaluate and refine your use of the tool based on outcomes. AI is not a replacement for engineering knowledge; it’s a powerful extension of it. If you’re already using AI in your work, what tools have been most helpful to you? #AIinEngineering #MechanicalEngineering #MaterialsScience #AcademicResearch #EdTech #CALPHAD #FEA #PhDLife

  • View profile for Erica Peterson

    From Marketing & Sales to Legal▪️JD Candidate 2026 ▪️Technology Transactions + Cyberlaw

    5,983 followers

    If you're a researcher who spends hours manually copying and pasting data from PDFs and web sources into spreadsheets before you can even begin analysis, this one's for you. Traditional research workflows are painfully manual: Find a document → download it → extract text → copy relevant data → paste into spreadsheet → clean and organize → finally start analyzing. Sound familiar? 😅 Last weekend, I created a streamlined system that automates this process: ✅ Step 1: Built a Hypermode agent that scrapes and performs OCR/text extraction from a given URL ✅ Step 2: Agent identifies entities and relationships ✅ Step 3: Agent creates structured database tables with a proper schema based on discovered relationships ✅ Step 4: Connected Anthropic's Claude Desktop to the database via MotherDuck's DuckDB MCP Server for querying and visualization What's so special? I gave the agent a URL and it was able to process multiple linked PDFs from that page about Iranian sanctions (my first test use-case for a project that John Doyle and a few others are working on). No manual downloads and no file uploads. The agent identified key entities, mapped their relationships, and populated a queryable database. Then using Claude Desktop, connected to that database through the MCP server, I was able to ask questions about the data, generate force graphs, and create infographics or dashboards that can be shared. For anyone drowning in manual research processes, this combination of automated data extraction + Claude's analytical capabilities through MCP servers isn't just a productivity boost...it's a fundamental shift in how people of all technical backgrounds can approach data-intensive research. What research workflows are you still doing manually that could benefit from this kind of automation? My next test use-case? Contracts, of course! 📄 #DataVisualization #CTI #OSINT #LegalTech #MCP

Explore categories