Engineers have developed a thread-based sensor capable of monitoring the direction, angle of rotation and degree of displacement of the head. The design is a proof of principle that could be extended to measuring movements of other limbs by sensors attached like tatoos to the skin.
Tag: Machine Learning
Detecting ADHD with near perfect accuracy
A new study led by a University at Buffalo researcher has identified how specific communication among different brain regions, known as brain connectivity, can serve as a biomarker for attention deficit hyperactivity disorder (ADHD).
Mount Sinai Researchers Build Models Using Machine Learning Technique to Enhance Predictions of COVID-19 Outcomes
Mount Sinai researchers have published one of the first studies using a machine learning technique called “federated learning” to examine electronic health records to better predict how COVID-19 patients will progress.
Story Tips from Johns Hopkins Experts on COVID-19
Vaccines take time to work. After getting a COVID-19 vaccine, it takes a while for the immune system to fully respond and provide protection from the virus. For the Moderna and Pfizer COVID-19 vaccines, it takes up to two weeks after the second shot to become appropriately protected.
Mount Sinai Researchers Build Models Using Machine Learning Technique to Enhance Predictions of COVID-19 Outcomes
Mount Sinai researchers have published one of the first studies using federated learning to examine electronic health records to better predict how COVID-19 patients will progress.
Using neural networks for faster X-ray imaging
A team of scientists from Argonne is using artificial intelligence to decode X-ray images faster, which could aid innovations in medicine, materials and energy.
Jefferson Lab Launches Virtual AI Winter School for Physicists
Artificial intelligence is a game-changer in nuclear physics, able to enhance and accelerate fundamental research and analysis by orders of magnitude. DOE’s Jefferson Lab is exploring the expanding synergy between nuclear physics and computer science as it co-hosts together with The Catholic University of America and the University of Maryland a virtual weeklong series of lectures and hands-on exercises Jan. 11-15 for graduate students, postdoctoral researchers and even “absolute beginners.”
Advanced tools reveal critical infrastructure connections and help mitigate disasters
A cross-platform Argonne collaboration is optimizing a tool developed after Hurricane Maria to find essential connections between critical infrastructure that will help owners and operators plan for and mitigate a variety of potential hazards.
Accelerating AI computing to the speed of light
A University of Washington-led team has come up with a system that could help speed up AI performance and find ways to reduce its energy consumption: an optical computing core prototype that uses phase-change material.
Fermilab receives DOE award to develop machine learning for particle accelerators
Fermilab scientists and engineers are developing a machine learning platform to help run Fermilab’s accelerator complex alongside a fast-response machine learning application for accelerating particle beams. The programs will work in tandem to boost efficiency and energy conservation in Fermilab accelerators.
10 ways Argonne science is combatting COVID-19
Argonne scientists and research facilities have made a difference in the fight against COVID-19 in the year since the first gene sequence for the virus was published.
UCI researchers use deep learning to identify gene regulation at single-cell level
Irvine, Calif., Jan. 5, 2021 — Scientists at the University of California, Irvine have developed a new deep-learning framework that predicts gene regulation at the single-cell level. Deep learning, a family of machine-learning methods based on artificial neural networks, has revolutionized applications such as image interpretation, natural language processing and autonomous driving.
Advanced materials in a snap
A research team at Sandia National Laboratories has successfully used machine learning — computer algorithms that improve themselves by learning patterns in data — to complete cumbersome materials science calculations more than 40,000 times faster than normal.
Machine Learning Improves Particle Accelerator Diagnostics
Operators of Jefferson Lab’s primary particle accelerator are getting a new tool to help them quickly address issues that can prevent it from running smoothly. The machine learning system has passed its first two-week test, correctly identifying glitchy accelerator components and the type of glitches they’re experiencing in near-real-time. An analysis of the results of the first field test of the custom-built machine learning system was recently published in the journal Physical Review Accelerators and Beams.
Artificial intelligence predicts gestational diabetes in Chinese women
Machine learning, a form of artificial intelligence, can predict which women are at high risk of developing gestational diabetes and lead to earlier intervention, according to a new study published in the Endocrine Society’s Journal of Clinical Endocrinology & Metabolism.
Developing Smarter, Faster Machine Intelligence with Light
SUMMARYResearchers at the George Washington University, together with researchers at the University of California, Los Angeles, and the deep-tech venture startup Optelligence LLC, have developed an optical convolutional neural network accelerator capable of processing large amounts of information, on the…
UCI researchers create model to calculate COVID-19 health outcomes
Irvine, Calif., Dec. 17, 2020 —University of California, Irvine health sciences researchers have created a machine-learning model to predict the probability that a COVID-19 patient will need a ventilator or ICU care. The tool is free and available online for any healthcare organization to use. “The goal is to give an earlier alert to clinicians to identify patients who may be vulnerable at the onset,” said Daniel S.
A.I. model shows promise to generate faster, more accurate weather forecasts
A model based solely on the past 40 years of weather events uses 7,000 times less computer power than today’s weather forecasting tools. An A.I.-powered model could someday provide more accurate forecasts for rain, snow and other weather events.
Artificial Intelligence Advances Showcased at the Virtual 2020 AACC Annual Scientific Meeting Could Help to Integrate This Technology Into Everyday Healthcare
Artificial intelligence (AI) has the potential to revolutionize healthcare, but integrating AI-based techniques into routine medical practice has proven to be a significant challenge. A plenary session at the virtual 2020 AACC Annual Scientific Meeting & Clinical Lab Expo will explore how one clinical lab overcame this challenge to implement a machine learning-based test, while a second session will take a big picture look at what machine learning is and how it could transform medicine.
Synthetic Biology and Machine Learning Speed the Creation of Lab-Grown Livers
Researchers at the University of Pittsburgh School of Medicine have combined synthetic biology with a machine learning algorithm to create human liver organoids with blood and bile handling systems. When implanted into mice with failing livers, the lab-grown replacement livers extended life.
Science Snapshots from Berkeley Lab
Berkeley Lab-developed machine learning tool can also calculate the optical properties of a known structure; CUORE experiment in Italy is designed to find theorized process called neutrinoless double-beta decay
Automatic deep-learning, artificial-intelligence clinical tool that can measure the volume of cerebral ventricles on MRIs in children
Researchers from multiple institutions in North America have developed a fully automated, deep-learning (DL), artificial-intelligence clinical tool that can measure the volume of cerebral ventricles on magnetic resonance images (MRIs) in children within about 25 minutes.
Brookhaven’s Kevin Yager Named Oppenheimer Leadership Fellow
Yager, a group leader at the Center for Functional Nanomaterials, is exploring challenges and opportunities for the U.S. Department of Energy.
The Impact of Pruning
PNNL researchers have shown an improved binarized neural network can deliver a low-cost and low-energy computation to help the performance of smart devices and the power grid.
Argonne AI methods unravel mysteries of SARS-CoV-2 viral-human cell interaction
Using a combination of AI and supercomputing resources, Argonne researchers are examining the dynamics of the SARS-CoV-2 spike protein to determine how it fuses with the human host cell, advancing the search for drug treatments.
Flame on! How AI may tame a complex materials technique and transform manufacturing
Creating nanomaterials with flame spray pyrolysis is complex, but scientists at Argonne have discovered how applying artificial intelligence can lead to an easier process and better performance.
INCITE program awards supercomputing time to 51 computational research projects
The new projects will use DOE’s leadership-class supercomputers to pursue transformational advances in science and engineering.
Virtual reality: ALCF’s remote interns tackle real-world computing projects
The Argonne Leadership Computing Facility’s internship program went virtual this year, providing students with an opportunity to work on real-world research projects that address issues at the forefront of scientific computing.
Machine learning model for COVID-19 drug discovery is a Gordon Bell finalist
A machine learning model developed by a team of Lawrence Livermore National Laboratory (LLNL) scientists to aid in COVID-19 drug discovery efforts is a finalist for the Gordon Bell Special Prize for High Performance Computing-Based COVID-19 Research.
ORNL, partners receive more than $4 million to advance AI control of complex systems
The Department of Energy’s Oak Ridge National Laboratory and three partnering institutions have received $4.2 million over three years to apply artificial intelligence to the advancement of complex systems in which human decision making could be enhanced via technology.
New Machine Learning-Based Model More Accurately Predicts Liver Transplant Waitlist Mortality
Data from a new study presented this week at The Liver Meeting Digital Experience® – held by the American Association for the Study of Liver Diseases – found that using neural networks, a type of machine learning algorithm, is a more accurate model for predicting waitlist mortality in liver transplantation, outperforming the older model for end-stage liver disease (MELD) scoring. This advancement could lead to the development of more equitable organ allocation systems and even reduce liver transplant waitlist death rates for patients.
Mount Sinai Develops Machine Learning Models to Predict Critical Illness and Mortality in COVID-19 Patients
Mount Sinai researchers have developed machine learning models that predict the likelihood of critical events and mortality in COVID-19 patients within clinically relevant time windows.
Argonne collaborates on largest COVID-19 viral sequence analysis in U.S.: Verifies mutation concern
Argonne computational resources supported the largest comprehensive analysis of COVID-19 genome sequences in the U.S. and helped corroborate growing evidence of a protein mutation.
Biologists Create “Atlas” of Gene Expression in Neurons, Documenting the Diversity of Brain Cells
New York University researchers have created a “developmental atlas” of gene expression in neurons, using gene sequencing and machine learning to categorize more than 250,000 neurons in the brains of fruit flies. Their study, published in Nature, finds that neurons exhibit the most molecular diversity during development and reveals a previously unknown type of neurons only present before flies hatch.
Second Annual National Health Symposium Event Summary Available
The event summary for the second annual National Health Symposium, organized by the Johns Hopkins Applied Physics Laboratory (APL) in Laurel, Maryland, is now available.
Informatics Approach Helps Reveal Risk Factors for Pressure Injuries
Researchers used informatics to examine 5,000+ patient records and five years of data related to nursing skin assessments and hospital-acquired pressure injuries. The results underscore the importance of treating and monitoring irritated skin early and eliminating the cause as an important step to prevent pressure injuries.
Sensors driven by machine learning sniff-out gas leaks fast
A new study confirms the success of a natural-gas leak-detection tool pioneered by Los Alamos National Laboratory scientists that uses sensors and machine learning to locate leak points at oil and gas fields, promising new automatic, affordable sampling across vast natural gas infrastructure.
SoundWatch: New smartwatch app alerts d/Deaf and hard-of-hearing users to birdsong, sirens and other desired sounds
UW researchers have developed SoundWatch, a smartwatch app for deaf, Deaf and hard-of-hearing people who want to be aware of nearby sounds.
Research Team Discovers the Molecular Processes in Kidney Cells That Attract and Feed COVID-19
What about the kidneys make them a hotspot for COVID-19’s cytokine storm? A research team says it’s the presence of a protein found on specialized renal transport cells.
DrugCell: New Experimental AI Platform Matches Tumor to Best Drug Combo
UC San Diego researchers use experimental artificial intelligence system called DrugCell to predict the best approach to treating cancer.
AI gets a boost via LLNL, SambaNova collaboration
Lawrence Livermore National Laboratory (LLNL) has installed a state-of-the-art artificial intelligence (AI) accelerator from SambaNova Systems, the National Nuclear Security Administration (NNSA) announced today, allowing researchers to more effectively combine AI and machine learning (ML) with complex scientific workloads.
Material found in house paint may spur technology revolution
The development of a new method to make non-volatile computer memory may have unlocked a problem that has been holding back machine learning and has the potential to revolutionize technologies like voice recognition, image processing and autonomous driving.
Nudges Combined with Machine Learning Triples Advanced Care Conversations Among Patients with Cancer
An electronic nudge to clinicians—triggered by an algorithm that used machine learning methods to flag patients with cancer who would most benefit from a conversation around end-of-life goals—tripled the rate of those discussions.
Creating the software that will unlock the power of exascale
Researchers nationwide are building the software and applications that will run on some the world’s fastest supercomputers. Among them are members of DOE’s Exascale Computing Project who recently published a paper highlighting their progress so far.
Scientists voice concerns, call for transparency and reproducibility in AI research
In an article published in Nature on October 14, 2020, scientists at Princess Margaret Cancer Centre, University of Toronto, Stanford University, Johns Hopkins, Harvard School of Public Health, Massachusetts Institute of Technology, and others, challenge scientific journals to hold computational researchers to higher standards of transparency, and call for their colleagues to share their code, models and computational environments in publications.
Assessing State of the Art in AI for Brain Disease Treatment
The range of AI technologies available for dealing with brain disease is growing fast, and exciting new methods are being applied to brain problems as computer scientists gain a deeper understanding of the capabilities of advanced algorithms. In APL Bioengineering, Italian researchers conducted a systematic literature review to understand the state of the art in the use of AI for brain disease. Their qualitative review sheds light on the most interesting corners of AI development.
Unraveling the network of molecules that influence COVID-19 severity
Researchers from the Morgridge Institute for Research, the University of Wisconsin-Madison, and Albany Medical College have identified more than 200 molecular features that strongly correlate with COVID-19 severity, offering insight into potential treatment options for those with advanced disease.
Virtual Argonne training program prepares researchers for extreme-scale computing
The annual Argonne Training Program on Extreme-Scale Computing went virtual this year, providing two weeks of instruction to ready attendees for science in the exascale era.
The Future of Precision Medicine
Precision medicine is a rapidly growing approach to health care that focuses on finding treatments and interventions that work for people based on their genetic makeup, rather than their symptoms.
Zeeshan Ahmed, director of the new Ahmed Lab at Rutgers Institute for Health, Health Care Policy and Aging Research, discusses the future of precision medicine, what needs to be done to successfully analyze the data necessary to develop individualized treatments and the role genetics play during the COVID-19 pandemic.
$20 million boost for world-leading AI research
Australia’s position as one of the world leaders in artificial intelligence (AI) and machine learning will be further boosted thanks to $20 million towards a new national centre, to be based at the University of Adelaide.