Irvine, Calif., Oct. 7, 2020 – Electrical engineers, computer scientists and biomedical engineers at the University of California, Irvine have created a new lab-on-a-chip that can help study tumor heterogeneity to reduce resistance to cancer therapies. In a paper published today in Advanced Biosystems, the researchers describe how they combined artificial intelligence, microfluidics and nanoparticle inkjet printing in a device that enables the examination and differentiation of cancers and healthy tissues at the single-cell level.
Tag: Machine Learning
Thomas J. Fuchs, DSc, Named Dean of Artificial Intelligence and Human Health and Co-Director of the Hasso Plattner Institute for Digital Health at Mount Sinai
Appointment Advances Health System’s Role as Leader in AI and Digital Health
WHOI-NOAA partnership tackles critical gap in climate knowledge
Researchers at WHOI were recently awarded a $500,000 grant from the NOAA Climate Observations and Monitoring program to develop machine learning tools to improve estimates of air-sea heat exchange in the Arctic Ocean and adjacent seas.
Computational Biologist Thomas Norman of Sloan Kettering Institute Honored with Distinguished NIH Director’s New Innovator Award
Computational biologist Thomas Norman, PhD, of Memorial Sloan Kettering’s (MSK) Sloan Kettering Institute (SKI) has been named one of 53 recipients of the prestigious 2020 National Institutes of Health (NIH) Director’s New Innovator Award. As part of the award, Dr. Norman will receive $1.5 million in direct costs upfront in the first year of a five-year award.
UCI researcher receives NIH Transformational Research Award
Irvine, Calif., Oct. 6, 2020 — University of California, Irvine biomedical engineer Chang Liu is the recipient of one of nine Director’s Transformative Research Awards this year from the National Institutes of Health under its High-Risk, High-Reward Research Program, the agency announced today. Liu’s five-year, $8.4 million grant will support a project to develop a system for making antibody generation a routine and widely accessible process.
Q&A: How machine learning helps scientists hunt for particles, wrangle floppy proteins and speed discovery
At the Department of Energy’s SLAC National Accelerator Laboratory, machine learning is opening new avenues to advance the lab’s unique scientific facilities and research.
Machine Learning Scientists Teach Computers to Read X-Ray Images
PNNL researchers used machine learning to develop a tool for a nonprofit to identify orthopedic implants in X-ray images to improve surgical speed and accuracy
Scientists Train Computers to Recognize Which Early Stage Breast Cancers Will Spread
A new, machine-learning based approach could help doctors to separate aggressive stage 0 breast cancer from non-aggressive forms, sparing some women unnecessary mastectomies.
Machine Learning Takes on Synthetic Biology: Algorithms Can Bioengineer Cells for You
Scientists at Lawrence Berkeley National Laboratory have developed a new tool that adapts machine learning algorithms to the needs of synthetic biology to guide development systematically. The innovation means scientists will not have to spend years developing a meticulous understanding of each part of a cell and what it does in order to manipulate it.
Active learning accelerates redox-flow battery discovery
In a new study from the U.S. Department of Energy’s Argonne National Laboratory, researchers are accelerating the hunt for the best possible battery components by employing artificial intelligence.
Master’s Degree in Artificial Intelligence Now Within Reach of Low-income Students
The accelerated five-year bachelor’s degree in science and master’s degree in AI program is designed to adapt curricular and co-curricular support to enable students to complete their degrees in AI, autonomous systems or machine learning, which are critically important to advance America’s global competitiveness and national security. With this grant, FAU will recruit and train talented and diverse students who are economically disadvantaged and provide them with a unique opportunity to pursue graduate education in a burgeoning field.
Artificial Intelligence Detects Osteoarthritis Years Before it Develops
Researchers have created a machine-learning algorithm that can pick up on subtle signs of osteoarthritis – too abstract to register in the eye of a trained radiologist – on an MRI scan taken years before symptom onset.
Features in Breast Cancer Recurrence May Lead to New Test to Determine Level of Risk
Article title: Spatial locations of certain enzymes and transporters within pre-invasive ductal epithelial cells predict human breast cancer recurrences Authors: Alexandra M. Kraft and Howard R. Petty From the authors: “In the future, it will be important to increase the number of…
With Digital Phenotyping, Smartphones May Play a Role in Assessing Severe Mental Illness
Digital phenotyping approaches that collect and analyze Smartphone-user data on locations, activities, and even feelings – combined with machine learning to recognize patterns and make predictions from the data – have emerged as promising tools for monitoring patients with psychosis spectrum illnesses, according to a report in the September/October issue of Harvard Review of Psychiatry. The journal is published in the Lippincott portfolio by Wolters Kluwer.
APL and the Intelligence Community Tackle Malware in the Age of AI
APL scientists are working with the intelligence community to develop fundamentally new methods to inspect artificial intelligence for Trojans — vulnerabilities that deep networks are exposed to during the AI training process.
Algorithm aims to alert consumers before they use illicit online pharmacies
In a study, a team of Penn State researchers report that an algorithm they developed may be able to spot illicit online pharmacies that could be providing customers with substandard medications without their knowledge, among other potential problems.
OU Receives $20 Million Grant to Lead Inaugural National Science Foundation Artificial Intelligence Institute
NSF recently announced an investment of more than $100 million to establish five AI Institutes to support research and education hubs nationwide. Amy McGovern, an OU professor with dual appointments in the School of Computer Science in the Gallogly College of Engineering and in the School of Meteorology in the College of Atmospheric and Geographic Sciences, will lead the NSF AI Institute for Research on Trustworthy AI in Weather, Climate, and Coastal Oceanography, which received $20 million of the NSF funding.
Scientists use reinforcement learning to train quantum algorithm
Scientists are investigating how to equip quantum computers with artificial intelligence and machine learning approaches.
Auralee Edelen and Wai Ling Wu receive 2020 Panofsky Fellowships at SLAC
Their work uses machine learning to transform the way scientists tune particle accelerators for experiments and solve longstanding mysteries in astrophysics and cosmology.
Filling in the blanks: How supercomputing can aid high-resolution X-ray imaging
Scientists are preparing for the increased brightness and resolution of next-generation light sources with a computing technique that reduces the need for human calculations to reconstruct images.
3 Awards Will Support Accelerator R&D for Medical Treatment, Miniaturization, and Machine Learning
U.S. Department of Energy awards announced in July will advance Lawrence Berkeley National Laboratory (Berkeley Lab) R&D to develop a more effective and compact particle-beam system for cancer treatment, improve particle-beam performance using artificial intelligence, and develop a high-power, rapid-fire laser system for both tabletop and large-scale applications.
LLNL pairs world’s largest computer chip from Cerebras with “Lassen” supercomputer to accelerate AI research
Lawrence Livermore National Laboratory (LLNL) and artificial intelligence computer company Cerebras Systems have integrated the world’s largest computer chip into the National Nuclear Security Administration’s (NNSA’s) Lassen system, upgrading the top-tier supercomputer with cutting-edge AI technology.
Machine learning unearths signature of slow-slip quake origins in seismic data
Combing through historical seismic data, researchers using a machine learning model have unearthed distinct statistical features marking the formative stage of slow-slip ruptures in the earth’s crust months before tremor or GPS data detected a slip in the tectonic plates. Given the similarity between slow-slip events and classic earthquakes, these distinct signatures may help geophysicists understand the timing of the devastating faster quakes as well.
How Cedars-Sinai Predicts Number of COVID-19 Patients
When the novel coronavirus started spreading across the U.S., hospital leaders were faced with a unique challenge: How could they accurately forecast the number of patients who would need hospitalization when no one knew what to expect from this new disease? To answer this and other questions, the data science team at Cedars-Sinai developed a machine learning platform to predict staffing needs. The team adjusted the platform’s algorithms to forecast data points related to the novel coronavirus. Now the platform tracks local hospitalization volumes and the rate of confirmed COVID-19 cases, running multiple forecasting models to help anticipate and prepare for increasing COVID-19 patient volumes with an 85%-95% degree of accuracy.
Study: Machine learning can predict market behavior
Machine learning can assess the effectiveness of mathematical tools used to predict the movements of financial markets, according to new Cornell research based on the largest dataset ever used in this area.
New Machine Learning Tool Predicts Devastating Intestinal Disease in Premature Infants
Researchers from Columbia Engineering and the University of Pittsburgh have developed a sensitive and specific early warning system for predicting necrotizing enterocolitis (NEC) in premature infants before the life-threatening intestinal disease occurs. The prototype predicts NEC accurately and early, using stool microbiome features combined with clinical and demographic information. “The lessons we’ve learned from our new technique could well translate to other genetic or proteomic datasets and inspire new machine learning algorithms for healthcare datasets.”
The University of Chicago is awarded a major federal contract to host a new COVID-19 medical imaging resource center
A new center hosted at the University of Chicago — co-led by the largest medical imaging professional organizations in the country — will help tackle the ongoing COVID-19 pandemic by curating a massive database of medical images to help better understand and treat the disease. The work is supported by a $20 million, two-year federal contract that could be renewable to $50 million over five years.
Speaker Change: International Year of Sound Events Explore Acoustics from Steelpan Music to Oceanography
The Acoustical Society of America continues to host virtual events in August as part of the International Year of Sound. The ASA Student Council will host Virtual Student Summer Talks for science students to present their research on topics ranging from acoustical oceanography to speech communication and Andrew Morrison will discuss how the acoustical physics of the steelpan helps machine learning algorithms process large datasets. All events are open to the public, and admission is free.
Machine Learning Probes 3D Microstructures
Scientists have developed a machine learning technique for materials research at the atomic and molecular scales. The technique visualizes and quantifies the atomic and molecular structures in three-dimensional samples in real time. It is designed primarily to identify and characterize microstructures in 3D samples.
New machine learning method allows hospitals to share patient data — privately
Penn Medicine researchers have shown that federated learning is successful specifically in the context of brain imaging, by being able to analyze magnetic resonance imaging (MRI) scans of brain tumor patients and distinguish healthy brain tissue from cancerous regions.
New cell profiling method could speed TB drug discovery
A new cell profiling technology combines high throughput imaging and machine learning to provide a rapid, cost-effective way to determine how specific compounds act to destroy the bacterium that causes tuberculosis. It could speed discovery of anti-TB drugs and be applied to other pathogens.
Tulane scientists partner with U.S. Army on machine learning study
The project could pave the way for small, mobile quantum networks and possibly lead to unbreakable, secure communication systems, quantum computers and enhanced radar.
Photon-Based Processing Units Enable More Complex Machine Learning
Machine learning performed by neural networks is a popular approach to developing artificial intelligence, as researchers aim to replicate brain functionalities for a variety of applications. A paper in the journal Applied Physics Reviews proposes a new approach to perform computations required by a neural network, using light instead of electricity. In this approach, a photonic tensor core performs multiplications of matrices in parallel, improving speed and efficiency of current deep learning paradigms.
Children with type 1 diabetes may have a less desirable gut bacteria composition
Children with type 1 diabetes have a less desirable gut microbiome composition which may play a role in the development of the disease, according to new research published in the Endocrine Society’s Journal of Clinical Endocrinology & Metabolism.
Machine Learning Speeds Molecular Motion Modeling
Molecular dynamics is central to many questions in modern chemistry. However, computer models of molecular dynamics must balance computational cost and accuracy. Scientists have now used a machine learning technique called transfer learning to create a novel model of molecular motion that is as accurate as calculations that use quantum-mechanical physics but much faster.
Predicting X-ray Absorption Spectra from Graphs
Scientists built a machine learning model that can rapidly predict how atoms absorb x-rays for materials science research.
Supercomputer Simulations Help Researchers Predict Solar Wind Storms
Researchers at the University of New Hampshire used SDSC’s Comet supercomputer to validate a model using a machine learning technique called Dynamic Time Lag Regression (DTLR) to help predict the solar wind arrival near the Earth’s orbit from physical parameters of the Sun.
Artificial intelligence identifies, locates seizures in real-time
Research from the McKelvey School of Engineering at Washington University in St. Louis has shown that understanding brain activity as a network instead of readings from an EEG allow for more accurate and efficient detection of seizures in real-time.
Researchers use machine learning to build COVID-19 predictions
Researchers at Binghamton University, State University of New York are using machine learning to track the coronavirus and predict where it might surge next.
Neural network can determine lung cancer severity
NIBIB-funded researchers at Stanford University have created an artificial neural network that analyzes lung CT scans to provide information about lung cancer severity that can guide treatment options.
Six Argonne researchers receive DOE Early Career Research Program awards
Argonne scientists Michael Bishof, Maria Chan, Marco Govini, Alessandro Lovato, Bogdan Nicolae and Stefan Wild have received funding for their research as part of DOE’s Early Career Research Program.
Machine Learning Has a Flaw. It’s Gullible
Potential biases that limit the effectiveness of machine learning process technologies and the scope for human capital to be complementary in reducing such biases is explored by Rajshree Agarwal and Evan Starr at the University of Maryland’s Robert H. Smith…
Predicting Side Effects
At a glance:
• Scientists develop AI-based tool to predict adverse drug events
• Such events are responsible for some 2 million U.S. hospitalizations per year
• The free, open-source system could enable safer drug design, optimize drug safety
New imaging method tracks brain’s elusive networks
Understanding the source and network of signals as the brain functions is a central goal of brain research. Now, Carnegie Mellon engineers have created a system for high-density EEG imaging of the origin and path of normal and abnormal brain signals.
Argonne to collaborate with Raytheon Technologies to accelerate aircraft engine design
This new agreement will dramatically improve and reduce the computational expense of fluid dynamics models. Both partners aim to improve the design and durability of engine components.
ISPOR Short Course Program Now Offered Virtually
ISPOR—The Professional Society for Health Economics and Outcomes Research (HEOR) announced that its HEOR Short Course Program is now being offered virtually with 9 upcoming short courses in June and July.
RENEWABLE ENERGY ADVANCE
In order to identify materials that can improve storage technologies for fuel cells and batteries, you need to be able to visualize the actual three-dimensional structure of a particular material up close and in context. Researchers from the University of Delaware’s Catalysis Center for Energy Innovation (CCEI) have done just that, developing new techniques for characterizing complex materials.
Researchers use drones, machine learning to detect dangerous ‘butterfly’ landmines
Using advanced machine learning, drones could be used to detect dangerous “butterfly” landmines in remote regions of post-conflict countries, according to research from Binghamton University, State University at New York.
Using Machine Learning to Estimate COVID-19’s Seasonal Cycle
One of the many unanswered scientific questions about COVID-19 is whether it is seasonal like the flu – waning in warm summer months then resurging in the fall and winter. Now scientists at Lawrence Berkeley National Laboratory (Berkeley Lab) are launching a project to apply machine-learning methods to a plethora of health and environmental datasets, combined with high-resolution climate models and seasonal forecasts, to tease out the answer.
How Big Data and Artificial Intelligence Can Help Improve Healthcare Decision Making
ISPOR held its second Virtual ISPOR 2020 plenary session this afternoon, “Health Economics and Outcomes Research and Clinical Decision Making—Advancing Meaningful Progress.”