Researchers Develop Novel Analog Processor for High Performance Computing

Analog photonic solutions offer unique opportunities to address complex computational tasks with unprecedented performance in terms of energy dissipation and speeds, overcoming current limitations of modern computing architectures based on electron flows and digital approaches.  In a new study published…

Department of Energy Announces $15.1 Million for Integrated Computational and Data Infrastructure for Science Research

The U.S. Department of Energy (DOE) announced $15.1 million for three collaborative research projects, at five universities, to advance the development of a flexible multi-tiered data and computational infrastructure to support a diverse collection of on-demand scientific data processing tasks and computationally intensive simulations.

Main Attraction: Scientists Create World’s Thinnest Magnet

Scientists at Berkeley Lab and UC Berkeley have created an ultrathin magnet that operates at room temperature. The ultrathin magnet could lead to new applications in computing and electronics – such as spintronic memory devices – and new tools for the study of quantum physics.

Quantum material’s subtle spin behavior proves theoretical predictions

Using complementary computing calculations and neutron scattering techniques, researchers from the Department of Energy’s Oak Ridge and Lawrence Berkeley national laboratories and the University of California, Berkeley, discovered the existence of an elusive type of spin dynamics in a quantum mechanical system.

Coalition’s new leadership renews focus on advocating for academic scientific computation

The Coalition for Academic Scientific Computation, or CASC — a network of more than 90 research computing-focused organizations from academic institutions, national labs, and research centers around the U.S. — hopes to continue to bring this critical computing work to the forefront through its activities. Newly elected leadership and recent changes within CASC are helping the nonprofit organization excel in this mission.

Coffea speeds up particle physics data analysis

The prodigious amount of data produced at the Large Hadron Collider presents a major challenge for data analysis. Coffea, a Python package developed by Fermilab researchers, speeds up computation and helps scientists work more efficiently. Around a dozen international LHC research groups now use Coffea, which draws on big data techniques used outside physics.

Nikhil Tiwale: Practicing the Art of Nanofabrication

Applying his passions for science and art, Nikhil Tiwale—a postdoc at Brookhaven Lab’s Center for Functional Nanomaterials—is fabricating new microelectronics components.

Four Rutgers Professors Named AAAS Fellows

Four Rutgers professors have been named fellows of the American Association for the Advancement of Science (AAAS), an honor given to AAAS members by their peers. They join 485 other new AAAS fellows as a result of their scientifically or socially distinguished efforts to advance science or its applications. A virtual induction ceremony is scheduled for Feb. 13, 2021.

Globus Moves 1 Exabyte

Globus, a leading research data management service, reached a huge milestone by breaking the exabyte barrier. While it took over 2,000 days for the service to transfer the first 200 petabytes (PB) of data, the last 200PB were moved in just 247 days. This rapidly accelerating growth is reflected by the more than 150,000 registered users who have now transferred over 120 billion files using Globus.

Scientists develop forecasting technique that could help advance quest for fusion energy

An international group of researchers has developed a technique that forecasts how tokamaks might respond to unwanted magnetic errors. These forecasts could help engineers design fusion facilities that create a virtually inexhaustible supply of safe and clean fusion energy to generate electricity.

White House Office of Technology Policy, National Science Foundation and Department of Energy Announce Over $1 Billion in Awards for Artificial Intelligence and Quantum Information Science Research Institutes

Today, the White House Office of Science and Technology Policy, the National Science Foundation (NSF), and the U.S. Department of Energy (DOE) announced over $1 billion in awards for the establishment of 12 new artificial intelligence (AI) and quantum information science (QIS) research institutes nationwide.

Photon-Based Processing Units Enable More Complex Machine Learning

Machine learning performed by neural networks is a popular approach to developing artificial intelligence, as researchers aim to replicate brain functionalities for a variety of applications. A paper in the journal Applied Physics Reviews proposes a new approach to perform computations required by a neural network, using light instead of electricity. In this approach, a photonic tensor core performs multiplications of matrices in parallel, improving speed and efficiency of current deep learning paradigms.

Preparing for exascale: LLNL breaks ground on computing facility upgrades

To meet the needs of tomorrow’s supercomputers, the National Nuclear Security Administration’s (NNSA’s) Lawrence Livermore National Laboratory (LLNL) has broken ground on its Exascale Computing Facility Modernization (ECFM) project, which will substantially upgrade the mechanical and electrical capabilities of the Livermore Computing Center.

Organic Memory Devices Show Promise for Flexible, Wearable, Personalized Computing

The advent of artificial intelligence, machine learning and the internet of things is expected to change modern electronics. The pressing question for many researchers is how to handle this technological revolution. Brain-inspired electronics with organic memristors could offer a functionally promising and cost- effective platform. Since memristors are functionally analogous to the operation of neurons, the computing units in the brain, they are optimal candidates for brain-inspired computing platforms.

Recipe for Neuromorphic Processing Systems?

The field of “brain-mimicking” neuromorphic electronics shows great potential for basic research and commercial applications, and researchers in Germany and Switzerland recently explored the possibility of reproducing the physics of real neural circuits by using the physics of silicon. In Applied Physics Letters, they present their work to understand neural processing systems, as well as a recipe to reproduce these computing principles in mixed signal analog/digital electronics and novel materials.

ORNL researchers develop ‘multitasking’ AI tool to extract cancer data in record time

To better leverage cancer data for research, scientists at ORNL are developing an artificial intelligence (AI)-based natural language processing tool to improve information extraction from textual pathology reports. In a first for cancer pathology reports, the team developed a multitask convolutional neural network (CNN)—a deep learning model that learns to perform tasks, such as identifying key words in a body of text, by processing language as a two-dimensional numerical dataset.

The Big Questions: Ian Foster on High-Performance Computing

The Big Questions series features perspectives from the five recipients of the Department of Energy Office of Science’s 2019 Distinguished Scientists Fellows Award describing their research and what they plan to do with the award. Ian Foster is the director of Argonne National Laboratory’s Data Science and Learning Division.

Reinventing the Computer: Brain-Inspired Computing for a Post-Moore’s Law Era

Since 1947, computing development has seen a consistent doubling of the number of transistors that can fit on a chip. But that trend, Moore’s Law, may reach its limit as components of submolecular size encounter problems with thermal noise, making further scaling impossible. In this week’s Applied Physics Reviews, researchers present an examination of the computing landscape, focusing on functions needed to advance brain-inspired neuromorphic computing.

Wireless networking researcher wins Air Force’s Young Investigator Award for research into smart drones

Northern Arizona University assistant professor Fatemeh Afghah is one of 40 recipients of the grant, given to foster creative basic research in science and engineering, enhance career development and provide opportunities for engineers to address military challenges in science and engineering.

Machine learning analyses help unlock secrets of stable ‘supercrystal’

By blasting a frustrated mixture of materials with quick pulses of laser light, researchers transformed a superlattice into a supercrystal, a rare, repeating, three-dimensional structural much larger than an ordinary crystal. Using machine learning techniques, they studied the underlying structure of this sample at the nanoscale level before and after applying the laser pulse treatment.

NUS deep-learning AI system puts Singapore on global map of big data analytics

⎯ A team of researchers from the National University of Singapore (NUS) has put Singapore on the global map of Artificial Intelligence (AI) and big data analytics. Their open-source project, called Apache SINGA, “graduated” from the Apache Incubator on 16 October 2019 and is now Southeast Asia’s first Top-Level Project (TLP) under the Apache Software Foundation, the world’s largest open-source software community.

The Technological Heavyweight You’ve Probably Never Heard Of: ESnet

Since that first computer more than 30 years ago, ESnet has expanded to connect more than 40 major research institutions at speeds 15,000 times faster than a home network. From acting as an early adopter of protocols that now run the internet to making today’s scientific discoveries possible, ESnet is the big player in the internet you’ve probably never heard of.