AI and photonics join forces to make it easier to find ‘new Earths’


Australian scientists have developed a new type of sensor to measure and correct the distortion of starlight caused by viewing through the Earth’s atmosphere, which should make it easier to study the possibility of life on distant planets.

Using artificial intelligence and machine learning, University of Sydney optical scientists have developed a sensor that can neutralise a star’s ‘twinkle’ caused by heat variations in the Earth’s atmosphere. This will make the discovery and study of planets in distant solar systems easier from optical telescopes on Earth.

“The main way we identify planets orbiting distant stars is by measuring regular dips in starlight caused by planets blocking out bits of their sun,” said lead author

Dr Barnaby Norris

, who holds a joint position as a Research Fellow in the University of Sydney Astrophotonic Instrumentation Laboratory and in the University of Sydney node of

Australian Astronomical Optics

in the School of Physics.

“This is really difficult from the ground, so we needed to develop a new way of looking up at the stars. We also wanted to find a way to directly observe these planets from Earth,” he said.

The team’s invention will now be deployed in one of the largest optical telescopes in the world, the 8.2-metre

Subaru telescope in Hawaii

, operated by the National Astronomical Observatory of Japan.

“It is really hard to separate a star’s ‘twinkle’ from the light dips caused by planets when observing from Earth,” Dr Norris said. “Most observations of exoplanets have come from orbiting telescopes, such as NASA’s Kepler. With our invention, we hope to launch a renaissance in exoplanet observation from the ground.”

The research is published today in


Nature Communications


.



NOVEL METHODS

Using the new ‘photonic wavefront sensor’ will help astronomers directly image exoplanets around distant stars from Earth.

Over the past two decades, thousands of planets beyond our solar system have been detected, but only a small handful have been directly imaged from Earth. This severely limits scientific exploration of these exoplanets.

Making an image of the planet gives far more information than indirect detection methods, like measuring starlight dips. Earth-like planets might appear a billion times fainter than their host star. And observing the planet separate from its star is like looking at a 10-cent coin held in Sydney, as viewed from Melbourne.

To solve this problem, the scientific team in the

School of Physics

developed a ‘photonic wavefront sensor’, a new way to allow the exact distortion caused by the atmosphere to be measured, so it can then be corrected by the telescope’s adaptive optics systems thousands of times a second.

“This new sensor merges advanced photonic devices with deep learning and neural networks techniques to achieve an unprecedented type of wavefront sensor for large telescopes,’ Dr Norris said.

“Unlike conventional wavefront sensors, it can be placed at the same location in the optical instrument where the image is formed. This means it is sensitive to types of distortions invisible to other wavefront sensors currently used today in large observatories,” he said.


Professor Olivier Guyon

from the Subaru Telescope and the University of Arizona is one of the world’s leading experts in adaptive optics. He said: “This is no doubt a very innovative approach and very different to all existing methods. It could potentially resolve several major limitations of the current technology. We are currently working in collaboration with the University of Sydney team towards testing this concept at Subaru in conjunction with

SCExAO

, which is one of the most advanced adaptive optics systems in the world.”



APPLICATION BEYOND ASTRONOMY

The scientists have achieved this remarkable result by building on a novel method to measure (and correct) the wavefront of light that passes through atmospheric turbulence directly at the focal plane of an imaging instrument. This is done using an advanced light converter, known as a photonic lantern, linked to a neural network inference process.

“This is a radically different approach to existing methods and resolves several major limitations of current approaches,” said co-author Jin (Fiona) Wei, a postgraduate student at the Sydney Astrophotonic Instrumentation Laboratory.

The Director of the Sydney Astrophotonic Instrumentation Laboratory in the School of Physics at the University of Sydney,

Associate Professor Sergio Leon-Saval

, said: “While we have come to this problem to solve a problem in astronomy, the proposed technique is extremely relevant to a wide range of fields. It could be applied in optical communications, remote sensing, in-vivo imaging and any other field that involves the reception or transmission of accurate wavefronts through a turbulent or turbid medium, such as water, blood or air.”

###

DOWNLOAD a copy of the paper and photos of the research team

at this link

.



INTERVIEWS

Dr Barnaby Norris |

[email protected]

Associate Professor Sergio Leon-Saval |

[email protected]



MEDIA ENQUIRIES

Marcus Strom |

[email protected]

| +61 423 982 485



PUBLICATION

‘An all-photonic focal-plane wavefront sensor’ Nature Communications
DOI: 10.1038/s41467-020-19117-w

Authors: Barnaby Norris, Jin Wei, Christopher Betters, Alison Wong, Sergio Leon-Saval (All affiliations: University of Sydney)



DECLARATION

The authors declare no external funding.

This part of information is sourced from https://www.eurekalert.org/pub_releases/2020-10/uos-aap102120.php

withyou android app