This article is more than 1 year old

Boffins turn to AI to zip through piles of gravitational lenses

Fast method gives researchers more time to answer bigger questions about our Universe

A group of physicists has trained an artificial neural network to analyze gravitational lensing images ten million times faster than normal computational methods.

Gravitational lensing is "the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures," according to a paper published in Nature on Tuesday.

Such lenses are complex distortions in spacetime and are important systems in space. They give astronomers a window to study distant objects that are difficult to observe, and can help answer questions on dark matter and the expansion rate of the universe.

The lenses act as natural telescopes. The light emitted from a far-flung source can be detected as it’s bent while passing through a distribution of matter, like a cluster of galaxies, on its way to Earth.

The stronger the gravitational field, the more the distant light rays are bent. The most drastic example is when the distant light rays form an Einstein Ring around the gravitational lensing system, which happens when the light source, the massive lensing object, and the observer lie in a straight line.

It can take months to churn through data to analyze gravitational lenses. But a new method using a convolutional neural network (CNN), traditionally used for computer vision, can complete the process “within a fraction of a second” and with similar accuracy.

The neural net spits out vital information such as the radius of the Einstein ring, the ellipticity of the system, and the coordinates of the center of the lens.

Laurence Perreault Levasseur, co-author of the paper and a postdoctoral fellow at the SLAC National Accelerator Laboratory at Stanford University in California, said it lets researchers analyze the images “in a fully automated way and, in principle, on a cell phone’s computer chip.”

You’ve got to fake it to make it

Real data is sparse, so the researchers simulated half a million gravitational lens images to train the CNN. The fake images have to be as realistic as possible for the CNN to be useful in dealing with real data, so blurring and noise effects were added.

The training images and information on the Einstein radius and the shape of the location of the center of the gravitational lens were then fed into the CNN.

Neural networks are fuzzy and hard to understand. Yashar Hezaveh, co-author of the paper and researcher at Stanford, told The Register: “It’s hard to say” what features the CNN learned to extract to arrive at its output answers.

“In fact we don’t really know. As we show the examples to neural networks and ask them to make the correct predictions, they may find very complex features in the data that they can use for their predictions. We can sometimes look at the features, but they will be highly non-intuitive.

“I usually think about it like opening the brain of someone and looking inside it: it doesn’t tell us much about what the person is actually thinking about or how they see the world.”

Hezaveh hopes that this tool will help other astronomers with their research. “With the new generation of sky surveys in the next few years, we will have many tens of thousands of gravitational lenses."

"Astrophysicists will try to use these systems to answer many different questions, including: How is dark matter distributed in galaxies? What are the most distant galaxies of the universe? What is the exact expansion rate of the universe?”

These questions are some of the biggest problems in astrophysics and require careful analysis of the gravitational lenses first.

At the moment the CNNs can only deal with simple cases where the gravitational lens is a single galaxy. The researchers hope to expand to more complex situations involving clusters of galaxies.

“Neural networks can impact all the sciences that will be done with such samples,” Hezaveh said.

The researchers’ code can be found here. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like