The advent of AI has been hailed by many as a societal game-changer, as it opens a universe of possibilities to improve nearly every aspect of our lives.
Astronomers are now using AI, quite literally, to measure the expansion of our universe.
Two recent studies led by Maria Dainotti, a visiting professor with UNLV’s Nevada Center for Astrophysics and assistant professor at the National Astronomical Observatory of Japan (NAOJ), incorporated multiple machine learning models to add a new level of precision to distance measurements for gamma-ray bursts (GRBs) – the most luminous and violent explosions in the universe.
In just a few seconds, GRBs release the same amount of energy our sun releases in its entire lifetime. Because they are so bright, GRBs can be observed at multiple distances – including at the edge of the visible universe – and aid astronomers in their quest to chase the oldest and most distant stars. But, due to the limits of current technology, only a small percentage of known GRBs have all of the observational characteristics needed to aid astronomers in calculating how far away they occurred.
Dainotti and her teams combined GRB data from NASA’s Neil Gehrels Swift Observatory with multiple machine learning models to overcome the limitations of current observational technology and, more precisely, estimate the proximity of GRBs for which the distance is unknown. Because GRBs can be observed both far away and at relatively close distances, knowing where they occurred can help scientists understand how stars evolve over time and how many GRBs can occur in a given space and time.
“This research pushes forward the frontier in both gamma-ray astronomy and machine learning,” said Dainotti. “Follow-up research and innovation will help us achieve even more reliable results and enable us to answer some of the most pressing cosmological questions, including the earliest processes of our universe and how it has evolved over time.”
AI Boosts Limits of Deep-Space Observation
In one study, Dainotti and Aditya Narendra, a final-year doctoral student at Poland’s Jagiellonian University, used several machine learning methods to precisely measure the distance of GRBs observed by the space Swift UltraViolet/Optical Telescope (UVOT) and ground-based telescopes, including the Subaru Telescope. The measurements were based solely on other, non distance-related GRB properties. The research was published May 23 in the Astrophysical Journal Letters.
“The outcome of this study is so precise that we can determine using predicted distance the number of GRBs in a given volume and time (called the rate), which is very close to the actual observed estimates,” said Narendra.
Another study led by Dainotti and international collaborators has been successful in measuring GRB distance with machine learning using data from NASA’s Swift X-ray Telescope (XRT) afterglows from what are known as long GRBs. GRBs are believed to occur in different ways. Long GRBs happen when a massive star reaches the end of its life and explodes in a spectacular supernova. Another type, known as short GRBs, happens when the remnants of dead stars, such as neutron stars, merge gravitationally and collide with each other.
Dainotti says the novelty of this approach comes from using several machine-learning methods together to improve their collective predictive power. This method, called Superlearner, assigns each algorithm a weight whose values range from 0 to 1, with each weight corresponding to the predictive power of that singular method.
“The advantage of the Superlearner is that the final prediction is always more performant than the singular models,” said Dainotti. “Superlearner is also used to discard the algorithms which are the least predictive.”
This study, which was published Feb. 26 in The Astrophysical Journal, Supplement Series, reliably estimates the distance of 154 long GRBs for which the distance is unknown and significantly boosts the population of known distances among this type of burst.
Answering Puzzling Questions on GRB Formation
A third study, published Feb. 21 in the Astrophysical Journal Letters and led by Stanford University astrophysicist Vahé Petrosian and Dainotti, used Swift X-ray data to answer puzzling questions by showing that the GRB rate – at least at small relative distances – does not follow the rate of star formation.
“This opens the possibility that long GRBs at small distances may be generated not by a collapse of massive stars but rather by the fusion of very dense objects like neutron stars,” said Petrosian.
With support from NASA’s Swift Observatory Guest Investigator program (Cycle 19), Dainotti and her colleagues are now working to make the machine learning tools publicly available through an interactive web application.