The designers of these observatories have made herculean efforts to muffle stray noise, but when your signal is so weak, noise is a constant companion.
The first task in any gravitational wave detection is to try to extract a weak signal from that noise. Field compares the process to “driving in a car with a loud muffler and a lot of static on the radio, while thinking there might be a song, a faint melody, somewhere in that noisy background.”
Astronomers take the incoming stream of data and first ask if any of it is consistent with a previously modeled gravitational wave form. They might run this preliminary comparison against tens of thousands of signals stored in their “template bank.” Researchers can’t determine the exact black hole characteristics from this procedure. They’re just trying to figure out if there’s a song on the radio.
The next step is analogous to identifying the song and determining who sang it and what instruments are playing. Researchers run tens of millions of simulations to compare the observed signal, or wave form, with those produced by black holes of differing masses and spins. This is where researchers can really nail down the details. The frequency of the gravitational wave tells you the total mass of the system. How that frequency changes over time reveals the mass ratio, and thus the masses of the individual black holes. The rate of change in the frequency also provides information about a black hole’s spin. Finally, the amplitude (or height) of the detected wave can reveal how far the system is from our telescopes on Earth.
If you have to do tens of millions of simulations, they’d better be quick. “To complete that in a day, you need to do each in about a millisecond,” said Rory Smith, an astronomer at Monash University and a member of the LIGO collaboration. Yet the time needed to run a single numerical relativity simulation—one that faithfully grinds its way through the Einstein equations—is measured in days, weeks or even months.
To speed up this process, researchers typically start with the results of full supercomputer simulations—of which several thousand have been carried out so far. They then use machine learning strategies to interpolate their data, Smith said, “filling in the gaps and mapping out the full space of possible simulations.”
This “surrogate modeling” approach works well so long as the interpolated data doesn’t stray too far from the baseline simulations. But simulations for collisions with a high mass ratio are incredibly difficult. “The bigger the mass ratio, the more slowly the system of two inspiraling black holes takes to evolve,” Warburton explained. For a typical low-mass-ratio computation, you need to look at 20 to 40 orbits before the black holes plunge together, he said. “For a mass ratio of 1,000, you need to look at 1,000 orbits, and that would just take too long”—on the order of years. This makes the task virtually “impossible, even if you have a supercomputer at your disposal,” Field said. “And without a revolutionary breakthrough, this won’t be possible in the near future either.”