Do gravitational wave interferometers create a single pixel of data, or do they create an "image" of gravitational distortion in a 2D region? The first few Google/Wiki hits talk about the physics of interferometry, but not the actual resulting output from real hardware. I'm assuming there is no image, otherwise there would be some associated with the articles, rather than artistic renditions?
You can think of each individual detector producing a single-channel audio signal. By combining the signals from multiple detectors it's possible to determine where the signal is coming from. But the output is neither a picture nor a single pixel: it's a brief blip of a few seconds of audio-frequency time series.
A decent analogy is to think of each LIGO detector not as a camera but a microphone.
Or a 3 pixels camera whose pixel color changing is what matters :D
The only difference between a camera and a mic is the number or vibrating thing it cares about (mic only cares about the vibration of its single membrane, cameras create millions of membranes sensitive to photon vibration on a grid)
LIGO 3 interferometers care about the time-variation of the difference of distance measure in 3 groups of 2 mirrors. So it's more than a single mic, and it's a derivative of 2 distance measures, in time. It would be like a 3-pixel video, with white as a baseline for the 3 pixels, and it would varies towards green or blue depending of the negative or positive difference (random colors) between the mirror distances.
The microphone analogy is particularly apt, because the signals are also audio-frequency. You can listen to them. Sometimes in the control room we play the output on a loudspeaker. It can help in tuning the instrument (most of what you hear is noise).
There are devices that are arrays of microphones that can be used to detect what direction a noise comes from. The Army uses one to triangulate rifle fire, and Boeing started using one circa the 787 to locate cabin noises (planes have a sound insulation budget, and positively identifying the ingress points allows for better ambient noise levels).
>> it's a brief blip of a few seconds of audio-frequency time series.
A bit more time and they should be able to point telescopes in the general direction of the event prior to a merger. Not sure what kind of directional precision can be obtained, nor if there would even be much to see.
I'm pretty sure that they'd have to be pointed in the right direction first.
There is very little time between the start of something detectable, and its finish. But in the case of a neutron star merger, we were able to point other telescopes and see the resulting magnetar for several hours.
They measure the total distortion over the baseline, so in that sense it's a single pixel. But I think measurements from several interferometers can be combined to give some very limited spatial resolution.