INSPIRE: Glitch Zoo: Teaming Citizen Science with Machine Learning to Deepen LIGO's View of the Cosmos

Project: Research project

Project Details

Description

The Advanced Laser Interferometer Gravitational-wave Observatory (aLIGO) is the most complicated experiment ever undertaken in gravitational physics. The goal of the instrument is to detect gravitational waves generated by cosmic events. Exquisitely sensitive to the merest of disturbances, the observatory is unfortunately susceptible to many sources of interference. To separate out true signals, the detector provides not only the gravitational wave signal, but also data from a vast array of environmental sensors and monitors designed to provide real-time data about the health of the interferometer and the environment it is immersed in. All told, there are some 30,000 channels of monitoring data available for analysis and evaluation alongside the main science data products. From the data, the goals of “detector characterization” (detChar) are four-fold: (1) to identify which data channels are most effective for capturing non-astrophysical, burst-like signals, glitches, in the science data; (2) to identify, characterize, and classify the morphology of such glitch events in aLIGO data; (3) to use the taxonomical knowledge developed in detChar to provide reliable, high-fidelity vetoes or acceptances of events that emerge from signal searches in the aLIGO data; and (4) to use the resulting robust classification to track the origin of glitches and through detector commissioning to eliminate them from the data stream. The size of the data, coupled with the complex interactions they represent, means that the problems of detChar cannot be effectively and efficiently addressed without using computational technology to identify the important and relevant content of the data.

The LIGO-Zooniverse collaboration will develop and research a joint citizen science project that will link the aLIGO data with citizen scientists and a machine-learning infrastructure as part of the detChar pipeline analysis. Such a collaboration is needed because teaching computers to identify and morphologically classify artifacts in signal data is exceedingly difficult. However, human eyesight is an effective and proven tool for doing the same task. The difficulty with big data streams like those coming from aLIGO is the volume of data that needs to be quickly and efficiently evaluated; it is far too much for a single or even small team of humans to do. To address such problems of scale, the Zooniverse has developed a workable crowd-sourcing model where “citizen scientists” (typically, but not exclusively, members of the lay public) are shown scientific data through a web browser interface, and asked to make a simple judgment based on the data’s appearance. In the present study, participants will see images of time-frequency and time series output for the aLIGO detector output and many (~100) auxiliary channels. Participants will be asked to classify whether a given image belongs to a known group of glitches or perhaps represents an unknown group of glitch. A non-functional mockup of the planned system can be seen at http://demo.zooniverse.org/ligo/#/classify.
StatusFinished
Effective start/end date10/1/159/30/19

Funding

  • National Science Foundation (IIS-1547880-004)

Fingerprint Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.