Challenge activity for the Open Data Workshop 2021: https://www.gw-openscience.org/s/workshop4/
Data files may be downloaded using these links:
Workshop participants may submit solutions as individuals or in teams of up to 3 people. Solutions are due at the start of the close-out session, that is before 7:00 UTC of Day 5, Friday 14.
Challenges are ordered by difficulty. Entries will be rewarded a number of points that scales with the difficulty of the challenge.
Good luck to all!
Identify a loud binary black hole signal in white, Gaussian noise.
- Use the data file "challenge1.gwf". The channel name is "H1:CHALLENGE1".
- The data are white, Gaussian noise containing a simulated BBH signal.
-
Load the data into memory. What are the sampling rate and duration of the data?
-
Plot the data in the time-domain.
-
Plot a spectrogram (or q-transform) of the data, and try to identify the signal.
-
What is the time of the merger?
Signal in colored, Gaussian noise.
- Use the data file "challenge2.gwf", with channel name "H1:CHALLENGE2"
- The data contain a BBH signal with m1=m2=30 solar masses, spin = 0.
-
What is the approximative time of the merger? (Hint: a plot of the q-transform could help)
-
Generate a time-domain template waveform using approximate "SEOBNRv4_opt". with the same parameters as above. Plot this waveform.
-
Calculate a PSD of the data, and plot this on a log-log scale. Use axes ranging from 20 Hz up to the Nyquist frequency.
-
Use the template waveform and PSD to calculate the SNR time series. Plot the SNR time-series.
-
What is the matched filter SNR of the signal?
- Use the data file "challenge3.gwf" with channel "H1:CHALLENGE3"
- These are real LIGO data from O2, though we've adjusted the time labels and added some simulated signals.
- The data contain a loud simulated signal with m1 = m2 = 10 solar masses.
-
What is the merger time of this signal?
-
What is the matched-filter SNR of this signal?
- Use the data file "challenge3.gwf" with channels "H1:CHALLENGE3" and "L1:CHALLENGE3".
- These are real LIGO data from O2, though we've adjusted the time labels and added some simulated signals.
- Any simulated signals have been added to both the H1 and L1 data
- All simulated signals have 0 spin and m1=m2, with m1 somewhere in the range 10-50 solar masses
- Identify as many signals as you can. Watch out! These are real data, and so glitches may be present. Any correct detection is +1 point but any false alarms will count -1 point against your score. For each signal you find, list:
- The merger time
- The SNR
- Your estimate of the component masses
-
Identify as many glitches as you can. Make a spectrogram of each one.
-
For each simulated BBH you found, use bilby to compute a posterior distribution for the mass. You can fix the spin and mass ratio to make this run faster.
Processing data from a local file with Bilby
sampling_rate=2048 #needs to be high enough for the signals found in steps above
duration=8 #needs to be long enough for the signals found in steps above
start_time=100 #needs to be set so that the segment defined by [start_time,start_time+duration] contains the signal
interferometers = bilby.gw.detector.InterferometerList([])
for ifo_name in ['H1','L1']:
ifo=bilby.gw.detector.get_empty_interferometer(ifo_name)
ifo.set_strain_data_from_frame_file('challenge3.gwf',sampling_rate, duration, start_time=start_time ,channel=ifo_name+':CHALLENGE3')
interferometers.append(ifo)
To load data in google co-lab. Run this code, and then 'restart runtime', and run it again
! pip install -q lalsuite
! pip install -q gwpy
! pip install -q pycbc
# -- Click "restart runtime" in the runtime menu
# -- download data
! wget https://www.gw-openscience.org/s/workshop3/challenge/challenge3.gwf
# -- for gwpy
from gwpy.timeseries import TimeSeries
gwpy_strain = TimeSeries.read('challenge3.gwf', channel="H1:CHALLENGE3")
# -- for pycbc
from pycbc import frame
pycbc_strain = frame.read_frame('challenge3.gwf', 'H1:CHALLENGE3')