Replies: 3 comments 2 replies
-
I’m pretty sure the results from the time series is the flux sampled at the given time multiplied by the output bin width, so it is a fluence in that sense. But, if you change the output time bins you will get a different number of expected interactions with the generate_time_series, but the generate_fluence will give the same as it integrates the underlying model for the requested time range instead of just sampling it at the given time. Yes, it is slow, but I bet it could sped up with a bit of effort, it has nested for loops (my fault). |
Beta Was this translation helpful? Give feedback.
-
The reason for the dt and 0.2MeV is because SNOwGLoBES works with numbers of neutrinos per unit area in a given energy and time bin while the data from the simulations is usually a spectrum. The 'integration' in SNOwGLoBES to get the total number of events in a given channel for a given detector/material, is summation. My preference is to send SNOwGLoBES data in the units that it wants rather than us sending data in a different units and then integrating after SNOwGLoBES has run. We could do better energy and time integrations in the generate_ methods but those would require more calls to get_oscillated_spectra. |
Beta Was this translation helpful? Give feedback.
-
If you want to do write the code to do better integrations over time and energy then please go ahead. I thought about it a little and if you use trapezoid method, it's not lots more calls to get_initial_spectra: you need to add +1 to the number of times and +1 to the number of energy bins. That's just one more call to the method and a little more data in the return. That should not hurt performance much. The event rates in the SNEWS white paper were summed across channels, time and energy, so yes., they do get used. I don't know how SNOwGLoBES handles the energy dependent cross-sections and smearing - you will have to dig into the code or ask Kate Scholberg / the GLoBES developers. |
Beta Was this translation helpful? Give feedback.
-
Hi all, I have a question about the products of
snowglobes.generate_time_series
.Is each table for the time bin supposed to be a fluence (i.e. integrated in time bin) or flux (i.e. sampled rate at given time)?
I see that the
oscillated_spectra
are multiplied by the bin sizes:snewpy/python/snewpy/snowglobes.py
Lines 120 to 122 in 4630b45
but I'm not sure if that's an 0-th order approximation of the integral, or just a changed normalization factor.
Can you help me understand?
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions