This is an example snakemake workflow for data reduction of Fermi-LAT data.
The workflow will run the standard fermitools
for a given configuration
and produce FITS files in a format that Gammapy can read.
Thereby it will handle the reduction of counts, exposure and point
spread function (PSF) for multiple PSF classes.
If you would like use this as Snakmake module you should add e.g. the following to your Snakefile
:
module fermi_lat_data_workflow:
snakefile:
# here, plain paths, URLs and the special markers for code hosting providers (see below) are possible.
github("adonath/snakemake-workflow-fermi-lat", path="workflow/Snakefile", branch="main")
config: config["fermi-lat-data"]
use rule * from fermi_lat_data_workflow as fermi_lat_data_*
Alternatively you could also just clone this repository to your local machine:
git clone https://github.com/adonath/snakemake-workflow-fermi-lat.git
If you havn't done yet, please install conda or mamba.
Now change to the directory of the repository:
cd snakemake-workflow-fermi-lat/
And create the conda environment using:
mamba env create -f environment.yaml
Once the process is done you can activate the environment:
conda activate snakemake-workflow-fermi-lat
Go to https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi and download the data
you are interested in. The data should be downloaded to the ./data
folder.
Now you should adapt the configuration in config/config.yaml to match your data.
Then you are ready to run the workflow, like:
snakemake --cores 8
You can also create a report to see previews of the counts, exposure and PSF images:
snakemake --report report.html
open report.html
Finally you can read and print the datasets as well as models using Gammapy:
from gammapy.datasets import Datasets
from gammapy.modeling.models import Models
datasets = Datasets.read("results/<my-config>/datasets/<my-config>-datasets-all.yaml")
models = Models.read("results/<my-config>/model/<my-config>-model.yaml")
print(datasets)
print(models)