Repository Architecture:
-
PythonInterfaceOOP: Main python scripts folder (run Scripts)
-
PythonInterfaceOOP/configs: JSON config files
-
PythonInterfaceOOP/templibs: Temp DQ Libraries from O2-DQ Framework (for auto-completion)
-
PythonInterfaceOOP/extramodules: All extramodules for manage many utils for python scripts
-
-
doc: Main Documentation files
-
runYAPF.sh
: Python code re-formatter like clang -
.flake8
: Rules for flake8 as Lint wrapper for python -
.style.yapf
: YAPF rules file -
README.md
: Main readme file -
.github/workflows: CI integration (automated YAPF as Python formater and automated flake8 tests as Python Linter)
This project includes python based user interface development for PWG-DQ Workflows based on nightly-20222212. You can follow the instructions and you can find tutorials in table of contents at the end of the page (For prerequisites, Installation guide for argcomplete and Some Informations good to know).
Ionut Cristian Arsene (Owner of O2DQWorkflows
)
- Contact: [email protected] / [email protected]
Cevat Batuhan Tolon (Author Of Interface)
- Contact: [email protected]
If you are having problems with scripts, you can first ask your questions on mattermost directly to @ctolon account or via e-mail at [email protected].
Since the interface is constantly updated for stability, it is recommended to update it with git pull --rebase
command
We have standard run template for each run workflow python scripts.
Template:
python3 <script.py> <config.json> --task-name:<configurable|processFunc> parameter ...
-
script.py
e.g:- runAnalysis.py
- runTableMaker.py
-
config.json
e.g:- configTableMakerDataRun3.json
- configAnalysisMC.json
Note: You should provide config.json with full path! Like that: (configs/configAnalysisMC.json)
--task-name:<configurable|processFunc> parameter
e.g:- --table-maker:processBarrelOnly true
- --analysis-same-event-pairing:cfgTrackCuts jpsiPID1 jpsiPID2
- --analysis-event-selection:processSkimmed true --analysis-track-selection:cfgTrackCuts jpsiPID1 jpsiPID2
Switcher Variables (in boolean type): runOverMC
and runOverSkimmed
(you can found runOverMC variable under the main() functions in runTableMaker.py and runAnalysis.py scripts, runOverSkimmed in runEMEfficiency.py script).
Variable in script | Workflow Script | Value | Selected Task | Taskname in config |
---|---|---|---|---|
runOverMC | runTableMaker.py | False |
tableMaker.cxx | table-maker (For Real Data Skimming) |
runOverMC | runTableMaker.py | True |
tableMakerMC.cxx | table-maker-m-c (For MC Skimming) |
runOverMC | runAnalysis.py | False |
tableReader.cxx | analysis-event-selection (For Real Data Analysis) |
runOverMC | runAnalysis.py | True |
dqEfficiency.cxx | analysis-event-selection (For MC Analysis) |
runOverSkimmed | runEMEfficiency.py | False |
emEfficiency.cxx | analysis-event-selection (It will run with not skimmed MC) |
runOverSkimmed | runEMEfficiency.py | True |
emEfficiency.cxx | analysis-event-selection (It will run with skimmed MC) |
IMPORTANT NOTE: It creates interface arguments by parsing the json before executing the script with dependency injection, so it is very important that you configure it correctly!
e.g. for run tableMaker.cxx with runTableMaker.py script (runOverMC is have to be False in script):
python3 runTableMaker.py configs/configTableMakerDataRun3.json --internal-dpl-aod-reader:aod-file Datas/AO2D_ppDataRun3_LHC22c.root --table-maker:processMuonOnlyWithCov true --table-maker:processBarrelOnlyWithCov true --event-selection-task:syst pp --table-maker:cfgQA true --table-maker:cfgMuonCuts muonQualityCuts muonTightQualityCutsForTests --table-maker:cfgBarrelTrackCuts jpsiPID1 jpsiPID2 jpsiO2MCdebugCuts --add_track_prop --logFile
-
Some Scripts have been merged:
-
The runTableMaker.py and runTableMakerMC.py scripts have been merged into the runTableMaker.py script.
- IMPORTANT: You need Switch runOverMC variable to True in the script if you want work on tableMakerMC for MC else it will run for tableMaker for Data
-
The runTableReader.py and runDQEfficiency.py scripts have been merged into the runAnalysis.py script.
- IMPORTANT: You need Switch runOverMC variable to True in the script if you want work on dqEfficiency for MC else it will run for tableMaker for Data
-
the runEMEfficiency.py an runEMEfficiencyNotSkimmed.py scripts have been merged into the the runEMEfficiency.py script.
- IMPORTANT: You need Switch runOverSkimmed variable to True in the script if you want work on skimmed EM efficiency else it will run for not Skimmed EM efficiency
-
The argument parameter template has changed, now for each task, arguments in the form of doublets separated by separate colons are given, and then parameters are given (Template:
python3 <script.py> >config.json> --taskname:<configurable|processFunction> <parameters>
).- OLD TEMPLATE e.g:
python3 runTableMaker.py configs/configTableMakerDataRun3.json --process BarrelOnly -cfgBarrelTrackCuts jpsiPID1 jpsiPID2
- NEW TEMPLATE e.g:
python3 runTableMaker.py configs/configTableMakerDataRun3.json --table-maker:processBarrelOnly true --table-maker:cfgBarrelTrackCuts jpsiPID1 jpsiPID2
- OLD TEMPLATE e.g:
-
Now interface arguments are not hard coded but dynamically generated by parsing JSON files. That's why interface scripts were deleted.
-
Since the arguments are configured according to the tasks, it is necessary to configure some configurations separately for each task. For example, when configuring cfgTrackCuts, in the old version of interface all cfgTrackCuts configs in JSON config files were configured, now they are configured for each task separately (important to keep in your mind!).
You can refer to this chapter to modify scripts:
For code re-formatting, we use YAPF. It's based off of 'clang-format', developed by Daniel Jasper.
Install YAPF for code re-formatting
pip3 install yapf
or pip install yapf
In root folder, execute runYAPF.sh bash script for code re-formatting:
bash runYAPF.sh
- Python Scripts And JSON Configs
- Prerequisites!!!
- Instructions for TAB Autocomplete
- Technical Informations
- Instructions for Python Scripts
- Instructions for runTableMaker
- Instructions for runAnalysis.py
- Instructions for runFilterPP.py
- Instructions for runDQFlow.py
- Tutorial Part
- Download Datas For Tutorials
- MC Part
- Run tableMakerMC on LHC21i3d2 (jpsi to MuMu pp Run3Simulation)
- Run dqEfficiency on MC (LHC21i3d2 pp Run3Simulation)
- Run tablemakerMC on LHC21i3b (Prompt jpsi to dielectron pp Run3Simulation)
- Run dqEfficiency on MC (LHC21i3b pp Run3Simulation)
- Run tablemakerMC on LHC21i3f2 (Non-Prompt jpsi to dielectron pp Run3Simulation)
- Run dqEfficiency on LHC21i3f2 (LHC21i3f2 pp Run3Simulation)
- Data Part
- Run tableMaker on LHC15o (LHC15o PbPb Run2Data)
- Run tableReader on LHC15o (LHC15o PbPb Run2Data)
- Run tableMaker on LHC15o With Generic Flow Analysis (LHC15o PbPb Run2Data)
- Run tableReader on LHC15o with Generic Flow Analysis (LHC15o PbPb Run2Data)
- Run dqFlow on LHC15o (LHC15o PbPb Run2Data)
- Run v0Selector on LHC15o (LHC15o PbPb Run2Data)
- Run tableMaker on LHC22c (LHC22c pp Run3Data)
- Run tableReader on Data (LHC22c pp Run3Data)
- Run filterPP on fwdprompt(fwdprompt pp Run3Data)
- Special Part : Dilepton Analysis For Non-Standart Existing Workflows in DQ
- Special Part : run tableMaker and tableReader at the same time
- Developer Guide
- Troubleshooting: Tree Not Found