You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For the pheval pipelines, we have a little config element called
pheval-version
This will
install a specific version of pheval instead of the latest
during corpus preparation download the archive corresponding to the version (either zenodo or github, see above)
In effect this will ensure
That our corpus version is always compatible with our tool version, for example when the schema changes, or we use different gene ids, or any other changes related to the phenopackets that need to be taken into account by preprocessing
That all our experiments are fully reproducible.
That our pypi releases always have an associated tag in version control
All corpora go into one mega archive. It would be sort of nice if I only wanted to test one corpus, that the pipeline only downloads that one.
In theory this is easy:
The GitHub action that performs the release can zip the corpora individually and attach them to the release. In practice I dont know if the disc space limitations in GitHub actions will be exceeded. I would vote to try this..
The text was updated successfully, but these errors were encountered:
I would suggest the following SOP for PhEval releases
For the pheval pipelines, we have a little config element called
This will
In effect this will ensure
@souzadevinicius @yaseminbridges @julesjacobsen
What do you think?
There is one thing I dont like about that:
All corpora go into one mega archive. It would be sort of nice if I only wanted to test one corpus, that the pipeline only downloads that one.
In theory this is easy:
The GitHub action that performs the release can zip the corpora individually and attach them to the release. In practice I dont know if the disc space limitations in GitHub actions will be exceeded. I would vote to try this..
The text was updated successfully, but these errors were encountered: