From 93ca5d7b7b3a63395a759d6e2395ec00f8d24207 Mon Sep 17 00:00:00 2001 From: carlocagnetta Date: Tue, 27 Feb 2024 11:58:40 +0100 Subject: [PATCH] Update Readme and contributing files --- README.md | 21 +- {{cookiecutter.project_name}}/CONTRIBUTING.md | 201 ------------------ .../docs/spelling_wordlist.txt | 4 +- 3 files changed, 11 insertions(+), 215 deletions(-) delete mode 100644 {{cookiecutter.project_name}}/CONTRIBUTING.md diff --git a/README.md b/README.md index 2d01f43..be82181 100644 --- a/README.md +++ b/README.md @@ -2,22 +2,21 @@ This repository contains a [cookiecutter](https://github.com/cookiecutter/cookiecutter) template that can be used for library development. The template contains several well-known "best-practices" for libraries - (tox, sphinx, nbsphinx, coverage, pylint etc) and also some tools + (mypy, ruff, sphinx, nbsphinx, coverage, pylint etc) and also some tools inspired by projects of ours that we consider generally useful - build and release scripts, auto-generation of documentation files, links for jumping directly to the correct place in the source code and others. Earlier versions of this template were used in several industry projects as well as for open source libraries. -Build, install and tests of the library are run by tox, the documentation is built with sphinx and a -helper script (both also invoked by tox). The template includes ci/cd pipelines for gitlab CI, github actions and -a rudimentary pipeline for azure devops. The pipeline will run the test suite and publish docu, badges and reports. On -gitlab, [gitlab pages](https://docs.gitlab.com/ee/user/project/pages/) are used and on github, -we make use of [github pages](https://pages.github.com/) through the -[github-pages-deploy-action](https://github.com/JamesIves/github-pages-deploy-action). You will need to enable pages -for gitlab, for github you should configure the pages source to be the root directory of the branch gh-pages. +Build, install and tests of the library are run by default poetry tasks, the documentation is built with Jupyter-Books. +The template includes ci/cd pipelines for github actions and a rudimentary pipeline for azure devops. +The pipeline will run the test suite and publish docu, badges and reports. +We make use of [github pages](https://pages.github.com/) through the [github-pages-deploy-action](https://github.com/JamesIves/github-pages-deploy-action). +You should configure the pages source to be the root directory of the branch gh-pages. In the documentation links to source code will be created, therefore you will be prompted to give the project's url. -See the resulting repository's [developer's readme]({{cookiecutter.project_name}}/CONTRIBUTING.md) for further details. An example of the current output of this template is in [pymetrius_output](https://github.com/appliedAI-Initiative/pymetrius_output) +See the resulting repository's [developer's readme]({{cookiecutter.project_name}}/docs/04_contributing/04_contributing.rst) +for further details. An example of the current output of this template is in [pymetrius_output](https://github.com/appliedAI-Initiative/pymetrius_output) # Usage @@ -41,8 +40,8 @@ and walk through the questions. You can also clone this repository, adjust the t the local file. You will get a repo in `/`, which will contain your new library installed in -"editable mode" (i.e. with `pip install -e` ). The virtual environment is created by poetry. Documentation is built with -jupyter-book and is published on github pages. +"editable mode" (i.e. with `pip install -e`, we reccomend installing in poetry with `poetry install --with dev` ). +The virtual environment is created by poetry. Documentation is built with jupyter-book and is published on github pages. # Contributing diff --git a/{{cookiecutter.project_name}}/CONTRIBUTING.md b/{{cookiecutter.project_name}}/CONTRIBUTING.md deleted file mode 100644 index 1ce621b..0000000 --- a/{{cookiecutter.project_name}}/CONTRIBUTING.md +++ /dev/null @@ -1,201 +0,0 @@ -# {{cookiecutter.project_name}} development guide - -This repository contains the {{cookiecutter.project_name}} python library together with utilities for building, testing, -documentation and configuration management. - -This project uses the [black](https://github.com/psf/black) source code formatter -and [pre-commit](https://pre-commit.com/) to invoke it as a Git pre-commit hook. We also use -[isort](https://github.com/PyCQA/isort) for sorting -imports and [nbstripout](https://github.com/kynan/nbstripout) to prevent outputs of notebooks to be committed. - -When first cloning the repository, run the following command (after -setting up your virtualenv with dev dependencies installed, see below) to set up -the local Git hook: - -```shell script -pre-commit install -``` - -## Local Development -Automated builds, tests, generation of docu and publishing are handled by CI/CD pipelines. -You will find an initial version of the pipeline in this repo. Below are further details on testing -and documentation. - -Before pushing your changes to the remote it is often useful to execute `tox` locally in order to -detect mistakes early on. - -We strongly suggest to use some form of virtual environment for working with the library. E.g. with poetry -(if you have created the project locally with the python-library-template, it will already include a poetry env) -```shell script -poetry shell -``` - -A very convenient way of working with your library during development is to install it in editable mode -into your environment by running -```shell script -poetry install -``` -or -```shell script -pip install -e . -``` - - -### Additional requirements - -The main requirements for developing the library locally can be installed by running `poetry install --with dev`. -For building documentation locally you can run the poetry task `poe doc-build`. - -### Testing and packaging -The library is built with tox which will build and install the package, run the test suite and build documentation. -Running tox will also generate coverage and pylint reports in html and badges. -You can configure pytest, coverage and pylint by adjusting [pytest.ini](pytest.ini), [.coveragerc](.coveragerc) and -[.pylintrc](.pylintrc) respectively. - -In order to facilitate testing without tox (which can be slow, especially when executed for the first time), the -steps executed by tox are available as bash scripts within the [build_scripts](build_scripts) directory. For example, -to run tests, execute notebooks in 4 parallel processes as integration tests (and docu) and build the docu, -all without involving tox, you could run - -```shell -pip install -r requirements-test.txt -r requirements-docs.txt -./build_scripts/run-all-tests-with-coverage.sh -./build_scripts/build-docs.sh -``` - -Concerning notebooks: all notebooks in the [notebooks](docs/02_notebooks) directory will be executed during test run, -the results will be added to the docu in the _Guides and Tutorials_ section. Thus, notebooks can be conveniently used -as integration tests and docu at the same time. - -You can run thew build by installing tox into your virtual environment -(e.g. with `pip install tox`) and executing `tox`. - -For creating a package locally run -```shell script -python setup.py sdist bdist_wheel -``` - -### Documentation -Documentation is built with sphinx every time tox is executed, doctests are run during that step. -There is a helper script for updating documentation files automatically. It is called by tox on build and can -be invoked manually as -```bash -python build_scripts/update_docs.py -``` -See the code documentation in the script for more details on that. We recommend using the bash script -```shell -./build_scripts/build-docs.sh -``` -to both update the docs files and to rebuild the docu with sphinx in one command. - -Notebooks also form part of the documentation, in case they have been rendered before (see explanation above). - -## Configuration Management -If you decided to include configuration utils when generating the project from the template, this repository -also includes [configuration utilities](config.py) that are often helpful when using data-related libraries. -They are based on appliedAI's lightweight library [accsr](https://github.com/appliedAI-Initiative/accsr). - -By default the configured secrets like access keys and so on are expected to be in a file called `config_local.json`. -In order for these secrets to be available in CI/CD during the build, _create a gitlab variable of type file called_ -`CONFIG_LOCAL` containing your CI secrets. -Note that sometimes it makes sense for them to differ from your own local config. - -Generally the configuration utils support an arbitrary hierarchy of config files, you will have to adjust the -[config.py](config.py) and the gitlab pipeline if you want to make use of that. - -## CI/CD and Release Process -This repository contains ci/cd pipelines for multiple providers. -The most sophisticated one is the [gitlab ci pipeline](.gitlab-ci.yml) (this is what we use internally at appliedAI), it -will run the test suite and publish docu, badges and reports. -Badges can be accessed from the pipeline's artifacts, on gitlab the url of the coverage badge will be: - -``` -/-/jobs/artifacts/develop/raw/badges/coverage.svg?job=tox_use_cache -``` - -The azure ci pipeline is rather rudimentary, pull requests are always welcome! - -### Development and Release Process with Gitlab and Github - -In order to be able to automatically release new versions of the package, the - CI pipeline should have access to the following variables / github secrets: - -``` -PYPI_REPO_USER -PYPI_REPO_PASS -``` - -They will be used in the release steps in the pipeline. If you want to publish packages to a private server, -you will also need to set the `PYPI_REPO_URL` variable. - -On gitlab, you will need to set up a `Gitlab CI deploy key` and add it as file-type variable called `GITLAB_DEPLOY_KEY` -for automatically committing from the develop pipeline during version bumping. - -#### Automatic release process - -In order to create an automatic release, a few prerequisites need to be satisfied: - -- The project's virtualenv needs to be active -- The repository needs to be on the `develop` branch -- The repository must be clean (including no untracked files) - -Then, a new release can be created using the `build_scripts/release-version.sh` script (leave off the version parameter -to have `bumpversion` automatically derive the next release version): - -```shell script -./scripts/release-version.sh 0.1.6 -``` - -To find out how to use the script, pass the `-h` or `--help` flags: - -```shell script -./build_scripts/release-version.sh --help -``` - -If running in interactive mode (without `-y|--yes`), the script will output a summary of pending -changes and ask for confirmation before executing the actions. - -#### Manual release process -If the automatic release process doesn't cover your use case, you can also create a new release -manually by following these steps: - -1. (repeat as needed) implement features on feature branches merged into `develop`. -Each merge into develop will advance the `.devNNN` version suffix and publish the pre-release version into the package -registry. These versions can be installed using `pip install --pre`. -2. When ready to release: From the develop branch create the release branch and perform release activities -(update changelog, news, ...). For your own convenience, define an env variable for the release version - ```shell script - export RELEASE_VERSION="vX.Y.Z" - git checkout develop - git branch release/${RELEASE_VERSION} && git checkout release/${RELEASE_VERSION} - ``` -3. Run `bumpversion --commit release` if the release is only a patch release, otherwise the full version can be specified -using `bumpversion --commit --new-version X.Y.Z release` -(the `release` part is ignored but required by bumpversion :rolling_eyes:). -4. Merge the release branch into `master`, tag the merge commit, and push back to the repo. -The CI pipeline publishes the package based on the tagged commit. - - ```shell script - git checkout master - git merge --no-ff release/${RELEASE_VERSION} - git tag -a ${RELEASE_VERSION} -m"Release ${RELEASE_VERSION}" - git push --follow-tags origin master - ``` -5. Switch back to the release branch `release/vX.Y.Z` and pre-bump the version: `bumpversion --commit patch`. -This ensures that `develop` pre-releases are always strictly more recent than the last published release version -from `master`. -6. Merge the release branch into `develop`: - ```shell script - git checkout develop - git merge --no-ff release/${RELEASE_VERSION} - git push origin develop - ``` -6. Delete the release branch if necessary: `git branch -d release/${RELEASE_VERSION}` -7. Pour yourself a cup of coffee, you earned it! :coffee: :sparkles: - -## Useful information - -Mark all autogenerated directories as excluded in your IDE. In particular docs/_build and .tox should be marked -as excluded in order to get a significant speedup in searches and refactorings. - -If using remote execution, don't forget to exclude data paths from deployment (unless you really want to sync them) diff --git a/{{cookiecutter.project_name}}/docs/spelling_wordlist.txt b/{{cookiecutter.project_name}}/docs/spelling_wordlist.txt index 9e96009..1053587 100644 --- a/{{cookiecutter.project_name}}/docs/spelling_wordlist.txt +++ b/{{cookiecutter.project_name}}/docs/spelling_wordlist.txt @@ -1,4 +1,3 @@ -tianshou arXiv tanh lr @@ -255,5 +254,4 @@ javascript plotly appliedAI init -pycharm -tox \ No newline at end of file +pycharm \ No newline at end of file