diff --git a/.github/workflows/conda.yml b/.github/workflows/conda.yml index 523a1f5e..33fe1134 100644 --- a/.github/workflows/conda.yml +++ b/.github/workflows/conda.yml @@ -11,8 +11,7 @@ jobs: fail-fast: false matrix: python-version: ["3.9"] - # platform: ["ubuntu-latest", "macos-latest", "windows-latest"] - platform: ["ubuntu-latest", "windows-latest"] + platform: ["ubuntu-latest", "macos-latest", "windows-latest"] runs-on: ${{ matrix.platform }} defaults: diff --git a/.github/workflows/docker.yaml b/.github/workflows/docker.yaml new file mode 100644 index 00000000..368c4664 --- /dev/null +++ b/.github/workflows/docker.yaml @@ -0,0 +1,21 @@ +on: + pull_request: + push: + branches: [master, main] +jobs: + docker-test: + name: Docker install and test + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v3 + - name: Build Docker image + run: docker build -t eis_toolkit -f ./Dockerfile-docs . + - name: Run mkdocs in Docker container + run: | + docker run -v ./site/pdf/:/eis_toolkit/site/pdf/ --env ENABLE_PDF_EXPORT=1 eis_toolkit poetry run mkdocs build + - uses: actions/upload-artifact@v4 + with: + name: document.pdf + path: ./site/pdf/document.pdf + - name: Run pytest in Docker container + run: docker run eis_toolkit poetry run pytest -v diff --git a/.github/workflows/tests.yml b/.github/workflows/tests.yml index 07c874f0..d84f9330 100644 --- a/.github/workflows/tests.yml +++ b/.github/workflows/tests.yml @@ -25,8 +25,8 @@ jobs: sudo apt-add-repository ppa:ubuntugis/ubuntugis-unstable sudo apt-get update sudo apt-get install gdal-bin libgdal-dev - pip install GDAL==3.4.3 pytest - if [ -f requirements.txt ]; then pip install -r requirements.txt; fi + pip install GDAL==3.4.3 poetry + poetry install - name: Test with pytest run: | - pytest + poetry run pytest -v diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index d8d636a3..0b90fc27 100755 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -7,78 +7,119 @@ If an issue for the changes you intend to make does not exist, create one. 1. Raise an issue for the changes you wish to make (or start working on a pre-existing issue). 2. Make a feature branch for your changes. -> Name the branch as add_ + +> Name the branch as -add- or something as descriptive + 3. Base your feature branch on the master branch. + > Remember to -``` console + +```bash git pull ``` -before checking out to a new branch. - -4. Do all -- editing -- formatting -and - -- testing +before checking out to a new branch. -on the issue-specific branch. Commit only to that branch, do not edit the master branch directly. +4. Do all editing formatting and testing on the issue-specific branch. + Commit only to that branch, do not edit the master branch directly. -5. Once you have something working, make sure your commits are according to the desired coding style and that your branch contains appropriate documentation and tests. +5. Once you have something working, make sure your commits are + according to the desired coding style and that your branch contains + appropriate documentation and tests. -6. Create a pull request (PR) to merge your branch into the master. In it, summarize your changes. -Assign a reviewer / reviewers for the PR. +6. Create a pull request (PR) to merge your branch into the master. In + it, summarize your changes. Assign a reviewer / reviewers for the + PR. ## Terminology and general coding principles 1. Packages -The folders at eis_toolkit/eis_toolkit are called packages. The initial division to packages already exist. Feel free to suggest modifications to the -current package division via creating an issue for it! Note that the packages can split up into sub packages if needed. Subpackages' names should also represent the main purpose of the modules belonging to the particular subpackage. +The folders inside `./eis_toolkit` are called subpackages. Feel free to suggest +modifications to the current subpackage division via creating an issue for it! +Note that the subpackages can split up into more subpackages if needed. 2. Modules -Module names come from the names of the .py files containing function declarations. You will need to create a new python file for each functionality. The name of the file containing the function declaration(s) for providing the functionality will be essentially the same as the function’s name but instead of the basic form use –ing form if it makes sense. +Module names come from the names of the .py files containing function +declarations. You will need to create a new python file for each functionality. +The name of the file containing the function declaration(s) for providing the +functionality will be essentially the same as the function’s name but instead +of the basic form use –ing form if it makes sense. -- Try to create modules in a way that each module contains only one functionality. Split this functionality into two function declarations: one for external use and one (the core functionality) for internal use. See e.g. implementation of [clipping functionality](./eis_toolkit/raster_processing/clipping.py) for reference. +- Try to create modules in a way that each module contains only one + functionality. Split this functionality into two function + declarations: one for external use and one (the core functionality) + for internal use. See e.g. implementation of [clipping + functionality](./eis_toolkit/raster_processing/clipping.py) for + reference. 1. Functions -Name each function according to what it is supposed to do. Try to express the purpose as simplistic as possible. In principle, each function should be creted for executing one task. We prefer modular structure and low hierarchy by trying to avoid nested function declarations. It is highly recommended to call other functions for executing sub tasks. +Name each function according to what it is supposed to do. Try to +express the purpose as simplistic as possible. In principle, each +function should be created for executing one task. We prefer modular +structure and low hierarchy by trying to avoid nested function +declarations. It is highly recommended to call other functions for +executing sub tasks. **Example (packages, modules & functions):** -Create a function which clips a raster file with polygon -> name the function as clip. Write this function declaration into a new python file with name clipping.py inside of the eis_toolkit/eis_toolkit/raster_processing folder. +Create a function which clips a raster file with polygon -\> name the +function as clip. Write this function declaration into a new python file +with name clipping.py inside of the +`eis_toolkit/raster_processing` folder. 4. Classes -A class can be defined inside of a module or a function. Class names should begin with a capital letter and follow the CamelCase naming convention: if a class name contains multiple words, the spaces are simply ignored and each separate word begins with capital letters. +A class can be defined inside of a module or a function. Class names +should begin with a capital letter and follow the CamelCase naming +convention: if a class name contains multiple words, the spaces are +simply ignored and each separate word begins with capital letters. + +When implementing the toolkit functions, create classes only when they +are clearly beneficial. -When implementing the toolkit functions, create classes only when they are clearly beneficial. -> If you create new custom exception classes, add them directly into eis_toolkit/eis_toolkit/exceptions.py file. +> If you create new custom exception classes, add them directly into +> `eis_toolkit/exceptions.py` file. 5. Variables -Avoid using global variables. Name your variables clearly for code maintainability and to avoid bugs. +Avoid using global variables. Name your variables clearly for code +maintainability and to avoid bugs. 6. Docstrings and code comments -For creating docstrings, we rely on google convention (see section 3.8 in [link](https://google.github.io/styleguide/pyguide.html) for more detailed instructions). Let’s try to minimize the amount of code comments. Well defined docstrings should do most of the job with clear code structure. +For creating docstrings, we rely on google convention (see section 3.8 +in [link](https://google.github.io/styleguide/pyguide.html) for more +detailed instructions). Let's try to minimize the amount of code +comments. Well defined docstrings should do most of the job with clear +code structure. ## Naming policy -General guidelines about naming policy (applies to package, module, function, class and variable names): -- all names should be given in English -- avoid too cryptic names by using complete words -- if the name consists of multiple words, use snake_case so replace space with underscore character (_) (CamelCase is used for classes as an exception to this rule) -- do not include special characters, capital letters or numbers into the names unless in case of using numbers in variable names and there is an unavoidable need for it / using numbers significantly increases clarity +General guidelines about naming policy (applies to package, module, +function, class and variable names): + +- all names should be given in English + +- avoid too cryptic names by using complete words + +- if the name consists of multiple words, use snake_case so replace + space with underscore character (\_) (CamelCase is used for classes + as an exception to this rule) + +- do not include special characters, capital letters or numbers into + the names unless in case of using numbers in variable names and + there is an unavoidable need for it / using numbers significantly + increases clarity ## Code style ### pre-commit -> Note that pre-commit was added as the primary style check tool later in the project and you need to install and enable it manually! +> Note that pre-commit was added as the primary style check tool later +> in the project and you need to install and enable it manually! The repository contains a `.pre-commit-config.yaml` file that has configuration to run a set of [`pre-commit`](https://pre-commit.com) hooks. As the name @@ -90,14 +131,14 @@ copy of the repository to run. To install `pre-commit` on Debian or Ubuntu -based systems with `apt` as the package manager you should be able to run: -``` console +```bash apt update apt install pre-commit ``` Alternatively, it can be installed with the system installation of `Python`: -``` console +```bash pip install pre-commit ``` @@ -107,7 +148,7 @@ methods (). To enable the hooks locally, enter the directory with your local version of `eis_toolkit`, and run: -``` console +```bash pre-commit install ``` @@ -119,7 +160,7 @@ the commit. To disable the hooks and allow commits even with errors pointed out by `pre-commit`, you can add the `--no-verify` option to the `git` command-line: -``` console +```bash git commit -m "" --no-verify ``` @@ -129,33 +170,37 @@ out by `pre-commit`. You can also run the hooks without committing on all files. Make sure you save any text changes as `pre-commit` can modify unformatted files: -``` console +```bash pre-commit run --all-files ``` - ## Testing -Creating and executing tests improves code quality and helps to ensure that nothing gets broken -after merging the PR. +Creating and executing tests improves code quality and helps to ensure +that nothing gets broken after merging the PR. > **Please** note that creating and running tests is not optional! Create a new python file into eis_toolkit/tests folder every time you wish to add a new functionality into eis_toolkit. Name that file as _test.py. -In this test file you can declare all test functions related to the new function. Add a function at least for testing that -- the new function works as expected (in this you can utilize other software for generating the reference solution) -- custom exception class errors get fired when expected +In this test file you can declare all test functions related to the new +function. Add a function at least for testing that -You can utilize both local and remote test data in your tests. -For more information about creating and running tests, take a look at [test instructions](./instructions/testing.md). +- the new function works as expected (in this you can utilize other + software for generating the reference solution) +- custom exception class errors get fired when expected -## Documentation +You can utilize both local and remote test data in your tests. For more +information about creating and running tests, take a look at [test +instructions](./instructions/testing.md). -When adding (or editing) a module, function or class, **please** make sure the documentation stays up-to-date! -For more information, take a look at [documentation instructions](./instructions/generating_documentation.md). +## Documentation +When adding (or editing) a module, function or class, **please** make +sure the documentation stays up-to-date! For more information, take a +look at [documentation +instructions](./instructions/generating_documentation.md). ## Creating a PR @@ -171,22 +216,29 @@ If you act according to the workflow stated in this document, these PR checks should always pass since you have already run pytest through before committing :) The purpose of this automatic workflow is to double check that nothing gets broken by merge. -However, **IF** you make changes to the dependencies of the repository (i.e. edit -pyproject.toml file), you need to update requirements.txt file in order to the -workflow tests to stay up-to-date. You can do this by running the following command +However, **IF** you make changes to the dependencies of the repository (i.e. +edit pyproject.toml file), you need to update `poetry.lock` and +`environment.yaml` files in order to the workflow tests to stay up-to-date. You +can update the `poetry.lock` file by running the following commands: -```console -poetry export --without-hashes --format=requirements.txt > requirements.txt +```bash +# Not required if you added a package with poetry add command +poetry lock --no-update ``` -and committing the new version of the particular file into your feature -branch. Please note that this file is only used for GitHub workflows, otherwise -we utilize poetry for dependency handling. - +and committing the new version of the particular file into your feature branch. +Dependencies in `environment.yaml` need to be kept up to date manually by +including the same package, which was added to `pyproject.toml`, in +`environment.yaml`. Please note that this file is only used for GitHub +workflows, otherwise we utilize poetry for dependency handling. ## Recent changes + Some changes have been made to the style guide: -- Use `numbers.Number` as the type when both floats and integers are accepted by functions: + +- Use `numbers.Number` as the type when both floats and integers are + accepted by functions: + ```python from numbers import Number @@ -196,7 +248,11 @@ def func(int_or_float: Number): ```python raise InvalidParameterValueException(f"Window size is too small: {height}, {width}.") ``` -- Use beartype's decorator for automatic function argument type checking and import types from `beartype.typing` if a warning is raised by beartype on imports from `typing`: + +- Use beartype's decorator for automatic function argument type + checking and import types from `beartype.typing` if a warning is + raised by beartype on imports from `typing`: + ```python from beartype import beartype from beartype.typing import Sequence @@ -229,14 +285,23 @@ def my_function(parameter_1: float, parameter_2: bool, parameter_seq: Sequence): ``` ## Developer's checklist + Here are some things to remember while implementing a new tool: -- Create an issue **before or when you start** developing a functionality -- Adhere to the style guide - - Look at existing implementations and copy the form - - Enable pre-commit and fix style/other issues according to the error messages -- Remember to use typing hints -- Write tests for your functions -- Add a .md file for you functionality -- If you think the tool you are developing could use a separate general utility function, make an issue about this need before starting to develop it on your own. Also check if a utility function exists already -- Remember to implement only the minimum what is required for the tool! With data functions, you can usually assume file reading/writing, nodata handling and other such processes are done before/after executing your tool +- Create an issue **before or when you start** developing a + functionality +- Adhere to the style guide + - Look at existing implementations and copy the form + - Enable pre-commit and fix style/other issues according to the + error messages +- Remember to use typing hints +- Write tests for your functions +- Add a .md file for you functionality +- If you think the tool you are developing could use a separate + general utility function, make an issue about this need before + starting to develop it on your own. Also check if a utility function + exists already +- Remember to implement only the minimum what is required for the + tool! With data functions, you can usually assume file + reading/writing, nodata handling and other such processes are done + before/after executing your tool diff --git a/Dockerfile-docs b/Dockerfile-docs new file mode 100755 index 00000000..198f0d2a --- /dev/null +++ b/Dockerfile-docs @@ -0,0 +1,26 @@ +FROM ubuntu:22.04 + +EXPOSE 8888 +EXPOSE 8000 + +WORKDIR /eis_toolkit + +ARG DEBIAN_FRONTEND=noninteractive +RUN apt-get update && apt-get install -y \ + libpango-1.0-0 \ + libharfbuzz0b \ + libpangoft2-1.0-0 \ + libgdal-dev \ + python3-pip + +RUN pip install poetry pre-commit + +COPY poetry.lock pyproject.toml mkdocs.yml /eis_toolkit/ +COPY docs /eis_toolkit/docs/ +COPY docs_assets /eis_toolkit/docs_assets/ +COPY eis_toolkit /eis_toolkit/eis_toolkit/ +COPY tests /eis_toolkit/tests/ + +RUN poetry install + +# COPY . . diff --git a/docs/dependency_licenses.md b/docs/dependency_licenses.md index 04c38864..95bbbd47 100644 --- a/docs/dependency_licenses.md +++ b/docs/dependency_licenses.md @@ -1,88 +1,85 @@ -| Name | Version | License | -|------------------------------|-----------|----------------------------------------------------| -| protobuf | 3.19.4 | 3-Clause BSD License | -| tensorboard-plugin-wit | 1.8.1 | Apache 2.0 | -| absl-py | 1.2.0 | Apache Software License | -| flatbuffers | 1.12 | Apache Software License | -| ghp-import | 2.1.0 | Apache Software License | -| google-auth | 2.11.0 | Apache Software License | -| google-auth-oauthlib | 0.4.6 | Apache Software License | -| google-pasta | 0.2.0 | Apache Software License | -| grpcio | 1.48.1 | Apache Software License | -| importlib-metadata | 4.12.0 | Apache Software License | -| keras | 2.9.0 | Apache Software License | -| libclang | 14.0.6 | Apache Software License | -| requests | 2.28.1 | Apache Software License | -| rsa | 4.9 | Apache Software License | -| tenacity | 8.2.2 | Apache Software License | -| tensorboard | 2.9.1 | Apache Software License | -| tensorboard-data-server | 0.6.1 | Apache Software License | -| tensorflow | 2.9.2 | Apache Software License | -| tensorflow-estimator | 2.9.0 | Apache Software License | -| tensorflow-io-gcs-filesystem | 0.26.0 | Apache Software License | -| watchdog | 2.1.9 | Apache Software License | -| packaging | 21.3 | Apache Software License; BSD License | -| python-dateutil | 2.8.2 | Apache Software License; BSD License | -| affine | 2.3.1 | BSD | -| cligj | 0.7.2 | BSD | -| geopandas | 0.11.1 | BSD | -| Fiona | 1.8.21 | BSD License | -| Jinja2 | 3.1.2 | BSD License | -| Markdown | 3.3.7 | BSD License | -| MarkupSafe | 2.1.1 | BSD License | -| Pygments | 2.13.0 | BSD License | -| Shapely | 1.8.4 | BSD License | -| Werkzeug | 2.2.2 | BSD License | -| astunparse | 1.6.3 | BSD License | -| click | 8.1.3 | BSD License | -| click-plugins | 1.1.1 | BSD License | -| cycler | 0.11.0 | BSD License | -| gast | 0.4.0 | BSD License | -| h5py | 3.7.0 | BSD License | -| idna | 3.3 | BSD License | -| joblib | 1.1.0 | BSD License | -| kiwisolver | 1.4.4 | BSD License | -| mkdocs | 1.3.1 | BSD License | -| numpy | 1.23.2 | BSD License | -| oauthlib | 3.2.0 | BSD License | -| pandas | 1.4.4 | BSD License | -| patsy | 0.5.2 | BSD License | -| pyasn1 | 0.4.8 | BSD License | -| pyasn1-modules | 0.2.8 | BSD License | -| rasterio | 1.3.2 | BSD License | -| requests-oauthlib | 1.3.1 | BSD License | -| scikit-learn | 1.1.2 | BSD License | -| scipy | 1.9.1 | BSD License | -| statsmodels | 0.13.2 | BSD License | -| threadpoolctl | 3.1.0 | BSD License | -| wrapt | 1.14.1 | BSD License | -| eis-toolkit | 0.1.0 | European Union Public Licence 1.2 (EUPL 1.2) | -| Pillow | 9.2.0 | Historical Permission Notice and Disclaimer (HPND) | -| opt-einsum | 3.3.0 | MIT | -| snuggs | 1.4.7 | MIT | -| GDAL | 3.4.3 | MIT License | -| Keras-Preprocessing | 1.1.2 | MIT License | -| PyYAML | 6.0 | MIT License | -| attrs | 22.1.0 | MIT License | -| cachetools | 5.2.0 | MIT License | -| charset-normalizer | 2.1.1 | MIT License | -| fonttools | 4.37.1 | MIT License | -| mergedeep | 1.3.4 | MIT License | -| mkdocs-material | 8.4.2 | MIT License | -| mkdocs-material-extensions | 1.0.3 | MIT License | -| munch | 2.5.0 | MIT License | -| plotly | 5.14.0 | MIT License | -| pymdown-extensions | 9.5 | MIT License | -| pyparsing | 3.0.9 | MIT License | -| pyproj | 3.3.1 | MIT License | -| pytz | 2022.2.1 | MIT License | -| pyyaml_env_tag | 0.1 | MIT License | -| setuptools-scm | 6.4.2 | MIT License | -| six | 1.16.0 | MIT License | -| termcolor | 1.1.0 | MIT License | -| tomli | 2.0.1 | MIT License | -| urllib3 | 1.26.12 | MIT License | -| zipp | 3.8.1 | MIT License | -| certifi | 2022.6.15 | Mozilla Public License 2.0 (MPL 2.0) | -| matplotlib | 3.5.3 | Python Software Foundation License | -| typing_extensions | 4.3.0 | Python Software Foundation License | +| Name | Version | License | +|------------------------------|--------------|----------------------------------------------------| +| protobuf | 4.24.4 | 3-Clause BSD License | +| absl-py | 2.0.0 | Apache Software License | +| flatbuffers | 23.5.26 | Apache Software License | +| google-auth | 2.23.3 | Apache Software License | +| google-auth-oauthlib | 1.0.0 | Apache Software License | +| google-pasta | 0.2.0 | Apache Software License | +| grpcio | 1.59.0 | Apache Software License | +| keras | 2.14.0 | Apache Software License | +| libclang | 16.0.6 | Apache Software License | +| ml-dtypes | 0.2.0 | Apache Software License | +| requests | 2.31.0 | Apache Software License | +| rsa | 4.9 | Apache Software License | +| tensorboard | 2.14.1 | Apache Software License | +| tensorboard-data-server | 0.7.2 | Apache Software License | +| tensorflow | 2.14.0 | Apache Software License | +| tensorflow-estimator | 2.14.0 | Apache Software License | +| tensorflow-io-gcs-filesystem | 0.34.0 | Apache Software License | +| tzdata | 2023.3 | Apache Software License | +| packaging | 23.2 | Apache Software License; BSD License | +| python-dateutil | 2.8.2 | Apache Software License; BSD License | +| cligj | 0.7.2 | BSD | +| geopandas | 0.11.1 | BSD | +| Markdown | 3.5 | BSD License | +| MarkupSafe | 2.1.3 | BSD License | +| PyKrige | 1.7.1 | BSD License | +| Pygments | 2.16.1 | BSD License | +| Shapely | 1.8.5.post1 | BSD License | +| Werkzeug | 3.0.0 | BSD License | +| affine | 2.4.0 | BSD License | +| astunparse | 1.6.3 | BSD License | +| click | 8.1.7 | BSD License | +| click-plugins | 1.1.1 | BSD License | +| colorama | 0.4.6 | BSD License | +| contourpy | 1.1.1 | BSD License | +| cycler | 0.12.1 | BSD License | +| fiona | 1.9.5 | BSD License | +| gast | 0.5.4 | BSD License | +| h5py | 3.10.0 | BSD License | +| idna | 3.4 | BSD License | +| joblib | 1.3.2 | BSD License | +| kiwisolver | 1.4.5 | BSD License | +| numpy | 1.26.1 | BSD License | +| oauthlib | 3.2.2 | BSD License | +| pandas | 2.1.1 | BSD License | +| patsy | 0.5.3 | BSD License | +| pyasn1 | 0.5.0 | BSD License | +| pyasn1-modules | 0.3.0 | BSD License | +| rasterio | 1.3.9 | BSD License | +| requests-oauthlib | 1.3.1 | BSD License | +| scikit-learn | 1.3.2 | BSD License | +| scipy | 1.11.3 | BSD License | +| seaborn | 0.13.0 | BSD License | +| statsmodels | 0.14.0 | BSD License | +| threadpoolctl | 3.2.0 | BSD License | +| wrapt | 1.14.1 | BSD License | +| eis-toolkit | 0.1.0 | European Union Public Licence 1.2 (EUPL 1.2) | +| Pillow | 10.1.0 | Historical Permission Notice and Disclaimer (HPND) | +| shellingham | 1.5.4 | ISC License (ISCL) | +| imbalanced-learn | 0.11.0 | MIT | +| opt-einsum | 3.3.0 | MIT | +| snuggs | 1.4.7 | MIT | +| GDAL | 3.4.3 | MIT License | +| Rtree | 1.1.0 | MIT License | +| attrs | 23.1.0 | MIT License | +| beartype | 0.13.1 | MIT License | +| cachetools | 5.3.1 | MIT License | +| charset-normalizer | 3.3.1 | MIT License | +| fonttools | 4.43.1 | MIT License | +| markdown-it-py | 3.0.0 | MIT License | +| mdurl | 0.1.2 | MIT License | +| pyparsing | 3.1.1 | MIT License | +| pyproj | 3.6.1 | MIT License | +| pytz | 2023.3.post1 | MIT License | +| rich | 13.6.0 | MIT License | +| setuptools-scm | 8.0.4 | MIT License | +| six | 1.16.0 | MIT License | +| termcolor | 2.3.0 | MIT License | +| tomli | 2.0.1 | MIT License | +| typer | 0.9.0 | MIT License | +| urllib3 | 2.0.7 | MIT License | +| certifi | 2023.7.22 | Mozilla Public License 2.0 (MPL 2.0) | +| matplotlib | 3.8.0 | Python Software Foundation License | +| typing_extensions | 4.8.0 | Python Software Foundation License | diff --git a/eis_toolkit/__main__.py b/eis_toolkit/__main__.py new file mode 100644 index 00000000..beae11b2 --- /dev/null +++ b/eis_toolkit/__main__.py @@ -0,0 +1,4 @@ +from eis_toolkit.cli import app + +if __name__ == "__main__": + app() diff --git a/instructions/dev_setup_with_docker.md b/instructions/dev_setup_with_docker.md index b5c52063..56f44986 100644 --- a/instructions/dev_setup_with_docker.md +++ b/instructions/dev_setup_with_docker.md @@ -2,13 +2,13 @@ Build and run the eis_toolkit container. Run this and every other command in the repository root unless otherwise directed. -```console +```bash docker compose up -d ``` If you need to rebuild already existing container (e.g. dependencies have been updated), run -```console +```bash docker compose up -d --build ``` @@ -16,7 +16,7 @@ docker compose up -d --build Attach to the running container -```console +```bash docker attach eis_toolkit ``` @@ -34,24 +34,24 @@ For your workflow this means that: Whether or not using docker we manage the python dependencies with poetry. This means that a python venv is found in the container too. Inside the container, you can get into the venv like you normally would -```console +```bash poetry shell ``` and run your code and tests from the command line. For example: -```console +```bash python ``` or -```console +```bash pytest ``` You can also run commands from outside the venv, just prefix them with poetry run. For example: -```console +```bash poetry run pytest -``` \ No newline at end of file +``` diff --git a/instructions/dev_setup_without_docker.md b/instructions/dev_setup_without_docker.md index 664d2834..cb0c71ca 100755 --- a/instructions/dev_setup_without_docker.md +++ b/instructions/dev_setup_without_docker.md @@ -1,12 +1,15 @@ # Development with Poetry + If you do not have docker, you can setup your local development environment as a python virtual environment using Poetry. ## Prerequisites 0. Make sure that GDAL's dependencies -- libgdal (3.5.1 or greater) -- header files (gdal-devel) +- libgdal (3.5.1 or greater) +- header files (gdal-devel) +- See `.github/workflows/tests.yml` to see how these dependencies are + installed in CI are satisfied. If not, install them. @@ -19,21 +22,21 @@ are satisfied. If not, install them. 1. Install dependencies and create a virtual environment -```shell +```bash poetry install ``` 2. To use the virtual environment you can either enter it with -```shell +```bash poetry shell ``` or prefix your normal shell commands with -```shell +```bash poetry run ``` -If you want to use jpyterlab see the [instructions](./using_jupyterlab.md) +If you want to use jupyterlab see the [instructions](./using_jupyterlab.md) diff --git a/instructions/dev_setup_without_docker_with_conda.md b/instructions/dev_setup_without_docker_with_conda.md index 7b5bdce9..193ef1a2 100755 --- a/instructions/dev_setup_without_docker_with_conda.md +++ b/instructions/dev_setup_without_docker_with_conda.md @@ -18,7 +18,7 @@ the `libmamba` solver instead of the default. **If you encounter installation issues following this guide further**, especially on Windows, you can enable the `libmamba` solver globally(!) as follows: -``` shell +```bash conda install -n base conda-libmamba-solver conda config --set solver libmamba ``` @@ -34,7 +34,7 @@ further info. provided `environment.yml` file. The environment name is defined in `environment.yml` (eis_toolkit). -``` shell +```bash conda env create -f environment.yml # You can overwrite an existing environment named eis_toolkit with the --force flag conda env create -f environment.yml --force @@ -42,7 +42,7 @@ conda env create -f environment.yml --force 2. Activate the environment. -``` shell +```bash conda activate eis_toolkit ``` @@ -58,7 +58,7 @@ conda activate eis_toolkit You can add your own packages to the environment as needed. E.g. `jupyterlab`: -``` shell +```bash # -c conda-forge specifies the conda-forge channel, which is recommended conda install -n eis_toolkit -c conda-forge jupyterlab ``` diff --git a/instructions/generating_dependency_licenses.md b/instructions/generating_dependency_licenses.md index 4b5e6fb4..a059069c 100644 --- a/instructions/generating_dependency_licenses.md +++ b/instructions/generating_dependency_licenses.md @@ -11,34 +11,27 @@ changes. Start by cleaning your local `poetry` environment: -``` shell +``` bash poetry env remove python ``` -Then install only the main dependencies in pyproject.toml (not dev -dependencies): +Add `pip-licenses` to `pyproject.toml` and `poetry.lock`: -``` shell -poetry install --with main -# Old poetry version: poetry install --no-dev +``` bash +poetry add pip-licenses --lock ``` -Use either i.) poetry to add pip-licenses package (which updates pyproject.toml and -poetry.lock but these changes should not be committed) - -``` shell -poetry add pip-licenses -``` +Then install only the main dependencies (now including `pip-licenses`) in +pyproject.toml (not dev dependencies): -ii.) or the pip within the poetry environment: - -``` shell -poetry run pip install pip-licenses +``` bash +poetry install --with main +# Old poetry version: poetry install --no-dev ``` `pip-licenses` is now available in the poetry environment: -``` shell +``` bash poetry run pip-licenses --order=license --format=markdown > docs/dependency_licenses.md ``` @@ -48,7 +41,7 @@ poetry.lock` -command. To clean your local `poetry` environment: -``` shell +``` bash # Remove poetry environment from the current project poetry env remove python # Install main and dev packages from pyproject.toml again diff --git a/instructions/testing.md b/instructions/testing.md index 4f1bea18..56eb67a9 100755 --- a/instructions/testing.md +++ b/instructions/testing.md @@ -8,12 +8,12 @@ The tests in this repository can also serve as a starting point. All tests should be under the `tests/` directory under a correct subfolder for the module, e.g. `raster_processing/`. Put test of one module into one file, for example, `clip_test.py` to test clipping functions. running tests is as simple as executing -```console +```bash pytest ``` or if you are not inside Poetry shell -```console +```bash poetry run pytest ``` in the container's command line. diff --git a/instructions/using_jupyterlab.md b/instructions/using_jupyterlab.md index 1bb11801..56d2ea03 100755 --- a/instructions/using_jupyterlab.md +++ b/instructions/using_jupyterlab.md @@ -1,4 +1,5 @@ # Using jupyter + We include [JupyterLab](https://jupyterlab.readthedocs.io/en/stable/) as a development dependency for testing purposes. You can use it for example in cases when you want to store intermediate results in active memory or just to see your pretty plots in the same place you are experimenting. The notebooks are found under the `notebooks/` directory. You can import and use eis_toolkit's functions in these notebooks in the same way as you normally would use any other python package. @@ -6,15 +7,16 @@ The notebooks are found under the `notebooks/` directory. You can import and use *There exists three example notebook files. The first one contains general usage instructions for running and modifying JupyterLab notebooks. The second one has been created for testing that dependencies to other python packages work and the third one has been created for testing the functionality of the clip tool.* ## With docker + To start the server from your container run (inside the running container) -```shell +```bash poetry run jupyter lab --ip=0.0.0.0 --no-browser --allow-root ``` or -```shell +```bash poetry shell jupyter lab --ip=0.0.0.0 --no-browser --allow-root ``` @@ -23,9 +25,10 @@ A jupyter server should now be available. Access it with the last link jupyter p to the terminal (you can just click it to automatically open it in a browser) ## Without docker + Start the jupyter server with -```shell +```bash poetry run jupyter lab ``` diff --git a/pyproject.toml b/pyproject.toml index 52c0cd5e..044ec9b9 100755 --- a/pyproject.toml +++ b/pyproject.toml @@ -8,13 +8,11 @@ license = "EUPL-1.2" readme = "README.md" homepage = "https://eis-he.eu" repository = "https://github.com/GispoCoding/eis_toolkit" -# documentation = "" +# See https://pypi.org/classifiers/ keywords = [ - "Exploration", - "Mineral prospectivity", - "Horizon Europe", - "Geology", - "Packages" + "Development Status :: 4 - Beta", + "Topic :: Scientific/Engineering :: GIS", + "Programming Language :: Python :: 3 :: Only", ] [tool.poetry.scripts] diff --git a/requirements.txt b/requirements.txt deleted file mode 100644 index 76ab1338..00000000 --- a/requirements.txt +++ /dev/null @@ -1,87 +0,0 @@ -absl-py==2.0.0 ; python_version >= "3.9" and python_version < "3.11" -affine==2.4.0 ; python_version >= "3.9" and python_version < "3.11" -astunparse==1.6.3 ; python_version >= "3.9" and python_version < "3.11" -attrs==23.1.0 ; python_version >= "3.9" and python_version < "3.11" -beartype==0.13.1 ; python_version >= "3.9" and python_version < "3.11" -cachetools==5.3.1 ; python_version >= "3.9" and python_version < "3.11" -certifi==2023.7.22 ; python_version >= "3.9" and python_version < "3.11" -charset-normalizer==3.3.1 ; python_version >= "3.9" and python_version < "3.11" -click-plugins==1.1.1 ; python_version >= "3.9" and python_version < "3.11" -click==8.1.7 ; python_version >= "3.9" and python_version < "3.11" -cligj==0.7.2 ; python_version >= "3.9" and python_version < "3.11" -colorama==0.4.6 ; python_version >= "3.9" and python_version < "3.11" -contourpy==1.1.1 ; python_version >= "3.9" and python_version < "3.11" -cycler==0.12.1 ; python_version >= "3.9" and python_version < "3.11" -fiona==1.9.5 ; python_version >= "3.9" and python_version < "3.11" -flatbuffers==23.5.26 ; python_version >= "3.9" and python_version < "3.11" -fonttools==4.43.1 ; python_version >= "3.9" and python_version < "3.11" -gast==0.5.4 ; python_version >= "3.9" and python_version < "3.11" -gdal==3.4.3 ; python_version >= "3.9" and python_version < "3.11" -geopandas==0.11.1 ; python_version >= "3.9" and python_version < "3.11" -google-auth-oauthlib==1.0.0 ; python_version >= "3.9" and python_version < "3.11" -google-auth==2.23.3 ; python_version >= "3.9" and python_version < "3.11" -google-pasta==0.2.0 ; python_version >= "3.9" and python_version < "3.11" -grpcio==1.59.0 ; python_version >= "3.9" and python_version < "3.11" -h5py==3.10.0 ; python_version >= "3.9" and python_version < "3.11" -idna==3.4 ; python_version >= "3.9" and python_version < "3.11" -imbalanced-learn==0.11.0 ; python_version >= "3.9" and python_version < "3.11" -importlib-metadata==6.8.0 ; python_version >= "3.9" and python_version < "3.10" -importlib-resources==6.1.0 ; python_version >= "3.9" and python_version < "3.10" -joblib==1.3.2 ; python_version >= "3.9" and python_version < "3.11" -keras==2.14.0 ; python_version >= "3.9" and python_version < "3.11" -kiwisolver==1.4.5 ; python_version >= "3.9" and python_version < "3.11" -libclang==16.0.6 ; python_version >= "3.9" and python_version < "3.11" -markdown-it-py==3.0.0 ; python_version >= "3.9" and python_version < "3.11" -markdown==3.5 ; python_version >= "3.9" and python_version < "3.11" -markupsafe==2.1.3 ; python_version >= "3.9" and python_version < "3.11" -matplotlib==3.8.0 ; python_version >= "3.9" and python_version < "3.11" -mdurl==0.1.2 ; python_version >= "3.9" and python_version < "3.11" -ml-dtypes==0.2.0 ; python_version >= "3.9" and python_version < "3.11" -numpy==1.26.1 ; python_version >= "3.9" and python_version < "3.11" -oauthlib==3.2.2 ; python_version >= "3.9" and python_version < "3.11" -opt-einsum==3.3.0 ; python_version >= "3.9" and python_version < "3.11" -packaging==23.2 ; python_version >= "3.9" and python_version < "3.11" -pandas==2.1.1 ; python_version >= "3.9" and python_version < "3.11" -patsy==0.5.3 ; python_version >= "3.9" and python_version < "3.11" -pillow==10.1.0 ; python_version >= "3.9" and python_version < "3.11" -protobuf==4.24.4 ; python_version >= "3.9" and python_version < "3.11" -pyasn1-modules==0.3.0 ; python_version >= "3.9" and python_version < "3.11" -pyasn1==0.5.0 ; python_version >= "3.9" and python_version < "3.11" -pygments==2.16.1 ; python_version >= "3.9" and python_version < "3.11" -pykrige==1.7.1 ; python_version >= "3.9" and python_version < "3.11" -pyparsing==3.1.1 ; python_version >= "3.9" and python_version < "3.11" -pyproj==3.6.1 ; python_version >= "3.9" and python_version < "3.11" -python-dateutil==2.8.2 ; python_version >= "3.9" and python_version < "3.11" -pytz==2023.3.post1 ; python_version >= "3.9" and python_version < "3.11" -rasterio==1.3.9 ; python_version >= "3.9" and python_version < "3.11" -requests-oauthlib==1.3.1 ; python_version >= "3.9" and python_version < "3.11" -requests==2.31.0 ; python_version >= "3.9" and python_version < "3.11" -rich==13.6.0 ; python_version >= "3.9" and python_version < "3.11" -rsa==4.9 ; python_version >= "3.9" and python_version < "3.11" -rtree==1.1.0 ; python_version >= "3.9" and python_version < "3.11" -scikit-learn==1.3.2 ; python_version >= "3.9" and python_version < "3.11" -scipy==1.11.3 ; python_version >= "3.9" and python_version < "3.11" -seaborn==0.13.0 ; python_version >= "3.9" and python_version < "3.11" -setuptools-scm==8.0.4 ; python_version >= "3.9" and python_version < "3.11" -setuptools==68.2.2 ; python_version >= "3.9" and python_version < "3.11" -shapely==1.8.5.post1 ; python_version >= "3.9" and python_version < "3.11" -shellingham==1.5.4 ; python_version >= "3.9" and python_version < "3.11" -six==1.16.0 ; python_version >= "3.9" and python_version < "3.11" -snuggs==1.4.7 ; python_version >= "3.9" and python_version < "3.11" -statsmodels==0.14.0 ; python_version >= "3.9" and python_version < "3.11" -tensorboard-data-server==0.7.2 ; python_version >= "3.9" and python_version < "3.11" -tensorboard==2.14.1 ; python_version >= "3.9" and python_version < "3.11" -tensorflow-estimator==2.14.0 ; python_version >= "3.9" and python_version < "3.11" -tensorflow-io-gcs-filesystem==0.34.0 ; python_version >= "3.9" and python_version < "3.11" -tensorflow==2.14.0 ; python_version >= "3.9" and python_version < "3.11" -termcolor==2.3.0 ; python_version >= "3.9" and python_version < "3.11" -threadpoolctl==3.2.0 ; python_version >= "3.9" and python_version < "3.11" -tomli==2.0.1 ; python_version >= "3.9" and python_version < "3.11" -typer[all]==0.9.0 ; python_version >= "3.9" and python_version < "3.11" -typing-extensions==4.8.0 ; python_version >= "3.9" and python_version < "3.11" -tzdata==2023.3 ; python_version >= "3.9" and python_version < "3.11" -urllib3==2.0.7 ; python_version >= "3.9" and python_version < "3.11" -werkzeug==3.0.0 ; python_version >= "3.9" and python_version < "3.11" -wheel==0.41.2 ; python_version >= "3.9" and python_version < "3.11" -wrapt==1.14.1 ; python_version >= "3.9" and python_version < "3.11" -zipp==3.17.0 ; python_version >= "3.9" and python_version < "3.10" diff --git a/setup.py b/setup.py deleted file mode 100644 index 28756e29..00000000 --- a/setup.py +++ /dev/null @@ -1,15 +0,0 @@ -from setuptools import find_packages, setup - -setup( - name="eis_toolkit", - version="0.1", - packages=find_packages(), - install_requires=[ - # Add your package dependencies here - ], - entry_points={ - "console_scripts": [ - "eis=eis_toolkit.cli:cli", - ], - }, -)