From 11b94878e52c1252be803161a034980cda7e3c8e Mon Sep 17 00:00:00 2001 From: Benjamin Rodenberg Date: Tue, 4 Jul 2023 14:57:50 +0200 Subject: [PATCH 01/14] Fix link (#158) --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 831d6b76..e9d82642 100644 --- a/README.md +++ b/README.md @@ -85,7 +85,7 @@ To create and install the `fenicsprecice` python package the following instructi ## Development history -The initial version of this adapter was developed by [Benjamin Rodenberg](https://www.in.tum.de/i05/personen/personen/benjamin-rodenberg/) during his research stay at Lund University in the group for [Numerical Analysis](http://www.maths.lu.se/english/research/research-divisions/numerical-analysis/) in close collaboration with [Peter Meisrimel](https://www.lunduniversity.lu.se/lucat/user/09d80f0367a060bcf2a22d7c22e5e504). +The initial version of this adapter was developed by [Benjamin Rodenberg](https://www.in.tum.de/i05/personen/personen/benjamin-rodenberg/) during his research stay at Lund University in the group for [Numerical Analysis](http://www.maths.lu.se/english/research/research-divisions/numerical-analysis/) in close collaboration with [Peter Meisrimel](https://portal.research.lu.se/en/persons/peter-meisrimel). [Richard Hertrich](https://github.com/richahert) contributed the possibility to perform FSI simulations using the adapter in his [Bachelor thesis](https://mediatum.ub.tum.de/node?id=1520579). From 1a988ee2bb05d87eab05f76c85dd531d4355e965 Mon Sep 17 00:00:00 2001 From: Benjamin Rodenberg Date: Mon, 10 Jul 2023 10:05:06 +0200 Subject: [PATCH 02/14] Update link. --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index e9d82642..423b81ec 100644 --- a/README.md +++ b/README.md @@ -85,7 +85,7 @@ To create and install the `fenicsprecice` python package the following instructi ## Development history -The initial version of this adapter was developed by [Benjamin Rodenberg](https://www.in.tum.de/i05/personen/personen/benjamin-rodenberg/) during his research stay at Lund University in the group for [Numerical Analysis](http://www.maths.lu.se/english/research/research-divisions/numerical-analysis/) in close collaboration with [Peter Meisrimel](https://portal.research.lu.se/en/persons/peter-meisrimel). +The initial version of this adapter was developed by [Benjamin Rodenberg](https://www.cs.cit.tum.de/sccs/personen/benjamin-rodenberg/) during his research stay at Lund University in the group for [Numerical Analysis](http://www.maths.lu.se/english/research/research-divisions/numerical-analysis/) in close collaboration with [Peter Meisrimel](https://portal.research.lu.se/en/persons/peter-meisrimel). [Richard Hertrich](https://github.com/richahert) contributed the possibility to perform FSI simulations using the adapter in his [Bachelor thesis](https://mediatum.ub.tum.de/node?id=1520579). From 0a8f031b59f8b61af650a5e508a452a45165f56e Mon Sep 17 00:00:00 2001 From: Benjamin Rodenberg Date: Mon, 10 Jul 2023 10:08:02 +0200 Subject: [PATCH 03/14] Update `CITATION.cff` to use `preferred-citation` (#157) * Update citation cff * Update ReleaseGuide.md * Refer to preCICE Literature guide --- CITATION.cff | 104 ++++++++++++++++++++++++++----------------- docs/ReleaseGuide.md | 9 ++-- 2 files changed, 68 insertions(+), 45 deletions(-) diff --git a/CITATION.cff b/CITATION.cff index 34b3f499..ef7b5316 100644 --- a/CITATION.cff +++ b/CITATION.cff @@ -1,45 +1,65 @@ -# YAML 1.2 ---- -abstract: "The new software FEniCS-preCICE is a middle software layer, sitting in between the existing finite-element library FEniCS and the coupling library preCICE. The middle layer simplifies coupling (existing) FEniCS application codes to other simulation software via preCICE. To this end, FEniCS-preCICE converts between FEniCS and preCICE mesh and data structures, provides easy-to-use coupling conditions, and manages data checkpointing for implicit coupling. The new software is a library itself and follows a FEniCS-native style. Only a few lines of additional code are necessary to prepare a FEniCS application code for coupling. We illustrate the functionality of FEniCS-preCICE by two examples: a FEniCS heat conduction code coupled to OpenFOAM and a FEniCS linear elasticity code coupled to SU2. The results of both scenarios are compared with other simulation software showing good agreement." -authors: - - - affiliation: "Technical University of Munich" +# This CITATION.cff file was generated with cffinit. +# Visit https://bit.ly/cffinit to generate yours today! + +cff-version: 1.2.0 +title: FEniCS-preCICE +message: >- + If you use this software, please cite it using the metadata from this file. When using or referring to preCICE in + academic publications, please follow the [citation guidelines](https://precice.org/fundamentals-literature-guide.html). +type: software +authors: + - given-names: Benjamin family-names: Rodenberg - given-names: Benjamin - orcid: "https://orcid.org/0000-0002-3116-0133" - - - affiliation: "University Stuttgart" + orcid: 'https://orcid.org/0000-0002-3116-0133' + affiliation: Technical University of Munich + - given-names: Ishaan family-names: Desai - given-names: Ishaan - orcid: "https://orcid.org/0000-0002-2552-7509" - - - family-names: Hertrich + orcid: 'https://orcid.org/0000-0002-2552-7509' + affiliation: University Stuttgart + - family-names: Hertrich given-names: Richard - orcid: "https://orcid.org/0000-0003-1722-2841" - - - affiliation: "University of Stuttgart" - family-names: Jaust - given-names: Alexander - orcid: "https://orcid.org/0000-0002-6082-105X" - - - affiliation: "University of Stuttgart" - family-names: Uekermann - given-names: Benjamin - orcid: "https://orcid.org/0000-0002-1314-9969" -cff-version: "1.1.0" -date-released: 2021-01-10 -keywords: - - FEniCS - - "Fluid-Structure Interaction" - - "Conjugate Heat Transfer" - - Multiphysics - - "Coupled Problems" - - "Finite Element Method" - - preCICE -license: "LGPL-3.0" -message: "If you use this software, please cite it using these metadata." -repository-code: "https://github.com/precice/fenics-adapter" -title: "FEniCS-preCICE: Coupling FEniCS to other Simulation Software" -version: 1.2.0 -doi: 10.1016/j.softx.2021.100807 -... + orcid: 'https://orcid.org/0000-0003-1722-2841' + - affiliation: University of Stuttgart + given-names: Jaust + family-names: Alexander + orcid: 'https://orcid.org/0000-0002-6082-105X' + - affiliation: University of Stuttgart + given-names: Uekermann + family-names: Benjamin + orcid: 'https://orcid.org/0000-0002-1314-9969' +repository-code: 'https://github.com/precice/fenics-adapter' +abstract: >- + preCICE-adapter for the open source computing platform + FEniCS. +license: LGPL-3.0 +commit: ' 9aa3e22' +version: 1.4.0 +date-released: '2022-09-22' +preferred-citation: + title: "FEniCS-preCICE: Coupling FEniCS to other Simulation Software" + type: "article" + authors: + - affiliation: "Technical University of Munich" + family-names: Rodenberg + given-names: Benjamin + orcid: "https://orcid.org/0000-0002-3116-0133" + - affiliation: "University Stuttgart" + family-names: Desai + given-names: Ishaan + orcid: "https://orcid.org/0000-0002-2552-7509" + - family-names: Hertrich + given-names: Richard + orcid: "https://orcid.org/0000-0003-1722-2841" + - affiliation: "University of Stuttgart" + family-names: Jaust + given-names: Alexander + orcid: "https://orcid.org/0000-0002-6082-105X" + - affiliation: "University of Stuttgart" + family-names: Uekermann + given-names: Benjamin + orcid: "https://orcid.org/0000-0002-1314-9969" + doi: 10.1016/j.softx.2021.100807 + journal: "SoftwareX" + volume: 16 + pages: 100807 + year: 2021 diff --git a/docs/ReleaseGuide.md b/docs/ReleaseGuide.md index c62583a5..1a6093c4 100644 --- a/docs/ReleaseGuide.md +++ b/docs/ReleaseGuide.md @@ -2,14 +2,17 @@ Before starting this process make sure to check that all relevant changes are included in the `CHANGELOG.md`. The developer who is releasing a new version of FEniCS-preCICE adapter is expected to follow this workflow: -1. If it does not already exist, create a release branch with the version number of the planned release. Use develop as base for the branch. `git checkout develop`; `git checkout -b fenics-adapter-vX.X.X`. Perform the following steps only on the release branch, if not indicated differently. +1. If it does not already exist, create a release branch with the version number of the planned release. Use develop as base for the branch. `git checkout develop`; `git checkout -b fenics-adapter-vX.X.X`. Perform the following steps only on the release branch, if not indicated differently. 2. [Open a Pull Request from the branch `fenics-adapter-vX.X.X` to `master`](https://github.com/precice/fenics-adapter/compare) named after the version (i.e. `Release v1.0.0`) and briefly describe the new features of the release in the PR description. 3. Bump the version in the following places: - a) Before merging the PR, make sure to bump the version in `CHANGELOG.md` on `fenics-adapter-vX.X.X` - b) There is no need to bump the version anywhere else, since we use the [python-versioneer](https://github.com/python-versioneer/python-versioneer/) for maintaining the version everywhere else. + a) Before merging the PR, make sure to bump the version in `CHANGELOG.md` on `fenics-adapter-vX.X.X`. + + b) Update the version in `CITATION.cff` and update the release date. + + c) There is no need to bump the version anywhere else, since we use the [python-versioneer](https://github.com/python-versioneer/python-versioneer/) for maintaining the version everywhere else. 4. [Draft a New Release](https://github.com/precice/fenics-adapter/releases/new) in the `Releases` section of the repository page in a web browser. The release tag needs to be the exact version number (i.e.`v1.0.0` or `v1.0.0rc1`, compare to [existing tags](https://github.com/precice/fenics-adapter/tags)). Use `@target:master`. Release title is also the version number (i.e. `v1.0.0` or `v1.0.0rc1`, compare to [existing releases](https://github.com/precice/fenics-adapter/tags)). *Note:* If it is a pre-release then the option *This is a pre-release* needs to be selected at the bottom of the page. Use `@target:fenics-adapter-vX.X.X` for a pre-release, since we will never merge a pre-release into master. From 8143553540b06a66e36c0683d5b74e58be771818 Mon Sep 17 00:00:00 2001 From: valentin-seitz Date: Wed, 9 Aug 2023 18:58:01 +0200 Subject: [PATCH 04/14] Build `latest` docker container from master not `develop` branch (#161) * Let the games begin :) * should still fail? * lets try again with a bit more logic * seperate dockerfile concerns in pythonbinding_ref and adapter ref * adopt to pythonbindings --------- Co-authored-by: Valentin Seitz Co-authored-by: Ishaan Desai --- .github/workflows/build-docker.yml | 39 +++++++++++++++++---- tools/releasing/packaging/docker/Dockerfile | 8 ++--- 2 files changed, 37 insertions(+), 10 deletions(-) diff --git a/.github/workflows/build-docker.yml b/.github/workflows/build-docker.yml index 0e05525c..6f229cf2 100644 --- a/.github/workflows/build-docker.yml +++ b/.github/workflows/build-docker.yml @@ -1,10 +1,18 @@ name: Update docker image on: - workflow_dispatch: # Trigger by hand from the UI + workflow_dispatch: # Trigger by hand from the UI + inputs: + branch: + type: choice + description: branch to build the container from + options: + - develop + - master push: branches: - develop + - master jobs: build-and-release-docker-image: @@ -13,10 +21,28 @@ jobs: env: docker_username: precice steps: - - name: Get branch name - if: github.event_name != 'pull_request' + - name: Set branch name for manual triggering + if: github.event_name == 'workflow_dispatch' shell: bash - run: echo "branch=$(echo ${GITHUB_REF#refs/heads/} | tr / -)" >> $GITHUB_ENV + run: | + echo "ADAPTER_REF=${{ inputs.branch }}" >> $GITHUB_ENV + - name: Set branch name for on pull triggering + if: github.event_name != 'pull_request' && github.event_name != 'workflow_dispatch' + shell: bash + run: | + echo "ADAPTER_REF=${{ github.ref_name }}" >> $GITHUB_ENV + - name: Set PYTHON_BINDINGS_REF and the TAG depending on branch + shell: bash + run: | + if [[ '${{ env.ADAPTER_REF }}' == 'master' ]]; then + echo "PYTHON_BINDINGS_REF=latest" >> "$GITHUB_ENV" + echo "TAG=latest" >> "$GITHUB_ENV" + echo "Building TAG: latest" + else + echo "PYTHON_BINDINGS_REF=${{ env.ADAPTER_REF }}" >> "$GITHUB_ENV" + echo "TAG=${{ env.ADAPTER_REF }}" >> "$GITHUB_ENV" + echo "Building TAG: ${{ env.ADAPTER_REF }}" + fi - name: Checkout Repository uses: actions/checkout@v2 - name: Set up Docker Buildx @@ -31,6 +57,7 @@ jobs: with: push: true file: "./tools/releasing/packaging/docker/Dockerfile" - tags: ${{ env.docker_username }}/fenics-adapter:${{ env.branch }},${{ env.docker_username }}/fenics-adapter:latest + tags: ${{ env.docker_username }}/fenics-adapter:${{ env.TAG }} build-args: | - branch=${{ env.branch }} + FENICS_ADAPTER_REF=${{ env.ADAPTER_REF }} + PYTHON_BINDINGS_REF=${{ env.PYTHON_BINDINGS_REF }} diff --git a/tools/releasing/packaging/docker/Dockerfile b/tools/releasing/packaging/docker/Dockerfile index 810c6c69..642a22e5 100644 --- a/tools/releasing/packaging/docker/Dockerfile +++ b/tools/releasing/packaging/docker/Dockerfile @@ -1,6 +1,6 @@ # Dockerfile to build a ubuntu image containing the installed Debian package of a release -ARG branch=develop -ARG from=precice/python-bindings:${branch} +ARG PYTHON_BINDINGS_REF=develop +ARG from=precice/python-bindings:${PYTHON_BINDINGS_REF} FROM $from USER root @@ -17,7 +17,7 @@ RUN python3 -m pip install --user --upgrade pip # Rebuild image if force_rebuild after that command ARG CACHEBUST -ARG branch=develop +ARG FENICS_ADAPTER_REF=develop # Building fenics-adapter -RUN python3 -m pip install --user git+https://github.com/precice/fenics-adapter.git@$branch +RUN python3 -m pip install --user git+https://github.com/precice/fenics-adapter.git@$FENICS_ADAPTER_REF From 7fa2f38873b0d74afa9c20abd17c111c7c1008bf Mon Sep 17 00:00:00 2001 From: Ishaan Desai Date: Fri, 11 Aug 2023 13:49:41 -0400 Subject: [PATCH 05/14] Update versioneer to 0.29 --- fenicsprecice/__init__.py | 5 +- fenicsprecice/_version.py | 320 +++++++++--- setup.cfg | 2 +- versioneer.py | 1024 ++++++++++++++++++++++++++----------- 4 files changed, 965 insertions(+), 386 deletions(-) diff --git a/fenicsprecice/__init__.py b/fenicsprecice/__init__.py index 59c289bd..5dce538e 100644 --- a/fenicsprecice/__init__.py +++ b/fenicsprecice/__init__.py @@ -7,6 +7,5 @@ "The FEniCS adapter might not work as expected.\n\n") from .fenicsprecice import Adapter -from ._version import get_versions -__version__ = get_versions()['version'] -del get_versions +from . import _version +__version__ = _version.get_versions()['version'] diff --git a/fenicsprecice/_version.py b/fenicsprecice/_version.py index fb2f0691..66afe360 100644 --- a/fenicsprecice/_version.py +++ b/fenicsprecice/_version.py @@ -5,8 +5,9 @@ # directories (produced by setup.py build) will contain a much shorter file # that just contains the computed version number. -# This file is released into the public domain. Generated by -# versioneer-0.19 (https://github.com/python-versioneer/python-versioneer) +# This file is released into the public domain. +# Generated by versioneer-0.29 +# https://github.com/python-versioneer/python-versioneer """Git implementation of _version.py.""" @@ -15,9 +16,11 @@ import re import subprocess import sys +from typing import Any, Callable, Dict, List, Optional, Tuple +import functools -def get_keywords(): +def get_keywords() -> Dict[str, str]: """Get the keywords needed to look up the version information.""" # these strings will be replaced by git during git-archive. # setup.py/versioneer.py will grep for the variable names, so they must @@ -33,15 +36,22 @@ def get_keywords(): class VersioneerConfig: """Container for Versioneer configuration parameters.""" + VCS: str + style: str + tag_prefix: str + parentdir_prefix: str + versionfile_source: str + verbose: bool -def get_config(): + +def get_config() -> VersioneerConfig: """Create, populate and return the VersioneerConfig() object.""" # these strings are filled in when 'setup.py versioneer' creates # _version.py cfg = VersioneerConfig() cfg.VCS = "git" cfg.style = "pep440" - cfg.tag_prefix = "" + cfg.tag_prefix = "v" cfg.parentdir_prefix = "fenicsprecice-" cfg.versionfile_source = "fenicsprecice/_version.py" cfg.verbose = False @@ -52,13 +62,13 @@ class NotThisMethod(Exception): """Exception raised if a method is not valid for the current scenario.""" -LONG_VERSION_PY = {} -HANDLERS = {} +LONG_VERSION_PY: Dict[str, str] = {} +HANDLERS: Dict[str, Dict[str, Callable]] = {} -def register_vcs_handler(vcs, method): # decorator +def register_vcs_handler(vcs: str, method: str) -> Callable: # decorator """Create decorator to mark a method as the handler of a VCS.""" - def decorate(f): + def decorate(f: Callable) -> Callable: """Store f in HANDLERS[vcs][method].""" if vcs not in HANDLERS: HANDLERS[vcs] = {} @@ -67,22 +77,35 @@ def decorate(f): return decorate -def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False, - env=None): +def run_command( + commands: List[str], + args: List[str], + cwd: Optional[str] = None, + verbose: bool = False, + hide_stderr: bool = False, + env: Optional[Dict[str, str]] = None, +) -> Tuple[Optional[str], Optional[int]]: """Call the given command(s).""" assert isinstance(commands, list) - p = None - for c in commands: + process = None + + popen_kwargs: Dict[str, Any] = {} + if sys.platform == "win32": + # This hides the console window if pythonw.exe is used + startupinfo = subprocess.STARTUPINFO() + startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW + popen_kwargs["startupinfo"] = startupinfo + + for command in commands: try: - dispcmd = str([c] + args) + dispcmd = str([command] + args) # remember shell=False, so use git.cmd on windows, not just git - p = subprocess.Popen([c] + args, cwd=cwd, env=env, - stdout=subprocess.PIPE, - stderr=(subprocess.PIPE if hide_stderr - else None)) + process = subprocess.Popen([command] + args, cwd=cwd, env=env, + stdout=subprocess.PIPE, + stderr=(subprocess.PIPE if hide_stderr + else None), **popen_kwargs) break - except EnvironmentError: - e = sys.exc_info()[1] + except OSError as e: if e.errno == errno.ENOENT: continue if verbose: @@ -93,16 +116,20 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False, if verbose: print("unable to find command, tried %s" % (commands,)) return None, None - stdout = p.communicate()[0].strip().decode() - if p.returncode != 0: + stdout = process.communicate()[0].strip().decode() + if process.returncode != 0: if verbose: print("unable to run %s (error)" % dispcmd) print("stdout was %s" % stdout) - return None, p.returncode - return stdout, p.returncode + return None, process.returncode + return stdout, process.returncode -def versions_from_parentdir(parentdir_prefix, root, verbose): +def versions_from_parentdir( + parentdir_prefix: str, + root: str, + verbose: bool, +) -> Dict[str, Any]: """Try to determine the version from the parent directory name. Source tarballs conventionally unpack into a directory that includes both @@ -111,15 +138,14 @@ def versions_from_parentdir(parentdir_prefix, root, verbose): """ rootdirs = [] - for i in range(3): + for _ in range(3): dirname = os.path.basename(root) if dirname.startswith(parentdir_prefix): return {"version": dirname[len(parentdir_prefix):], "full-revisionid": None, "dirty": False, "error": None, "date": None} - else: - rootdirs.append(root) - root = os.path.dirname(root) # up a level + rootdirs.append(root) + root = os.path.dirname(root) # up a level if verbose: print("Tried directories %s but none started with prefix %s" % @@ -128,39 +154,42 @@ def versions_from_parentdir(parentdir_prefix, root, verbose): @register_vcs_handler("git", "get_keywords") -def git_get_keywords(versionfile_abs): +def git_get_keywords(versionfile_abs: str) -> Dict[str, str]: """Extract version information from the given file.""" # the code embedded in _version.py can just fetch the value of these # keywords. When used from setup.py, we don't want to import _version.py, # so we do it with a regexp instead. This function is not used from # _version.py. - keywords = {} + keywords: Dict[str, str] = {} try: - f = open(versionfile_abs, "r") - for line in f.readlines(): - if line.strip().startswith("git_refnames ="): - mo = re.search(r'=\s*"(.*)"', line) - if mo: - keywords["refnames"] = mo.group(1) - if line.strip().startswith("git_full ="): - mo = re.search(r'=\s*"(.*)"', line) - if mo: - keywords["full"] = mo.group(1) - if line.strip().startswith("git_date ="): - mo = re.search(r'=\s*"(.*)"', line) - if mo: - keywords["date"] = mo.group(1) - f.close() - except EnvironmentError: + with open(versionfile_abs, "r") as fobj: + for line in fobj: + if line.strip().startswith("git_refnames ="): + mo = re.search(r'=\s*"(.*)"', line) + if mo: + keywords["refnames"] = mo.group(1) + if line.strip().startswith("git_full ="): + mo = re.search(r'=\s*"(.*)"', line) + if mo: + keywords["full"] = mo.group(1) + if line.strip().startswith("git_date ="): + mo = re.search(r'=\s*"(.*)"', line) + if mo: + keywords["date"] = mo.group(1) + except OSError: pass return keywords @register_vcs_handler("git", "keywords") -def git_versions_from_keywords(keywords, tag_prefix, verbose): +def git_versions_from_keywords( + keywords: Dict[str, str], + tag_prefix: str, + verbose: bool, +) -> Dict[str, Any]: """Get version information from git keywords.""" - if not keywords: - raise NotThisMethod("no keywords at all, weird") + if "refnames" not in keywords: + raise NotThisMethod("Short version file found") date = keywords.get("date") if date is not None: # Use only the last line. Previous lines may contain GPG signature @@ -179,11 +208,11 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose): if verbose: print("keywords are unexpanded, not using") raise NotThisMethod("unexpanded keywords, not a git-archive tarball") - refs = set([r.strip() for r in refnames.strip("()").split(",")]) + refs = {r.strip() for r in refnames.strip("()").split(",")} # starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of # just "foo-1.0". If we see a "tag: " prefix, prefer those. TAG = "tag: " - tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)]) + tags = {r[len(TAG):] for r in refs if r.startswith(TAG)} if not tags: # Either we're using git < 1.8.3, or there really are no tags. We use # a heuristic: assume all version tags have a digit. The old git %d @@ -192,7 +221,7 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose): # between branches and tags. By ignoring refnames without digits, we # filter out many common branch names like "release" and # "stabilization", as well as "HEAD" and "master". - tags = set([r for r in refs if re.search(r'\d', r)]) + tags = {r for r in refs if re.search(r'\d', r)} if verbose: print("discarding '%s', no digits" % ",".join(refs - tags)) if verbose: @@ -201,6 +230,11 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose): # sorting will prefer e.g. "2.0" over "2.0rc1" if ref.startswith(tag_prefix): r = ref[len(tag_prefix):] + # Filter out refs that exactly match prefix or that don't start + # with a number once the prefix is stripped (mostly a concern + # when prefix is '') + if not re.match(r'\d', r): + continue if verbose: print("picking %s" % r) return {"version": r, @@ -216,7 +250,12 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose): @register_vcs_handler("git", "pieces_from_vcs") -def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): +def git_pieces_from_vcs( + tag_prefix: str, + root: str, + verbose: bool, + runner: Callable = run_command +) -> Dict[str, Any]: """Get version from 'git describe' in the root of the source tree. This only gets called if the git-archive 'subst' keywords were *not* @@ -227,8 +266,15 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): if sys.platform == "win32": GITS = ["git.cmd", "git.exe"] - out, rc = run_command(GITS, ["rev-parse", "--git-dir"], cwd=root, - hide_stderr=True) + # GIT_DIR can interfere with correct operation of Versioneer. + # It may be intended to be passed to the Versioneer-versioned project, + # but that should not change where we get our version from. + env = os.environ.copy() + env.pop("GIT_DIR", None) + runner = functools.partial(runner, env=env) + + _, rc = runner(GITS, ["rev-parse", "--git-dir"], cwd=root, + hide_stderr=not verbose) if rc != 0: if verbose: print("Directory %s not under git control" % root) @@ -236,24 +282,57 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): # if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty] # if there isn't one, this yields HEX[-dirty] (no NUM) - describe_out, rc = run_command(GITS, ["describe", "--tags", "--dirty", - "--always", "--long", - "--match", "%s*" % tag_prefix], - cwd=root) + describe_out, rc = runner(GITS, [ + "describe", "--tags", "--dirty", "--always", "--long", + "--match", f"{tag_prefix}[[:digit:]]*" + ], cwd=root) # --long was added in git-1.5.5 if describe_out is None: raise NotThisMethod("'git describe' failed") describe_out = describe_out.strip() - full_out, rc = run_command(GITS, ["rev-parse", "HEAD"], cwd=root) + full_out, rc = runner(GITS, ["rev-parse", "HEAD"], cwd=root) if full_out is None: raise NotThisMethod("'git rev-parse' failed") full_out = full_out.strip() - pieces = {} + pieces: Dict[str, Any] = {} pieces["long"] = full_out pieces["short"] = full_out[:7] # maybe improved later pieces["error"] = None + branch_name, rc = runner(GITS, ["rev-parse", "--abbrev-ref", "HEAD"], + cwd=root) + # --abbrev-ref was added in git-1.6.3 + if rc != 0 or branch_name is None: + raise NotThisMethod("'git rev-parse --abbrev-ref' returned error") + branch_name = branch_name.strip() + + if branch_name == "HEAD": + # If we aren't exactly on a branch, pick a branch which represents + # the current commit. If all else fails, we are on a branchless + # commit. + branches, rc = runner(GITS, ["branch", "--contains"], cwd=root) + # --contains was added in git-1.5.4 + if rc != 0 or branches is None: + raise NotThisMethod("'git branch --contains' returned error") + branches = branches.split("\n") + + # Remove the first line if we're running detached + if "(" in branches[0]: + branches.pop(0) + + # Strip off the leading "* " from the list of branches. + branches = [branch[2:] for branch in branches] + if "master" in branches: + branch_name = "master" + elif not branches: + branch_name = None + else: + # Pick the first branch that is returned. Good or bad. + branch_name = branches[0] + + pieces["branch"] = branch_name + # parse describe_out. It will be like TAG-NUM-gHEX[-dirty] or HEX[-dirty] # TAG might have hyphens. git_describe = describe_out @@ -270,7 +349,7 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): # TAG-NUM-gHEX mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe) if not mo: - # unparseable. Maybe git-describe is misbehaving? + # unparsable. Maybe git-describe is misbehaving? pieces["error"] = ("unable to parse git-describe output: '%s'" % describe_out) return pieces @@ -295,13 +374,11 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): else: # HEX: no tags pieces["closest-tag"] = None - count_out, rc = run_command(GITS, ["rev-list", "HEAD", "--count"], - cwd=root) - pieces["distance"] = int(count_out) # total number of commits + out, rc = runner(GITS, ["rev-list", "HEAD", "--left-right"], cwd=root) + pieces["distance"] = len(out.split()) # total number of commits # commit date: see ISO-8601 comment in git_versions_from_keywords() - date = run_command(GITS, ["show", "-s", "--format=%ci", "HEAD"], - cwd=root)[0].strip() + date = runner(GITS, ["show", "-s", "--format=%ci", "HEAD"], cwd=root)[0].strip() # Use only the last line. Previous lines may contain GPG signature # information. date = date.splitlines()[-1] @@ -310,14 +387,14 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): return pieces -def plus_or_dot(pieces): +def plus_or_dot(pieces: Dict[str, Any]) -> str: """Return a + if we don't already have one, else return a .""" if "+" in pieces.get("closest-tag", ""): return "." return "+" -def render_pep440(pieces): +def render_pep440(pieces: Dict[str, Any]) -> str: """Build up version string, with post-release "local version identifier". Our goal: TAG[+DISTANCE.gHEX[.dirty]] . Note that if you @@ -342,23 +419,71 @@ def render_pep440(pieces): return rendered -def render_pep440_pre(pieces): - """TAG[.post0.devDISTANCE] -- No -dirty. +def render_pep440_branch(pieces: Dict[str, Any]) -> str: + """TAG[[.dev0]+DISTANCE.gHEX[.dirty]] . + + The ".dev0" means not master branch. Note that .dev0 sorts backwards + (a feature branch will appear "older" than the master branch). Exceptions: - 1: no tags. 0.post0.devDISTANCE + 1: no tags. 0[.dev0]+untagged.DISTANCE.gHEX[.dirty] """ if pieces["closest-tag"]: rendered = pieces["closest-tag"] + if pieces["distance"] or pieces["dirty"]: + if pieces["branch"] != "master": + rendered += ".dev0" + rendered += plus_or_dot(pieces) + rendered += "%d.g%s" % (pieces["distance"], pieces["short"]) + if pieces["dirty"]: + rendered += ".dirty" + else: + # exception #1 + rendered = "0" + if pieces["branch"] != "master": + rendered += ".dev0" + rendered += "+untagged.%d.g%s" % (pieces["distance"], + pieces["short"]) + if pieces["dirty"]: + rendered += ".dirty" + return rendered + + +def pep440_split_post(ver: str) -> Tuple[str, Optional[int]]: + """Split pep440 version string at the post-release segment. + + Returns the release segments before the post-release and the + post-release version number (or -1 if no post-release segment is present). + """ + vc = str.split(ver, ".post") + return vc[0], int(vc[1] or 0) if len(vc) == 2 else None + + +def render_pep440_pre(pieces: Dict[str, Any]) -> str: + """TAG[.postN.devDISTANCE] -- No -dirty. + + Exceptions: + 1: no tags. 0.post0.devDISTANCE + """ + if pieces["closest-tag"]: if pieces["distance"]: - rendered += ".post0.dev%d" % pieces["distance"] + # update the post release segment + tag_version, post_version = pep440_split_post(pieces["closest-tag"]) + rendered = tag_version + if post_version is not None: + rendered += ".post%d.dev%d" % (post_version + 1, pieces["distance"]) + else: + rendered += ".post0.dev%d" % (pieces["distance"]) + else: + # no commits, use the tag as the version + rendered = pieces["closest-tag"] else: # exception #1 rendered = "0.post0.dev%d" % pieces["distance"] return rendered -def render_pep440_post(pieces): +def render_pep440_post(pieces: Dict[str, Any]) -> str: """TAG[.postDISTANCE[.dev0]+gHEX] . The ".dev0" means dirty. Note that .dev0 sorts backwards @@ -385,7 +510,36 @@ def render_pep440_post(pieces): return rendered -def render_pep440_old(pieces): +def render_pep440_post_branch(pieces: Dict[str, Any]) -> str: + """TAG[.postDISTANCE[.dev0]+gHEX[.dirty]] . + + The ".dev0" means not master branch. + + Exceptions: + 1: no tags. 0.postDISTANCE[.dev0]+gHEX[.dirty] + """ + if pieces["closest-tag"]: + rendered = pieces["closest-tag"] + if pieces["distance"] or pieces["dirty"]: + rendered += ".post%d" % pieces["distance"] + if pieces["branch"] != "master": + rendered += ".dev0" + rendered += plus_or_dot(pieces) + rendered += "g%s" % pieces["short"] + if pieces["dirty"]: + rendered += ".dirty" + else: + # exception #1 + rendered = "0.post%d" % pieces["distance"] + if pieces["branch"] != "master": + rendered += ".dev0" + rendered += "+g%s" % pieces["short"] + if pieces["dirty"]: + rendered += ".dirty" + return rendered + + +def render_pep440_old(pieces: Dict[str, Any]) -> str: """TAG[.postDISTANCE[.dev0]] . The ".dev0" means dirty. @@ -407,7 +561,7 @@ def render_pep440_old(pieces): return rendered -def render_git_describe(pieces): +def render_git_describe(pieces: Dict[str, Any]) -> str: """TAG[-DISTANCE-gHEX][-dirty]. Like 'git describe --tags --dirty --always'. @@ -427,7 +581,7 @@ def render_git_describe(pieces): return rendered -def render_git_describe_long(pieces): +def render_git_describe_long(pieces: Dict[str, Any]) -> str: """TAG-DISTANCE-gHEX[-dirty]. Like 'git describe --tags --dirty --always -long'. @@ -447,7 +601,7 @@ def render_git_describe_long(pieces): return rendered -def render(pieces, style): +def render(pieces: Dict[str, Any], style: str) -> Dict[str, Any]: """Render the given version pieces into the requested style.""" if pieces["error"]: return {"version": "unknown", @@ -461,10 +615,14 @@ def render(pieces, style): if style == "pep440": rendered = render_pep440(pieces) + elif style == "pep440-branch": + rendered = render_pep440_branch(pieces) elif style == "pep440-pre": rendered = render_pep440_pre(pieces) elif style == "pep440-post": rendered = render_pep440_post(pieces) + elif style == "pep440-post-branch": + rendered = render_pep440_post_branch(pieces) elif style == "pep440-old": rendered = render_pep440_old(pieces) elif style == "git-describe": @@ -479,7 +637,7 @@ def render(pieces, style): "date": pieces.get("date")} -def get_versions(): +def get_versions() -> Dict[str, Any]: """Get version information or return default if unable to do so.""" # I am in _version.py, which lives at ROOT/VERSIONFILE_SOURCE. If we have # __file__, we can work backwards from there to the root. Some @@ -500,7 +658,7 @@ def get_versions(): # versionfile_source is the relative path from the top of the source # tree (where the .git directory might live) to this file. Invert # this to find the root from __file__. - for i in cfg.versionfile_source.split('/'): + for _ in cfg.versionfile_source.split('/'): root = os.path.dirname(root) except NameError: return {"version": "0+unknown", "full-revisionid": None, diff --git a/setup.cfg b/setup.cfg index b0126dc4..d39abe7e 100644 --- a/setup.cfg +++ b/setup.cfg @@ -17,5 +17,5 @@ VCS = git style = pep440 versionfile_source = fenicsprecice/_version.py versionfile_build = fenicsprecice/_version.py -tag_prefix = +tag_prefix = v parentdir_prefix = fenicsprecice- diff --git a/versioneer.py b/versioneer.py index 1040c218..1e3753e6 100644 --- a/versioneer.py +++ b/versioneer.py @@ -1,5 +1,5 @@ -# Version: 0.19 +# Version: 0.29 """The Versioneer - like a rocketeer, but for versions. @@ -9,12 +9,12 @@ * like a rocketeer, but for versions! * https://github.com/python-versioneer/python-versioneer * Brian Warner -* License: Public Domain -* Compatible with: Python 3.6, 3.7, 3.8, 3.9 and pypy3 +* License: Public Domain (Unlicense) +* Compatible with: Python 3.7, 3.8, 3.9, 3.10, 3.11 and pypy3 * [![Latest Version][pypi-image]][pypi-url] * [![Build Status][travis-image]][travis-url] -This is a tool for managing a recorded version number in distutils-based +This is a tool for managing a recorded version number in setuptools-based python projects. The goal is to remove the tedious and error-prone "update the embedded version string" step from your release process. Making a new release should be as easy as recording a new tag in your version-control @@ -23,10 +23,38 @@ ## Quick Install +Versioneer provides two installation modes. The "classic" vendored mode installs +a copy of versioneer into your repository. The experimental build-time dependency mode +is intended to allow you to skip this step and simplify the process of upgrading. + +### Vendored mode + +* `pip install versioneer` to somewhere in your $PATH + * A [conda-forge recipe](https://github.com/conda-forge/versioneer-feedstock) is + available, so you can also use `conda install -c conda-forge versioneer` +* add a `[tool.versioneer]` section to your `pyproject.toml` or a + `[versioneer]` section to your `setup.cfg` (see [Install](INSTALL.md)) + * Note that you will need to add `tomli; python_version < "3.11"` to your + build-time dependencies if you use `pyproject.toml` +* run `versioneer install --vendor` in your source tree, commit the results +* verify version information with `python setup.py version` + +### Build-time dependency mode + * `pip install versioneer` to somewhere in your $PATH -* add a `[versioneer]` section to your setup.cfg (see [Install](INSTALL.md)) -* run `versioneer install` in your source tree, commit the results -* Verify version information with `python setup.py version` + * A [conda-forge recipe](https://github.com/conda-forge/versioneer-feedstock) is + available, so you can also use `conda install -c conda-forge versioneer` +* add a `[tool.versioneer]` section to your `pyproject.toml` or a + `[versioneer]` section to your `setup.cfg` (see [Install](INSTALL.md)) +* add `versioneer` (with `[toml]` extra, if configuring in `pyproject.toml`) + to the `requires` key of the `build-system` table in `pyproject.toml`: + ```toml + [build-system] + requires = ["setuptools", "versioneer[toml]"] + build-backend = "setuptools.build_meta" + ``` +* run `versioneer install --no-vendor` in your source tree, commit the results +* verify version information with `python setup.py version` ## Version Identifiers @@ -231,9 +259,10 @@ To upgrade your project to a new release of Versioneer, do the following: * install the new Versioneer (`pip install -U versioneer` or equivalent) -* edit `setup.cfg`, if necessary, to include any new configuration settings - indicated by the release notes. See [UPGRADING](./UPGRADING.md) for details. -* re-run `versioneer install` in your source tree, to replace +* edit `setup.cfg` and `pyproject.toml`, if necessary, + to include any new configuration settings indicated by the release notes. + See [UPGRADING](./UPGRADING.md) for details. +* re-run `versioneer install --[no-]vendor` in your source tree, to replace `SRC/_version.py` * commit any changed files @@ -256,14 +285,15 @@ dependency * [minver](https://github.com/jbweston/miniver) - a lightweight reimplementation of versioneer +* [versioningit](https://github.com/jwodder/versioningit) - a PEP 518-based setuptools + plugin ## License To make Versioneer easier to embed, all its code is dedicated to the public domain. The `_version.py` that it creates is also in the public domain. -Specifically, both are released under the Creative Commons "Public Domain -Dedication" license (CC0-1.0), as described in -https://creativecommons.org/publicdomain/zero/1.0/ . +Specifically, both are released under the "Unlicense", as described in +https://unlicense.org/. [pypi-image]: https://img.shields.io/pypi/v/versioneer.svg [pypi-url]: https://pypi.python.org/pypi/versioneer/ @@ -272,6 +302,11 @@ [travis-url]: https://travis-ci.com/github/python-versioneer/python-versioneer """ +# pylint:disable=invalid-name,import-outside-toplevel,missing-function-docstring +# pylint:disable=missing-class-docstring,too-many-branches,too-many-statements +# pylint:disable=raise-missing-from,too-many-lines,too-many-locals,import-error +# pylint:disable=too-few-public-methods,redefined-outer-name,consider-using-with +# pylint:disable=attribute-defined-outside-init,too-many-arguments import configparser import errno @@ -280,13 +315,34 @@ import re import subprocess import sys +from pathlib import Path +from typing import Any, Callable, cast, Dict, List, Optional, Tuple, Union +from typing import NoReturn +import functools + +have_tomllib = True +if sys.version_info >= (3, 11): + import tomllib +else: + try: + import tomli as tomllib + except ImportError: + have_tomllib = False class VersioneerConfig: """Container for Versioneer configuration parameters.""" + VCS: str + style: str + tag_prefix: str + versionfile_source: str + versionfile_build: Optional[str] + parentdir_prefix: Optional[str] + verbose: Optional[bool] -def get_root(): + +def get_root() -> str: """Get the project root directory. We require that all commands are run from the project root, i.e. the @@ -294,13 +350,23 @@ def get_root(): """ root = os.path.realpath(os.path.abspath(os.getcwd())) setup_py = os.path.join(root, "setup.py") + pyproject_toml = os.path.join(root, "pyproject.toml") versioneer_py = os.path.join(root, "versioneer.py") - if not (os.path.exists(setup_py) or os.path.exists(versioneer_py)): + if not ( + os.path.exists(setup_py) + or os.path.exists(pyproject_toml) + or os.path.exists(versioneer_py) + ): # allow 'python path/to/setup.py COMMAND' root = os.path.dirname(os.path.realpath(os.path.abspath(sys.argv[0]))) setup_py = os.path.join(root, "setup.py") + pyproject_toml = os.path.join(root, "pyproject.toml") versioneer_py = os.path.join(root, "versioneer.py") - if not (os.path.exists(setup_py) or os.path.exists(versioneer_py)): + if not ( + os.path.exists(setup_py) + or os.path.exists(pyproject_toml) + or os.path.exists(versioneer_py) + ): err = ("Versioneer was unable to run the project root directory. " "Versioneer requires setup.py to be executed from " "its immediate directory (like 'python setup.py COMMAND'), " @@ -314,43 +380,62 @@ def get_root(): # module-import table will cache the first one. So we can't use # os.path.dirname(__file__), as that will find whichever # versioneer.py was first imported, even in later projects. - me = os.path.realpath(os.path.abspath(__file__)) - me_dir = os.path.normcase(os.path.splitext(me)[0]) + my_path = os.path.realpath(os.path.abspath(__file__)) + me_dir = os.path.normcase(os.path.splitext(my_path)[0]) vsr_dir = os.path.normcase(os.path.splitext(versioneer_py)[0]) - if me_dir != vsr_dir: + if me_dir != vsr_dir and "VERSIONEER_PEP518" not in globals(): print("Warning: build in %s is using versioneer.py from %s" - % (os.path.dirname(me), versioneer_py)) + % (os.path.dirname(my_path), versioneer_py)) except NameError: pass return root -def get_config_from_root(root): +def get_config_from_root(root: str) -> VersioneerConfig: """Read the project setup.cfg file to determine Versioneer config.""" - # This might raise EnvironmentError (if setup.cfg is missing), or + # This might raise OSError (if setup.cfg is missing), or # configparser.NoSectionError (if it lacks a [versioneer] section), or # configparser.NoOptionError (if it lacks "VCS="). See the docstring at # the top of versioneer.py for instructions on writing your setup.cfg . - setup_cfg = os.path.join(root, "setup.cfg") - parser = configparser.ConfigParser() - with open(setup_cfg, "r") as f: - parser.read_file(f) - VCS = parser.get("versioneer", "VCS") # mandatory - - def get(parser, name): - if parser.has_option("versioneer", name): - return parser.get("versioneer", name) - return None + root_pth = Path(root) + pyproject_toml = root_pth / "pyproject.toml" + setup_cfg = root_pth / "setup.cfg" + section: Union[Dict[str, Any], configparser.SectionProxy, None] = None + if pyproject_toml.exists() and have_tomllib: + try: + with open(pyproject_toml, 'rb') as fobj: + pp = tomllib.load(fobj) + section = pp['tool']['versioneer'] + except (tomllib.TOMLDecodeError, KeyError) as e: + print(f"Failed to load config from {pyproject_toml}: {e}") + print("Try to load it from setup.cfg") + if not section: + parser = configparser.ConfigParser() + with open(setup_cfg) as cfg_file: + parser.read_file(cfg_file) + parser.get("versioneer", "VCS") # raise error if missing + + section = parser["versioneer"] + + # `cast`` really shouldn't be used, but its simplest for the + # common VersioneerConfig users at the moment. We verify against + # `None` values elsewhere where it matters + cfg = VersioneerConfig() - cfg.VCS = VCS - cfg.style = get(parser, "style") or "" - cfg.versionfile_source = get(parser, "versionfile_source") - cfg.versionfile_build = get(parser, "versionfile_build") - cfg.tag_prefix = get(parser, "tag_prefix") - if cfg.tag_prefix in ("''", '""'): + cfg.VCS = section['VCS'] + cfg.style = section.get("style", "") + cfg.versionfile_source = cast(str, section.get("versionfile_source")) + cfg.versionfile_build = section.get("versionfile_build") + cfg.tag_prefix = cast(str, section.get("tag_prefix")) + if cfg.tag_prefix in ("''", '""', None): cfg.tag_prefix = "" - cfg.parentdir_prefix = get(parser, "parentdir_prefix") - cfg.verbose = get(parser, "verbose") + cfg.parentdir_prefix = section.get("parentdir_prefix") + if isinstance(section, configparser.SectionProxy): + # Make sure configparser translates to bool + cfg.verbose = section.getboolean("verbose") + else: + cfg.verbose = section.get("verbose") + return cfg @@ -359,37 +444,48 @@ class NotThisMethod(Exception): # these dictionaries contain VCS-specific tools -LONG_VERSION_PY = {} -HANDLERS = {} +LONG_VERSION_PY: Dict[str, str] = {} +HANDLERS: Dict[str, Dict[str, Callable]] = {} -def register_vcs_handler(vcs, method): # decorator +def register_vcs_handler(vcs: str, method: str) -> Callable: # decorator """Create decorator to mark a method as the handler of a VCS.""" - def decorate(f): + def decorate(f: Callable) -> Callable: """Store f in HANDLERS[vcs][method].""" - if vcs not in HANDLERS: - HANDLERS[vcs] = {} - HANDLERS[vcs][method] = f + HANDLERS.setdefault(vcs, {})[method] = f return f return decorate -def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False, - env=None): +def run_command( + commands: List[str], + args: List[str], + cwd: Optional[str] = None, + verbose: bool = False, + hide_stderr: bool = False, + env: Optional[Dict[str, str]] = None, +) -> Tuple[Optional[str], Optional[int]]: """Call the given command(s).""" assert isinstance(commands, list) - p = None - for c in commands: + process = None + + popen_kwargs: Dict[str, Any] = {} + if sys.platform == "win32": + # This hides the console window if pythonw.exe is used + startupinfo = subprocess.STARTUPINFO() + startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW + popen_kwargs["startupinfo"] = startupinfo + + for command in commands: try: - dispcmd = str([c] + args) + dispcmd = str([command] + args) # remember shell=False, so use git.cmd on windows, not just git - p = subprocess.Popen([c] + args, cwd=cwd, env=env, - stdout=subprocess.PIPE, - stderr=(subprocess.PIPE if hide_stderr - else None)) + process = subprocess.Popen([command] + args, cwd=cwd, env=env, + stdout=subprocess.PIPE, + stderr=(subprocess.PIPE if hide_stderr + else None), **popen_kwargs) break - except EnvironmentError: - e = sys.exc_info()[1] + except OSError as e: if e.errno == errno.ENOENT: continue if verbose: @@ -400,13 +496,13 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False, if verbose: print("unable to find command, tried %s" % (commands,)) return None, None - stdout = p.communicate()[0].strip().decode() - if p.returncode != 0: + stdout = process.communicate()[0].strip().decode() + if process.returncode != 0: if verbose: print("unable to run %s (error)" % dispcmd) print("stdout was %s" % stdout) - return None, p.returncode - return stdout, p.returncode + return None, process.returncode + return stdout, process.returncode LONG_VERSION_PY['git'] = r''' @@ -416,8 +512,9 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False, # directories (produced by setup.py build) will contain a much shorter file # that just contains the computed version number. -# This file is released into the public domain. Generated by -# versioneer-0.19 (https://github.com/python-versioneer/python-versioneer) +# This file is released into the public domain. +# Generated by versioneer-0.29 +# https://github.com/python-versioneer/python-versioneer """Git implementation of _version.py.""" @@ -426,9 +523,11 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False, import re import subprocess import sys +from typing import Any, Callable, Dict, List, Optional, Tuple +import functools -def get_keywords(): +def get_keywords() -> Dict[str, str]: """Get the keywords needed to look up the version information.""" # these strings will be replaced by git during git-archive. # setup.py/versioneer.py will grep for the variable names, so they must @@ -444,8 +543,15 @@ def get_keywords(): class VersioneerConfig: """Container for Versioneer configuration parameters.""" + VCS: str + style: str + tag_prefix: str + parentdir_prefix: str + versionfile_source: str + verbose: bool + -def get_config(): +def get_config() -> VersioneerConfig: """Create, populate and return the VersioneerConfig() object.""" # these strings are filled in when 'setup.py versioneer' creates # _version.py @@ -463,13 +569,13 @@ class NotThisMethod(Exception): """Exception raised if a method is not valid for the current scenario.""" -LONG_VERSION_PY = {} -HANDLERS = {} +LONG_VERSION_PY: Dict[str, str] = {} +HANDLERS: Dict[str, Dict[str, Callable]] = {} -def register_vcs_handler(vcs, method): # decorator +def register_vcs_handler(vcs: str, method: str) -> Callable: # decorator """Create decorator to mark a method as the handler of a VCS.""" - def decorate(f): + def decorate(f: Callable) -> Callable: """Store f in HANDLERS[vcs][method].""" if vcs not in HANDLERS: HANDLERS[vcs] = {} @@ -478,22 +584,35 @@ def decorate(f): return decorate -def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False, - env=None): +def run_command( + commands: List[str], + args: List[str], + cwd: Optional[str] = None, + verbose: bool = False, + hide_stderr: bool = False, + env: Optional[Dict[str, str]] = None, +) -> Tuple[Optional[str], Optional[int]]: """Call the given command(s).""" assert isinstance(commands, list) - p = None - for c in commands: + process = None + + popen_kwargs: Dict[str, Any] = {} + if sys.platform == "win32": + # This hides the console window if pythonw.exe is used + startupinfo = subprocess.STARTUPINFO() + startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW + popen_kwargs["startupinfo"] = startupinfo + + for command in commands: try: - dispcmd = str([c] + args) + dispcmd = str([command] + args) # remember shell=False, so use git.cmd on windows, not just git - p = subprocess.Popen([c] + args, cwd=cwd, env=env, - stdout=subprocess.PIPE, - stderr=(subprocess.PIPE if hide_stderr - else None)) + process = subprocess.Popen([command] + args, cwd=cwd, env=env, + stdout=subprocess.PIPE, + stderr=(subprocess.PIPE if hide_stderr + else None), **popen_kwargs) break - except EnvironmentError: - e = sys.exc_info()[1] + except OSError as e: if e.errno == errno.ENOENT: continue if verbose: @@ -504,16 +623,20 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False, if verbose: print("unable to find command, tried %%s" %% (commands,)) return None, None - stdout = p.communicate()[0].strip().decode() - if p.returncode != 0: + stdout = process.communicate()[0].strip().decode() + if process.returncode != 0: if verbose: print("unable to run %%s (error)" %% dispcmd) print("stdout was %%s" %% stdout) - return None, p.returncode - return stdout, p.returncode + return None, process.returncode + return stdout, process.returncode -def versions_from_parentdir(parentdir_prefix, root, verbose): +def versions_from_parentdir( + parentdir_prefix: str, + root: str, + verbose: bool, +) -> Dict[str, Any]: """Try to determine the version from the parent directory name. Source tarballs conventionally unpack into a directory that includes both @@ -522,15 +645,14 @@ def versions_from_parentdir(parentdir_prefix, root, verbose): """ rootdirs = [] - for i in range(3): + for _ in range(3): dirname = os.path.basename(root) if dirname.startswith(parentdir_prefix): return {"version": dirname[len(parentdir_prefix):], "full-revisionid": None, "dirty": False, "error": None, "date": None} - else: - rootdirs.append(root) - root = os.path.dirname(root) # up a level + rootdirs.append(root) + root = os.path.dirname(root) # up a level if verbose: print("Tried directories %%s but none started with prefix %%s" %% @@ -539,39 +661,42 @@ def versions_from_parentdir(parentdir_prefix, root, verbose): @register_vcs_handler("git", "get_keywords") -def git_get_keywords(versionfile_abs): +def git_get_keywords(versionfile_abs: str) -> Dict[str, str]: """Extract version information from the given file.""" # the code embedded in _version.py can just fetch the value of these # keywords. When used from setup.py, we don't want to import _version.py, # so we do it with a regexp instead. This function is not used from # _version.py. - keywords = {} + keywords: Dict[str, str] = {} try: - f = open(versionfile_abs, "r") - for line in f.readlines(): - if line.strip().startswith("git_refnames ="): - mo = re.search(r'=\s*"(.*)"', line) - if mo: - keywords["refnames"] = mo.group(1) - if line.strip().startswith("git_full ="): - mo = re.search(r'=\s*"(.*)"', line) - if mo: - keywords["full"] = mo.group(1) - if line.strip().startswith("git_date ="): - mo = re.search(r'=\s*"(.*)"', line) - if mo: - keywords["date"] = mo.group(1) - f.close() - except EnvironmentError: + with open(versionfile_abs, "r") as fobj: + for line in fobj: + if line.strip().startswith("git_refnames ="): + mo = re.search(r'=\s*"(.*)"', line) + if mo: + keywords["refnames"] = mo.group(1) + if line.strip().startswith("git_full ="): + mo = re.search(r'=\s*"(.*)"', line) + if mo: + keywords["full"] = mo.group(1) + if line.strip().startswith("git_date ="): + mo = re.search(r'=\s*"(.*)"', line) + if mo: + keywords["date"] = mo.group(1) + except OSError: pass return keywords @register_vcs_handler("git", "keywords") -def git_versions_from_keywords(keywords, tag_prefix, verbose): +def git_versions_from_keywords( + keywords: Dict[str, str], + tag_prefix: str, + verbose: bool, +) -> Dict[str, Any]: """Get version information from git keywords.""" - if not keywords: - raise NotThisMethod("no keywords at all, weird") + if "refnames" not in keywords: + raise NotThisMethod("Short version file found") date = keywords.get("date") if date is not None: # Use only the last line. Previous lines may contain GPG signature @@ -590,11 +715,11 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose): if verbose: print("keywords are unexpanded, not using") raise NotThisMethod("unexpanded keywords, not a git-archive tarball") - refs = set([r.strip() for r in refnames.strip("()").split(",")]) + refs = {r.strip() for r in refnames.strip("()").split(",")} # starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of # just "foo-1.0". If we see a "tag: " prefix, prefer those. TAG = "tag: " - tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)]) + tags = {r[len(TAG):] for r in refs if r.startswith(TAG)} if not tags: # Either we're using git < 1.8.3, or there really are no tags. We use # a heuristic: assume all version tags have a digit. The old git %%d @@ -603,7 +728,7 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose): # between branches and tags. By ignoring refnames without digits, we # filter out many common branch names like "release" and # "stabilization", as well as "HEAD" and "master". - tags = set([r for r in refs if re.search(r'\d', r)]) + tags = {r for r in refs if re.search(r'\d', r)} if verbose: print("discarding '%%s', no digits" %% ",".join(refs - tags)) if verbose: @@ -612,6 +737,11 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose): # sorting will prefer e.g. "2.0" over "2.0rc1" if ref.startswith(tag_prefix): r = ref[len(tag_prefix):] + # Filter out refs that exactly match prefix or that don't start + # with a number once the prefix is stripped (mostly a concern + # when prefix is '') + if not re.match(r'\d', r): + continue if verbose: print("picking %%s" %% r) return {"version": r, @@ -627,7 +757,12 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose): @register_vcs_handler("git", "pieces_from_vcs") -def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): +def git_pieces_from_vcs( + tag_prefix: str, + root: str, + verbose: bool, + runner: Callable = run_command +) -> Dict[str, Any]: """Get version from 'git describe' in the root of the source tree. This only gets called if the git-archive 'subst' keywords were *not* @@ -638,8 +773,15 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): if sys.platform == "win32": GITS = ["git.cmd", "git.exe"] - out, rc = run_command(GITS, ["rev-parse", "--git-dir"], cwd=root, - hide_stderr=True) + # GIT_DIR can interfere with correct operation of Versioneer. + # It may be intended to be passed to the Versioneer-versioned project, + # but that should not change where we get our version from. + env = os.environ.copy() + env.pop("GIT_DIR", None) + runner = functools.partial(runner, env=env) + + _, rc = runner(GITS, ["rev-parse", "--git-dir"], cwd=root, + hide_stderr=not verbose) if rc != 0: if verbose: print("Directory %%s not under git control" %% root) @@ -647,24 +789,57 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): # if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty] # if there isn't one, this yields HEX[-dirty] (no NUM) - describe_out, rc = run_command(GITS, ["describe", "--tags", "--dirty", - "--always", "--long", - "--match", "%%s*" %% tag_prefix], - cwd=root) + describe_out, rc = runner(GITS, [ + "describe", "--tags", "--dirty", "--always", "--long", + "--match", f"{tag_prefix}[[:digit:]]*" + ], cwd=root) # --long was added in git-1.5.5 if describe_out is None: raise NotThisMethod("'git describe' failed") describe_out = describe_out.strip() - full_out, rc = run_command(GITS, ["rev-parse", "HEAD"], cwd=root) + full_out, rc = runner(GITS, ["rev-parse", "HEAD"], cwd=root) if full_out is None: raise NotThisMethod("'git rev-parse' failed") full_out = full_out.strip() - pieces = {} + pieces: Dict[str, Any] = {} pieces["long"] = full_out pieces["short"] = full_out[:7] # maybe improved later pieces["error"] = None + branch_name, rc = runner(GITS, ["rev-parse", "--abbrev-ref", "HEAD"], + cwd=root) + # --abbrev-ref was added in git-1.6.3 + if rc != 0 or branch_name is None: + raise NotThisMethod("'git rev-parse --abbrev-ref' returned error") + branch_name = branch_name.strip() + + if branch_name == "HEAD": + # If we aren't exactly on a branch, pick a branch which represents + # the current commit. If all else fails, we are on a branchless + # commit. + branches, rc = runner(GITS, ["branch", "--contains"], cwd=root) + # --contains was added in git-1.5.4 + if rc != 0 or branches is None: + raise NotThisMethod("'git branch --contains' returned error") + branches = branches.split("\n") + + # Remove the first line if we're running detached + if "(" in branches[0]: + branches.pop(0) + + # Strip off the leading "* " from the list of branches. + branches = [branch[2:] for branch in branches] + if "master" in branches: + branch_name = "master" + elif not branches: + branch_name = None + else: + # Pick the first branch that is returned. Good or bad. + branch_name = branches[0] + + pieces["branch"] = branch_name + # parse describe_out. It will be like TAG-NUM-gHEX[-dirty] or HEX[-dirty] # TAG might have hyphens. git_describe = describe_out @@ -681,7 +856,7 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): # TAG-NUM-gHEX mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe) if not mo: - # unparseable. Maybe git-describe is misbehaving? + # unparsable. Maybe git-describe is misbehaving? pieces["error"] = ("unable to parse git-describe output: '%%s'" %% describe_out) return pieces @@ -706,13 +881,11 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): else: # HEX: no tags pieces["closest-tag"] = None - count_out, rc = run_command(GITS, ["rev-list", "HEAD", "--count"], - cwd=root) - pieces["distance"] = int(count_out) # total number of commits + out, rc = runner(GITS, ["rev-list", "HEAD", "--left-right"], cwd=root) + pieces["distance"] = len(out.split()) # total number of commits # commit date: see ISO-8601 comment in git_versions_from_keywords() - date = run_command(GITS, ["show", "-s", "--format=%%ci", "HEAD"], - cwd=root)[0].strip() + date = runner(GITS, ["show", "-s", "--format=%%ci", "HEAD"], cwd=root)[0].strip() # Use only the last line. Previous lines may contain GPG signature # information. date = date.splitlines()[-1] @@ -721,14 +894,14 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): return pieces -def plus_or_dot(pieces): +def plus_or_dot(pieces: Dict[str, Any]) -> str: """Return a + if we don't already have one, else return a .""" if "+" in pieces.get("closest-tag", ""): return "." return "+" -def render_pep440(pieces): +def render_pep440(pieces: Dict[str, Any]) -> str: """Build up version string, with post-release "local version identifier". Our goal: TAG[+DISTANCE.gHEX[.dirty]] . Note that if you @@ -753,23 +926,71 @@ def render_pep440(pieces): return rendered -def render_pep440_pre(pieces): - """TAG[.post0.devDISTANCE] -- No -dirty. +def render_pep440_branch(pieces: Dict[str, Any]) -> str: + """TAG[[.dev0]+DISTANCE.gHEX[.dirty]] . + + The ".dev0" means not master branch. Note that .dev0 sorts backwards + (a feature branch will appear "older" than the master branch). Exceptions: - 1: no tags. 0.post0.devDISTANCE + 1: no tags. 0[.dev0]+untagged.DISTANCE.gHEX[.dirty] """ if pieces["closest-tag"]: rendered = pieces["closest-tag"] + if pieces["distance"] or pieces["dirty"]: + if pieces["branch"] != "master": + rendered += ".dev0" + rendered += plus_or_dot(pieces) + rendered += "%%d.g%%s" %% (pieces["distance"], pieces["short"]) + if pieces["dirty"]: + rendered += ".dirty" + else: + # exception #1 + rendered = "0" + if pieces["branch"] != "master": + rendered += ".dev0" + rendered += "+untagged.%%d.g%%s" %% (pieces["distance"], + pieces["short"]) + if pieces["dirty"]: + rendered += ".dirty" + return rendered + + +def pep440_split_post(ver: str) -> Tuple[str, Optional[int]]: + """Split pep440 version string at the post-release segment. + + Returns the release segments before the post-release and the + post-release version number (or -1 if no post-release segment is present). + """ + vc = str.split(ver, ".post") + return vc[0], int(vc[1] or 0) if len(vc) == 2 else None + + +def render_pep440_pre(pieces: Dict[str, Any]) -> str: + """TAG[.postN.devDISTANCE] -- No -dirty. + + Exceptions: + 1: no tags. 0.post0.devDISTANCE + """ + if pieces["closest-tag"]: if pieces["distance"]: - rendered += ".post0.dev%%d" %% pieces["distance"] + # update the post release segment + tag_version, post_version = pep440_split_post(pieces["closest-tag"]) + rendered = tag_version + if post_version is not None: + rendered += ".post%%d.dev%%d" %% (post_version + 1, pieces["distance"]) + else: + rendered += ".post0.dev%%d" %% (pieces["distance"]) + else: + # no commits, use the tag as the version + rendered = pieces["closest-tag"] else: # exception #1 rendered = "0.post0.dev%%d" %% pieces["distance"] return rendered -def render_pep440_post(pieces): +def render_pep440_post(pieces: Dict[str, Any]) -> str: """TAG[.postDISTANCE[.dev0]+gHEX] . The ".dev0" means dirty. Note that .dev0 sorts backwards @@ -796,7 +1017,36 @@ def render_pep440_post(pieces): return rendered -def render_pep440_old(pieces): +def render_pep440_post_branch(pieces: Dict[str, Any]) -> str: + """TAG[.postDISTANCE[.dev0]+gHEX[.dirty]] . + + The ".dev0" means not master branch. + + Exceptions: + 1: no tags. 0.postDISTANCE[.dev0]+gHEX[.dirty] + """ + if pieces["closest-tag"]: + rendered = pieces["closest-tag"] + if pieces["distance"] or pieces["dirty"]: + rendered += ".post%%d" %% pieces["distance"] + if pieces["branch"] != "master": + rendered += ".dev0" + rendered += plus_or_dot(pieces) + rendered += "g%%s" %% pieces["short"] + if pieces["dirty"]: + rendered += ".dirty" + else: + # exception #1 + rendered = "0.post%%d" %% pieces["distance"] + if pieces["branch"] != "master": + rendered += ".dev0" + rendered += "+g%%s" %% pieces["short"] + if pieces["dirty"]: + rendered += ".dirty" + return rendered + + +def render_pep440_old(pieces: Dict[str, Any]) -> str: """TAG[.postDISTANCE[.dev0]] . The ".dev0" means dirty. @@ -818,7 +1068,7 @@ def render_pep440_old(pieces): return rendered -def render_git_describe(pieces): +def render_git_describe(pieces: Dict[str, Any]) -> str: """TAG[-DISTANCE-gHEX][-dirty]. Like 'git describe --tags --dirty --always'. @@ -838,7 +1088,7 @@ def render_git_describe(pieces): return rendered -def render_git_describe_long(pieces): +def render_git_describe_long(pieces: Dict[str, Any]) -> str: """TAG-DISTANCE-gHEX[-dirty]. Like 'git describe --tags --dirty --always -long'. @@ -858,7 +1108,7 @@ def render_git_describe_long(pieces): return rendered -def render(pieces, style): +def render(pieces: Dict[str, Any], style: str) -> Dict[str, Any]: """Render the given version pieces into the requested style.""" if pieces["error"]: return {"version": "unknown", @@ -872,10 +1122,14 @@ def render(pieces, style): if style == "pep440": rendered = render_pep440(pieces) + elif style == "pep440-branch": + rendered = render_pep440_branch(pieces) elif style == "pep440-pre": rendered = render_pep440_pre(pieces) elif style == "pep440-post": rendered = render_pep440_post(pieces) + elif style == "pep440-post-branch": + rendered = render_pep440_post_branch(pieces) elif style == "pep440-old": rendered = render_pep440_old(pieces) elif style == "git-describe": @@ -890,7 +1144,7 @@ def render(pieces, style): "date": pieces.get("date")} -def get_versions(): +def get_versions() -> Dict[str, Any]: """Get version information or return default if unable to do so.""" # I am in _version.py, which lives at ROOT/VERSIONFILE_SOURCE. If we have # __file__, we can work backwards from there to the root. Some @@ -911,7 +1165,7 @@ def get_versions(): # versionfile_source is the relative path from the top of the source # tree (where the .git directory might live) to this file. Invert # this to find the root from __file__. - for i in cfg.versionfile_source.split('/'): + for _ in cfg.versionfile_source.split('/'): root = os.path.dirname(root) except NameError: return {"version": "0+unknown", "full-revisionid": None, @@ -938,39 +1192,42 @@ def get_versions(): @register_vcs_handler("git", "get_keywords") -def git_get_keywords(versionfile_abs): +def git_get_keywords(versionfile_abs: str) -> Dict[str, str]: """Extract version information from the given file.""" # the code embedded in _version.py can just fetch the value of these # keywords. When used from setup.py, we don't want to import _version.py, # so we do it with a regexp instead. This function is not used from # _version.py. - keywords = {} + keywords: Dict[str, str] = {} try: - f = open(versionfile_abs, "r") - for line in f.readlines(): - if line.strip().startswith("git_refnames ="): - mo = re.search(r'=\s*"(.*)"', line) - if mo: - keywords["refnames"] = mo.group(1) - if line.strip().startswith("git_full ="): - mo = re.search(r'=\s*"(.*)"', line) - if mo: - keywords["full"] = mo.group(1) - if line.strip().startswith("git_date ="): - mo = re.search(r'=\s*"(.*)"', line) - if mo: - keywords["date"] = mo.group(1) - f.close() - except EnvironmentError: + with open(versionfile_abs, "r") as fobj: + for line in fobj: + if line.strip().startswith("git_refnames ="): + mo = re.search(r'=\s*"(.*)"', line) + if mo: + keywords["refnames"] = mo.group(1) + if line.strip().startswith("git_full ="): + mo = re.search(r'=\s*"(.*)"', line) + if mo: + keywords["full"] = mo.group(1) + if line.strip().startswith("git_date ="): + mo = re.search(r'=\s*"(.*)"', line) + if mo: + keywords["date"] = mo.group(1) + except OSError: pass return keywords @register_vcs_handler("git", "keywords") -def git_versions_from_keywords(keywords, tag_prefix, verbose): +def git_versions_from_keywords( + keywords: Dict[str, str], + tag_prefix: str, + verbose: bool, +) -> Dict[str, Any]: """Get version information from git keywords.""" - if not keywords: - raise NotThisMethod("no keywords at all, weird") + if "refnames" not in keywords: + raise NotThisMethod("Short version file found") date = keywords.get("date") if date is not None: # Use only the last line. Previous lines may contain GPG signature @@ -989,11 +1246,11 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose): if verbose: print("keywords are unexpanded, not using") raise NotThisMethod("unexpanded keywords, not a git-archive tarball") - refs = set([r.strip() for r in refnames.strip("()").split(",")]) + refs = {r.strip() for r in refnames.strip("()").split(",")} # starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of # just "foo-1.0". If we see a "tag: " prefix, prefer those. TAG = "tag: " - tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)]) + tags = {r[len(TAG):] for r in refs if r.startswith(TAG)} if not tags: # Either we're using git < 1.8.3, or there really are no tags. We use # a heuristic: assume all version tags have a digit. The old git %d @@ -1002,7 +1259,7 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose): # between branches and tags. By ignoring refnames without digits, we # filter out many common branch names like "release" and # "stabilization", as well as "HEAD" and "master". - tags = set([r for r in refs if re.search(r'\d', r)]) + tags = {r for r in refs if re.search(r'\d', r)} if verbose: print("discarding '%s', no digits" % ",".join(refs - tags)) if verbose: @@ -1011,6 +1268,11 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose): # sorting will prefer e.g. "2.0" over "2.0rc1" if ref.startswith(tag_prefix): r = ref[len(tag_prefix):] + # Filter out refs that exactly match prefix or that don't start + # with a number once the prefix is stripped (mostly a concern + # when prefix is '') + if not re.match(r'\d', r): + continue if verbose: print("picking %s" % r) return {"version": r, @@ -1026,7 +1288,12 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose): @register_vcs_handler("git", "pieces_from_vcs") -def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): +def git_pieces_from_vcs( + tag_prefix: str, + root: str, + verbose: bool, + runner: Callable = run_command +) -> Dict[str, Any]: """Get version from 'git describe' in the root of the source tree. This only gets called if the git-archive 'subst' keywords were *not* @@ -1037,8 +1304,15 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): if sys.platform == "win32": GITS = ["git.cmd", "git.exe"] - out, rc = run_command(GITS, ["rev-parse", "--git-dir"], cwd=root, - hide_stderr=True) + # GIT_DIR can interfere with correct operation of Versioneer. + # It may be intended to be passed to the Versioneer-versioned project, + # but that should not change where we get our version from. + env = os.environ.copy() + env.pop("GIT_DIR", None) + runner = functools.partial(runner, env=env) + + _, rc = runner(GITS, ["rev-parse", "--git-dir"], cwd=root, + hide_stderr=not verbose) if rc != 0: if verbose: print("Directory %s not under git control" % root) @@ -1046,24 +1320,57 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): # if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty] # if there isn't one, this yields HEX[-dirty] (no NUM) - describe_out, rc = run_command(GITS, ["describe", "--tags", "--dirty", - "--always", "--long", - "--match", "%s*" % tag_prefix], - cwd=root) + describe_out, rc = runner(GITS, [ + "describe", "--tags", "--dirty", "--always", "--long", + "--match", f"{tag_prefix}[[:digit:]]*" + ], cwd=root) # --long was added in git-1.5.5 if describe_out is None: raise NotThisMethod("'git describe' failed") describe_out = describe_out.strip() - full_out, rc = run_command(GITS, ["rev-parse", "HEAD"], cwd=root) + full_out, rc = runner(GITS, ["rev-parse", "HEAD"], cwd=root) if full_out is None: raise NotThisMethod("'git rev-parse' failed") full_out = full_out.strip() - pieces = {} + pieces: Dict[str, Any] = {} pieces["long"] = full_out pieces["short"] = full_out[:7] # maybe improved later pieces["error"] = None + branch_name, rc = runner(GITS, ["rev-parse", "--abbrev-ref", "HEAD"], + cwd=root) + # --abbrev-ref was added in git-1.6.3 + if rc != 0 or branch_name is None: + raise NotThisMethod("'git rev-parse --abbrev-ref' returned error") + branch_name = branch_name.strip() + + if branch_name == "HEAD": + # If we aren't exactly on a branch, pick a branch which represents + # the current commit. If all else fails, we are on a branchless + # commit. + branches, rc = runner(GITS, ["branch", "--contains"], cwd=root) + # --contains was added in git-1.5.4 + if rc != 0 or branches is None: + raise NotThisMethod("'git branch --contains' returned error") + branches = branches.split("\n") + + # Remove the first line if we're running detached + if "(" in branches[0]: + branches.pop(0) + + # Strip off the leading "* " from the list of branches. + branches = [branch[2:] for branch in branches] + if "master" in branches: + branch_name = "master" + elif not branches: + branch_name = None + else: + # Pick the first branch that is returned. Good or bad. + branch_name = branches[0] + + pieces["branch"] = branch_name + # parse describe_out. It will be like TAG-NUM-gHEX[-dirty] or HEX[-dirty] # TAG might have hyphens. git_describe = describe_out @@ -1080,7 +1387,7 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): # TAG-NUM-gHEX mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe) if not mo: - # unparseable. Maybe git-describe is misbehaving? + # unparsable. Maybe git-describe is misbehaving? pieces["error"] = ("unable to parse git-describe output: '%s'" % describe_out) return pieces @@ -1105,13 +1412,11 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): else: # HEX: no tags pieces["closest-tag"] = None - count_out, rc = run_command(GITS, ["rev-list", "HEAD", "--count"], - cwd=root) - pieces["distance"] = int(count_out) # total number of commits + out, rc = runner(GITS, ["rev-list", "HEAD", "--left-right"], cwd=root) + pieces["distance"] = len(out.split()) # total number of commits # commit date: see ISO-8601 comment in git_versions_from_keywords() - date = run_command(GITS, ["show", "-s", "--format=%ci", "HEAD"], - cwd=root)[0].strip() + date = runner(GITS, ["show", "-s", "--format=%ci", "HEAD"], cwd=root)[0].strip() # Use only the last line. Previous lines may contain GPG signature # information. date = date.splitlines()[-1] @@ -1120,7 +1425,7 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): return pieces -def do_vcs_install(manifest_in, versionfile_source, ipy): +def do_vcs_install(versionfile_source: str, ipy: Optional[str]) -> None: """Git-specific installation logic for Versioneer. For Git, this means creating/changing .gitattributes to mark _version.py @@ -1129,36 +1434,40 @@ def do_vcs_install(manifest_in, versionfile_source, ipy): GITS = ["git"] if sys.platform == "win32": GITS = ["git.cmd", "git.exe"] - files = [manifest_in, versionfile_source] + files = [versionfile_source] if ipy: files.append(ipy) - try: - me = __file__ - if me.endswith(".pyc") or me.endswith(".pyo"): - me = os.path.splitext(me)[0] + ".py" - versioneer_file = os.path.relpath(me) - except NameError: - versioneer_file = "versioneer.py" - files.append(versioneer_file) + if "VERSIONEER_PEP518" not in globals(): + try: + my_path = __file__ + if my_path.endswith((".pyc", ".pyo")): + my_path = os.path.splitext(my_path)[0] + ".py" + versioneer_file = os.path.relpath(my_path) + except NameError: + versioneer_file = "versioneer.py" + files.append(versioneer_file) present = False try: - f = open(".gitattributes", "r") - for line in f.readlines(): - if line.strip().startswith(versionfile_source): - if "export-subst" in line.strip().split()[1:]: - present = True - f.close() - except EnvironmentError: + with open(".gitattributes", "r") as fobj: + for line in fobj: + if line.strip().startswith(versionfile_source): + if "export-subst" in line.strip().split()[1:]: + present = True + break + except OSError: pass if not present: - f = open(".gitattributes", "a+") - f.write("%s export-subst\n" % versionfile_source) - f.close() + with open(".gitattributes", "a+") as fobj: + fobj.write(f"{versionfile_source} export-subst\n") files.append(".gitattributes") run_command(GITS, ["add", "--"] + files) -def versions_from_parentdir(parentdir_prefix, root, verbose): +def versions_from_parentdir( + parentdir_prefix: str, + root: str, + verbose: bool, +) -> Dict[str, Any]: """Try to determine the version from the parent directory name. Source tarballs conventionally unpack into a directory that includes both @@ -1167,15 +1476,14 @@ def versions_from_parentdir(parentdir_prefix, root, verbose): """ rootdirs = [] - for i in range(3): + for _ in range(3): dirname = os.path.basename(root) if dirname.startswith(parentdir_prefix): return {"version": dirname[len(parentdir_prefix):], "full-revisionid": None, "dirty": False, "error": None, "date": None} - else: - rootdirs.append(root) - root = os.path.dirname(root) # up a level + rootdirs.append(root) + root = os.path.dirname(root) # up a level if verbose: print("Tried directories %s but none started with prefix %s" % @@ -1184,7 +1492,7 @@ def versions_from_parentdir(parentdir_prefix, root, verbose): SHORT_VERSION_PY = """ -# This file was generated by 'versioneer.py' (0.19) from +# This file was generated by 'versioneer.py' (0.29) from # revision-control system data, or from the parent directory name of an # unpacked source archive. Distribution tarballs contain a pre-generated copy # of this file. @@ -1201,12 +1509,12 @@ def get_versions(): """ -def versions_from_file(filename): +def versions_from_file(filename: str) -> Dict[str, Any]: """Try to determine the version from _version.py if present.""" try: with open(filename) as f: contents = f.read() - except EnvironmentError: + except OSError: raise NotThisMethod("unable to read _version.py") mo = re.search(r"version_json = '''\n(.*)''' # END VERSION_JSON", contents, re.M | re.S) @@ -1218,9 +1526,8 @@ def versions_from_file(filename): return json.loads(mo.group(1)) -def write_to_version_file(filename, versions): +def write_to_version_file(filename: str, versions: Dict[str, Any]) -> None: """Write the given version number to the given _version.py file.""" - os.unlink(filename) contents = json.dumps(versions, sort_keys=True, indent=1, separators=(",", ": ")) with open(filename, "w") as f: @@ -1229,14 +1536,14 @@ def write_to_version_file(filename, versions): print("set %s to '%s'" % (filename, versions["version"])) -def plus_or_dot(pieces): +def plus_or_dot(pieces: Dict[str, Any]) -> str: """Return a + if we don't already have one, else return a .""" if "+" in pieces.get("closest-tag", ""): return "." return "+" -def render_pep440(pieces): +def render_pep440(pieces: Dict[str, Any]) -> str: """Build up version string, with post-release "local version identifier". Our goal: TAG[+DISTANCE.gHEX[.dirty]] . Note that if you @@ -1261,23 +1568,71 @@ def render_pep440(pieces): return rendered -def render_pep440_pre(pieces): - """TAG[.post0.devDISTANCE] -- No -dirty. +def render_pep440_branch(pieces: Dict[str, Any]) -> str: + """TAG[[.dev0]+DISTANCE.gHEX[.dirty]] . + + The ".dev0" means not master branch. Note that .dev0 sorts backwards + (a feature branch will appear "older" than the master branch). Exceptions: - 1: no tags. 0.post0.devDISTANCE + 1: no tags. 0[.dev0]+untagged.DISTANCE.gHEX[.dirty] """ if pieces["closest-tag"]: rendered = pieces["closest-tag"] + if pieces["distance"] or pieces["dirty"]: + if pieces["branch"] != "master": + rendered += ".dev0" + rendered += plus_or_dot(pieces) + rendered += "%d.g%s" % (pieces["distance"], pieces["short"]) + if pieces["dirty"]: + rendered += ".dirty" + else: + # exception #1 + rendered = "0" + if pieces["branch"] != "master": + rendered += ".dev0" + rendered += "+untagged.%d.g%s" % (pieces["distance"], + pieces["short"]) + if pieces["dirty"]: + rendered += ".dirty" + return rendered + + +def pep440_split_post(ver: str) -> Tuple[str, Optional[int]]: + """Split pep440 version string at the post-release segment. + + Returns the release segments before the post-release and the + post-release version number (or -1 if no post-release segment is present). + """ + vc = str.split(ver, ".post") + return vc[0], int(vc[1] or 0) if len(vc) == 2 else None + + +def render_pep440_pre(pieces: Dict[str, Any]) -> str: + """TAG[.postN.devDISTANCE] -- No -dirty. + + Exceptions: + 1: no tags. 0.post0.devDISTANCE + """ + if pieces["closest-tag"]: if pieces["distance"]: - rendered += ".post0.dev%d" % pieces["distance"] + # update the post release segment + tag_version, post_version = pep440_split_post(pieces["closest-tag"]) + rendered = tag_version + if post_version is not None: + rendered += ".post%d.dev%d" % (post_version + 1, pieces["distance"]) + else: + rendered += ".post0.dev%d" % (pieces["distance"]) + else: + # no commits, use the tag as the version + rendered = pieces["closest-tag"] else: # exception #1 rendered = "0.post0.dev%d" % pieces["distance"] return rendered -def render_pep440_post(pieces): +def render_pep440_post(pieces: Dict[str, Any]) -> str: """TAG[.postDISTANCE[.dev0]+gHEX] . The ".dev0" means dirty. Note that .dev0 sorts backwards @@ -1304,7 +1659,36 @@ def render_pep440_post(pieces): return rendered -def render_pep440_old(pieces): +def render_pep440_post_branch(pieces: Dict[str, Any]) -> str: + """TAG[.postDISTANCE[.dev0]+gHEX[.dirty]] . + + The ".dev0" means not master branch. + + Exceptions: + 1: no tags. 0.postDISTANCE[.dev0]+gHEX[.dirty] + """ + if pieces["closest-tag"]: + rendered = pieces["closest-tag"] + if pieces["distance"] or pieces["dirty"]: + rendered += ".post%d" % pieces["distance"] + if pieces["branch"] != "master": + rendered += ".dev0" + rendered += plus_or_dot(pieces) + rendered += "g%s" % pieces["short"] + if pieces["dirty"]: + rendered += ".dirty" + else: + # exception #1 + rendered = "0.post%d" % pieces["distance"] + if pieces["branch"] != "master": + rendered += ".dev0" + rendered += "+g%s" % pieces["short"] + if pieces["dirty"]: + rendered += ".dirty" + return rendered + + +def render_pep440_old(pieces: Dict[str, Any]) -> str: """TAG[.postDISTANCE[.dev0]] . The ".dev0" means dirty. @@ -1326,7 +1710,7 @@ def render_pep440_old(pieces): return rendered -def render_git_describe(pieces): +def render_git_describe(pieces: Dict[str, Any]) -> str: """TAG[-DISTANCE-gHEX][-dirty]. Like 'git describe --tags --dirty --always'. @@ -1346,7 +1730,7 @@ def render_git_describe(pieces): return rendered -def render_git_describe_long(pieces): +def render_git_describe_long(pieces: Dict[str, Any]) -> str: """TAG-DISTANCE-gHEX[-dirty]. Like 'git describe --tags --dirty --always -long'. @@ -1366,7 +1750,7 @@ def render_git_describe_long(pieces): return rendered -def render(pieces, style): +def render(pieces: Dict[str, Any], style: str) -> Dict[str, Any]: """Render the given version pieces into the requested style.""" if pieces["error"]: return {"version": "unknown", @@ -1380,10 +1764,14 @@ def render(pieces, style): if style == "pep440": rendered = render_pep440(pieces) + elif style == "pep440-branch": + rendered = render_pep440_branch(pieces) elif style == "pep440-pre": rendered = render_pep440_pre(pieces) elif style == "pep440-post": rendered = render_pep440_post(pieces) + elif style == "pep440-post-branch": + rendered = render_pep440_post_branch(pieces) elif style == "pep440-old": rendered = render_pep440_old(pieces) elif style == "git-describe": @@ -1402,7 +1790,7 @@ class VersioneerBadRootError(Exception): """The project root directory is unknown or missing key files.""" -def get_versions(verbose=False): +def get_versions(verbose: bool = False) -> Dict[str, Any]: """Get the project version from whatever source is available. Returns dict with two keys: 'version' and 'full'. @@ -1417,7 +1805,7 @@ def get_versions(verbose=False): assert cfg.VCS is not None, "please set [versioneer]VCS= in setup.cfg" handlers = HANDLERS.get(cfg.VCS) assert handlers, "unrecognized VCS '%s'" % cfg.VCS - verbose = verbose or cfg.verbose + verbose = verbose or bool(cfg.verbose) # `bool()` used to avoid `None` assert cfg.versionfile_source is not None, \ "please set versioneer.versionfile_source" assert cfg.tag_prefix is not None, "please set versioneer.tag_prefix" @@ -1478,13 +1866,13 @@ def get_versions(verbose=False): "date": None} -def get_version(): +def get_version() -> str: """Get the short version string for this project.""" return get_versions()["version"] -def get_cmdclass(cmdclass=None): - """Get the custom setuptools/distutils subclasses used by Versioneer. +def get_cmdclass(cmdclass: Optional[Dict[str, Any]] = None): + """Get the custom setuptools subclasses used by Versioneer. If the package uses a different cmdclass (e.g. one from numpy), it should be provide as an argument. @@ -1506,21 +1894,21 @@ def get_cmdclass(cmdclass=None): cmds = {} if cmdclass is None else cmdclass.copy() - # we add "version" to both distutils and setuptools - from distutils.core import Command + # we add "version" to setuptools + from setuptools import Command class cmd_version(Command): description = "report generated version string" - user_options = [] - boolean_options = [] + user_options: List[Tuple[str, str, str]] = [] + boolean_options: List[str] = [] - def initialize_options(self): + def initialize_options(self) -> None: pass - def finalize_options(self): + def finalize_options(self) -> None: pass - def run(self): + def run(self) -> None: vers = get_versions(verbose=True) print("Version: %s" % vers["version"]) print(" full-revisionid: %s" % vers.get("full-revisionid")) @@ -1530,7 +1918,7 @@ def run(self): print(" error: %s" % vers["error"]) cmds["version"] = cmd_version - # we override "build_py" in both distutils and setuptools + # we override "build_py" in setuptools # # most invocation pathways end up running build_py: # distutils/build -> build_py @@ -1545,20 +1933,25 @@ def run(self): # then does setup.py bdist_wheel, or sometimes setup.py install # setup.py egg_info -> ? + # pip install -e . and setuptool/editable_wheel will invoke build_py + # but the build_py command is not expected to copy any files. + # we override different "build_py" commands for both environments if 'build_py' in cmds: - _build_py = cmds['build_py'] - elif "setuptools" in sys.modules: - from setuptools.command.build_py import build_py as _build_py + _build_py: Any = cmds['build_py'] else: - from distutils.command.build_py import build_py as _build_py + from setuptools.command.build_py import build_py as _build_py class cmd_build_py(_build_py): - def run(self): + def run(self) -> None: root = get_root() cfg = get_config_from_root(root) versions = get_versions() _build_py.run(self) + if getattr(self, "editable_mode", False): + # During editable installs `.py` and data files are + # not copied to build_lib + return # now locate _version.py in the new build/ directory and replace # it with an updated value if cfg.versionfile_build: @@ -1568,13 +1961,13 @@ def run(self): write_to_version_file(target_versionfile, versions) cmds["build_py"] = cmd_build_py - if "setuptools" in sys.modules: - from setuptools.command.build_ext import build_ext as _build_ext + if 'build_ext' in cmds: + _build_ext: Any = cmds['build_ext'] else: - from distutils.command.build_ext import build_ext as _build_ext + from setuptools.command.build_ext import build_ext as _build_ext class cmd_build_ext(_build_ext): - def run(self): + def run(self) -> None: root = get_root() cfg = get_config_from_root(root) versions = get_versions() @@ -1587,14 +1980,21 @@ def run(self): return # now locate _version.py in the new build/ directory and replace # it with an updated value + if not cfg.versionfile_build: + return target_versionfile = os.path.join(self.build_lib, - cfg.versionfile_source) + cfg.versionfile_build) + if not os.path.exists(target_versionfile): + print(f"Warning: {target_versionfile} does not exist, skipping " + "version update. This can happen if you are running build_ext " + "without first running build_py.") + return print("UPDATING %s" % target_versionfile) write_to_version_file(target_versionfile, versions) cmds["build_ext"] = cmd_build_ext if "cx_Freeze" in sys.modules: # cx_freeze enabled? - from cx_Freeze.dist import build_exe as _build_exe + from cx_Freeze.dist import build_exe as _build_exe # type: ignore # nczeczulin reports that py2exe won't like the pep440-style string # as FILEVERSION, but it can be used for PRODUCTVERSION, e.g. # setup(console=[{ @@ -1603,7 +2003,7 @@ def run(self): # ... class cmd_build_exe(_build_exe): - def run(self): + def run(self) -> None: root = get_root() cfg = get_config_from_root(root) versions = get_versions() @@ -1626,10 +2026,13 @@ def run(self): del cmds["build_py"] if 'py2exe' in sys.modules: # py2exe enabled? - from py2exe.distutils_buildexe import py2exe as _py2exe + try: + from py2exe.setuptools_buildexe import py2exe as _py2exe # type: ignore + except ImportError: + from py2exe.distutils_buildexe import py2exe as _py2exe # type: ignore class cmd_py2exe(_py2exe): - def run(self): + def run(self) -> None: root = get_root() cfg = get_config_from_root(root) versions = get_versions() @@ -1650,16 +2053,51 @@ def run(self): }) cmds["py2exe"] = cmd_py2exe + # sdist farms its file list building out to egg_info + if 'egg_info' in cmds: + _egg_info: Any = cmds['egg_info'] + else: + from setuptools.command.egg_info import egg_info as _egg_info + + class cmd_egg_info(_egg_info): + def find_sources(self) -> None: + # egg_info.find_sources builds the manifest list and writes it + # in one shot + super().find_sources() + + # Modify the filelist and normalize it + root = get_root() + cfg = get_config_from_root(root) + self.filelist.append('versioneer.py') + if cfg.versionfile_source: + # There are rare cases where versionfile_source might not be + # included by default, so we must be explicit + self.filelist.append(cfg.versionfile_source) + self.filelist.sort() + self.filelist.remove_duplicates() + + # The write method is hidden in the manifest_maker instance that + # generated the filelist and was thrown away + # We will instead replicate their final normalization (to unicode, + # and POSIX-style paths) + from setuptools import unicode_utils + normalized = [unicode_utils.filesys_decode(f).replace(os.sep, '/') + for f in self.filelist.files] + + manifest_filename = os.path.join(self.egg_info, 'SOURCES.txt') + with open(manifest_filename, 'w') as fobj: + fobj.write('\n'.join(normalized)) + + cmds['egg_info'] = cmd_egg_info + # we override different "sdist" commands for both environments if 'sdist' in cmds: - _sdist = cmds['sdist'] - elif "setuptools" in sys.modules: - from setuptools.command.sdist import sdist as _sdist + _sdist: Any = cmds['sdist'] else: - from distutils.command.sdist import sdist as _sdist + from setuptools.command.sdist import sdist as _sdist class cmd_sdist(_sdist): - def run(self): + def run(self) -> None: versions = get_versions() self._versioneer_generated_versions = versions # unless we update this, the command will keep using the old @@ -1667,7 +2105,7 @@ def run(self): self.distribution.metadata.version = versions["version"] return _sdist.run(self) - def make_release_tree(self, base_dir, files): + def make_release_tree(self, base_dir: str, files: List[str]) -> None: root = get_root() cfg = get_config_from_root(root) _sdist.make_release_tree(self, base_dir, files) @@ -1720,21 +2158,26 @@ def make_release_tree(self, base_dir, files): """ -INIT_PY_SNIPPET = """ +OLD_SNIPPET = """ from ._version import get_versions __version__ = get_versions()['version'] del get_versions """ +INIT_PY_SNIPPET = """ +from . import {0} +__version__ = {0}.get_versions()['version'] +""" -def do_setup(): + +def do_setup() -> int: """Do main VCS-independent setup function for installing Versioneer.""" root = get_root() try: cfg = get_config_from_root(root) - except (EnvironmentError, configparser.NoSectionError, + except (OSError, configparser.NoSectionError, configparser.NoOptionError) as e: - if isinstance(e, (EnvironmentError, configparser.NoSectionError)): + if isinstance(e, (OSError, configparser.NoSectionError)): print("Adding sample versioneer config to setup.cfg", file=sys.stderr) with open(os.path.join(root, "setup.cfg"), "a") as f: @@ -1754,62 +2197,37 @@ def do_setup(): ipy = os.path.join(os.path.dirname(cfg.versionfile_source), "__init__.py") + maybe_ipy: Optional[str] = ipy if os.path.exists(ipy): try: with open(ipy, "r") as f: old = f.read() - except EnvironmentError: + except OSError: old = "" - if INIT_PY_SNIPPET not in old: + module = os.path.splitext(os.path.basename(cfg.versionfile_source))[0] + snippet = INIT_PY_SNIPPET.format(module) + if OLD_SNIPPET in old: + print(" replacing boilerplate in %s" % ipy) + with open(ipy, "w") as f: + f.write(old.replace(OLD_SNIPPET, snippet)) + elif snippet not in old: print(" appending to %s" % ipy) with open(ipy, "a") as f: - f.write(INIT_PY_SNIPPET) + f.write(snippet) else: print(" %s unmodified" % ipy) else: print(" %s doesn't exist, ok" % ipy) - ipy = None - - # Make sure both the top-level "versioneer.py" and versionfile_source - # (PKG/_version.py, used by runtime code) are in MANIFEST.in, so - # they'll be copied into source distributions. Pip won't be able to - # install the package without this. - manifest_in = os.path.join(root, "MANIFEST.in") - simple_includes = set() - try: - with open(manifest_in, "r") as f: - for line in f: - if line.startswith("include "): - for include in line.split()[1:]: - simple_includes.add(include) - except EnvironmentError: - pass - # That doesn't cover everything MANIFEST.in can do - # (http://docs.python.org/2/distutils/sourcedist.html#commands), so - # it might give some false negatives. Appending redundant 'include' - # lines is safe, though. - if "versioneer.py" not in simple_includes: - print(" appending 'versioneer.py' to MANIFEST.in") - with open(manifest_in, "a") as f: - f.write("include versioneer.py\n") - else: - print(" 'versioneer.py' already in MANIFEST.in") - if cfg.versionfile_source not in simple_includes: - print(" appending versionfile_source ('%s') to MANIFEST.in" % - cfg.versionfile_source) - with open(manifest_in, "a") as f: - f.write("include %s\n" % cfg.versionfile_source) - else: - print(" versionfile_source already in MANIFEST.in") + maybe_ipy = None # Make VCS-specific changes. For git, this means creating/changing # .gitattributes to mark _version.py for export-subst keyword # substitution. - do_vcs_install(manifest_in, cfg.versionfile_source, ipy) + do_vcs_install(cfg.versionfile_source, maybe_ipy) return 0 -def scan_setup_py(): +def scan_setup_py() -> int: """Validate the contents of setup.py against Versioneer's expectations.""" found = set() setters = False @@ -1846,10 +2264,14 @@ def scan_setup_py(): return errors +def setup_command() -> NoReturn: + """Set up Versioneer and exit with appropriate error code.""" + errors = do_setup() + errors += scan_setup_py() + sys.exit(1 if errors else 0) + + if __name__ == "__main__": cmd = sys.argv[1] if cmd == "setup": - errors = do_setup() - errors += scan_setup_py() - if errors: - sys.exit(1) + setup_command() From 283bba933c113234823d9bfdd98f59f97398ed2b Mon Sep 17 00:00:00 2001 From: Benjamin Rodenberg Date: Thu, 28 Sep 2023 17:03:03 +0200 Subject: [PATCH 06/14] Remove link Link is again broken. Decided to remove it after just fixing it via #157 --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 423b81ec..7d0690e9 100644 --- a/README.md +++ b/README.md @@ -85,7 +85,7 @@ To create and install the `fenicsprecice` python package the following instructi ## Development history -The initial version of this adapter was developed by [Benjamin Rodenberg](https://www.cs.cit.tum.de/sccs/personen/benjamin-rodenberg/) during his research stay at Lund University in the group for [Numerical Analysis](http://www.maths.lu.se/english/research/research-divisions/numerical-analysis/) in close collaboration with [Peter Meisrimel](https://portal.research.lu.se/en/persons/peter-meisrimel). +The initial version of this adapter was developed by [Benjamin Rodenberg](https://www.cs.cit.tum.de/sccs/personen/benjamin-rodenberg/) during his research stay at Lund University in the group for [Numerical Analysis](http://www.maths.lu.se/english/research/research-divisions/numerical-analysis/) in close collaboration with Peter Meisrimel. [Richard Hertrich](https://github.com/richahert) contributed the possibility to perform FSI simulations using the adapter in his [Bachelor thesis](https://mediatum.ub.tum.de/node?id=1520579). From 8a21819bd5c567854e809acfadcd1e75bf539640 Mon Sep 17 00:00:00 2001 From: Benjamin Rodenberg Date: Thu, 28 Sep 2023 18:19:42 +0200 Subject: [PATCH 07/14] Compatibility update for v3.0.0 (#153) * Remove API functions that are removed in v3.0.0. Fix data initialization. * Forward dt to python bindings. * Set mesh connectivity information only if it is required. Closes #138 * Update tests * Update links of contributors * Update CHANGELOG.md * Call checkpointing functions from the interface * Change required version of pyprecice to >=3.0.0.0 * Comment out nearest projection volume coupling. --------- Co-authored-by: Ishaan Desai --- .github/workflows/build-and-test.yml | 2 +- .github/workflows/run-tutorials.yml | 20 +-- CHANGELOG.md | 6 + fenicsprecice/adapter_core.py | 48 +++---- fenicsprecice/expression_core.py | 14 +- fenicsprecice/fenicsprecice.py | 169 ++++++++++-------------- setup.py | 2 +- tests/MockedPrecice.py | 31 ++--- tests/integration/test_fenicsprecice.py | 99 +++++++------- tests/integration/test_write_read.py | 122 ++++++++--------- tests/unit/test_adapter_core.py | 5 +- 11 files changed, 242 insertions(+), 276 deletions(-) diff --git a/.github/workflows/build-and-test.yml b/.github/workflows/build-and-test.yml index b3961c57..fcf01ee2 100644 --- a/.github/workflows/build-and-test.yml +++ b/.github/workflows/build-and-test.yml @@ -36,7 +36,7 @@ jobs: run: | mkdir -p precice echo "from setuptools import setup" >> precice/setup.py - echo "setup(name='pyprecice', version='2.0.2.1')" >> precice/setup.py + echo "setup(name='pyprecice', version='3.0.0.0')" >> precice/setup.py python3 -m pip install ./precice/ - name: Run unit tests run: python3 setup.py test -s tests.unit diff --git a/.github/workflows/run-tutorials.yml b/.github/workflows/run-tutorials.yml index 2ac02790..699aa3e1 100644 --- a/.github/workflows/run-tutorials.yml +++ b/.github/workflows/run-tutorials.yml @@ -7,15 +7,15 @@ on: pull_request: paths: - '**' - -jobs: + +jobs: run_ht_simple: name: Run HT, simple runs-on: ubuntu-latest - container: precice/precice + container: precice/precice:develop steps: - name: Checkout Repository - uses: actions/checkout@v2 + uses: actions/checkout@v2 - name: Install Dependencies & FEniCS run: | apt-get -qq update @@ -30,17 +30,17 @@ jobs: - name: Get tutorials run: git clone -b develop https://github.com/precice/tutorials.git - name: Run tutorial - run: | + run: | cd tutorials/partitioned-heat-conduction/fenics - python3 heat.py -d & python3 heat.py -n - + ./run.sh -d & ./run.sh -n + run_ht_complex: name: Run HT, complex runs-on: ubuntu-latest - container: precice/precice + container: precice/precice:develop steps: - name: Checkout Repository - uses: actions/checkout@v2 + uses: actions/checkout@v2 - name: Install Dependencies & FEniCS run: | apt-get -qq update @@ -57,4 +57,4 @@ jobs: - name: Run tutorial run: | cd tutorials/partitioned-heat-conduction-complex/fenics - python3 heat.py -d -i complex & python3 heat.py -n -i complex + ./run.sh -d & ./run.sh -n diff --git a/CHANGELOG.md b/CHANGELOG.md index 00642d1c..9c7601f9 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,5 +1,11 @@ # FEniCS-preCICE adapter changelog +## latest + +* Drop support for preCICE 2.x version, as this is a breaking release. +* Update adapter to use preCICE v3 API [#153](https://github.com/precice/fenics-adapter/pull/153). +* Remove functionality to define mesh connectivity in 2D cases in the form of triangles due to lack of testing and compatibility problems (might be added again). See [#162](https://github.com/precice/fenics-adapter/issues/162). + ## 1.4.0 * Adding CITATION.cff to link the adapter repository to the relevant publication in the journal SoftwareX. diff --git a/fenicsprecice/adapter_core.py b/fenicsprecice/adapter_core.py index 36b5cb8e..8eafb520 100644 --- a/fenicsprecice/adapter_core.py +++ b/fenicsprecice/adapter_core.py @@ -383,29 +383,26 @@ def edge_is_on(subdomain, this_edge): """ Check whether edge lies within subdomain """ - assert(len(list(vertices(this_edge))) == 2) + assert (len(list(vertices(this_edge))) == 2) return all([subdomain.inside(v.point(), True) for v in vertices(this_edge)]) - vertices1_ids = [] - vertices2_ids = [] - edges_ids = [] + edge_vertices_ids = [] + fenics_edges_ids = [] for edge in edges(function_space.mesh()): if edge_is_on(coupling_subdomain, edge): v1, v2 = list(vertices(edge)) if v1.global_index() in global_ids and v2.global_index() in global_ids: - vertices1_ids.append(id_mapping[v1.global_index()]) - vertices2_ids.append(id_mapping[v2.global_index()]) - edges_ids.append(edge.index()) + edge_vertices_ids.append([id_mapping[v1.global_index()], id_mapping[v2.global_index()]]) + fenics_edges_ids.append(edge.index()) - vertices1_ids = np.array(vertices1_ids) - vertices2_ids = np.array(vertices2_ids) - edges_ids = np.array(edges_ids) + edge_vertices_ids = np.array(edge_vertices_ids) + fenics_edges_ids = np.array(fenics_edges_ids) - return vertices1_ids, vertices2_ids, edges_ids + return edge_vertices_ids, fenics_edges_ids -def get_coupling_triangles(function_space, coupling_subdomain, precice_edge_dict): +def get_coupling_triangles(function_space, coupling_subdomain, fenics_edge_ids, id_mapping): """ Extracts triangles of mesh which lie on the coupling region. @@ -415,31 +412,36 @@ def get_coupling_triangles(function_space, coupling_subdomain, precice_edge_dict Function space on which the finite element problem definition lives. coupling_subdomain : FEniCS Domain FEniCS domain of the coupling interface region. - precice_edge_dict: dict - Dictionary with FEniCS IDs of coupling mesh edges as keys and preCICE IDs of the edges as values + fenics_edge_ids: numpy array + Array with FEniCS IDs of coupling mesh edges Returns ------- - edges : numpy array - Array of edges indices (3 per triangle) + vertex_ids : numpy array + Array of indices of vertices which make up triangles (3 per triangle) """ def cell_is_in(subdomain, this_cell): """ Check whether edge lies within subdomain """ - assert(len(list(vertices(this_cell))) == 3), "Only triangular meshes are supported" + assert (len(list(vertices(this_cell))) == 3), "Only triangular meshes are supported" return all([subdomain.inside(v.point(), True) for v in vertices(this_cell)]) - edges_ids = [] - + vertex_ids = [] for cell in cells(function_space.mesh()): if cell_is_in(coupling_subdomain, cell): e1, e2, e3 = list(edges(cell)) - if all(edge in precice_edge_dict.keys() for edge in [e1.index(), e2.index(), e3.index()]): - edges_ids.append([e1.index(), e2.index(), e3.index()]) - - return np.array(edges_ids) + if all(edge_ids in fenics_edge_ids for edge_ids in [e1.index(), e2.index(), e3.index()]): + v1, v2 = vertices(e1) + _, v3 = vertices(e2) + assert (v3 != v1) + assert (v3 != v2) + vertex_ids.append([id_mapping[v1.global_index()], + id_mapping[v2.global_index()], + id_mapping[v3.global_index()]]) + + return np.array(vertex_ids) def get_forces_as_point_sources(fixed_boundary, function_space, data): diff --git a/fenicsprecice/expression_core.py b/fenicsprecice/expression_core.py index fd1a8116..5e18a0c0 100644 --- a/fenicsprecice/expression_core.py +++ b/fenicsprecice/expression_core.py @@ -49,7 +49,7 @@ def update_boundary_data(self, vals, coords): self._vals = vals _, self._dimension = coords.shape - assert(self._dimension == 2), "Coordinates are of incorrect dimensions" + assert (self._dimension == 2), "Coordinates are of incorrect dimensions" self._coords_x = coords[:, 0] self._coords_y = coords[:, 1] @@ -128,10 +128,10 @@ def is_scalar_valued(self): """ try: if self._vals.ndim == 1: - assert(self._function_type is FunctionType.SCALAR) + assert (self._function_type is FunctionType.SCALAR) return True elif self._vals.ndim > 1: - assert(self._function_type is FunctionType.VECTOR) + assert (self._function_type is FunctionType.VECTOR) return False else: raise Exception("Dimension of the function is 0 or negative!") @@ -149,10 +149,10 @@ def is_vector_valued(self): """ try: if self._vals.ndim > 1: - assert(self._function_type is FunctionType.VECTOR) + assert (self._function_type is FunctionType.VECTOR) return True elif self._vals.ndim == 1: - assert(self._function_type is FunctionType.SCALAR) + assert (self._function_type is FunctionType.SCALAR) return False else: raise Exception("Dimension of the function is 0 or negative!") @@ -170,7 +170,7 @@ class SegregatedRBFInterpolationExpression(CouplingExpression): """ def segregated_interpolant_2d(self, coords_x, coords_y, data): - assert(coords_x.shape == coords_y.shape) + assert (coords_x.shape == coords_y.shape) # create least squares system to approximate a * x ** 2 + b * x + c ~= y def lstsq_interp(x, y, w): return w[0] * x ** 2 + w[1] * y ** 2 + w[2] * x * y + w[3] * x + w[4] * y + w[5] @@ -242,7 +242,7 @@ def eval(self, value, x): :param x: coordinate where expression has to be evaluated :param value: buffer where result has to be returned to """ - assert(MPI.COMM_WORLD.Get_size() > 1) + assert (MPI.COMM_WORLD.Get_size() > 1) for i in range(self._vals.ndim): value[i] = 0 diff --git a/fenicsprecice/fenicsprecice.py b/fenicsprecice/fenicsprecice.py index 06bb1421..09e77d77 100644 --- a/fenicsprecice/fenicsprecice.py +++ b/fenicsprecice/fenicsprecice.py @@ -52,8 +52,11 @@ def __init__(self, adapter_config_filename='precice-adapter-config.json'): # Setup up MPI communicator on mpi4py self._comm = MPI.COMM_WORLD - self._interface = precice.Interface(self._config.get_participant_name(), self._config.get_config_file_name(), - self._comm.Get_rank(), self._comm.Get_size()) + self._participant = precice.Participant( + self._config.get_participant_name(), + self._config.get_config_file_name(), + self._comm.Get_rank(), + self._comm.Get_size()) # FEniCS related quantities self._read_function_space = None # initialized later @@ -65,7 +68,6 @@ def __init__(self, adapter_config_filename='precice-adapter-config.json'): self._unowned_vertices = Vertices(VertexType.UNOWNED) self._fenics_vertices = Vertices(VertexType.FENICS) self._precice_vertex_ids = None # initialized later - self._precice_edge_dict = dict() # read data related quantities (read data is read from preCICE and applied in FEniCS) self._read_function_type = None # stores whether read function is scalar or vector valued @@ -118,7 +120,7 @@ def create_coupling_expression(self): coupling_expression : Object of class dolfin.functions.expression.Expression Reference to object of class SegregatedRBFInterpolationExpression. """ - assert(self._fenics_dims == 2), "Boundary conditions of Expression objects are only allowed for 2D cases" + assert (self._fenics_dims == 2), "Boundary conditions of Expression objects are only allowed for 2D cases" if not (self._read_function_type is FunctionType.SCALAR or self._read_function_type is FunctionType.VECTOR): raise Exception("No valid read_function is provided in initialization. Cannot create coupling expression") @@ -161,7 +163,7 @@ def update_coupling_expression(self, coupling_expression, data): The coupling data. A dictionary containing nodal data with vertex coordinates as key and associated data as value. """ - assert(self._fenics_dims == 2), "Boundary conditions of Expression objects are only allowed for 2D cases" + assert (self._fenics_dims == 2), "Boundary conditions of Expression objects are only allowed for 2D cases" if not self._empty_rank: coupling_expression.update_boundary_data(np.array(list(data.values())), np.array(list(data.keys()))) @@ -190,7 +192,7 @@ def get_point_sources(self, data): return get_forces_as_point_sources(self._Dirichlet_Boundary, self._read_function_space, data) - def read_data(self): + def read_data(self, dt): """ Read data from preCICE. Data is generated depending on the type of the read function (Scalar or Vector). For a scalar read function the data is a numpy array with shape (N) where N = number of coupling vertices @@ -206,9 +208,6 @@ def read_data(self): assert (self._coupling_type is CouplingMode.UNI_DIRECTIONAL_READ_COUPLING or CouplingMode.BI_DIRECTIONAL_COUPLING) - read_data_id = self._interface.get_data_id(self._config.get_read_data_name(), - self._interface.get_mesh_id(self._config.get_coupling_mesh_name())) - read_data = None if self._empty_rank: @@ -216,10 +215,11 @@ def read_data(self): ), "having participants without coupling mesh nodes is only valid for parallel runs" if not self._empty_rank: - if self._read_function_type is FunctionType.SCALAR: - read_data = self._interface.read_block_scalar_data(read_data_id, self._precice_vertex_ids) - elif self._read_function_type is FunctionType.VECTOR: - read_data = self._interface.read_block_vector_data(read_data_id, self._precice_vertex_ids) + read_data = self._participant.read_data( + self._config.get_coupling_mesh_name(), + self._config.get_read_data_name(), + self._precice_vertex_ids, + dt) read_data = {tuple(key): value for key, value in zip(self._owned_vertices.get_coordinates(), read_data)} read_data = communicate_shared_vertices( @@ -251,9 +251,6 @@ def write_data(self, write_function): assert (self._write_function_type == determine_function_type(w_func, self._fenics_dims)) assert (write_function.function_space() == self._write_function_space) - write_data_id = self._interface.get_data_id(self._config.get_write_data_name(), - self._interface.get_mesh_id(self._config.get_coupling_mesh_name())) - if self._empty_rank: assert (self._is_parallel() ), "having participants without coupling mesh nodes is only valid for parallel runs" @@ -261,14 +258,11 @@ def write_data(self, write_function): write_function_type = determine_function_type(write_function, self._fenics_dims) assert (write_function_type in list(FunctionType)) write_data = convert_fenics_to_precice(write_function, self._owned_vertices.get_local_ids()) - if write_function_type is FunctionType.SCALAR: - assert (write_function.function_space().num_sub_spaces() == 0) - self._interface.write_block_scalar_data(write_data_id, self._precice_vertex_ids, write_data) - elif write_function_type is FunctionType.VECTOR: - assert (write_function.function_space().num_sub_spaces() > 0) - self._interface.write_block_vector_data(write_data_id, self._precice_vertex_ids, write_data) - else: - raise Exception("write_function provided is neither VECTOR nor SCALAR type") + self._participant.write_data( + self._config.get_coupling_mesh_name(), + self._config.get_write_data_name(), + self._precice_vertex_ids, + write_data) def initialize(self, coupling_subdomain, read_function_space=None, write_object=None, fixed_boundary=None): """ @@ -296,10 +290,10 @@ def initialize(self, coupling_subdomain, read_function_space=None, write_object= """ write_function_space, write_function = None, None - if isinstance(write_object, Function): # precice.initialize_data() will be called using this Function + if isinstance(write_object, Function): write_function_space = write_object.function_space() write_function = write_object - elif isinstance(write_object, FunctionSpace): # preCICE will use default zero values for initialization. + elif isinstance(write_object, FunctionSpace): write_function_space = write_object write_function = None elif write_object is None: @@ -361,7 +355,7 @@ def initialize(self, coupling_subdomain, read_function_space=None, write_object= if fixed_boundary: self._Dirichlet_Boundary = fixed_boundary - if self._fenics_dims != self._interface.get_dimensions(): + if self._fenics_dims != self._participant.get_mesh_dimensions(self._config.get_coupling_mesh_name()): raise Exception("Dimension of preCICE setup and FEniCS do not match") # Set vertices on the coupling subdomain for this rank @@ -391,8 +385,8 @@ def initialize(self, coupling_subdomain, read_function_space=None, write_object= print("Rank {} has no part of coupling boundary.".format(self._comm.Get_rank())) # Define mesh in preCICE - self._precice_vertex_ids = self._interface.set_mesh_vertices(self._interface.get_mesh_id( - self._config.get_coupling_mesh_name()), self._owned_vertices.get_coordinates()) + self._precice_vertex_ids = self._participant.set_mesh_vertices( + self._config.get_coupling_mesh_name(), self._owned_vertices.get_coordinates()) if (self._coupling_type is CouplingMode.UNI_DIRECTIONAL_READ_COUPLING or self._coupling_type is CouplingMode.BI_DIRECTIONAL_COUPLING) and self._is_parallel: @@ -407,41 +401,36 @@ def initialize(self, coupling_subdomain, read_function_space=None, write_object= point_data = {tuple(key): None for key in self._owned_vertices.get_coordinates()} _ = filter_point_sources(point_data, fixed_boundary, warn_duplicate=True) - # Set mesh edges in preCICE to allow nearest-projection mapping - # Define a mapping between coupling vertices and their IDs in preCICE - id_mapping = {key: value for key, value in zip(self._owned_vertices.get_global_ids(), self._precice_vertex_ids)} - - edge_vertex_ids1, edge_vertex_ids2, edges_ids = get_coupling_boundary_edges( - function_space, coupling_subdomain, self._owned_vertices.get_global_ids(), id_mapping) - - for i in range(len(edge_vertex_ids1)): - assert (edge_vertex_ids1[i] != edge_vertex_ids2[i]) - self._precice_edge_dict[edges_ids[i]] = self._interface.set_mesh_edge( - self._interface.get_mesh_id(self._config.get_coupling_mesh_name()), - edge_vertex_ids1[i], edge_vertex_ids2[i]) - - # Configure mesh connectivity (triangles from edges) for 2D simulations - if self._fenics_dims == 2: - edges = get_coupling_triangles(function_space, coupling_subdomain, self._precice_edge_dict) - for edges_ids in edges: - self._interface.set_mesh_triangle(self._interface.get_mesh_id(self._config.get_coupling_mesh_name()), - self._precice_edge_dict[edges_ids[0]], - self._precice_edge_dict[edges_ids[1]], - self._precice_edge_dict[edges_ids[2]]) - else: - print("Mesh connectivity information is not written for 3D cases.") - - precice_dt = self._interface.initialize() - - if self._interface.is_action_required(precice.action_write_initial_data()): + # Set mesh connectivity information in preCICE to allow nearest-projection mapping + if self._participant.requires_mesh_connectivity_for(self._config.get_coupling_mesh_name()): + # Define a mapping between coupling vertices and their IDs in preCICE + id_mapping = { + key: value for key, + value in zip( + self._owned_vertices.get_global_ids(), + self._precice_vertex_ids)} + + edge_vertex_ids, fenics_edge_ids = get_coupling_boundary_edges( + function_space, coupling_subdomain, self._owned_vertices.get_global_ids(), id_mapping) + + # Surface coupling over 1D edges + self._participant.set_mesh_edges(self._config.get_coupling_mesh_name(), edge_vertex_ids) + + # Code below does not work properly. Volume coupling does not integrate well with surface coupling in this state. See https://github.com/precice/fenics-adapter/issues/162. + # # Configure mesh connectivity (triangles from edges) for 2D simulations + # if self._fenics_dims == 2: + # vertices = get_coupling_triangles(function_space, coupling_subdomain, fenics_edge_ids, id_mapping) + # self._participant.set_mesh_triangles(self._config.get_coupling_mesh_name(), vertices) + # else: + # print("Mesh connectivity information is not written for 3D cases.") + + if self._participant.requires_initial_data(): if not write_function: - raise Exception("Non-standard initialization requires a write_function") + raise Exception( + "preCICE requires you to write initial data. Please provide a write_function to initialize(...)") self.write_data(write_function) - self._interface.mark_action_fulfilled(precice.action_write_initial_data()) - - self._interface.initialize_data() - return precice_dt + self._participant.initialize() def store_checkpoint(self, user_u, t, n): """ @@ -464,7 +453,6 @@ def store_checkpoint(self, user_u, t, n): # making sure that the FEniCS function provided by user is not directly accessed by the Adapter assert (my_u != user_u) self._checkpoint = SolverState(my_u, t, n) - self._interface.mark_action_fulfilled(self.action_write_iteration_checkpoint()) def retrieve_checkpoint(self): """ @@ -481,7 +469,6 @@ def retrieve_checkpoint(self): """ assert (not self.is_time_window_complete()) logger.debug("Restore solver state") - self._interface.mark_action_fulfilled(self.action_read_iteration_checkpoint()) return self._checkpoint.get_state() def advance(self, dt): @@ -496,15 +483,9 @@ def advance(self, dt): Notes ----- Refer advance() in https://github.com/precice/python-bindings/blob/develop/cyprecice/cyprecice.pyx - - Returns - ------- - max_dt : double - Maximum length of timestep to be computed by solver. """ self._first_advance_done = True - max_dt = self._interface.advance(dt) - return max_dt + self._participant.advance(dt) def finalize(self): """ @@ -514,7 +495,7 @@ def finalize(self): ----- Refer finalize() in https://github.com/precice/python-bindings/blob/develop/cyprecice/cyprecice.pyx """ - self._interface.finalize() + self._participant.finalize() def get_participant_name(self): """ @@ -538,7 +519,7 @@ def is_coupling_ongoing(self): tag : bool True if coupling is still going on and False if coupling has finished. """ - return self._interface.is_coupling_ongoing() + return self._participant.is_coupling_ongoing() def is_time_window_complete(self): """ @@ -554,58 +535,52 @@ def is_time_window_complete(self): tag : bool True if implicit coupling in the time window has converged and False if not converged yet. """ - return self._interface.is_time_window_complete() + return self._participant.is_time_window_complete() - def is_action_required(self, action): + def get_max_time_step_size(self): """ - Tag to check if a particular preCICE action is required. - - Parameters - ---------- - action : string - Name of the preCICE action. + Get the maximum time step from preCICE. Notes ----- - Refer is_action_required(action) in + Refer get_max_time_step_size() in https://github.com/precice/python-bindings/blob/develop/cyprecice/cyprecice.pyx Returns ------- - tag : bool - True if action is required and False if action is not required. + max_dt : double + Maximum length of timestep to be computed by solver. """ - return self._interface.is_action_required(action) + return self._participant.get_max_time_step_size() - def action_write_iteration_checkpoint(self): + def requires_writing_checkpoint(self): """ - Get name of action to convey to preCICE that a checkpoint has been written. + Tag to check if checkpoint needs to be written. Notes ----- - Refer action_write_iteration_checkpoint() in + Refer requires_writing_checkpoint() in https://github.com/precice/python-bindings/blob/develop/cyprecice/cyprecice.pyx Returns ------- - action : string - Name of action related to writing a checkpoint. + tag : bool + True if checkpoint needs to be written, False otherwise. """ - return precice.action_write_iteration_checkpoint() + return self._participant.requires_writing_checkpoint() - def action_read_iteration_checkpoint(self): + def requires_reading_checkpoint(self): """ - Get name of action to convey to preCICE that a checkpoint has been read and the state of the system has been - restored to that checkpoint. + Tag to check if checkpoint needs to be read. Notes ----- - Refer action_read_iteration_checkpoint() in + Refer requires_reading_checkpoint() in https://github.com/precice/python-bindings/blob/develop/cyprecice/cyprecice.pyx Returns ------- - action : string - Name of action related to reading a checkpoint. + tag : bool + True if checkpoint needs to be written, False otherwise. """ - return precice.action_read_iteration_checkpoint() + return self._participant.requires_reading_checkpoint() diff --git a/setup.py b/setup.py index 7d33accc..5e5f430a 100644 --- a/setup.py +++ b/setup.py @@ -39,6 +39,6 @@ author_email='info@precice.org', license='LGPL-3.0', packages=['fenicsprecice'], - install_requires=['pyprecice>=2.0.0', 'scipy', 'numpy>=1.13.3', 'mpi4py'], + install_requires=['pyprecice>=3.0.0.0.dev0', 'scipy', 'numpy>=1.13.3', 'mpi4py'], test_suite='tests', zip_safe=False) diff --git a/tests/MockedPrecice.py b/tests/MockedPrecice.py index b972517b..3d2e8ac2 100644 --- a/tests/MockedPrecice.py +++ b/tests/MockedPrecice.py @@ -1,11 +1,4 @@ -from unittest.mock import MagicMock - -action_read_iteration_checkpoint = MagicMock(return_value=1) -action_write_iteration_checkpoint = MagicMock(return_value=2) -action_write_initial_data = MagicMock() - - -class Interface: +class Participant: """ Mock representation of preCICE to be used in all mock tests. Dummy implementation of all functions below are to be used where the preCICE API calls via the python bindings are done in the FEniCS Adapter @@ -14,31 +7,28 @@ class Interface: def __init__(self, name, config_file, rank, procs): pass - def read_block_scalar_data(self, read_data_id, vertex_ids): - raise Exception("not implemented") - - def read_block_vector_data(self, read_data_id, vertex_ids): + def read_data(self, read_mesh_name, read_data_name, vertex_ids, dt): raise Exception("not implemented") - def write_block_scalar_data(self, write_data_id, vertex_ids, write_data): + def write_data(self, write_mesh_name, write_data_name, vertex_ids, write_data): raise Exception("not implemented") - def write_block_vector_data(self, write_data_id, vertex_ids, write_data): + def initialize(): raise Exception("not implemented") - def get_data_id(self, foo, bar): + def advance(self, foo): raise Exception("not implemented") - def get_mesh_id(self, foo): + def finalize(): raise Exception("not implemented") - def initialize_data(self): + def requires_initial_data(self): raise Exception("not implemented") - def advance(self, foo): + def requires_reading_checkpoint(self): raise Exception("not implemented") - def is_action_required(self, action): + def requires_writing_checkpoint(self): raise Exception("not implemented") def is_coupling_ongoing(self): @@ -50,5 +40,8 @@ def mark_action_fulfilled(self, action): def get_dimensions(self): raise Exception("not implemented") + def get_max_time_step_size(self): + raise Exception("not implemented") + def is_time_window_complete(self): raise Exception("not implemented") diff --git a/tests/integration/test_fenicsprecice.py b/tests/integration/test_fenicsprecice.py index b5a0e1ef..faec6038 100644 --- a/tests/integration/test_fenicsprecice.py +++ b/tests/integration/test_fenicsprecice.py @@ -4,9 +4,9 @@ from unittest import TestCase from tests import MockedPrecice import numpy as np -from fenics import Expression, UnitSquareMesh, FunctionSpace, VectorFunctionSpace, interpolate, dx, ds, \ - SubDomain, near, PointSource, Point, AutoSubDomain, TestFunction, grad, assemble, Function, solve, dot, \ - TrialFunction, TestFunction, lhs, inner, Constant, assemble_system +from fenics import Expression, UnitSquareMesh, FunctionSpace, VectorFunctionSpace, interpolate, dx, \ + SubDomain, near, PointSource, Point, AutoSubDomain, TestFunction, grad, dot, TrialFunction, \ + TestFunction, inner, Constant, assemble_system class MockedArray: @@ -74,22 +74,12 @@ def test_checkpoint_mechanism(self): Test correct checkpoint storing """ import fenicsprecice - from precice import Interface, action_write_iteration_checkpoint - - def is_action_required_behavior(py_action): - if py_action == action_write_iteration_checkpoint(): - return True - else: - return False - - Interface.initialize = MagicMock(return_value=self.dt) - Interface.is_action_required = MagicMock(side_effect=is_action_required_behavior) - Interface.get_dimensions = MagicMock() - Interface.get_mesh_id = MagicMock() - Interface.get_data_id = MagicMock() - Interface.mark_action_fulfilled = MagicMock() - Interface.is_time_window_complete = MagicMock(return_value=True) - Interface.advance = MagicMock() + from precice import Participant + + Participant.initialize = MagicMock(return_value=self.dt) + Participant.get_mesh_dimensions = MagicMock() + Participant.is_time_window_complete = MagicMock(return_value=True) + Participant.advance = MagicMock() precice = fenicsprecice.Adapter(self.dummy_config) @@ -98,7 +88,7 @@ def is_action_required_behavior(py_action): # Replicating control flow where implicit iteration has not converged and solver state needs to be restored # to a checkpoint precice.advance(self.dt) - Interface.is_time_window_complete = MagicMock(return_value=False) + Participant.is_time_window_complete = MagicMock(return_value=False) # Check if the checkpoint is stored correctly in the adapter self.assertEqual(precice.retrieve_checkpoint() == self.u_n_mocked, self.t, self.n) @@ -123,7 +113,6 @@ class TestExpressionHandling(TestCase): vector_function = interpolate(vector_expr, vector_V) n_vertices = 11 - fake_id = 15 vertices_x = [1 for _ in range(n_vertices)] vertices_y = np.linspace(0, 1, n_vertices) vertex_ids = np.arange(n_vertices) @@ -136,36 +125,54 @@ class Right(SubDomain): def inside(self, x, on_boundary): return near(x[0], 1.0) + def test_create_expression_scalar(self): + """ + Check if a sampling of points on a dolfin Function interpolated via FEniCS is matching with the sampling of the + same points on a FEniCS Expression created by the Adapter + """ + from precice import Participant + import fenicsprecice + + Participant.get_mesh_dimensions = MagicMock(return_value=2) + Participant.set_mesh_vertices = MagicMock(return_value=self.vertex_ids) + Participant.requires_mesh_connectivity_for = MagicMock(return_value=False) + Participant.requires_initial_data = MagicMock(return_value=False) + Participant.initialize = MagicMock() + Participant.write_data = MagicMock() + + right_boundary = self.Right() + + precice = fenicsprecice.Adapter(self.dummy_config) + precice._participant = Participant(None, None, None, None) + precice.initialize(right_boundary, self.scalar_V, self.scalar_function) + precice.create_coupling_expression() + # currently only a smoke tests. Is there a good way to test this? + def test_update_expression_scalar(self): """ Check if a sampling of points on a dolfin Function interpolated via FEniCS is matching with the sampling of the same points on a FEniCS Expression created by the Adapter """ - from precice import Interface + from precice import Participant import fenicsprecice - Interface.get_dimensions = MagicMock(return_value=2) - Interface.set_mesh_vertices = MagicMock(return_value=self.vertex_ids) - Interface.get_mesh_id = MagicMock() - Interface.get_data_id = MagicMock() - Interface.set_mesh_edge = MagicMock() - Interface.initialize = MagicMock() - Interface.initialize_data = MagicMock() - Interface.is_action_required = MagicMock() - Interface.mark_action_fulfilled = MagicMock() - Interface.write_block_scalar_data = MagicMock() + Participant.get_mesh_dimensions = MagicMock(return_value=2) + Participant.set_mesh_vertices = MagicMock(return_value=self.vertex_ids) + Participant.requires_mesh_connectivity_for = MagicMock(return_value=False) + Participant.requires_initial_data = MagicMock(return_value=False) + Participant.initialize = MagicMock() + Participant.write_data = MagicMock() right_boundary = self.Right() precice = fenicsprecice.Adapter(self.dummy_config) - precice._interface = Interface(None, None, None, None) + precice._participant = Participant(None, None, None, None) precice.initialize(right_boundary, self.scalar_V, self.scalar_function) values = np.array([self.scalar_function(x, y) for x, y in zip(self.vertices_x, self.vertices_y)]) data = {(x, y): v for x, y, v in zip(self.vertices_x, self.vertices_y, values)} - scalar_coupling_expr = precice.create_coupling_expression() - precice.update_coupling_expression(scalar_coupling_expr, data) + precice.update_coupling_expression(scalar_coupling_expr, data) expr_samples = np.array([scalar_coupling_expr(x, y) for x, y in zip(self.samplepts_x, self.samplepts_y)]) func_samples = np.array([self.scalar_function(x, y) for x, y in zip(self.samplepts_x, self.samplepts_y)]) @@ -176,24 +183,20 @@ def test_update_expression_vector(self): Check if a sampling of points on a dolfin Function interpolated via FEniCS is matching with the sampling of the same points on a FEniCS Expression created by the Adapter """ - from precice import Interface + from precice import Participant import fenicsprecice - Interface.get_dimensions = MagicMock(return_value=2) - Interface.set_mesh_vertices = MagicMock(return_value=self.vertex_ids) - Interface.get_mesh_id = MagicMock() - Interface.get_data_id = MagicMock() - Interface.set_mesh_edge = MagicMock() - Interface.initialize = MagicMock() - Interface.initialize_data = MagicMock() - Interface.is_action_required = MagicMock() - Interface.mark_action_fulfilled = MagicMock() - Interface.write_block_vector_data = MagicMock() + Participant.get_mesh_dimensions = MagicMock(return_value=2) + Participant.set_mesh_vertices = MagicMock(return_value=self.vertex_ids) + Participant.requires_mesh_connectivity_for = MagicMock(return_value=False) + Participant.requires_initial_data = MagicMock(return_value=False) + Participant.initialize = MagicMock() + Participant.write_data = MagicMock() right_boundary = self.Right() precice = fenicsprecice.Adapter(self.dummy_config) - precice._interface = Interface(None, None, None, None) + precice._participant = Participant(None, None, None, None) precice.initialize(right_boundary, self.vector_V, self.vector_function) values = np.array([self.vector_function(x, y) for x, y in zip(self.vertices_x, self.vertices_y)]) data = {(x, y): v for x, y, v in zip(self.vertices_x, self.vertices_y, values)} @@ -294,4 +297,4 @@ def dirichlet_boundary(x, on_boundary): return on_boundary and abs(x[0]) < 1E-14 for ps in forces_y: ps.apply(b_forces) - assert(np.allclose(b_dummy.get_local(), b_forces.get_local())) + assert (np.allclose(b_dummy.get_local(), b_forces.get_local())) diff --git a/tests/integration/test_write_read.py b/tests/integration/test_write_read.py index 49ffbbdb..bc11e848 100644 --- a/tests/integration/test_write_read.py +++ b/tests/integration/test_write_read.py @@ -7,6 +7,8 @@ x_left, x_right = 0, 1 y_bottom, y_top = 0, 1 +dummy_dt = 1 + class RightBoundary(SubDomain): def inside(self, x, on_boundary): @@ -37,7 +39,6 @@ class TestWriteandReadData(TestCase): vector_function = interpolate(vector_expr, vector_V) n_vertices = 11 - fake_id = 15 vertices_x = [x_right for _ in range(n_vertices)] vertices_y = np.linspace(y_bottom, y_top, n_vertices) @@ -45,32 +46,27 @@ def test_scalar_write(self): """ Test to check if Adapter function write() passes correct parameters to the API function write_block_scalar_data() """ - from precice import Interface + from precice import Participant import fenicsprecice - Interface.write_block_scalar_data = MagicMock() - Interface.get_dimensions = MagicMock(return_value=2) - Interface.get_mesh_id = MagicMock() - Interface.get_data_id = MagicMock(return_value=self.fake_id) - Interface.set_mesh_vertices = MagicMock(return_value=np.arange(self.n_vertices)) - Interface.set_mesh_edge = MagicMock() - Interface.initialize = MagicMock() - Interface.is_action_required = MagicMock(return_value=False) - Interface.initialize_data = MagicMock() + Participant.write_data = MagicMock() + Participant.get_mesh_dimensions = MagicMock(return_value=2) + Participant.set_mesh_vertices = MagicMock(return_value=np.arange(self.n_vertices)) + Participant.requires_mesh_connectivity_for = MagicMock(return_value=False) + Participant.requires_initial_data = MagicMock(return_value=False) + Participant.initialize = MagicMock() precice = fenicsprecice.Adapter(self.dummy_config) - precice._interface = Interface(None, None, None, None) - precice._write_data_id = self.fake_id + precice._participant = Participant(None, None, None, None) precice.initialize(RightBoundary(), self.scalar_V, self.scalar_function) precice.write_data(self.scalar_function) - expected_data_id = self.fake_id expected_values = np.array([self.scalar_expr(x_right, y) for y in self.vertices_y]) expected_ids = np.arange(self.n_vertices) - expected_args = [expected_data_id, expected_ids, expected_values] + expected_args = [expected_ids, expected_values] - for arg, expected_arg in zip(Interface.write_block_scalar_data.call_args[0], expected_args): + for arg, expected_arg in zip(Participant.write_data.call_args[0], expected_args): if isinstance(arg, int): self.assertTrue(arg == expected_arg) elif isinstance(arg, np.ndarray): @@ -80,35 +76,29 @@ def test_vector_write(self): """ Test to check if Adapter function write() passes correct parameters to the API function write_block_vector_data() """ - from precice import Interface + from precice import Participant import fenicsprecice - from fenicsprecice.adapter_core import VertexType, Vertices, convert_fenics_to_precice - - Interface.write_block_vector_data = MagicMock() - Interface.get_dimensions = MagicMock(return_value=self.dimension) - Interface.get_mesh_id = MagicMock() - Interface.get_data_id = MagicMock(return_value=self.fake_id) - Interface.set_mesh_vertices = MagicMock(return_value=np.arange(self.n_vertices)) - Interface.set_mesh_edge = MagicMock() - Interface.initialize = MagicMock() - Interface.is_action_required = MagicMock(return_value=False) - Interface.initialize_data = MagicMock() + + Participant.write_data = MagicMock() + Participant.get_mesh_dimensions = MagicMock(return_value=self.dimension) + Participant.set_mesh_vertices = MagicMock(return_value=np.arange(self.n_vertices)) + Participant.requires_mesh_connectivity_for = MagicMock(return_value=False) + Participant.requires_initial_data = MagicMock(return_value=False) + Participant.initialize = MagicMock() precice = fenicsprecice.Adapter(self.dummy_config) - precice._interface = Interface(None, None, None, None) - precice._write_data_id = self.fake_id + precice._participant = Participant(None, None, None, None) precice.initialize(RightBoundary(), self.vector_V, self.vector_function) precice.write_data(self.vector_function) - expected_data_id = self.fake_id expected_values_x = np.array([self.vector_expr(x_right, y)[0] for y in np.linspace(y_bottom, y_top, 11)]) expected_values_y = np.array([self.vector_expr(x_right, y)[1] for y in np.linspace(y_bottom, y_top, 11)]) expected_values = np.stack([expected_values_x, expected_values_y], axis=1) expected_ids = np.arange(self.n_vertices) - expected_args = [expected_data_id, expected_ids, expected_values] + expected_args = [expected_ids, expected_values] - for arg, expected_arg in zip(Interface.write_block_vector_data.call_args[0], expected_args): + for arg, expected_arg in zip(Participant.write_data.call_args[0], expected_args): if isinstance(arg, int): self.assertTrue(arg == expected_arg) elif isinstance(arg, np.ndarray): @@ -121,39 +111,38 @@ def test_scalar_read(self): Test to check if Adapter function read() passes correct parameters to the API function read_block_scalar_data() Test to check if data return by API function read_block_scalar_data() is also returned by Adapter function read() """ - from precice import Interface + from precice import Participant import fenicsprecice def return_dummy_data(n_points): data = np.arange(n_points) return data - Interface.read_block_scalar_data = MagicMock(return_value=return_dummy_data(self.n_vertices)) - Interface.get_dimensions = MagicMock(return_value=self.dimension) - Interface.get_mesh_id = MagicMock() - Interface.get_data_id = MagicMock(return_value=self.fake_id) - Interface.set_mesh_vertices = MagicMock(return_value=np.arange(self.n_vertices)) - Interface.set_mesh_edge = MagicMock() - Interface.initialize = MagicMock() - Interface.is_action_required = MagicMock(return_value=False) - Interface.initialize_data = MagicMock() + Participant.read_data = MagicMock(return_value=return_dummy_data(self.n_vertices)) + Participant.get_mesh_dimensions = MagicMock(return_value=self.dimension) + Participant.set_mesh_vertices = MagicMock(return_value=np.arange(self.n_vertices)) + Participant.requires_mesh_connectivity_for = MagicMock(return_value=False) + Participant.requires_initial_data = MagicMock(return_value=False) + Participant.initialize = MagicMock() + Participant.get_max_time_step_size = MagicMock(return_value=dummy_dt) precice = fenicsprecice.Adapter(self.dummy_config) - precice._interface = Interface(None, None, None, None) - precice._read_data_id = self.fake_id + precice._participant = Participant(None, None, None, None) precice.initialize(RightBoundary(), self.scalar_V) - read_data = precice.read_data() + dt = precice.get_max_time_step_size() + read_data = precice.read_data(dt) - expected_data_id = self.fake_id expected_ids = np.arange(self.n_vertices) - expected_args = [expected_data_id, expected_ids] + expected_args = ["Dummy-Mesh", "Dummy-Read", expected_ids, dummy_dt] - for arg, expected_arg in zip(Interface.read_block_scalar_data.call_args[0], expected_args): - if isinstance(arg, int): + for arg, expected_arg in zip(Participant.read_data.call_args[0], expected_args): + if isinstance(arg, int) or isinstance(arg, str): self.assertTrue(arg == expected_arg) elif isinstance(arg, np.ndarray): np.testing.assert_allclose(arg, expected_arg) + else: + self.fail(f"Unexpected combination of arg: {arg}, expected_arg: {expected_arg}") np.testing.assert_almost_equal(list(read_data.values()), return_dummy_data(self.n_vertices)) @@ -162,38 +151,37 @@ def test_vector_read(self): Test to check if Adapter function read() passes correct parameters to the API function read_block_vector_data() Test to check if data return by API function read_block_vector_data() is also returned by Adapter function read() """ - from precice import Interface + from precice import Participant import fenicsprecice def return_dummy_data(n_points): data = np.arange(n_points * self.dimension).reshape(n_points, self.dimension) return data - Interface.read_block_vector_data = MagicMock(return_value=return_dummy_data(self.n_vertices)) - Interface.get_dimensions = MagicMock(return_value=self.dimension) - Interface.get_mesh_id = MagicMock() - Interface.get_data_id = MagicMock(return_value=self.fake_id) - Interface.set_mesh_vertices = MagicMock(return_value=np.arange(self.n_vertices)) - Interface.set_mesh_edge = MagicMock() - Interface.initialize = MagicMock() - Interface.is_action_required = MagicMock(return_value=False) - Interface.initialize_data = MagicMock() + Participant.read_data = MagicMock(return_value=return_dummy_data(self.n_vertices)) + Participant.get_mesh_dimensions = MagicMock(return_value=self.dimension) + Participant.set_mesh_vertices = MagicMock(return_value=np.arange(self.n_vertices)) + Participant.requires_mesh_connectivity_for = MagicMock(return_value=False) + Participant.requires_initial_data = MagicMock(return_value=False) + Participant.initialize = MagicMock() + Participant.get_max_time_step_size = MagicMock(return_value=dummy_dt) precice = fenicsprecice.Adapter(self.dummy_config) - precice._interface = Interface(None, None, None, None) - precice._read_data_id = self.fake_id + precice._participant = Participant(None, None, None, None) precice.initialize(RightBoundary(), self.vector_V) - read_data = precice.read_data() + dt = precice.get_max_time_step_size() + read_data = precice.read_data(dt) - expected_data_id = self.fake_id expected_ids = np.arange(self.n_vertices) - expected_args = [expected_data_id, expected_ids] + expected_args = ["Dummy-Mesh", "Dummy-Read", expected_ids, dummy_dt] - for arg, expected_arg in zip(Interface.read_block_vector_data.call_args[0], expected_args): - if isinstance(arg, int): + for arg, expected_arg in zip(Participant.read_data.call_args[0], expected_args): + if isinstance(arg, int) or isinstance(arg, str): self.assertTrue(arg == expected_arg) elif isinstance(arg, np.ndarray): np.testing.assert_allclose(arg, expected_arg) + else: + self.fail(f"Unexpected combination of arg: {arg}, expected_arg: {expected_arg}") np.testing.assert_almost_equal(list(read_data.values()), return_dummy_data(self.n_vertices)) diff --git a/tests/unit/test_adapter_core.py b/tests/unit/test_adapter_core.py index c2b0c8a0..8775562e 100644 --- a/tests/unit/test_adapter_core.py +++ b/tests/unit/test_adapter_core.py @@ -29,10 +29,9 @@ def inside(self, x, on_boundary): if right_edge.inside(v.point(), True): global_ids.append(v.global_index()) - edge_vertex_ids1, edge_vertex_ids2, _ = get_coupling_boundary_edges(V, right_edge, global_ids, id_mapping) + edge_vertex_ids, _ = get_coupling_boundary_edges(V, right_edge, global_ids, id_mapping) - self.assertEqual(len(edge_vertex_ids1), 10) - self.assertEqual(len(edge_vertex_ids2), 10) + self.assertEqual(len(edge_vertex_ids), 10) def test_convert_fenics_to_precice(self): """ From e1182dc9a66dfbc9085aec65d04981134aee5a48 Mon Sep 17 00:00:00 2001 From: Benjamin Rodenberg Date: Tue, 10 Oct 2023 20:25:50 +0200 Subject: [PATCH 08/14] Add troubleshooting hint for #154 --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index 7d0690e9..ebf31847 100644 --- a/README.md +++ b/README.md @@ -65,6 +65,7 @@ python3 -m unittest tests.test_write_read.TestWriteandReadData.test_vector_write **FEniCS is suddenly broken:** There are two known issues with preCICE, fenicsprecice and FEniCS: * If you see `ImportError: cannot import name 'sub_forms_by_domain'` run `pip3 uninstall -y fenics-ufl`. For details, refer to [issue #103](https://github.com/precice/fenics-adapter/issues/103). +* If you see `ImportError: cannot import name 'cellname2facetname' from 'ufl.cell', refer to [issue #154](https://github.com/precice/fenics-adapter/issues/154). * If you see `ModuleNotFoundError: No module named 'dolfin'` and have installed PETSc from source, refer to [this forum post](https://fenicsproject.discourse.group/t/modulenotfounderror-no-module-named-dolfin-if-petsc-dir-is-set/4407). Short version: Try to use the PETSc that comes with your system, if possible. Note that you can also [compile preCICE without PETSc](https://www.precice.org/installation-source-configuration.html), if necessary. If this does not help, you can contact us on [gitter](https://gitter.im/precice/lobby) or [open an issue](https://github.com/precice/fenics-adapter/issues/new). From 6f998598aac79f923235bf1288fd9dd28ae07ef6 Mon Sep 17 00:00:00 2001 From: Benjamin Rodenberg Date: Tue, 10 Oct 2023 20:26:26 +0200 Subject: [PATCH 09/14] Fix formatting --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index ebf31847..568bc8fe 100644 --- a/README.md +++ b/README.md @@ -65,7 +65,7 @@ python3 -m unittest tests.test_write_read.TestWriteandReadData.test_vector_write **FEniCS is suddenly broken:** There are two known issues with preCICE, fenicsprecice and FEniCS: * If you see `ImportError: cannot import name 'sub_forms_by_domain'` run `pip3 uninstall -y fenics-ufl`. For details, refer to [issue #103](https://github.com/precice/fenics-adapter/issues/103). -* If you see `ImportError: cannot import name 'cellname2facetname' from 'ufl.cell', refer to [issue #154](https://github.com/precice/fenics-adapter/issues/154). +* If you see `ImportError: cannot import name 'cellname2facetname' from 'ufl.cell'`, refer to [issue #154](https://github.com/precice/fenics-adapter/issues/154). * If you see `ModuleNotFoundError: No module named 'dolfin'` and have installed PETSc from source, refer to [this forum post](https://fenicsproject.discourse.group/t/modulenotfounderror-no-module-named-dolfin-if-petsc-dir-is-set/4407). Short version: Try to use the PETSc that comes with your system, if possible. Note that you can also [compile preCICE without PETSc](https://www.precice.org/installation-source-configuration.html), if necessary. If this does not help, you can contact us on [gitter](https://gitter.im/precice/lobby) or [open an issue](https://github.com/precice/fenics-adapter/issues/new). From 102022d5c3cdd6452bb0b5ab3900d20ec8508bca Mon Sep 17 00:00:00 2001 From: Ishaan Desai Date: Mon, 5 Feb 2024 14:26:06 +0100 Subject: [PATCH 10/14] Use PyPI Token in the publishing action --- .github/workflows/pythonpublish.yml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/.github/workflows/pythonpublish.yml b/.github/workflows/pythonpublish.yml index e596c578..275817cd 100644 --- a/.github/workflows/pythonpublish.yml +++ b/.github/workflows/pythonpublish.yml @@ -20,8 +20,8 @@ jobs: pip install setuptools wheel twine - name: Build and publish env: - TWINE_USERNAME: ${{ secrets.PYPI_USERNAME }} - TWINE_PASSWORD: ${{ secrets.PYPI_PASSWORD }} + TWINE_USERNAME: __token__ + TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }} run: | python setup.py sdist twine upload dist/* From 6faa44d840261ca79d6d6faa3f90c627f65450c2 Mon Sep 17 00:00:00 2001 From: Benjamin Rodenberg Date: Mon, 12 Feb 2024 12:24:48 +0100 Subject: [PATCH 11/14] Remove fields from citation cff that are hard to maintain during a release cycle. --- CITATION.cff | 2 -- 1 file changed, 2 deletions(-) diff --git a/CITATION.cff b/CITATION.cff index ef7b5316..12a05de8 100644 --- a/CITATION.cff +++ b/CITATION.cff @@ -32,9 +32,7 @@ abstract: >- preCICE-adapter for the open source computing platform FEniCS. license: LGPL-3.0 -commit: ' 9aa3e22' version: 1.4.0 -date-released: '2022-09-22' preferred-citation: title: "FEniCS-preCICE: Coupling FEniCS to other Simulation Software" type: "article" From 9cd721e168f1b3823999f0ae1eca169fe374e8c8 Mon Sep 17 00:00:00 2001 From: Benjamin Rodenberg Date: Mon, 12 Feb 2024 13:02:23 +0100 Subject: [PATCH 12/14] Update test to new tutorial structure. --- .github/workflows/run-tutorials.yml | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/.github/workflows/run-tutorials.yml b/.github/workflows/run-tutorials.yml index 699aa3e1..67424c84 100644 --- a/.github/workflows/run-tutorials.yml +++ b/.github/workflows/run-tutorials.yml @@ -31,8 +31,8 @@ jobs: run: git clone -b develop https://github.com/precice/tutorials.git - name: Run tutorial run: | - cd tutorials/partitioned-heat-conduction/fenics - ./run.sh -d & ./run.sh -n + cd tutorials/partitioned-heat-conduction + cd dirichlet-fenics && ./run.sh & cd neumann-fenics && ./run.sh run_ht_complex: name: Run HT, complex @@ -56,5 +56,5 @@ jobs: run: git clone -b develop https://github.com/precice/tutorials.git - name: Run tutorial run: | - cd tutorials/partitioned-heat-conduction-complex/fenics - ./run.sh -d & ./run.sh -n + cd tutorials/partitioned-heat-conduction-complex + cd dirichlet-fenics && ./run.sh & cd neumann-fenics && ./run.sh From 53c09bcaa0b3e0bb940ab6745224715139e85881 Mon Sep 17 00:00:00 2001 From: Benjamin Rodenberg Date: Mon, 12 Feb 2024 12:23:06 +0100 Subject: [PATCH 13/14] Bump version. --- CHANGELOG.md | 2 +- CITATION.cff | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index 9c7601f9..538cc7c2 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,6 +1,6 @@ # FEniCS-preCICE adapter changelog -## latest +## 2.0.0 * Drop support for preCICE 2.x version, as this is a breaking release. * Update adapter to use preCICE v3 API [#153](https://github.com/precice/fenics-adapter/pull/153). diff --git a/CITATION.cff b/CITATION.cff index 12a05de8..a1de507f 100644 --- a/CITATION.cff +++ b/CITATION.cff @@ -32,7 +32,7 @@ abstract: >- preCICE-adapter for the open source computing platform FEniCS. license: LGPL-3.0 -version: 1.4.0 +version: 2.0.0 preferred-citation: title: "FEniCS-preCICE: Coupling FEniCS to other Simulation Software" type: "article" From 47dc07ec6796e0e038621a013ecb2907260f50a9 Mon Sep 17 00:00:00 2001 From: Benjamin Rodenberg Date: Mon, 12 Feb 2024 16:14:11 +0100 Subject: [PATCH 14/14] Update setup.py Co-authored-by: Ishaan Desai --- setup.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/setup.py b/setup.py index 5e5f430a..d63a54c4 100644 --- a/setup.py +++ b/setup.py @@ -39,6 +39,6 @@ author_email='info@precice.org', license='LGPL-3.0', packages=['fenicsprecice'], - install_requires=['pyprecice>=3.0.0.0.dev0', 'scipy', 'numpy>=1.13.3', 'mpi4py'], + install_requires=['pyprecice>=3.0.0.0', 'scipy', 'numpy>=1.13.3', 'mpi4py'], test_suite='tests', zip_safe=False)