diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index fb0d68407..568201d4f 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -52,11 +52,11 @@ To work with the repo and contribute changes, the basic process is as follows: ## Getting Started -In order to be able to contribute code changes, you will first need to [create a Github fork](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/fork-a-repo) of the official OWP repo. +In order to be able to contribute code changes, you will first need to [create a Github fork](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/fork-a-repo) of the official OWP repo. -Next, set up your authentication mechanism with Github for your command line (or IDE). You can either [create an SSH key pair](https://docs.github.com/en/authentication/connecting-to-github-with-ssh/generating-a-new-ssh-key-and-adding-it-to-the-ssh-agent) and [add the public key](https://docs.github.com/en/authentication/connecting-to-github-with-ssh/adding-a-new-ssh-key-to-your-github-account) to your Github account, or you can set up a [Personal Access Token](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#using-a-personal-access-token-on-the-command-line) if you plan to clone the repo locally via HTTPS. +Next, set up your authentication mechanism with Github for your command line (or IDE). You can either [create an SSH key pair](https://docs.github.com/en/authentication/connecting-to-github-with-ssh/generating-a-new-ssh-key-and-adding-it-to-the-ssh-agent) and [add the public key](https://docs.github.com/en/authentication/connecting-to-github-with-ssh/adding-a-new-ssh-key-to-your-github-account) to your Github account, or you can set up a [Personal Access Token](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#using-a-personal-access-token-on-the-command-line) if you plan to clone the repo locally via HTTPS. -After that, [clone a local development repo](https://docs.github.com/en/repositories/creating-and-managing-repositories/cloning-a-repository) from your fork, using a command similar to one of the following: +After that, [clone a local development repo](https://docs.github.com/en/repositories/creating-and-managing-repositories/cloning-a-repository) from your fork, using a command similar to one of the following: # SSH-based clone command. Change URL to match your fork as appropriate git clone git@github.com:your_user/DMOD.git dmod @@ -64,7 +64,7 @@ After that, [clone a local development repo](https://docs.github.com/en/reposito # HTTPS-based clone command. Change URL to match your fork as appropriate git clone https://github.com/your_user/DMOD.git dmod -You can now change directories into the local repo, which will have the default branch - `master` - checked out. +You can now change directories into the local repo, which will have the default branch - `master` - checked out. # Move into the repo directory "dmod" cd dmod @@ -72,7 +72,7 @@ You can now change directories into the local repo, which will have the default # You can verify the branch by examining the output of ... git status -> [!IMPORTANT] +> [!IMPORTANT] > Git's will add a [Git remote](https://git-scm.com/book/en/v2/Git-Basics-Working-with-Remotes) named `origin` to the clone's configuration that points to the cloned-from repo. Because of this, the recommended convention is to clone your local repo(s) from your personal fork, thus making `origin` point to your fork. This is assumed to be the case in other parts of the documentation. Next, add the upstream OWP DMOD repo as a second remote for the local clone. The standard convention used in this doc and elsewhere is to name that remote `upstream`. Doing the addition will look something like: @@ -130,7 +130,7 @@ Especially if making more frequent, smaller commits as suggested above, it is a # The fetch is probably unnecesssary unless you work from multiple local repos git fetch - + # Assuming your branch of interest is still checked out: git status @@ -143,7 +143,7 @@ Once a code contribution is finished, make sure all changes have been pushed to #### PR Review and Requested Revisions -Once the PR is submitted, it will be reviewed by one or more other repo contributors. Often conversations will be had within the Github PR if reviewers have questions or request revisions be made to the proposed changes. If revisions are requested, you will need to make those in your locally copy of the feature/fix branch, and then re-push that branch (and the updates) to your personal fork. Then, use the PR page in Github to re-request review. +Once the PR is submitted, it will be reviewed by one or more other repo contributors. Often conversations will be had within the Github PR if reviewers have questions or request revisions be made to the proposed changes. If revisions are requested, you will need to make those in your locally copy of the feature/fix branch, and then re-push that branch (and the updates) to your personal fork. Then, use the PR page in Github to re-request review. ## Keeping Forks Up to Date diff --git a/doc/GIT_USAGE.md b/doc/GIT_USAGE.md index 87f671d5c..744c2ff88 100644 --- a/doc/GIT_USAGE.md +++ b/doc/GIT_USAGE.md @@ -10,17 +10,17 @@ Note that this document goes into detail on the Git strategy and branching model ## Branching Model -- The DMOD repo uses a branching model based on [Gitflow](https://nvie.com/posts/a-successful-git-branching-model/) that has two primary long-term branches: +- The DMOD repo uses a branching model based on [Gitflow](https://nvie.com/posts/a-successful-git-branching-model/) that has two primary long-term branches: - **master**: the main development and integration branch containing the latest completed development work intended for the next released version - **production**: the branch representing the latest code verified as production-ready and pointing to the most recently release, official version - Rebasing is used to integrate changes across branches, rather than merge commits - This allows the repo to maintain a more robust and complete history - Most interaction with the official OWP DMOD repo is done via pull requests (PRs) to the `master` branch - Independent branches for features or bug fixes are created off `master` to contain development work that is in progress - - Once work in a feature/fix branch is complete (or at least thought complete), it is used to create a PR + - Once work in a feature/fix branch is complete (or at least thought complete), it is used to create a PR - PRs and their linked branches are reviewed and, once approved, have their changes integrated back into `master` - Typically feature/fix branches exist in personal clones and personal Github forks, but not in the official OWP repo -- Release branches (e.g., `release-X` for pending version `X`) will be created whenever it is time to officially release a new version +- Release branches (e.g., `release-X` for pending version `X`) will be created whenever it is time to officially release a new version - These effectively are release candidates, with branches created from `master` - The release branches are managed by the core OWP contributors team - They do exist in the official OWP repo @@ -30,7 +30,7 @@ Note that this document goes into detail on the Git strategy and branching model ### Feature Branches from `master` This illustrates the relationship between feature branches and `master`. They should be created from `master` and independently contain commits from their feature. Once done, the changes will be reintegrated back into `master` via rebasing. -```mermaid +```mermaid %%{init: { 'logLevel': 'debug', 'theme': 'base', 'gitGraph': { 'showBranches': true, 'showCommitLabel':true, 'mainBranchName': 'master'}}}%% gitGraph commit id:"feature1.1" @@ -52,7 +52,7 @@ This illustrates the relationship between feature branches and `master`. They s The resulting state of `master` after rebasing the two new feature branches would be: -```mermaid +```mermaid %%{init: { 'logLevel': 'debug', 'theme': 'base', 'gitGraph': { 'showBranches': true, 'showCommitLabel':true, 'mainBranchName': 'master'}}}%% gitGraph commit id:"feature1.1" @@ -69,7 +69,7 @@ The resulting state of `master` after rebasing the two new feature branches woul This illustrates the relationship between `production`, `master`, and `release-v2`. Notice that `production` has already been tagged with version `v1` at the start. Commits for `feature1` and `feature2` at some point are integrated into `master`. When it is time to prepare to release version `v2`, `release-v2` is created. A few bug fix commits were needed in `release-v2`. After that, all the changes in `release-v2` are integrated into `production`, and `production` is tagged `v2`. All the changes are also integrated back into `master`. -```mermaid +```mermaid %%{init: { 'logLevel': 'debug', 'theme': 'base', 'gitGraph': { 'showBranches': true, 'showCommitLabel':true, 'mainBranchName': 'master'}}}%% gitGraph commit id:"v1-commit" @@ -95,7 +95,7 @@ This illustrates the relationship between `production`, `master`, and `release-v The resulting state of `production` is: -```mermaid +```mermaid %%{init: { 'logLevel': 'debug', 'theme': 'base', 'gitGraph': { 'showBranches': true, 'showCommitLabel':true, 'mainBranchName': 'production'}}}%% gitGraph commit id:"v1-commit" tag:"v1" @@ -110,7 +110,7 @@ The resulting state of `production` is: The resulting state of `master` is essentially the same: -```mermaid +```mermaid %%{init: { 'logLevel': 'debug', 'theme': 'base', 'gitGraph': { 'showBranches': true, 'showCommitLabel':true, 'mainBranchName': 'master'}}}%% gitGraph commit id:"v1-commit" @@ -160,7 +160,7 @@ Once the utility is available, install the _pre-commit_-configured hook scripts pre-commit install ``` -The hook scripts will now run when code is committed. +The hook scripts will now run when code is committed. Alternatively, you can run the hook scripts manually via: @@ -168,4 +168,4 @@ Alternatively, you can run the hook scripts manually via: pre-commit run --all-files ``` -For more information, see [_pre-commit_'s documentation](https://pre-commit.com/index.html). \ No newline at end of file +For more information, see [_pre-commit_'s documentation](https://pre-commit.com/index.html). diff --git a/doc/RELEASE_MANAGEMENT.md b/doc/RELEASE_MANAGEMENT.md index 8802f4575..0b26b5ca5 100644 --- a/doc/RELEASE_MANAGEMENT.md +++ b/doc/RELEASE_MANAGEMENT.md @@ -39,11 +39,11 @@ The release process for DMOD can be summarized fairly simply: # Versions -The versioning for DMOD is a little complicated. +The versioning for DMOD is a little complicated. -DMOD contains the sources of several independently-versioned Python packages; e.g., `dmod-core-0.19.0`. As long as this code remains organized as multiple separate packages, the package versions need to continue to be maintained individually. +DMOD contains the sources of several independently-versioned Python packages; e.g., `dmod-core-0.19.0`. As long as this code remains organized as multiple separate packages, the package versions need to continue to be maintained individually. -DMOD contains other source code wholly separate from these package, such as helper scripts, Dockerfiles, stack configurations, and other files. These are not contained in some inner organizational unit with its own versioning, and many (if not all) of them are particularly relevant to DMOD deployment. +DMOD contains other source code wholly separate from these package, such as helper scripts, Dockerfiles, stack configurations, and other files. These are not contained in some inner organizational unit with its own versioning, and many (if not all) of them are particularly relevant to DMOD deployment. As such, DMOD utilizes another, independent versioning scheme for itself as a whole. diff --git a/docker/main/ngen/customize/README.md b/docker/main/ngen/customize/README.md index 9f2338cc8..ee8fb2b17 100644 --- a/docker/main/ngen/customize/README.md +++ b/docker/main/ngen/customize/README.md @@ -1,6 +1,6 @@ # Locally Customizing the ngen Worker -It simply isn't possible to bundle every possible ngen-compatible BMI module into the job worker image Dockerfile. As such, the source Dockerfile doesn't try to include everything and only integrates a small number of OWP-developed BMI modules. +It simply isn't possible to bundle every possible ngen-compatible BMI module into the job worker image Dockerfile. As such, the source Dockerfile doesn't try to include everything and only integrates a small number of OWP-developed BMI modules. But, it is possible to use other BMI modules outside this small subset in the DMOD job worker images. @@ -15,7 +15,7 @@ In summary, it is possible to: ## Supply `requirements.txt` File -If a `requirements.txt` file is present within this directory, it will be used by an additional call to `pip` during the image build process, installing the Python packages listed within the file. This is likely the easiest way to incorporate more Python BMI modules, as long as they are publicly accessible. +If a `requirements.txt` file is present within this directory, it will be used by an additional call to `pip` during the image build process, installing the Python packages listed within the file. This is likely the easiest way to incorporate more Python BMI modules, as long as they are publicly accessible. Keep in mind that, even if ready-made packages are not available via something like PyPI, `pip` supports [installing directly from version control systems](https://pip.pypa.io/en/stable/topics/vcs-support/) like Git. @@ -30,12 +30,12 @@ For this to work for a provided Git repo, a few conditions must hold true: - the Git repository must be accessible at the given URL anonymously - the script doesn't provide a branch, so whatever the default branch is (e.g., `master` or `main`) must be suitable - the contents of the repo must be set up to build with **CMake** -- no extra, deliberate configuration of **CMake** variables should be necessary +- no extra, deliberate configuration of **CMake** variables should be necessary - (except `CMAKE_BUILD_TYPE` and `CMAKE_INSTALL_PREFIX`, which are pre-set in the script) - running `cmake --build ` will build anything/everything required - i.e., it must not be necessary to build a specific **CMake** `target` -## Use Manual Customization Script +## Use Manual Customization Script If the above methods are insufficient, it is possible to write a more bespoke script for configuring whatever customization is needed within the image, while also avoiding commiting this script directly to the repo. This allows for finer-grained control but also puts more responsibility on the user. To do this: @@ -45,4 +45,4 @@ If the above methods are insufficient, it is possible to write a more bespoke sc 1. `/dmod/bin/` for executables 2. `/dmod/bmi_module_data/` for static module data and configs 3. `/dmod/shared_libs/` for compiled shared libraries - 4. Note that there is also a Python virtual environment at `/dmod/venv/`, though this should be active in the environment when the script is run; i.e., installing packages using `pip` should get things there without any extra steps \ No newline at end of file + 4. Note that there is also a Python virtual environment at `/dmod/venv/`, though this should be active in the environment when the script is run; i.e., installing packages using `pip` should get things there without any extra steps diff --git a/python/lib/client/dmod/client/__main__.py b/python/lib/client/dmod/client/__main__.py index 991263f32..b26d146f3 100644 --- a/python/lib/client/dmod/client/__main__.py +++ b/python/lib/client/dmod/client/__main__.py @@ -46,6 +46,8 @@ def _create_ngen_based_exec_parser(subcommand_container: Any, parser_name: str, The newly created and associated subparser. """ new_parser = subcommand_container.add_parser(parser_name) + new_parser.add_argument('--worker-version', dest='worker_version', default="latest", + help="Specify version of worker (e.g., Docker image tag) to use.") new_parser.add_argument('--partition-config-data-id', dest='partition_cfg_data_id', default=None, help='Provide data_id for desired partition config dataset.') paradigms = [p for p in AllocationParadigm] diff --git a/python/lib/client/dmod/client/_version.py b/python/lib/client/dmod/client/_version.py index e4e49b3bb..9d1bb721b 100644 --- a/python/lib/client/dmod/client/_version.py +++ b/python/lib/client/dmod/client/_version.py @@ -1 +1 @@ -__version__ = '0.9.0' +__version__ = '0.10.0' diff --git a/python/lib/client/pyproject.toml b/python/lib/client/pyproject.toml index 60eba7c86..738fd7608 100644 --- a/python/lib/client/pyproject.toml +++ b/python/lib/client/pyproject.toml @@ -12,7 +12,7 @@ dependencies = [ "dmod.core>=0.17.0", "websockets>=8.1", "pydantic>=1.10.8,~=1.10", - "dmod.communication>=0.20.0", + "dmod.communication>=0.22.0", "dmod.externalrequests>=0.6.0", "dmod.modeldata>=0.12.0", ] diff --git a/python/lib/communication/dmod/communication/_version.py b/python/lib/communication/dmod/communication/_version.py index 8c306aa66..81edede8b 100644 --- a/python/lib/communication/dmod/communication/_version.py +++ b/python/lib/communication/dmod/communication/_version.py @@ -1 +1 @@ -__version__ = '0.21.1' +__version__ = '0.22.0' diff --git a/python/lib/communication/dmod/communication/maas_request/ngen/abstract_nextgen_request.py b/python/lib/communication/dmod/communication/maas_request/ngen/abstract_nextgen_request.py index a56217a22..0f343818c 100644 --- a/python/lib/communication/dmod/communication/maas_request/ngen/abstract_nextgen_request.py +++ b/python/lib/communication/dmod/communication/maas_request/ngen/abstract_nextgen_request.py @@ -1,7 +1,7 @@ from .partial_realization_config import PartialRealizationConfig from abc import ABC, abstractmethod from typing import List, Optional -from pydantic import PrivateAttr +from pydantic import Field, PrivateAttr from dmod.core.meta_data import ( DataCategory, @@ -37,6 +37,7 @@ class AbstractNgenRequest(DmodJobRequest, ABC): """ request_body: NGENRequestBody + worker_version: str = Field("latest", description="The desired version of the applicable worker for the request.") _hydrofabric_data_requirement = PrivateAttr(None) _forcing_data_requirement = PrivateAttr(None) diff --git a/python/lib/communication/dmod/test/test_ngen_request.py b/python/lib/communication/dmod/test/test_ngen_request.py index 0ad0aaca8..e97a6e80d 100644 --- a/python/lib/communication/dmod/test/test_ngen_request.py +++ b/python/lib/communication/dmod/test/test_ngen_request.py @@ -36,7 +36,7 @@ def create_time_range(begin, end, var=None) -> TimeRange: '{"bmi_config_data_id": "02468", "composite_config_data_id": "composite02468", "hydrofabric_data_id": ' '"9876543210", "hydrofabric_uid": "0123456789", "partition_config_data_id": "part1234", ' '"realization_config_data_id": "02468", "time_range": ' + time_range.to_json() + '}, ' - '"session_secret": "f21f27ac3d443c0948aab924bddefc64891c455a756ca77a4d86ec2f697cd13c"}') + '"session_secret": "f21f27ac3d443c0948aab924bddefc64891c455a756ca77a4d86ec2f697cd13c", "worker_version": "latest"}') self.request_jsons.append({ 'allocation_paradigm': 'SINGLE_NODE', 'cpu_count': cpu_count_ex_0, @@ -51,7 +51,8 @@ def create_time_range(begin, end, var=None) -> TimeRange: 'realization_config_data_id': '02468', 'partition_config_data_id': 'part1234' }, - 'session_secret': 'f21f27ac3d443c0948aab924bddefc64891c455a756ca77a4d86ec2f697cd13c' + 'session_secret': 'f21f27ac3d443c0948aab924bddefc64891c455a756ca77a4d86ec2f697cd13c', + "worker_version": "latest" }) self.request_objs.append( NGENRequest(request_body={ @@ -84,7 +85,7 @@ def create_time_range(begin, end, var=None) -> TimeRange: '"composite_config_data_id": "composite02468", "hydrofabric_data_id": "9876543210", ' '"hydrofabric_uid": "0123456789", "partition_config_data_id": "part1234", ' '"realization_config_data_id": "02468", "time_range": ' + time_range.to_json() + '}, ' - '"session_secret": "f21f27ac3d443c0948aab924bddefc64891c455a756ca77a4d86ec2f697cd13c"}') + '"session_secret": "f21f27ac3d443c0948aab924bddefc64891c455a756ca77a4d86ec2f697cd13c", "worker_version": "latest"}') self.request_jsons.append({ 'allocation_paradigm': 'ROUND_ROBIN', 'cpu_count': cpu_count_ex_1, @@ -100,7 +101,8 @@ def create_time_range(begin, end, var=None) -> TimeRange: 'catchments': cat_ids_list, 'partition_config_data_id': 'part1234' }, - 'session_secret': 'f21f27ac3d443c0948aab924bddefc64891c455a756ca77a4d86ec2f697cd13c' + 'session_secret': 'f21f27ac3d443c0948aab924bddefc64891c455a756ca77a4d86ec2f697cd13c', + "worker_version": "latest" }) self.request_objs.append( NGENRequest( @@ -128,7 +130,7 @@ def create_time_range(begin, end, var=None) -> TimeRange: '"request_body": {"bmi_config_data_id": "02468", "composite_config_data_id": "composite02468",' '"hydrofabric_data_id": "9876543210", ' '"hydrofabric_uid": "0123456789", "realization_config_data_id": "02468", "time_range": ' + time_range.to_json() + '}, ' - '"session_secret": "f21f27ac3d443c0948aab924bddefc64891c455a756ca77a4d86ec2f697cd13c"}' + '"session_secret": "f21f27ac3d443c0948aab924bddefc64891c455a756ca77a4d86ec2f697cd13c", "worker_version": "latest"}' ) self.request_jsons.append({ 'allocation_paradigm': 'SINGLE_NODE', @@ -142,7 +144,8 @@ def create_time_range(begin, end, var=None) -> TimeRange: 'bmi_config_data_id': '02468', 'realization_config_data_id': '02468' }, - 'session_secret': 'f21f27ac3d443c0948aab924bddefc64891c455a756ca77a4d86ec2f697cd13c' + 'session_secret': 'f21f27ac3d443c0948aab924bddefc64891c455a756ca77a4d86ec2f697cd13c', + "worker_version": "latest" }) self.request_objs.append( NGENRequest( diff --git a/python/lib/communication/dmod/test/test_scheduler_request_message.py b/python/lib/communication/dmod/test/test_scheduler_request_message.py index a2a2a8adc..52fe1b3db 100644 --- a/python/lib/communication/dmod/test/test_scheduler_request_message.py +++ b/python/lib/communication/dmod/test/test_scheduler_request_message.py @@ -92,7 +92,7 @@ def setUp(self) -> None: "end": "2012-05-31 23:00:00", "subclass": "TimeRange", "variable": "TIME"}) - raw_json_str_1 = '{"allocation_paradigm": "SINGLE_NODE", "cpus": ' + str(cpu_count_ex_1) + ', "mem": ' + str(memory_ex_1) + ', "model_request": {"allocation_paradigm": "ROUND_ROBIN", "cpu_count": ' + str(cpu_count_ex_1) + ', "job_type": "ngen", "memory": ' + str(memory_ex_1) + ', "request_body": {"bmi_config_data_id": "simple-bmi-cfe-1", "hydrofabric_data_id": "hydrofabric-huc01-copy-288", "hydrofabric_uid": "72c2a0220aa7315b50e55b6c5b68f927ac1d9b81", "realization_config_data_id": "huc01-simple-realization-config-1", "time_range": ' + str(time_range) +'}, "session_secret": "675b2f8826f69f97c01fe4d7add30420322cd21a790ddc68a5b3c149966de919"}, "user_id": "someone"}' + raw_json_str_1 = '{"allocation_paradigm": "SINGLE_NODE", "cpus": ' + str(cpu_count_ex_1) + ', "mem": ' + str(memory_ex_1) + ', "model_request": {"allocation_paradigm": "ROUND_ROBIN", "cpu_count": ' + str(cpu_count_ex_1) + ', "job_type": "ngen", "memory": ' + str(memory_ex_1) + ', "request_body": {"bmi_config_data_id": "simple-bmi-cfe-1", "hydrofabric_data_id": "hydrofabric-huc01-copy-288", "hydrofabric_uid": "72c2a0220aa7315b50e55b6c5b68f927ac1d9b81", "realization_config_data_id": "huc01-simple-realization-config-1", "time_range": ' + str(time_range) +'}, "session_secret": "675b2f8826f69f97c01fe4d7add30420322cd21a790ddc68a5b3c149966de919", "worker_version": "latest"}, "user_id": "someone"}' raw_json_obj_1 = json.loads(raw_json_str_1) sorted_json_str_1 = json.dumps(raw_json_obj_1, sort_keys=True) self.request_strings.append(sorted_json_str_1) @@ -113,7 +113,8 @@ def setUp(self) -> None: "realization_config_data_id": "huc01-simple-realization-config-1", "time_range": time_range.to_dict() }, - "session_secret": "675b2f8826f69f97c01fe4d7add30420322cd21a790ddc68a5b3c149966de919" + "session_secret": "675b2f8826f69f97c01fe4d7add30420322cd21a790ddc68a5b3c149966de919", + "worker_version": "latest" }, "user_id": "someone" }) @@ -143,7 +144,7 @@ def setUp(self) -> None: "end": "2012-05-31 23:00:00", "subclass": "TimeRange", "variable": "TIME"}) - raw_json_str_2 = '{"allocation_paradigm": "SINGLE_NODE", "cpus": ' + str(cpu_count_ex_2) + ', "model_request": {"allocation_paradigm": "ROUND_ROBIN", "cpu_count": ' + str(cpu_count_ex_2) + ', "job_type": "ngen", "memory": ' + str(memory_ex_2) + ', "request_body": {"bmi_config_data_id": "simple-bmi-cfe-1", "hydrofabric_data_id": "hydrofabric-huc01-copy-288", "hydrofabric_uid": "72c2a0220aa7315b50e55b6c5b68f927ac1d9b81", "realization_config_data_id": "huc01-simple-realization-config-1", "time_range": ' + str(time_range) +'}, "session_secret": "675b2f8826f69f97c01fe4d7add30420322cd21a790ddc68a5b3c149966de919"}, "user_id": "someone"}' + raw_json_str_2 = '{"allocation_paradigm": "SINGLE_NODE", "cpus": ' + str(cpu_count_ex_2) + ', "model_request": {"allocation_paradigm": "ROUND_ROBIN", "cpu_count": ' + str(cpu_count_ex_2) + ', "job_type": "ngen", "memory": ' + str(memory_ex_2) + ', "request_body": {"bmi_config_data_id": "simple-bmi-cfe-1", "hydrofabric_data_id": "hydrofabric-huc01-copy-288", "hydrofabric_uid": "72c2a0220aa7315b50e55b6c5b68f927ac1d9b81", "realization_config_data_id": "huc01-simple-realization-config-1", "time_range": ' + str(time_range) +'}, "session_secret": "675b2f8826f69f97c01fe4d7add30420322cd21a790ddc68a5b3c149966de919", "worker_version": "latest"}, "user_id": "someone"}' raw_json_obj_2 = json.loads(raw_json_str_2) sorted_json_str_2 = json.dumps(raw_json_obj_2, sort_keys=True) self.request_strings.append(sorted_json_str_2) @@ -164,7 +165,8 @@ def setUp(self) -> None: "realization_config_data_id": "huc01-simple-realization-config-1", "time_range": time_range.to_dict() }, - "session_secret": "675b2f8826f69f97c01fe4d7add30420322cd21a790ddc68a5b3c149966de919" + "session_secret": "675b2f8826f69f97c01fe4d7add30420322cd21a790ddc68a5b3c149966de919", + "worker_version": "latest" }, "user_id": "someone" }) diff --git a/python/lib/scheduler/dmod/scheduler/_version.py b/python/lib/scheduler/dmod/scheduler/_version.py index 2d7893e3d..ef9199407 100644 --- a/python/lib/scheduler/dmod/scheduler/_version.py +++ b/python/lib/scheduler/dmod/scheduler/_version.py @@ -1 +1 @@ -__version__ = '0.13.0' +__version__ = '0.14.0' diff --git a/python/lib/scheduler/dmod/scheduler/scheduler.py b/python/lib/scheduler/dmod/scheduler/scheduler.py index 25e002a79..a1a2b2ed2 100644 --- a/python/lib/scheduler/dmod/scheduler/scheduler.py +++ b/python/lib/scheduler/dmod/scheduler/scheduler.py @@ -537,10 +537,10 @@ def determine_image_for_job(self, job: 'Job') -> str: # For now, these are the only two requests supported # TODO: move registry name into environment variable other other more appropriate place if isinstance(job.model_request, NgenCalibrationRequest): - return "127.0.0.1:5000/ngen-cal:latest" + return f"127.0.0.1:5000/ngen-cal:{job.model_request.worker_version}" if isinstance(job.model_request, NGENRequest): - return "127.0.0.1:5000/ngen:latest" + return f"127.0.0.1:5000/ngen:{job.model_request.worker_version}" else: msg = "Unable to determine correct scheduler image for job {} with request of {} type" raise DmodRuntimeError(msg.format(job.job_id, job.model_request.__class__.__name__)) diff --git a/python/lib/scheduler/pyproject.toml b/python/lib/scheduler/pyproject.toml index 3094e2ff5..a03b8cadc 100644 --- a/python/lib/scheduler/pyproject.toml +++ b/python/lib/scheduler/pyproject.toml @@ -14,7 +14,7 @@ authors = [ dependencies = [ "docker>=7.1.0", "Faker", - "dmod.communication>=0.20.0", + "dmod.communication>=0.22.0", "dmod.modeldata>=0.7.1", "dmod.redis>=0.1.0", "dmod.core>=0.17.0", diff --git a/python/services/dataservice/dmod/test/it_data_derive_util.py b/python/services/dataservice/dmod/test/it_data_derive_util.py index 9b6345080..a5fb982c9 100644 --- a/python/services/dataservice/dmod/test/it_data_derive_util.py +++ b/python/services/dataservice/dmod/test/it_data_derive_util.py @@ -191,5 +191,3 @@ def test__generate_bmi_ds_1_a(self): self.assertEqual(len(ds_files), 2) self.assertIn(managed_ds.archive_name, ds_files) - - diff --git a/python/services/requestservice/dmod/requestservice/_version.py b/python/services/requestservice/dmod/requestservice/_version.py index f323a57be..2c7bffbf8 100644 --- a/python/services/requestservice/dmod/requestservice/_version.py +++ b/python/services/requestservice/dmod/requestservice/_version.py @@ -1 +1 @@ -__version__ = '0.11.0' +__version__ = '0.12.0' diff --git a/python/services/requestservice/pyproject.toml b/python/services/requestservice/pyproject.toml index 8b9dd0e68..7e22792eb 100644 --- a/python/services/requestservice/pyproject.toml +++ b/python/services/requestservice/pyproject.toml @@ -13,7 +13,7 @@ authors = [ dependencies = [ "websockets", "dmod.core>=0.19.0", - "dmod.communication>=0.21.0", + "dmod.communication>=0.22.0", "dmod.access>=0.2.0", "dmod.externalrequests>=0.6.0", ] diff --git a/python/services/schedulerservice/dmod/schedulerservice/_version.py b/python/services/schedulerservice/dmod/schedulerservice/_version.py index 2c7bffbf8..2d7893e3d 100644 --- a/python/services/schedulerservice/dmod/schedulerservice/_version.py +++ b/python/services/schedulerservice/dmod/schedulerservice/_version.py @@ -1 +1 @@ -__version__ = '0.12.0' +__version__ = '0.13.0' diff --git a/python/services/schedulerservice/pyproject.toml b/python/services/schedulerservice/pyproject.toml index c62cd0eda..5a8c16c6d 100644 --- a/python/services/schedulerservice/pyproject.toml +++ b/python/services/schedulerservice/pyproject.toml @@ -11,8 +11,8 @@ authors = [ ] dependencies = [ "dmod.core>=0.17.0", - "dmod.communication>=0.20.0", - "dmod.scheduler>=0.13.0", + "dmod.communication>=0.22.0", + "dmod.scheduler>=0.14.0", ] readme = "README.md" description = "Service package for service responsible for managing job scheduling, execution, and resource management in the DMOD architecture."