Skip to content

Commit

Permalink
[Cloud Deployment IIIb]: AWS batch deployment helper (#384)
Browse files Browse the repository at this point in the history
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: CodyCBakerPhD <[email protected]>
Co-authored-by: Heberto Mayorquin <[email protected]>
  • Loading branch information
4 people authored Aug 8, 2024
1 parent ef656a0 commit b7ae085
Show file tree
Hide file tree
Showing 9 changed files with 820 additions and 7 deletions.
43 changes: 43 additions & 0 deletions .github/workflows/aws_tests.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
name: AWS Tests
on:
schedule:
- cron: "0 16 * * 1" # Weekly at noon on Monday
workflow_dispatch:

concurrency: # Cancel previous workflows on the same pull request
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true

env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
DANDI_API_KEY: ${{ secrets.DANDI_API_KEY }}

jobs:
run:
name: ${{ matrix.os }} Python ${{ matrix.python-version }}
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
python-version: ["3.12"]
os: [ubuntu-latest]
steps:
- uses: actions/checkout@v4
- run: git fetch --prune --unshallow --tags
- name: Setup Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}

- name: Global Setup
run: |
python -m pip install -U pip # Official recommended way
git config --global user.email "[email protected]"
git config --global user.name "CI Almighty"
- name: Install full requirements
run: pip install .[aws,test]

- name: Run subset of tests that use S3 live services
run: pytest -rsx -n auto tests/test_minimal/test_tools/aws_tools.py
10 changes: 4 additions & 6 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,18 @@
# Upcoming


### Deprecations
* Deprecated `WaveformExtractor` usage . [PR #821](https://github.com/catalystneuro/neuroconv/pull/821)
* Deprecated `WaveformExtractor` usage. [PR #821](https://github.com/catalystneuro/neuroconv/pull/821)

### Features
* Added MedPCInterface for operant behavioral output files. [PR #883](https://github.com/catalystneuro/neuroconv/pull/883)
* Support `SortingAnalyzer` in the `SpikeGLXConverterPipe`. [PR #821](https://github.com/catalystneuro/neuroconv/pull/821)
* Add Plexon2 support [PR #918](https://github.com/catalystneuro/neuroconv/pull/918)
* Converter working with multiple VideoInterface instances [PR 914](https://github.com/catalystneuro/neuroconv/pull/914)
* Added helper function `neuroconv.tools.data_transfers.submit_aws_batch_job` for basic automated submission of AWS batch jobs. [PR #384](https://github.com/catalystneuro/neuroconv/pull/384)

### Bug fixes
* Fixed the default naming of multiple electrical series in the `SpikeGLXConverterPipe`. [PR #957](https://github.com/catalystneuro/neuroconv/pull/957)
* Write new properties to the electrode table use the global identifier channel_name, group [PR #984](https://github.com/catalystneuro/neuroconv/pull/984)


### Improvements
* The `OpenEphysBinaryRecordingInterface` now uses `lxml` for extracting the session start time from the settings.xml file and does not depend on `pyopenephys` anymore. [PR #971](https://github.com/catalystneuro/neuroconv/pull/971)
* Swap the majority of package setup and build steps to `pyproject.toml` instead of `setup.py`. [PR #955](https://github.com/catalystneuro/neuroconv/pull/955)
Expand All @@ -33,7 +32,6 @@

### Features
* Added docker image and tests for an automated Rclone configuration (with file stream passed via an environment variable). [PR #902](https://github.com/catalystneuro/neuroconv/pull/902)
* Added MedPCInterface for operant behavioral output files. [PR #883](https://github.com/catalystneuro/neuroconv/pull/883)

### Bug fixes
* Fixed the conversion option schema of a `SpikeGLXConverter` when used inside another `NWBConverter`. [PR #922](https://github.com/catalystneuro/neuroconv/pull/922)
Expand All @@ -60,7 +58,7 @@

### Improvements
* Propagated `photon_series_type` to `BaseImagingExtractorInterface` init instead of passing it as an argument of `get_metadata()` and `get_metadata_schema()`. [PR #847](https://github.com/catalystneuro/neuroconv/pull/847)

* Converter working with multiple VideoInterface instances [PR 914](https://github.com/catalystneuro/neuroconv/pull/914)



Expand Down
1 change: 1 addition & 0 deletions codecov.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,3 +19,4 @@ ignore:
- "./setup.py"
- "./docs/"
- "./documentation/"
- "./src/neuroconv/tools/aws/"
1 change: 1 addition & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,7 @@ docs = [
]
dandi = ["dandi>=0.58.1"]
compressors = ["hdf5plugin"]
aws = ["boto3"]

[tool.setuptools.packages.find]
where = ["src"]
Expand Down
1 change: 0 additions & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@ def read_requirements(file):
extras_require = defaultdict(list)
extras_require["full"] = ["dandi>=0.58.1", "hdf5plugin"]


for modality in ["ophys", "ecephys", "icephys", "behavior", "text"]:
modality_path = root / "src" / "neuroconv" / "datainterfaces" / modality
modality_requirement_file = modality_path / "requirements.txt"
Expand Down
3 changes: 3 additions & 0 deletions src/neuroconv/tools/aws/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
from ._submit_aws_batch_job import submit_aws_batch_job

__all__ = ["submit_aws_batch_job"]
44 changes: 44 additions & 0 deletions src/neuroconv/tools/aws/_dynamodb.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
"""Helper functions for operations on DynamoDB tables."""

import os


def update_table_status(
*,
submission_id: str,
status: str,
status_tracker_table_name: str = "neuroconv_batch_status_tracker",
region: str = "us-east-2",
) -> None:
"""
Helper function for updating a status value on a DynamoDB table tracking the status of EC2 jobs.
Intended for use by the running job to indicate its completion.
Parameters
----------
submission_id : str
The random hash that was assigned on submission of this job to the status tracker table.
status : str
The new status value to update.
status_tracker_table_name : str, default: "neuroconv_batch_status_tracker"
The name of the DynamoDB table to use for tracking job status.
region : str, default: "us-east-2"
The AWS region to use for the job.
"""
import boto3

aws_access_key_id = os.environ["AWS_ACCESS_KEY_ID"]
aws_secret_access_key = os.environ["AWS_SECRET_ACCESS_KEY"]

dynamodb_resource = boto3.resource(
service_name="dynamodb",
region_name=region,
aws_access_key_id=aws_access_key_id,
aws_secret_access_key=aws_secret_access_key,
)
table = dynamodb_resource.Table(name=status_tracker_table_name)

table.update_item(Key={"id": submission_id}, AttributeUpdates={"status": {"Action": "PUT", "Value": status}})

return None
Loading

0 comments on commit b7ae085

Please sign in to comment.