-
Notifications
You must be signed in to change notification settings - Fork 76
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
AWS Batch: split CC and stack yaml files (#240)
* AWS Batch: split CC and stack yaml files * Missed line * fix test * Template testing * fix tests
- Loading branch information
Showing
11 changed files
with
85 additions
and
62 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -134,23 +134,3 @@ jobs: | |
--xml_path s3://scedc-pds/FDSNstationXML/CI/ \ | ||
--stations "SBC,RIO" --start_date 2022-02-02 --end_date 2022-02-04 \ | ||
--config configs/s3_anon.yaml | ||
s3_singlepath: | ||
strategy: | ||
fail-fast: true | ||
matrix: | ||
python_version: ['3.9', '3.10'] | ||
runs-on: ubuntu-22.04 | ||
steps: | ||
- name: Checkout Repo | ||
uses: actions/[email protected] | ||
- name: Setup NoisePy | ||
uses: ./.github/actions/setup | ||
with: | ||
python-version: ${{matrix.python_version}} | ||
- name: Test S3 data with a single path | ||
run: | | ||
noisepy cross_correlate --raw_data_path s3://scedc-pds/continuous_waveforms/2022/2022_002/ \ | ||
--ccf_path $RUNNER_TEMP/CCF_S3 --freq_norm rma \ | ||
--xml_path s3://scedc-pds/FDSNstationXML/CI/ \ | ||
--stations "SBC,RIO" \ | ||
--config configs/s3_anon.yaml |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,18 @@ | ||
jobName: 'noisepy-cross-correlate' | ||
jobQueue: '' | ||
jobDefinition: '' # [REQUIRED] The job definition used by this job. | ||
# Uncomment to run a job across multiple nodes. The days in the time range will be split across the nodes. | ||
# arrayProperties: | ||
# size: 16 # number of nodes | ||
containerOverrides: # An object with various properties that override the defaults for the job definition that specify the name of a container in the specified job definition and the overrides it should receive. | ||
resourceRequirements: | ||
- value: '90112' # CC requires more memory | ||
type: MEMORY | ||
command: # The command to send to the container that overrides the default command from the Docker image or the job definition. | ||
- cross_correlate | ||
- --raw_data_path=s3://scedc-pds/continuous_waveforms/ | ||
- --xml_path=s3://scedc-pds/FDSNstationXML/CI/ | ||
- --ccf_path=s3://<YOUR_S3_BUCKET>/<CC_PATH> | ||
- --config=s3://<YOUR_S3_BUCKET>/<CONFIG_PATH>/config.yaml | ||
timeout: | ||
attemptDurationSeconds: 36000 # 10 hrs |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,16 @@ | ||
jobName: 'noisepy-stack' | ||
jobQueue: '' | ||
jobDefinition: '' # [REQUIRED] The job definition used by this job. | ||
# Uncomment to run a job across multiple nodes. The station pairs to be stacked will be split across the nodes. | ||
# arrayProperties: | ||
# size: 16 # number of nodes | ||
containerOverrides: # An object with various properties that override the defaults for the job definition that specify the name of a container in the specified job definition and the overrides it should receive. | ||
resourceRequirements: | ||
- value: '32768' | ||
type: MEMORY | ||
command: # The command to send to the container that overrides the default command from the Docker image or the job definition. | ||
- stack | ||
- --ccf_path=s3://<YOUR_S3_BUCKET>/<CC_PATH> | ||
- --stack_path=s3://<YOUR_S3_BUCKET>/<STACK_PATH> | ||
timeout: | ||
attemptDurationSeconds: 7200 # 2 hrs |