You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
From the uploading_s3 in push action in github workflow we could see:
Notice: A new release of pip is available: 23.0.1 -> 23.3.1
Notice: To update, run: pip install --upgrade pip
Starting upload_s3.py
Namespace(sha='211bb633b0', version='3.7.0')
Uploading docker-openstudio.simg to 3.7.0/OpenStudio-3.7.0.211bb633b0-Singularity.simg
Traceback (most recent call last):
File "singularity/upload_s3.py", line 40, in <module>
main()
File "singularity/upload_s3.py", line 36, in main
s3.put_object(Bucket=bucket_name, Key=s3_key, Body=data)
File "/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/botocore/client.py", line 535, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/botocore/client.py", line 980, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (InvalidAccessKeyId) when calling the PutObject operation: The AWS Access Key Id you provided does not exist in our records.
Error: Process completed with exit code 1.
Proposal fix: update accessID for the pipeline.
The text was updated successfully, but these errors were encountered:
We just switched to using this action and GitHub's OIDC provider for auth on the CI in https://github.com/NREL/resstock-estimation. The setup was a little tricky, but It's supposedly more secure and has the advantage that you don't need to manage keys/secrets anymore. It authenticates and gets temporary credentials for the duration of the CI run. If you're interested I could show you how we set it up.
From the uploading_s3 in push action in github workflow we could see:
Proposal fix: update accessID for the pipeline.
The text was updated successfully, but these errors were encountered: