You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I wanted to make a LENS ensemble-mean daily climatology, and saw this "cookbook" to use AWS.
Binder fails. Is there a best place to do this calculation?
A Casper account, or on my laptop (if it really uses AWS services, perhaps for free?)
But this step fails:
dsets = col_subset.to_dataset_dict(zarr_kwargs={"consolidated": True}, storage_options={"anon": True})
One clue is in the pink box,
Exception: 'AttributeError("module 'lib' has no attribute 'OpenSSL_add_all_algorithms'")'
--> 240 datasets = dask.compute(*datasets)
...
ESMDataSourceError: Failed to load dataset with key='atm.RCP85.daily'
You can use cat['atm.RCP85.daily'].df to inspect the assets/files for this key.
But cat['atm.RCP85.daily'].df does not work.
cat: '[atm.RCP85.daily].df': No such file or directory
I also get this popup, presumably from auto-saving the notebook itself:
File Save Error for key-figures.ipynb
Invalid response: 413 Request Entity Too Large
The text was updated successfully, but these errors were encountered:
Some of the underlying libraries have changed their call structure. Please see #83 for some updates to the notebook that may overcome the errors you are seeing. Specifically, you probably need to replace zarr_kwargs = with xarray_open_kwargs =. Also, make sure the "s3fs" library is available in the conda environment you are using.
The climatology calculation likely will involve a lot of data transfer and processing time, making it a lengthy calculation to run on a personal laptop. I don't personally know the status of the binder service, which is why I am waiting for confirmation from other members of the team in charge of this notebook and the underlying services before merging the pull request mentioned above.
The "invalid response" error refers to the notebook's auto-save feature. The binder service has some kind of file size limit for the notebook, and it tries to save the notebook's figures, which makes the notebook too large. You should still be able to right-click and save/download any output images without needing to save the notebook itself.
If you have computing credits with Amazon AWS, this is one way to go; using Casper is another possibility if you have an NCAR-sponsored project that gives you access to this machine. Sorry I can't provide more specific information at this time.
Hi @brianmapes, this code has found a new home under Project Pythia. See the CESM LENS on AWS Cookbook. We have it running on a new Binder, you can find links on that page.
I wanted to make a LENS ensemble-mean daily climatology, and saw this "cookbook" to use AWS.
Binder fails. Is there a best place to do this calculation?
A Casper account, or on my laptop (if it really uses AWS services, perhaps for free?)
------ Details of binder failure ----
I was able to launch in binder,
http://hub.mypythia.org/user/projectpythia-c-ns-aws-cookbook-25b4epah/lab/tree/notebooks/example-workflows/key-figures.ipynb
But this step fails:
dsets = col_subset.to_dataset_dict(zarr_kwargs={"consolidated": True}, storage_options={"anon": True})
One clue is in the pink box,
Exception: 'AttributeError("module 'lib' has no attribute 'OpenSSL_add_all_algorithms'")'
--> 240 datasets = dask.compute(*datasets)
...
ESMDataSourceError: Failed to load dataset with key='atm.RCP85.daily'
You can use
cat['atm.RCP85.daily'].df
to inspect the assets/files for this key.But cat['atm.RCP85.daily'].df does not work.
cat: '[atm.RCP85.daily].df': No such file or directory
I also get this popup, presumably from auto-saving the notebook itself:
File Save Error for key-figures.ipynb
Invalid response: 413 Request Entity Too Large
The text was updated successfully, but these errors were encountered: