to_netcdf engine='h5netcdf' Chunk Dimensionality issue. #8964
Replies: 3 comments 1 reply
-
@aymanbfree Please provide more context. A MCVE https://stackoverflow.com/help/minimal-reproducible-example would be great. From the information given it's hard to get to the root cause. There have been changes to the chunking behaviour in h5netcdf. So my first question would be, if does |
Beta Was this translation helpful? Give feedback.
-
You have a 0 in your data shape. I would track that down |
Beta Was this translation helpful? Give feedback.
-
We would need to continue using h5netcdf as its the only engine capable of reading LZF-compressed NetCDF files. For more context of the error we are seeing: File "/lgxarray.py", line 158, in write_netcdf_lzf CRITICAL root:batch.py:525 Chunk shape must not be greater than data shape in any dimension. (1, 1, 52, 16) is not compatible with (1, 1, 52, 0) And some more context of the code snippet that we find the issue: def write_netcdf_lzf(ds, fname, *, chunks=True):
|
Beta Was this translation helpful? Give feedback.
-
While upgrading xarray from 2022.3.0 to 2023.8.0 we are now seeing errors:
Chunk shape must not be greater than data shape in any dimension. (1, 1, 52, 16) is not compatible with (1, 1, 52, 0)
While running ds.to_netcdf(fname, engine='h5netcdf', encoding=encoding)
I'm struggling to find documentation on what change has caused this issue so I can implement a fix. Any suggestions?
Beta Was this translation helpful? Give feedback.
All reactions