Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Help]: Radar Inverse Transform Computing: distributed.protocol.core - CRITICAL - Failed to deserialize #175

Closed
gg-bb opened this issue Oct 20, 2024 · 5 comments

Comments

@gg-bb
Copy link

gg-bb commented Oct 20, 2024

Good morning,
I am processing two Sentinel 1 data (SAFE) in order to compute an interferogram. I have the dem.grd and the orbit files in the wdir\topo and wdir\raw directories (data tested with gmtsar and it works). I run a Turkey_Earthquake example based script to process the data but It raised a critical error that I can't solve (I am not very familiar with the dask library). My pygmtsar version is 2024.8.30.post3, I updated my anaconda env to the latest version.

Any help would be very appreciated
Have a nice day.
config:
Ubuntu 22.04.5 LTS
Intel® Xeon(R) Silver 4110 CPU @ 2.10GHz × 32
Ram: 64 Go

Here is my script:

#!/usr/bin/env python3
import platform, sys, os
import xarray as xr 
import numpy as np 
import pandas as pd 
import geopandas as gpd 
import json
import shapely
from dask.distributed import Client
import dask
import psutil
import pyvista as pv
import panel
from contextlib import contextmanager
import matplotlib.pyplot as plt
from pygmtsar import S1, Stack,tqdm_dask,ASF, Tiles

@contextmanager
def mpl_settings(settings):
    original_settings = {k: plt.rcParams[k] for k in settings}
    plt.rcParams.update(settings)
    yield
    plt.rcParams.update(original_settings)

#_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-
if __name__ == "__main__":
        pv.set_plot_theme("document")
	panel.extension(comms="ipywidgets")
	panel.extension("vtk")
	
	plt.rcParams["figure.figsize"] = [12, 4]
	plt.rcParams["figure.dpi"] = 100
	plt.rcParams["figure.titlesize"] = 24
	plt.rcParams["axes.titlesize"] = 14
	plt.rcParams["axes.labelsize"] = 12
	plt.rcParams["xtick.labelsize"] = 12
	plt.rcParams["ytick.labelsize"] = 12
	
	# define Pandas display settings
	pd.set_option("display.max_rows", None)
	pd.set_option("display.max_columns", None)
	pd.set_option("display.width", None)
	pd.set_option("display.max_colwidth", 100)
	
	PATH = os.environ["PATH"]
	DEM="./topo/dem.grd"
	LANDMASK="./landmask.nc"
	POLARIZATION="VV"
	ORBIT="A"
	SUBSWATH =123
	WORKDIR      = "workdir"
	DATADIR      = "./"
	RESOLUTION   = 200.
	
	AOI=S1.scan_slc(DATADIR)
	Tiles().download_dem(AOI,filename=DEM,product="3s").plot.imshow(cmap="Spectral")
	Tiles().download_landmask(AOI,filename=LANDMASK,product="3s").fillna(0).plot.imshow(cmap="binary_r")

	# cleanup for repeatable runs
	if "client" in globals(): client.close()
	client = Client(n_workers=max(1, psutil.cpu_count() // 4),
	                threads_per_worker=min(8, psutil.cpu_count()),
	                memory_limit=max(4e9, psutil.virtual_memory().available))
	
	scenes=S1.scan_slc(DATADIR,polarization=POLARIZATION,orbit=ORBIT)
	
	sbas=Stack(WORKDIR,drop_if_exists=True).set_scenes(scenes)

	sbas.to_dataframe()
	sbas.compute_reframe()
	sbas.load_dem(DEM,AOI)
	sbas.plot_scenes()
	plt.savefig('Estimated_Scene_Locations.jpg')

	sbas.compute_align()
	sbas.compute_geocode(RESOLUTION) #<-- fail here...`

I
Estimated_Scene_Locations

Everything seems to work fine until the compute_geocode step. It returns the following error message:

NOTE: Found multiple scenes for a single day, use function Stack.reframe() to stitch the scenes
NOTE: Target file exists, return it. Use "skip_exist=False" or omit the filename to allow new downloading.
/home/guillaume/anaconda3/lib/python3.11/site-packages/xarray/core/dataset.py:282: UserWarning: The specified chunks separate the stored chunks along dimension "lat" starting at index 2048. This could degrade performance. Instead, consider rechunking after loading.
  warnings.warn(
/home/guillaume/anaconda3/lib/python3.11/site-packages/xarray/core/dataset.py:282: UserWarning: The specified chunks separate the stored chunks along dimension "lon" starting at index 2048. This could degrade performance. Instead, consider rechunking after loading.
  warnings.warn(
Tiles Parallel Downloading: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████| 12/12 [00:02<00:00,  4.97it/s]
NOTE: auto set reference scene 2023-09-03. You can change it like Stack.set_reference("2022-01-20")
Reframing: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 6/6 [04:05<00:00, 40.85s/it]
/home/guillaume/anaconda3/lib/python3.11/site-packages/xarray/core/dataset.py:282: UserWarning: The specified chunks separate the stored chunks along dimension "lat" starting at index 2048. This could degrade performance. Instead, consider rechunking after loading.
  warnings.warn(
/home/guillaume/anaconda3/lib/python3.11/site-packages/xarray/core/dataset.py:282: UserWarning: The specified chunks separate the stored chunks along dimension "lon" starting at index 2048. This could degrade performance. Instead, consider rechunking after loading.
  warnings.warn(
/home/guillaume/anaconda3/lib/python3.11/site-packages/xarray/core/dataset.py:282: UserWarning: The specified chunks separate the stored chunks along dimension "y" starting at index 512. This could degrade performance. Instead, consider rechunking after loading.
  warnings.warn(
/home/guillaume/anaconda3/lib/python3.11/site-packages/xarray/core/dataset.py:282: UserWarning: The specified chunks separate the stored chunks along dimension "x" starting at index 512. This could degrade performance. Instead, consider rechunking after loading.
  warnings.warn(
Save DEM on WGS84 Ellipsoid: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████| 109/109 [00:03<00:00, 35.75it/s]
Aligning Reference: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3/3 [00:09<00:00,  3.27s/it]
Aligning Repeat: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3/3 [05:57<00:00, 119.30s/it]
Merging Subswaths: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:20<00:00, 10.47s/it]
Radar Transform Computing: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [01:00<00:00, 60.91s/it]
Radar Transform Saving: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████| 11/11 [00:00<00:00, 54.59it/s]
Radar Transform Indexing: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 78/78 [00:00<00:00, 386.19it/s]
Radar Inverse Transform Computing:   0%|                                                                                                 | 0/9000000.0 [00:00<?, ?it/s]2024-10-20 23:35:50,034 - distributed.protocol.core - CRITICAL - Failed to deserialize
Traceback (most recent call last):
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/distributed/protocol/core.py", line 175, in loads
    return msgpack.loads(
           ^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 128, in unpackb
    ret = unpacker._unpack()
          ^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 565, in _unpack
    ret.append(self._unpack(EX_CONSTRUCT))
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 592, in _unpack
    ret[key] = self._unpack(EX_CONSTRUCT)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 565, in _unpack
    ret.append(self._unpack(EX_CONSTRUCT))
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 546, in _unpack
    typ, n, obj = self._read_header()
                  ^^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 531, in _read_header
    raise ValueError(
ValueError: 1397124367 exceeds max_array_len(96405)
2024-10-20 23:35:50,053 - distributed.core - ERROR - Exception while handling op register-client
Traceback (most recent call last):
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/distributed/core.py", line 831, in _handle_comm
    result = await result
             ^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/distributed/scheduler.py", line 5902, in add_client
    await self.handle_stream(comm=comm, extra={"client": client})
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/distributed/core.py", line 886, in handle_stream
    msgs = await comm.read()
           ^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/distributed/comm/tcp.py", line 247, in read
    msg = await from_frames(
          ^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/distributed/comm/utils.py", line 78, in from_frames
    res = _from_frames()
          ^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/distributed/comm/utils.py", line 61, in _from_frames
    return protocol.loads(
           ^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/distributed/protocol/core.py", line 175, in loads
    return msgpack.loads(
           ^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 128, in unpackb
    ret = unpacker._unpack()
          ^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 565, in _unpack
    ret.append(self._unpack(EX_CONSTRUCT))
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 592, in _unpack
    ret[key] = self._unpack(EX_CONSTRUCT)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 565, in _unpack
    ret.append(self._unpack(EX_CONSTRUCT))
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 546, in _unpack
    typ, n, obj = self._read_header()
                  ^^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 531, in _read_header
    raise ValueError(
ValueError: 1397124367 exceeds max_array_len(96405)
Task exception was never retrieved
future: <Task finished name='Task-482' coro=<Server._handle_comm() done, defined at /home/guillaume/anaconda3/lib/python3.11/site-packages/distributed/core.py:737> exception=ValueError('1397124367 exceeds max_array_len(96405)')>
Traceback (most recent call last):
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/distributed/core.py", line 831, in _handle_comm
    result = await result
             ^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/distributed/scheduler.py", line 5902, in add_client
    await self.handle_stream(comm=comm, extra={"client": client})
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/distributed/core.py", line 886, in handle_stream
    msgs = await comm.read()
           ^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/distributed/comm/tcp.py", line 247, in read
    msg = await from_frames(
          ^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/distributed/comm/utils.py", line 78, in from_frames
    res = _from_frames()
          ^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/distributed/comm/utils.py", line 61, in _from_frames
    return protocol.loads(
           ^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/distributed/protocol/core.py", line 175, in loads
    return msgpack.loads(
           ^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 128, in unpackb
    ret = unpacker._unpack()
          ^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 565, in _unpack
    ret.append(self._unpack(EX_CONSTRUCT))
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 592, in _unpack
    ret[key] = self._unpack(EX_CONSTRUCT)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 565, in _unpack
    ret.append(self._unpack(EX_CONSTRUCT))
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 546, in _unpack
    typ, n, obj = self._read_header()
                  ^^^^^^^^^^^^^^^^^^^
  File "/home/guillaume/anaconda3/lib/python3.11/site-packages/msgpack/fallback.py", line 531, in _read_header
    raise ValueError(
ValueError: 1397124367 exceeds max_array_len(96405)
@gg-bb
Copy link
Author

gg-bb commented Oct 22, 2024

As I made a local copy of the pygmtsar (via pip install) I don't know if there is a command to test if the installation dependencies are corrects. I dove into some github forum about "deserialize" issues such as: dask/distributed#8038 and tried the solution they came up with without any success.

By the way, Here are the data I used (but I don't think that is relevant since I could produce the interferogram with the latest gmtsar version):

master: S1A_IW_SLC__1SDV_20230903T183344_20230903T183412_050167_0609B4_100E.SAFE
aligned: S1A_IW_SLC__1SDV_20230915T183345_20230915T183413_050342_060F9F_85A4.SAFE
master eof: S1A_OPER_AUX_POEORB_OPOD_20230923T080643_V20230902T225942_20230904T005942.EOF
aligned eof: S1A_OPER_AUX_POEORB_OPOD_20231005T080628_V20230914T225942_20230916T005942.EOF

@AlexeyPechnikov
Copy link
Owner

There are no errors when processing your scenes using the Türkiye_Earthquakes_2023.ipynb notebook. Some bursts are misaligned, but it still works.

image

@gg-bb
Copy link
Author

gg-bb commented Oct 22, 2024

Dear Alexey,
Thank you for your comment. I believe I figured out what happen.. In fact before installing pygmtsar I wanted to learn more about dask so I installed it via conda and pip. After that, when I installed pygmtsar some conflicted versions of msgpack likely made the code to crash. I found the following thread: dask/distributed#3876 (the last comment is of interest) to reinstall dask in a cleaner way. So far so good, I will finish the processing to be sure everything has been fixed before closing the issue. I keep you in touch.
Best regards,
GB

@AlexeyPechnikov
Copy link
Owner

Please note, PyGMTSAR Docker images can help provide a ready-to-use configuration on any host.

@gg-bb
Copy link
Author

gg-bb commented Oct 22, 2024

Dear Alexey,
It eventually worked, and it is really fast compare to the original version of gmtsar (15 min vs 2h30)!
I followed the instructions provided here: dask/distributed#3876 - last comment
Phase Geographic Coordinates,  rad

I will take a closer look to all the pygmtsar features including the docker images you provide.
Thank again for all your efforts and for sharing your work with the insar community!

Best Regards,

GB

@gg-bb gg-bb closed this as completed Oct 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants