You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Cloudpickle does not work with versions of ddtrace >1.5.0 when trying to serialize objects by value. This previously worked with ddtrace <1.5.0, but they changed module discovery to use a module watchdog that does not use the standard sys.modules dictionary to detect when modules are loaded/unloaded and this does not seem to work with cloudpickle. I've tried this on the latest version of ddtrace (1.12.6) and cloudpickle (2.2.1).
It seems to take the following route when broken down: cloudpickle/cloudpickle_fast.py:632 attempts to dump the object, which goes to cpython/blob/3.11/Lib/pickle.py:476, which dumps the object then attempts to save it in cpython/blob/3.11/Lib/pickle.py#L535, which calls save_reduce in cpython/blob/3.11/Lib/pickle.py#L603, and the object is somehow malformed.
I've confirmed this only happens when ddtrace is imported before cloudpickle, but in practice that is not necessarily feasible. I opened a bug ticket with Datadog but this seems like it may be more applicable as a feature request for cloudpickle to support this.
Exception has occurred: PicklingError
args[0] from __newobj__ args has the wrong class
File "/home/lukes/cloudpickle_test/python/cloudpickle/2/2/1/dist/lib/python3.9/cloudpickle/cloudpickle_fast.py", line 632, in dump
return Pickler.dump(self, obj)
File "/home/lukes/cloudpickle_test/python/cloudpickle/2/2/1/dist/lib/python3.9/cloudpickle/cloudpickle_fast.py", line 73, in dumps
cp.dump(obj)
File "/home/lukes/cloudpickle_test/python/modelingtests.py", line 24, in <module>
x_after = pickle.dumps(to_pickle)
_pickle.PicklingError: args[0] from __newobj__ args has the wrong class
https://github.com/python/cpython/blob/3.11/Lib/pickle.py#L476
def dump(self, obj):
"""Write a pickled representation of obj to the open file."""
# Check whether Pickler was initialized correctly. This is
# only needed to mimic the behavior of _pickle.Pickler.dump().
if not hasattr(self, "_file_write"):
raise PicklingError("Pickler.__init__() was not called by "
"%s.__init__()" % (self.__class__.__name__,))
if self.proto >= 2:
self.write(PROTO + pack("<B", self.proto))
if self.proto >= 4:
self.framer.start_framing()
self.save(obj)
self.write(STOP)
self.framer.end_framing()
https://github.com/python/cpython/blob/3.11/Lib/pickle.py#L535
def save(self, obj, save_persistent_id=True):
https://github.com/python/cpython/blob/3.11/Lib/pickle.py#L603
self.save_reduce(obj=obj, *rv)
https://github.com/python/cpython/blob/3.11/Lib/pickle.py#L621
def save_reduce(self, func, args, state=None, listitems=None,
dictitems=None, state_setter=None, *, obj=None):
# This API is called by some subclasses
cls, args, kwargs = args
b'\x80\x05\x95\x19\x02\x00\x00\x00\x00\x00\x00\x8c\x17cloudpickle.cloudpickle\x94\x8c\x0e_make_function\x94\x93\x94(h\x00\x8c\r_builtin_type\x94\x93\x94\x8c\x08CodeType\x94\x85\x94R\x94(K\x00K\x00K\x00K\x00K\x02KCC\x08t\x00\xa0\x00\xa1\x00S\x00\x94N\x85\x94\x8c\x04test\x94\x85\x94)\x8cM/home/lukes/cloudpickle_test/python/modelingtests.py\x94\x8c\tto_pickle\x94K\x10C\x02\x00\x01\x94))t\x94R\x94}\x94(\x8c\x0b__package__\x94N\x8c\x08__name__\x94\x8c\x08__main__\x94\x8c\x08__file__\x94h\x0cuNNNt\x94R\x94\x8c\x1ccloudpickle.cloudpickle_fast\x94\x8c\x12_function_setstate\x94\x93\x94h\x17}\x94}\x94(h\x13h\r\x8c\x0c__qualname__\x94h\r\x8c\x0f__annotations__\x94}\x94\x8c\x0e__kwdefaults__\x94N\x8c\x0c__defaults__\x94N\x8c\n__module__\x94h\x14\x8c\x07__doc__\x94N\x8c\x0b__closure__\x94N\x8c\x17_cloudpickle_submodules\x94]\x94\x8c\x0b__globals__\x94}\x94h\nh\x00\x8c\tsubimport\x94\x93\x94h\n\x85\x94R\x94su\x86\x94\x86R0.'
Traceback (most recent call last):
File "home/lukes/cloudpickle_test/python/modelingtests.py", line 24, in <module>
x_after = pickle.dumps(to_pickle)
File "home/lukes/cloudpickle_test/python/cloudpickle/2/2/1/dist/lib/python3.9/cloudpickle/cloudpickle_fast.py", line 73, in dumps
cp.dump(obj)
File "home/lukes/cloudpickle_test/python/cloudpickle/2/2/1/dist/lib/python3.9/cloudpickle/cloudpickle_fast.py", line 632, in dump
return Pickler.dump(self, obj)
_pickle.PicklingError: args[0] from __newobj__ args has the wrong class
Summary of problem
Cloudpickle does not work with versions of ddtrace >1.5.0 when trying to serialize objects by value. This previously worked with ddtrace <1.5.0, but they changed module discovery to use a module watchdog that does not use the standard sys.modules dictionary to detect when modules are loaded/unloaded and this does not seem to work with cloudpickle. I've tried this on the latest version of ddtrace (1.12.6) and cloudpickle (2.2.1).
It seems to take the following route when broken down: cloudpickle/cloudpickle_fast.py:632 attempts to dump the object, which goes to cpython/blob/3.11/Lib/pickle.py:476, which dumps the object then attempts to save it in cpython/blob/3.11/Lib/pickle.py#L535, which calls save_reduce in cpython/blob/3.11/Lib/pickle.py#L603, and the object is somehow malformed.
I've confirmed this only happens when ddtrace is imported before cloudpickle, but in practice that is not necessarily feasible. I opened a bug ticket with Datadog but this seems like it may be more applicable as a feature request for cloudpickle to support this.
Pip freeze
Example
main.py
test.py
Result
Expected result
Please let me know if I can get over any more info. Thanks!
The text was updated successfully, but these errors were encountered: