Packaging code for Dask execution #3156
-
It seems to me that since we ship a
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Our current Dask integration is pretty basic - and operates under the assumption that the dask workers have access to the same dagster definitions as the process that is executing the pipeline run.
Yes, also the pipeline and solid code need to be available.
The model we use in our kubernetes deployment is to use the same docker image for both the "workers" and the box running dagit (or user code deployments dagit is set up to talk to). If not using docker you could have a scheme where you checkout the same git repo to the same commit in both places or something similar to that. |
Beta Was this translation helpful? Give feedback.
Our current Dask integration is pretty basic - and operates under the assumption that the dask workers have access to the same dagster definitions as the process that is executing the pipeline run.
Yes, also the pipeline and solid code need to be available.
The model we use in our kubernetes deployment is to use the same docker image for both the "workers" and the bo…