You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to set up a dockerized celery deployment on a single machine:
I made a doker-compose with rabbitmq, dagit master and dagster celery workers containers
I had to use custom celery_config.yaml to overwrite broker url for internal docker network address
But I still want to use filesystem storage through mapping all my containers to a single shared volume on disk
And when I try to launch my pipeline I get dagster.check.CheckError: Invariant failed. Description: Must use S3 or GCS storage with non-local Celery broker: amqp://guest:guest@cube_rabbitmq:5672// and backend: rpc://
The text was updated successfully, but these errors were encountered:
it would be nice if our local filesystem-related failures were early -- right now, we will fail when a solid looks for an upstream intermediate. maybe we could have a sentinel file written to the filesystem that sub-executions look for (though they would need to know they were sub-executions which right now they don't). it's complicated b/c distributed disjoint executions without dependencies don't actually need a shared filesystem at all.
I'm trying to set up a dockerized celery deployment on a single machine:
And when I try to launch my pipeline I get
dagster.check.CheckError: Invariant failed. Description: Must use S3 or GCS storage with non-local Celery broker: amqp://guest:guest@cube_rabbitmq:5672// and backend: rpc://
The text was updated successfully, but these errors were encountered: