You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, in the Databricks Asset Bundle (DAB), you can specify experiments in the resources.experiments section. According to the documentation it uses this endpoint, it creates the experiment if it does not exist, and if it already exists, it throws an error. It does not make sense to have this feature if the deployment will fail the second time I attempt a deployment because the first time creates the experiment and the second time it already exists.
I assume that DAB internally handles this to not throw an exception in this scenario, but sometimes it does. An easy way to reproduce it is to have two bundles that reference the same experiment, deploy the first one, and then the second one. The second one will fail because the experiment already exists. This scenario does not bother me too much, but I would prefer a logic that creates an experiment if it does not exist, regardless of whether the deployment comes from different bundles. However, from time to time, when I deploy the same bundle, it fails with this exception, and to quickly solve this, I destroy the bundle and redeploy.
Uniqueness of the resource is tracked at bundle level hence resource with the same name is being created multiple times.
If you need to deploy the same resource from different bundle you can use databricks bundle deployment bind to bind the resource in DABs to the one already created
Uniqueness of the resource is tracked at bundle level hence resource with the same name is being created multiple times.
If you need to deploy the same resource from different bundle you can use databricks bundle deployment bind to bind the resource in DABs to the one already created
Do you have a reference/doc in how to do that with experiments?
I read the doc and just mention how to do it with jobs. I tried using the experiment resource key and name and it says that such resource does not exists.
Describe the issue
Hello, in the Databricks Asset Bundle (DAB), you can specify experiments in the resources.experiments section. According to the documentation it uses this endpoint, it creates the experiment if it does not exist, and if it already exists, it throws an error. It does not make sense to have this feature if the deployment will fail the second time I attempt a deployment because the first time creates the experiment and the second time it already exists.
I assume that DAB internally handles this to not throw an exception in this scenario, but sometimes it does. An easy way to reproduce it is to have two bundles that reference the same experiment, deploy the first one, and then the second one. The second one will fail because the experiment already exists. This scenario does not bother me too much, but I would prefer a logic that creates an experiment if it does not exist, regardless of whether the deployment comes from different bundles. However, from time to time, when I deploy the same bundle, it fails with this exception, and to quickly solve this, I destroy the bundle and redeploy.
Configuration
create_experiment/databricks.yml
create_existing_experiment/databricks.yaml
Steps to reproduce the behavior
databricks bundle deploy
in thecreate_experiment
directory.databricks bundle deploy
in thecreate_existing_experiment
directory.Expected Behavior
It should not fail if the experiment already exists; instead, it could display a warning or message.
Actual Behavior
The deployment fails with an error: cannot create MLflow experiment because it already exists.
OS and CLI version
OS: Ubuntu 23.04 x86_64
Databricks CLI v0.219.0
Debug Logs
log.txt
The text was updated successfully, but these errors were encountered: