-
Notifications
You must be signed in to change notification settings - Fork 116
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NO_DUPLICATES causes deadlock as multiple tasks try to insert data into same global temp table #49
Comments
Hi ankitbko option("tableLock", True) |
@pramodnagare Line 85 in 526f08d
cc: @shivsood |
Hello, I am encountering the same issue. I need to use the following options |
Any update on this ? |
I also encounter this issue when running multiple instances of the same notebook on the same cluster (DBR 7.3 LTS) using Azure Databricks. My notebook attempts to load data from delta lake table source and overwrite the data in an existing Azure SQL Database table using the following options:
The intent is to preserve the table definition (and indexes) in the sink database. I realise the tablock is only required for heaps, but this is a generic notebook that loads to both indexed and heap tables. |
Certainly looks like it is caused by the name of the temp staging table not being granular enough. Adding the sink table name to the temp table should resolve this yes. |
@ankitbko Could you please provide repro scripts? I am not able to repro this issue. |
Close this issue since no more info provided and cannot repro it. Please reopen if more info provided. |
Because the temp table remains unique to a worker for a table, multiple tasks executing on the same worker will result in deadlock,
The text was updated successfully, but these errors were encountered: