Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prevent postgres deadlocking #3256

Merged
merged 1 commit into from Feb 11, 2019
Merged

Prevent postgres deadlocking #3256

merged 1 commit into from Feb 11, 2019

Conversation

clarafu
Copy link
Contributor

@clarafu clarafu commented Feb 8, 2019

When we update the cache indexes for the all the pipelines that use a
resource config which we found new versions for, postgres can get into a
deadlock when two resource configs try to update the same pipeline's
cache indexes at the same time. By running all the cache index updates
by itself (not in a transaction), we will prevent that from happening.

Copy link
Member

@vito vito left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note: there's probably a better way to do this but I think it can wait until we start on bigger architecture changes.

this does the same amount of updates pre-global-resources, and there will still be fewer queries overall.

atc/db/resource_config_scope.go Show resolved Hide resolved
When we update the cache indexes for the all the pipelines that use a
resource config which we found new versions for, postgres can get into a
deadlock when two resource configs try to update the same pipeline's
cache indexes at the same time. By running all the cache index updates
by itself (not in a transaction), we will prevent that from happening.

Signed-off-by: Clara Fu <cfu@pivotal.io>
Co-authored-by: Alex Suraci <suraci.alex@gmail.com>
@vito vito merged commit 77eeea1 into master Feb 11, 2019
@vito vito deleted the prevent-deadlock branch February 11, 2019 18:19
@vito vito added the release/no-impact This is an issue that never affected released versions (i.e. a regression caught prior to shipping). label Feb 19, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
release/no-impact This is an issue that never affected released versions (i.e. a regression caught prior to shipping).
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants