Join GitHub today
GitHub is home to over 50 million developers working together to host and review code, manage projects, and build software together.
Sign upParallel + DB Locking Issue #322
Comments
|
Very interesting use-case. I do not see how this can be solved for SQLite. The best solution is to use a more advanced database, like postgress. Try createPostgresRepo() and setPostgresRepo(). Postgress will handle concurrent access to a single database. |
|
Hi, I was dealing with the similar issue and my solution was to create save_to_repo_extended using package save_to_repo_extended <- function(artifact,
repoDir = archivist::aoptions("repoDir"),
archiveData = TRUE,
archiveTags = TRUE,
archiveMiniature = TRUE,
archiveSessionInfo = TRUE,
force = TRUE, value = FALSE,
...,
userTags = c(),
silent = archivist::aoptions("silent"),
ascii = FALSE,
artifactName = deparse(substitute(artifact)),
file_lock = NULL){ # passing the file lock
if(is.null(file_lock)){ # if lock does not exists continue as usual
return_value <- archivist::saveToRepo(artifact, repoDir, archiveData, archiveTags,
archiveMiniature, archiveSessionInfo, force,
value, ..., userTags, silent, ascii, artifactName)
} else { # lock exists -> lock section and continue
locker <- flock::lock(file_lock) # lock critical section
return_value <- archivist::saveToRepo(artifact, repoDir, archiveData, archiveTags,
archiveMiniature, archiveSessionInfo, force,
value, ..., userTags, silent, ascii, artifactName)
flock::unlock(locker) # unlock section
}
return_value
} |
|
thanks, will try later. |
|
Thank you guys for this interesting use case. |
Hi,
I have a regular batch job to save articrafts to local repo. And I was using mcparallel to do it.
However, one of the task failed which shows:
What type of parallel process should I use to avoid DB locking issue?