Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

disable multithreading with PETSc #2265

Merged
merged 1 commit into from May 24, 2018
Merged

Conversation

tjhei
Copy link
Member

@tjhei tjhei commented May 23, 2018

No description provided.

@bangerth
Copy link
Contributor

Oh, why is this? Don't we carefully synchronize the assembly of the linear systems, for example? Or are you thinking of the postprocessors accessing data vectors in parallel, for example?

@tjhei
Copy link
Member Author

tjhei commented May 23, 2018

read access can not happen on two threads because we (lock, read, unlock), so yes something like dataout will fail. This is asserted here: https://github.com/dealii/dealii/blob/master/source/lac/petsc_vector_base.cc#L119

@bangerth
Copy link
Contributor

?? So you say that we (lock, read, unlock), but that we also just error out if threading is enabled? Then why do we do the lock/unlock at all?

But if I understand you correctly, then you want to disable threading with PETSc to avoid the assertion in VectorBase. Is this correct?

@tjhei
Copy link
Member Author

tjhei commented May 24, 2018

Then why do we do the lock/unlock at all?

Oh, not a mutex, but to read from a PETSc vector we use VecGetArray()/VecRestoreArray() and this operation is not thread-safe. To avoid this problem, we assert that multithreading is not enabled when you create a PETSc vector.

But if I understand you correctly, then you want to disable threading with PETSc to avoid the assertion in VectorBase. Is this correct?

Yes.

@bangerth bangerth merged commit e8948fc into geodynamics:master May 24, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants