Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PETScWrappers::MPI::Vector: add new reinit() #12864

Merged
merged 1 commit into from
Oct 23, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
9 changes: 9 additions & 0 deletions include/deal.II/lac/petsc_vector.h
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
# ifdef DEAL_II_WITH_PETSC

# include <deal.II/base/index_set.h>
# include <deal.II/base/partitioner.h>
# include <deal.II/base/subscriptor.h>

# include <deal.II/lac/exceptions.h>
Expand Down Expand Up @@ -356,6 +357,14 @@ namespace PETScWrappers
void
reinit(const IndexSet &local, const MPI_Comm &communicator);

/**
* Initialize the vector given to the parallel partitioning described in
* @p partitioner.
*/
void
reinit(
const std::shared_ptr<const Utilities::MPI::Partitioner> &partitioner);

/**
* Return a reference to the MPI communicator object in use with this
* vector.
Expand Down
9 changes: 9 additions & 0 deletions source/lac/petsc_parallel_vector.cc
Original file line number Diff line number Diff line change
Expand Up @@ -254,6 +254,15 @@ namespace PETScWrappers
create_vector(local.size(), local.n_elements());
}

void
Vector::reinit(
const std::shared_ptr<const Utilities::MPI::Partitioner> &partitioner)
{
this->reinit(partitioner->locally_owned_range(),
partitioner->ghost_indices(),
partitioner->get_mpi_communicator());
}


void
Vector::create_vector(const size_type n, const size_type locally_owned_size)
Expand Down