Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Poisson solver based on integrated Green functions #4648

Merged
merged 112 commits into from
Apr 3, 2024

Conversation

aeriforme
Copy link
Member

@aeriforme aeriforme commented Jan 30, 2024

This uses the method from:
https://journals.aps.org/prab/abstract/10.1103/PhysRevSTAB.9.044204
https://journals.aps.org/prab/pdf/10.1103/PhysRevSTAB.10.129901

Because this uses the same FFT infrastructure as PSATD, right now, the code needs to be compiled with:
-DWarpX_PSATD=ON.

Also: since the solver uses a global FFT, right now only one MPI rank participates in the FFT solve (but this MPI rank can still use a full GPU) ; this could be improved later by using heFFTe. The rest of the PIC operations (charge deposition, gather, etc.) are still performed by the different MPI ranks in parallel.

Comparison with Basseti-Erskine formula (see the test script in this PR):
296619769-50a1d730-84f8-475e-8888-b76c24c4daf1

TODO

  • Test on GPU
  • Add back the relativistic factor in dx
  • Overcome limitation with single MPI rank (use ParallelCopy)
  • Fix bug in the multi-GPU implementation, in the case grid_type = collocated (this could be done with the regular Poisson solver instead of the relativistic one, so that we can output phi)
  • Add timers to the FFT and the calculation of the IGF
  • Finalize the automated test
  • Decide on the user interface: introduce a new parameter warpx.poisson_solver that can be set to multigrid, fft_based, etc. + how to handle compatibility with boundary conditions?
    if electromagnetic simulation with PML + initialize_self_fields, then use IGF which implies open boundaries
    IGF only compatible with open and viceversa
    multigrid not compatible with open
  • This should follow the same logic as the Maxwell solver. In the PICMI interface, there is already a “method” parameter for the Poisson solver.

more

  • add new boundary condition “open” that is only valid for Poisson-like field solvers + organize documentation to group boundaries in similar types

more

  • Abort if someone tries to use mesh refinement If mesh refinement is on and the user has selected the IGF solver, then use IGF only on the coarsest level (lev = 0) and MLMG in the refined patches
  • Properly set guard cells of phi outside of the global domain (useful for grid_type = collocated)
  • Save the FFT plans in a permanent object, so that we do not need to recompute them. Probably not needed. From the timers, it seems that the creation and destruction of FFT plan does not take any significant time.
  • Clean-up ; move code to a different file, maybe IntegratedGreenFunctionSolver.H, in ablastr
  • Handle dimensionality that are not 3D, abort if we are not in 3D
  • Address warning
/Users/rlehe/Documents/codes/warpx_directory/warpx/Source/ablastr/fields/PoissonSolver.H:69:1: warning: function 'computePhiIGF' is not needed and will not be emitted [-Wunneeded-internal-declaration]
computePhiIGF ( amrex::MultiFab const & rho,

By calling the function in VectorPoissonSolver.H (for the magnetostatic solver)

  • Fix ParallelCopy error that appears in DEBUG mode:
Assertion `boxArray().ixType() == src.boxArray().ixType()' failed, file "/global/u2/a/aforment/src/war
  • Documentation

All complex types are equal, but some complex types are more equal
than others.

-- George Orwell
Copy link
Member

@ax3l ax3l left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome 🚀 ✨

@RemiLehe RemiLehe enabled auto-merge (squash) April 3, 2024 14:35
@RemiLehe RemiLehe disabled auto-merge April 3, 2024 15:08
@RemiLehe
Copy link
Member

RemiLehe commented Apr 3, 2024

@aeriforme Are you able to reproduce the build the Sphinx documentation on your local computer? I wonder why it fails here and not on other PRs.

@ax3l
Copy link
Member

ax3l commented Apr 3, 2024

The openbc_poisson_solver test fails with a checksum issue right now.

Are you able to reproduce the build the Sphinx documentation on your local computer? I wonder why it fails here and not on other PRs.

Found it 🎉 Just a small typo, the bib file has invalid syntax. Will fix.

Docs/source/refs.bib Outdated Show resolved Hide resolved
Docs/source/refs.bib Outdated Show resolved Hide resolved
Docs/source/refs.bib Outdated Show resolved Hide resolved
@RemiLehe
Copy link
Member

RemiLehe commented Apr 3, 2024

I updated the checksum. It was a negligible change in Bz. (Bz is supposed to be almost 0 for a relativistic beam propagating along z.)

@RemiLehe RemiLehe enabled auto-merge (squash) April 3, 2024 20:50
@RemiLehe RemiLehe merged commit 30ed0fb into ECP-WarpX:development Apr 3, 2024
45 checks passed
@ax3l ax3l mentioned this pull request May 9, 2024
7 tasks
Comment on lines +162 to +163
#if (defined(WARPX_USE_PSATD) && defined(WARPX_DIM_3D))
// Use the Integrated Green Function solver (FFT) on the coarsest level if it was selected
Copy link
Member

@ax3l ax3l May 9, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Small update I would suggest:

  • If FFTs were not compiled in, but is_solver_multigrid=false is passed to this function, then this silently falls back to a MLGM solve for the lowest level. This should raise a runtime error, because we cannot fulfill the API/user request.
    • An alternative, throwing a warning and "loud" fallback to MLMG, is not ideal in practice, because boundaries/padding & resolution would be quite different.
  • I realized the is_solver_multigrid name is a but confusing, because we do the IGF only on level 0 and an MLMG on all others in MR. Maybe we find a good alternative name, e.g., is_solver_igf_on_lev0 (most explicit/good?) or is_mlmg_all_levels (kinda like now).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The first point does not really apply to WarpX, I believe.
There's an assert that checks if FFTs are compiled when the IGF solver is selected.
It's here.
Anyway, I added a similar assert in ablastr in the follow-up PR #4945.
Let me know if this solution is ok for you.

Copy link
Member

@ax3l ax3l May 30, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Yes, we need the check in ABLASTR directly because that is a library that we also call from ImpactX 🙏

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component: electrostatic electrostatic solver
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants