Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Moist BC wave + slab + MPI #48

Merged
merged 1 commit into from
May 20, 2022
Merged

Moist BC wave + slab + MPI #48

merged 1 commit into from
May 20, 2022

Conversation

LenkaNovak
Copy link
Collaborator

@LenkaNovak LenkaNovak commented May 5, 2022

TODO

Screen Shot 2022-05-19 at 8 56 55 PM

  • peformance checks
    • ctrl setup: h_elem = 4, z_elem = 10, nq = 5, time_end = 1e4, dt = 1e2 (all time quoted in secs)
    • MacOS:
      • atmos only solve!: 77.5s (as in ClimaAtmos) [no output save]
      • atmos only step!: 90.7s [output saved at every dt]
      • coupled step!: 90.9s [output saved at every dt]
      • coupled step!: 74,4s [no output save]
    • HPC cluster with MPI:

Screen Shot 2022-05-17 at 10 21 57 PM

  • physical checks (crashes at day 17d, even without the additional heating blob, but good enough for now, drag also checked)

Screen Shot 2022-05-18 at 8 04 55 AM

- comparison between runs of different drag coefficient values (Cd) looks reasonable (NB: drag values are difficult to check due to a field type conversion issue)

Screen Shot 2022-05-19 at 8 47 28 PM

- (comparison with literature of the uncouple case is in ClimaAtmos)

The clean-up PR will include

  • introduce coupler-specific interface abstractions Coupler-specific Interface #44
  • substitute the latest version of ClimaAtmos interface (TC.jl, limiters, perf improvements etc)
  • buildkite hook
  • find in literature / produce using existing models coupled cases to compare

@LenkaNovak
Copy link
Collaborator Author

@sriharshakandala nice job on ClimaComms.jl implementation! 🚀 I was wondering what's the neatest way of tracking sim time if we want to add performance checks? Do you use @btime?

@sriharshakandala
Copy link
Member

sriharshakandala commented May 14, 2022

@sriharshakandala nice job on ClimaComms.jl implementation! 🚀 I was wondering what's the neatest way of tracking sim time if we want to add performance checks? Do you use @btime?

For measuring total simulation times (order of a few seconds or greater), generally speaking, @elapsed should suffice! https://github.com/CliMA/ClimaCore.jl/blob/main/examples/hybrid/driver.jl#L199

@btime is better for microbenchmarking (small pieces of code which take a few ns/micro secs)
@btime is also impractical for measuring full simulation times as it runs the function multiple times to gather statistics!

@LenkaNovak
Copy link
Collaborator Author

@sriharshakandala nice job on ClimaComms.jl implementation! 🚀 I was wondering what's the neatest way of tracking sim time if we want to add performance checks? Do you use @btime?

For measuring total simulation times (order of a few seconds or greater), generally speaking, @elapsed should suffice! https://github.com/CliMA/ClimaCore.jl/blob/main/examples/hybrid/driver.jl#L199

@btime is better for microbenchmarking (small pieces of code which take a few ns/micro secs) @btime is also impractical for measuring full simulation times as it runs the function multiple times to gather statistics!

Sounds good, thank you!

@LenkaNovak LenkaNovak marked this pull request as ready for review May 18, 2022 05:51
bottom = Operators.SetValue(.-dif_flux_uₕ),
)

@. Yₜ.c.uₕ += ᶜdivᵥ(ᶠK_E * ᶠinterp(ᶜρ) * ᶠgradᵥ(Y.c.uₕ / ᶜρ))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
@. Yₜ.c.uₕ += ᶜdivᵥ(ᶠK_E * ᶠinterp(ᶜρ) * ᶠgradᵥ(Y.c.uₕ / ᶜρ))
@. Yₜ.c.uₕ += ᶜdivᵥ(ᶠK_E * ᶠgradᵥ(Y.c.uₕ))

fix

coupler_loop_rough

spatial BCs, LinearSolver deps update, runs, rough

spatial BCs working

dry conservation

moist conservation, SF.jl, idealised radiation(t)

update for new ClimaAtmos inteface changes

depend on ClimaAtmos checkpoint

wip - mpi

race condition fix

mpi working + conserving

fixes for single processor

surface flux clean up

readme add

deps

deps

deps for cluster

scaling

scaling tests finished

format

rm residual output

fixes + add constant coefficient fluxes

format
@LenkaNovak
Copy link
Collaborator Author

bors r+

@bors
Copy link
Contributor

bors bot commented May 20, 2022

@bors bors bot merged commit 72cb15b into main May 20, 2022
@bors bors bot deleted the ln/bc_wave_moist_mpi branch May 20, 2022 05:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants