-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[REVIEW] Justify (or tone-down) performance claims in paper and docs #34
Comments
@ashiklom, can you point me to any publication demonstrating performance from the CliMA group ? I found this page but there are so many publications there... We show a comparison with interpreted languages in the paper we are preparing for PlantBiophysics, but it does not represent the entire field of plant modelling, so it's really relevant here. So I'm preparing a little benchmark with the new toy models I added in the examples fodler. |
I think just the simple comparison from the PlantBiophysics paper that you describe would be sufficient. Even better than anything from CLIMA, I think, because you are showing a direct comparison for exactly the computations that this package is doing. If you will not have those comparisons for some time, it's also fine to just tweak the language to say something like, "Julia can achieve significantly better performance than typical interpreted languages" and cite any studies or websites that have shown this. The performance is not the main selling point of this package, so this doesn't have to be perfect. |
Thank you for the quick answer ! I propose the changes in acdfc72.
|
The paper and documentation make multiple claims about performance without benchmarks or jusfitication. I am not convinced that simple models --- like the examples here --- are that much faster in their Julia implementations relative to good implementations in Python or R (especially because in practice, a lot of the performance-critical steps in R/Python are taking advantage of highly optimized Fortran/C routines). Some suggestions (don't have to do all of these; just some ideas):
(1) Cite existing Julia performance benchmarks to highlight why some kinds of operations in Julia are as fast or faster than interpreted (and sometimes, compiled!) languages. Besides the relevant engineering/CS literature on Julia, existing scientific programming efforts in Julia --- like the CliMA group --- may have good numbers for this.
(2) Re-frame some sections as, "Modularity / flexibility of Julia's design allows users to take advantage of performance-enhancing libraries like Floops.jl, DifferentialEquations.jl", etc.
(3) Include some performance statistics for simulations. These don't necessarily have to be comparisons --- most people don't care about something being the fastest, just that it's fast enough (especially when you add all the other great functionality that this package provides).
openjournals/joss-reviews#5371
The text was updated successfully, but these errors were encountered: