Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure type stability for Julia benchmarks #191

Open
GiggleLiu opened this issue Apr 8, 2020 · 2 comments
Open

Ensure type stability for Julia benchmarks #191

GiggleLiu opened this issue Apr 8, 2020 · 2 comments

Comments

@GiggleLiu
Copy link

GiggleLiu commented Apr 8, 2020

The julia code in this repo is not type stable, which causes more than 10x slow down.

Whenever the type is not stable, Julia can not compile efficient code for you.
For example, using global variables like this would cause type instability.

d = size(means,1)

To avoid type instability, one should

  • avoid using global variables, if it is nessesary, add const keyword.
  • do not change the type of a variable inside a loop,
  • try not using closure if not nessesary,

Type instability can be seen by typing @code_warntype f(x).
For more information, see performance tips
https://docs.julialang.org/en/v1/manual/performance-tips/

Also, please take a look at this package for benchmark.
https://github.com/JuliaCI/BenchmarkTools.jl

Fixing this issue will definitely make the benchmark in your paper more reliable.

@GiggleLiu
Copy link
Author

GiggleLiu commented Apr 16, 2020

I have setup a new repo to show how to fix the gaussian mixture Julia code here:
https://github.com/JuliaReverse/NiGaussianMixture.jl

ForwardDiff shows at least 1 order speed up.
The forwarddiff code is here
https://github.com/JuliaReverse/NiGaussianMixture.jl/blob/master/src/forwarddiff.jl

The benchmark script is here
https://github.com/JuliaReverse/NiGaussianMixture.jl/blob/master/benchmarks/forwarddiff.jl

It will definitely make the benchmark results more reliable.

If you are interested in benchmarking reversible programming AD framework NiLang. I will be happy to contribute to this part.

@GiggleLiu
Copy link
Author

The code for the second benchmark, BA is here:

https://github.com/JuliaReverse/NiBundleAdjustment.jl/blob/master/README.md

ForwardDiff should still have some room for improvements judging from the allocation.
NiLang also shows a comparable performance with Tapenade.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant