New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance Regression in 1.5 #36941
Comments
[EDIT: I can reproduce the better MWE in next comment.] I tried your code (starting at
I would do for your version 1.4:
My theory is that you do not have the same versions, and the regression is in LightGraphs. I'm not saying it isn't there, just try for sure the same versions in both. Then report there or where it is, and until fixed you could user older version in 1.5 too. |
It's the same version of LightGraphs on Julia 1.4 and 1.5. Either way, B_t is of type Transpose(SparseMatrix), so LightGraphs is only used to initialize the matrix and doesn't show up in the performance critical code. I investigated some more and could simplify it further: using BenchmarkTools
N = 100
x0 = ones(N)
A_t = transpose(ones(Int64, N, 5*N))
@btime $A_t * $x0
Has the following timings on my machine: If everything is float, or without the transpose there is no performance difference, so the origin here seems to be the interaction of transpose, Int and Float. So I also had a look at pure Integer performance, and there is a major regression here: using BenchmarkTools
N = 100
x0 = ones(Int64, N)
A_t = transpose(ones(Int64, N, 5*N))
@btime $A_t * $x0 Julia 1.4 : 15.053 μs (1 allocation: 4.06 KiB) My Versions are: Running on a stock Ubuntu 18.04 and on the same Project.toml/Manifest.toml with only BenchmarkTools in it. Also on Discourse someone could reproduce the original results on MacOS. |
I can confirm:
I can also confirm regression for former example, there numbers closer to yours. |
Thanks for the nice reproducer! Can you verify whether #36975 fixes the issue for you? You can simply monkey-patch, i.e. |
I tested #36975 on my machine with the MWE above:
OS: macOS (x86_64-apple-darwin18.7.0) |
I can confirm that the second bunch of minimal working examples is fixed by this. However, the performance regression in sparse int * float that I reported at the top remains. Here is a new MWE that doesn't use LightGraphs for initialization: using Pkg
Pkg.activate(@__DIR__)
println("Julia $VERSION :")
using BenchmarkTools
using SparseArrays
using Random
Random.seed!(3)
N = 100
B1 = sprand(Int64, N, 5*N, 0.1)
B = transpose(B1)
x0 = ones(N)
println("Sparse Int * Float")
@btime B * x0
include("patch.jl")
println("Sparse Int * Float patched")
@btime B * x0
Julia 1.4.1 : Julia 1.5.0 : I don't have time right now to run more thorough experiments on the various permutations of sparse, types and transpose that show up here for the next days. Files attached (including the patch.jl I include above that fixes the second regression reported) if someone else wants to have a look. |
Then we'll call this fixed by #36975 |
Note that only one (the far more dramatic one) of two performance issues reported above are reported as fixed by this. |
I have the following performance regression:
with the following code:
The text was updated successfully, but these errors were encountered: