-
Notifications
You must be signed in to change notification settings - Fork 82
Open
Description
This came up while investigating this topic on Discourse. The following script iterates over a few backends and prints gradients. Depending on the size of N (and luck), this will either give different results for the Enzyme backends vs. FiniteDiff or Zygote or even segfault. When it segfaulted, I got the following error:
[74750] signal (11.2): Segmentation fault: 11
in expression starting at REPL[19]:1
.Ldgemm_tcopy_L8_M8_80 at ~/.julia/juliaup/julia-1.10.9+0.aarch64.apple.darwin14/lib/julia/libopenblas64_.dylib (unknown line)
Allocations: 87827445 (Pool: 87719349; Big: 108096); GC: 101
[1] 74750 segmentation fault julia +lts --project=. -t4
using AbstractGPs, DifferentiationInterface, Optim, StatsFuns, Random
const DI = DifferentiationInterface
import Enzyme
import FiniteDiff
import Zygote
rng = MersenneTwister(1)
N = 1000
x = randn(rng, N)
y = randn(rng, N)
function loss_function(x, y)
function negativelogmarginallikelihood(params)
kernel =
softplus(params[1]) * (Matern52Kernel() ∘ ScaleTransform(softplus(params[2])))
noise_var = 0.1
f = GP(kernel)
fx = f(x, noise_var)
return -logpdf(fx, y)
end
return negativelogmarginallikelihood
end
θ0 = randn(rng, 2)
losses = [
loss_function(x[1:4], y[1:4]),
loss_function(x, y)
]
backends = [
AutoFiniteDiff(),
AutoZygote(),
AutoEnzyme(
mode=Enzyme.Reverse,
function_annotation=Enzyme.Duplicated,
),
AutoEnzyme(
mode=Enzyme.set_runtime_activity(Enzyme.Reverse),
function_annotation=Enzyme.Const,
),
]
for loss in losses
for backend in backends
grad = DI.gradient(loss, backend, θ0)
@info "Gradient" backend grad
end
end
julia> versioninfo()
Julia Version 1.10.9
Commit 5595d20a287 (2025-03-10 12:51 UTC)
Build Info:
Official https://julialang.org/ release
Platform Info:
OS: macOS (arm64-apple-darwin24.0.0)
CPU: 12 × Apple M3 Pro
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-15.0.7 (ORCJIT, apple-m1)
Threads: 4 default, 0 interactive, 2 GC (on 6 virtual cores)
Environment:
JULIA_EDITOR = code