-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Parallel Sequential Monte Carlo #6
Comments
@yebai: I've reproduced the code on the issue JuliaLang/julia#19450 you opened, and I do not get any segfault. Yet, it seems that there is still this issue JuliaLang/julia#10441. I can't find the |
The solution posted here https://discourse.julialang.org/t/how-do-i-deal-with-random-number-generation-when-multithreading/5636/2 seems to be a good solution to deal with random numbers while using |
Thanks for investigating into this. Please find my code in the following branch: https://github.com/yebai/Turing.jl/tree/hg/parallelsmc Ps. it might be broken since it was written long before. |
I just implemented a version which seems to work on branch See: https://github.com/TuringLang/Turing.jl/blob/multithreaded-PG/src/core/container.jl#L152-L180 |
FYI I implemented a few months ago a parallel version of IPMCMC: https://github.com/emilemathieu/Turing.jl/blob/7c72a238b4d278720409d845880738d5d2c44ed3/src/samplers/ipmcmc.jl#L70 |
Great! I’ll have a look at it. |
The code on branch https://github.com/TuringLang/Turing.jl/tree/multithreaded-PG now implements a multi-threaded version of the ParticleContainer and an adaptation of the code in https://github.com/emilemathieu/Turing.jl/blob/7c72a238b4d278720409d845880738d5d2c44ed3/src/samplers/ipmcmc.jl#L70 for Julia 0.6. A test for the distributed IPMCMC is currently failing, see below. As far as I understand the compiler code, it seems that the compiler currently does not generate the inner callback functions, *_model, on all processes. I'll, therefore, have a deeper look at the compiler implementation. @emilemathieu Did you encounter the same issue? |
Great work ! |
If I'm running a simple example to test the parallel implementation, e.g. addprocs(1)
@everywhere using Turing
srand(125)
x = [1.5, 2]
@everywhere @model gdemo(x) = begin
s ~ InverseGamma(2, 3)
m ~ Normal(0, sqrt(s))
for n in 1:length(x)
x[n] ~ Normal(m, sqrt(s))
end
s, m
end
inference = IPMCMC(30, 500, 4)
chain = sample(gdemo(x), inference) , I get an error that Thanks! |
@trappmartin You need to use
Supporting paralliesm using processes is a pain since it involves data transfer. It's probably better to stick with threads for now until we have a cleaner and more organised code base. |
Right, thanks for the tip! I wanted to have both supported so that one can use multi-threading locally and also distributed computation if necessary. Unfortuanitelly, I'm currently having a broken test for the distributed code and see a similar problem as in TuringLang/Turing.jl#463 even though the non-distributed code works fine. Working on it... |
For shared-memory parallelism, I suggest making use of KissThreading.jl https://github.com/bkamins/KissThreading.jl which is shared-memory parallelism free of the closure bug hassle. That package is not registered yet, but I can work on getting it registered if it turns out to be useful here. GPU support is probably also worth considering at some point, but that's a much larger commitment. |
Is multiprocessing SMC (that is, not multithreading) supported still? I see some mentions here but it's difficult to find documentation. |
I'll ask also, what is the current status of this? The current |
@FredericWantiez we can finally re-visit this functionality based on |
I looked into the newly released
Threads.@threading
construct in Julia 0.5. It seems that adding parallelization feature to SMC is simple - could be done in a few lines of code (see theparallelsmc
branch).However, the
threading
feature in Julia is still quite fragile (see e.g. here). I've also filed a bug in the Julia repo.Maybe we should wait until the Julia team fix these threading bugs and revisit to this feature in a few months time.
The text was updated successfully, but these errors were encountered: