New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add mechanism for using task-local vars with Threads.@threads
#50052
base: master
Are you sure you want to change the base?
Conversation
196fb16
to
cc84440
Compare
Just for reference, Polyester has supported something similar for a while. It might make sense to aim for future versions of Polyester to adopt whatever format gets merged into Base. https://github.com/JuliaSIMD/Polyester.jl#local-per-thread-storage |
Ah, so that's what the question on slack was about :D I haven't tried it, but does this syntax work with something like Threads.@threads begin
buffer = Int[]
for i in 1:100
#... use buffer
end
end too? Seems more convenient/clear to write than having it all on one line. |
The syntax is not very readable, but longer term we probably want a mini dsl ala openmp and it's probably wiser to leave that to packages, so this seem reasonable Is the overhead of array creation significant? Otherwise it might be simpler to return arrays rather than tuples for further manipulation? And for the outer container, a named tuples rather than a plain tuple? |
|
Makes sense. And automatically manage returning the vectors
|
Or have the user be explicit about which entries should comprise the vectors xs, zs = @threads let x = 1, y = 1
for i = 1:100
# body
end
z = x + y
x, z
end |
Alternative to #48543
This adds a form of
Threads.@threads
that allows storing of variables at the task level for use in the partitioned iterations.It means the user can use
@threads
with task local buffering/storage/states without having to read into thetask_local_storage()
dict every iteration.For instance
Or a toy sum