Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add mechanism for using task-local vars with Threads.@threads #50052

Draft
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

IanButterworth
Copy link
Sponsor Member

@IanButterworth IanButterworth commented Jun 3, 2023

Alternative to #48543

This adds a form of Threads.@threads that allows storing of variables at the task level for use in the partitioned iterations.

It means the user can use @threads with task local buffering/storage/states without having to read into the task_local_storage() dict every iteration.

xs, ys = Threads.@threads x = some_init() y = another_init() for i in eachindex(a)
 	# do something with a[i] using x, y as buffers or states etc.
end

# optionally do something with xs & ys which are both of length threadpoolsize()

For instance

julia> Threads.threadpoolsize()
6

julia> xs, ys = Threads.@threads x = 1 y = 1 for i in 1:100
       x = i
       y = i+10
       end
((17, 34, 51, 68, 84, 100), (27, 44, 61, 78, 94, 110))

Or a toy sum

julia> function multi_sum(a)
           buffers, = Threads.@threads buffer = 0 for i in eachindex(a)
               buffer += a[i]
           end
           return sum(buffers)
       end
multi_sum (generic function with 1 method)

julia> multi_sum(1:1_000_000)
500000500000

julia> multi_sum(1:1_000_000) == sum(1:1_000_000)
true

@IanButterworth IanButterworth added domain:multithreading Base.Threads and related functionality needs tests Unit tests are required for this change needs docs Documentation for this change is required needs news A NEWS entry is required for this change labels Jun 3, 2023
@Krastanov
Copy link

Just for reference, Polyester has supported something similar for a while. It might make sense to aim for future versions of Polyester to adopt whatever format gets merged into Base.

https://github.com/JuliaSIMD/Polyester.jl#local-per-thread-storage

@Seelengrab
Copy link
Contributor

Seelengrab commented Jun 4, 2023

Ah, so that's what the question on slack was about :D

I haven't tried it, but does this syntax work with something like

Threads.@threads begin
    buffer = Int[]
    for i in 1:100
        #... use buffer
    end
end

too? Seems more convenient/clear to write than having it all on one line.

@antoine-levitt
Copy link
Contributor

The syntax is not very readable, but longer term we probably want a mini dsl ala openmp and it's probably wiser to leave that to packages, so this seem reasonable

Is the overhead of array creation significant? Otherwise it might be simpler to return arrays rather than tuples for further manipulation? And for the outer container, a named tuples rather than a plain tuple?

@fredrikekre
Copy link
Member

let would probably be more fitting since in that context you specify variables, i.e.

@threads let x = 1, y = 1
    for i = 1:100
        # body
    end
end

@IanButterworth
Copy link
Sponsor Member Author

Makes sense. And automatically manage returning the vectors

xs, ys = @threads let x = 1, y = 1
    for i = 1:100
        # body
    end
end

@jonas-schulze
Copy link
Contributor

Or have the user be explicit about which entries should comprise the vectors

xs, zs = @threads let x = 1, y = 1
    for i = 1:100
        # body
    end
    z = x + y
    x, z
end

@IanButterworth IanButterworth marked this pull request as draft January 16, 2024 00:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
domain:multithreading Base.Threads and related functionality needs docs Documentation for this change is required needs news A NEWS entry is required for this change needs tests Unit tests are required for this change
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants