-
Notifications
You must be signed in to change notification settings - Fork 6
/
functions.jl
257 lines (193 loc) · 9.92 KB
/
functions.jl
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
"""
tmapreduce(f, op, A::AbstractArray...;
[scheduler::Union{Scheduler, Symbol} = :dynamic],
[outputtype::Type = Any],
[init])
A multithreaded function like `Base.mapreduce`. Perform a reduction over `A`, applying a
single-argument function `f` to each element, and then combining them with the two-argument
function `op`.
Note that `op` **must** be an
[associative](https://en.wikipedia.org/wiki/Associative_property) function, in the sense
that `op(a, op(b, c)) ≈ op(op(a, b), c)`. If `op` is not (approximately) associative, you
will get undefined results.
## Example:
```
using OhMyThreads: tmapreduce
tmapreduce(√, +, [1, 2, 3, 4, 5])
```
is the parallelized version of `sum(√, [1, 2, 3, 4, 5])` in the form
```
(√1 + √2) + (√3 + √4) + √5
```
## Keyword arguments:
- `scheduler::Union{Scheduler, Symbol}` (default `:dynamic`): determines how the computation is divided into parallel tasks and how these are scheduled. See [`Scheduler`](@ref) for more information on the available schedulers.
- `outputtype::Type` (default `Any`): will work as the asserted output type of parallel calculations. We use [StableTasks.jl](https://github.com/JuliaFolds2/StableTasks.jl) to make setting this option unnecessary, but if you experience problems with type stability, you may be able to recover it with this keyword argument.
- `init`: initial value of the reduction. Will be forwarded to `mapreduce` for the task-local sequential parts of the calculation.
In addition, `tmapreduce` accepts **all keyword arguments that are supported by the selected
scheduler**. They will simply be passed on to the corresponding `Scheduler` constructor. Example:
```
tmapreduce(√, +, [1, 2, 3, 4, 5]; chunksize=2, scheduler=:static)
```
However, to avoid ambiguity, this is currently **only supported for `scheduler::Symbol`**
(but not for `scheduler::Scheduler`).
"""
function tmapreduce end
"""
treducemap(op, f, A::AbstractArray...;
[scheduler::Union{Scheduler, Symbol} = :dynamic],
[outputtype::Type = Any],
[init])
Like `tmapreduce` except the order of the `f` and `op` arguments are switched. This is
sometimes convenient with `do`-block notation. Perform a reduction over `A`, applying a
single-argument function `f` to each element, and then combining them with the two-argument
function `op`.
Note that `op` **must** be an
[associative](https://en.wikipedia.org/wiki/Associative_property) function, in the sense
that `op(a, op(b, c)) ≈ op(op(a, b), c)`. If `op` is not (approximately) associative, you
will get undefined results.
## Example:
```
using OhMyThreads: treducemap
treducemap(+, √, [1, 2, 3, 4, 5])
```
is the parallelized version of `sum(√, [1, 2, 3, 4, 5])` in the form
```
(√1 + √2) + (√3 + √4) + √5
```
## Keyword arguments:
- `scheduler::Union{Scheduler, Symbol}` (default `:dynamic`): determines how the computation is divided into parallel tasks and how these are scheduled. See [`Scheduler`](@ref) for more information on the available schedulers.
- `outputtype::Type` (default `Any`): will work as the asserted output type of parallel calculations. We use [StableTasks.jl](https://github.com/JuliaFolds2/StableTasks.jl) to make setting this option unnecessary, but if you experience problems with type stability, you may be able to recover it with this keyword argument.
- `init`: initial value of the reduction. Will be forwarded to `mapreduce` for the task-local sequential parts of the calculation.
In addition, `treducemap` accepts **all keyword arguments that are supported by the selected
scheduler**. They will simply be passed on to the corresponding `Scheduler` constructor. Example:
```
treducemap(+, √, [1, 2, 3, 4, 5]; chunksize=2, scheduler=:static)
```
However, to avoid ambiguity, this is currently **only supported for `scheduler::Symbol`**
(but not for `scheduler::Scheduler`).
"""
function treducemap end
"""
treduce(op, A::AbstractArray...;
[scheduler::Union{Scheduler, Symbol} = :dynamic],
[outputtype::Type = Any],
[init])
A multithreaded function like `Base.reduce`. Perform a reduction over `A` using the
two-argument function `op`.
Note that `op` **must** be an
[associative](https://en.wikipedia.org/wiki/Associative_property) function, in the sense
that `op(a, op(b, c)) ≈ op(op(a, b), c)`. If `op` is not (approximately) associative, you
will get undefined results.
## Example:
```
using OhMyThreads: treduce
treduce(+, [1, 2, 3, 4, 5])
```
is the parallelized version of `sum([1, 2, 3, 4, 5])` in the form
```
(1 + 2) + (3 + 4) + 5
```
## Keyword arguments:
- `scheduler::Union{Scheduler, Symbol}` (default `:dynamic`): determines how the computation is divided into parallel tasks and how these are scheduled. See [`Scheduler`](@ref) for more information on the available schedulers.
- `outputtype::Type` (default `Any`): will work as the asserted output type of parallel calculations. We use [StableTasks.jl](https://github.com/JuliaFolds2/StableTasks.jl) to make setting this option unnecessary, but if you experience problems with type stability, you may be able to recover it with this keyword argument.
- `init`: initial value of the reduction. Will be forwarded to `mapreduce` for the task-local sequential parts of the calculation.
In addition, `treduce` accepts **all keyword arguments that are supported by the selected
scheduler**. They will simply be passed on to the corresponding `Scheduler` constructor. Example:
```
treduce(+, [1, 2, 3, 4, 5]; chunksize=2, scheduler=:static)
```
However, to avoid ambiguity, this is currently **only supported for `scheduler::Symbol`**
(but not for `scheduler::Scheduler`).
"""
function treduce end
"""
tforeach(f, A::AbstractArray...;
[schedule::Union{Scheduler, Symbol} = :dynamic]) :: Nothing
A multithreaded function like `Base.foreach`. Apply `f` to each element of `A` on
multiple parallel tasks, and return `nothing`. I.e. it is the parallel equivalent of
```
for x in A
f(x)
end
```
## Example:
```
using OhMyThreads: tforeach
tforeach(1:10) do i
println(i^2)
end
```
## Keyword arguments:
- `scheduler::Union{Scheduler, Symbol}` (default `:dynamic`): determines how the computation is divided into parallel tasks and how these are scheduled. See [`Scheduler`](@ref) for more information on the available schedulers.
In addition, `tforeach` accepts **all keyword arguments that are supported by the selected
scheduler**. They will simply be passed on to the corresponding `Scheduler` constructor. Example:
```
tforeach(1:10; chunksize=2, scheduler=:static) do i
println(i^2)
end
```
However, to avoid ambiguity, this is currently **only supported for `scheduler::Symbol`**
(but not for `scheduler::Scheduler`).
"""
function tforeach end
"""
tmap(f, [OutputElementType], A::AbstractArray...;
[schedule::Union{Scheduler, Symbol} = :dynamic])
A multithreaded function like `Base.map`. Create a new container `similar` to `A` and fills
it in parallel such that the `i`th element is equal to `f(A[i])`.
The optional argument `OutputElementType` will select a specific element type for the
returned container, and will generally incur fewer allocations than the version where
`OutputElementType` is not specified.
## Example:
```
using OhMyThreads: tmap
tmap(sin, 1:10)
```
## Keyword arguments:
- `scheduler::Union{Scheduler, Symbol}` (default `:dynamic`): determines how the computation is divided into parallel tasks and how these are scheduled. See [`Scheduler`](@ref) for more information on the available schedulers.
In addition, `tmap` accepts **all keyword arguments that are supported by the selected
scheduler**. They will simply be passed on to the corresponding `Scheduler` constructor. Example:
```
tmap(sin, 1:10; chunksize=2, scheduler=:static)
```
However, to avoid ambiguity, this is currently **only supported for `scheduler::Symbol`**
(but not for `scheduler::Scheduler`).
"""
function tmap end
"""
tmap!(f, out, A::AbstractArray...;
[schedule::Union{Scheduler, Symbol} = :dynamic])
A multithreaded function like `Base.map!`. In parallel on multiple tasks, this function
assigns each element of `out[i] = f(A[i])` for each index `i` of `A` and `out`.
## Keyword arguments:
- `scheduler::Union{Scheduler, Symbol}` (default `:dynamic`): determines how the computation is divided into parallel tasks and how these are scheduled. See [`Scheduler`](@ref) for more information on the available schedulers.
In addition, `tmap!` accepts **all keyword arguments that are supported by the selected
scheduler**. They will simply be passed on to the corresponding `Scheduler` constructor.
However, to avoid ambiguity, this is currently **only supported for `scheduler::Symbol`**
(but not for `scheduler::Scheduler`).
"""
function tmap! end
"""
tcollect([OutputElementType], gen::Union{AbstractArray, Generator{<:AbstractArray}};
[schedule::Union{Scheduler, Symbol} = :dynamic])
A multithreaded function like `Base.collect`. Essentially just calls `tmap` on the
generator function and inputs.
The optional argument `OutputElementType` will select a specific element type for the
returned container, and will generally incur fewer allocations than the version where
`OutputElementType` is not specified.
## Example:
```
using OhMyThreads: tcollect
tcollect(sin(i) for i in 1:10)
```
## Keyword arguments:
- `scheduler::Union{Scheduler, Symbol}` (default `:dynamic`): determines how the computation is divided into parallel tasks and how these are scheduled. See [`Scheduler`](@ref) for more information on the available schedulers.
In addition, `tcollect` accepts **all keyword arguments that are supported by the selected
scheduler**. They will simply be passed on to the corresponding `Scheduler` constructor. Example:
```
tcollect(sin(i) for i in 1:10; chunksize=2, scheduler=:static)
```
However, to avoid ambiguity, this is currently **only supported for `scheduler::Symbol`**
(but not for `scheduler::Scheduler`).
"""
function tcollect end