Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support multiple return values? #15

Open
ssfrr opened this issue Feb 27, 2020 · 4 comments
Open

support multiple return values? #15

ssfrr opened this issue Feb 27, 2020 · 4 comments
Labels
enhancement New feature or request

Comments

@ssfrr
Copy link

ssfrr commented Feb 27, 2020

I'm trying to slice-broadcast a function that has multiple return values. Ideally I'd collect them into separate arrays. Is this possible witih TensorCast?

Here's a MWE of basically what I'm trying to do:

x = randn(10, 20)
@cast v[k], i[k] := findmax(x[:, k]) 

But this throws the error:

LoadError: don't know what to do with left = (v[k], i[k])

Which is a great error message that leads me to believe I can't do this kind of thing, but I wanted to check.

In this case it's probably not too bad to just collect into a single Vector{Tuple{Float64, Int} and split it afterwards, but my actual example has some more complicated indexing stuff going on that would benefit from some TensorCast magic.

@ssfrr
Copy link
Author

ssfrr commented Feb 27, 2020

For a more concrete but less MWE example, here's what I'm doing right now:

@cast rank1res[k] := rank1filter(permutedims(specs[k, :, :]))
@cast sproj[k, n, c] := rank1res[k][1][c, n]
@cast vs[k, c] := rank1res[k][2][c]
@cast λs[k] := rank1res[k][3]
@cast q[k] := rank1res[k][4]

Though actually I'm not sure how I would get rid of the permutedims stuff here if it was combined with multiple return values, so maybe this isn't a great idea after all.

@mcabbott
Copy link
Owner

mcabbott commented Feb 27, 2020

Have never thought about that.

Fundamentally this package wants to make one big array from the RHS, and then there's another set of procedures to reshape/slice as desired by the LHS, afterwards. What certainly ought to work (but doesn't) is this:

@cast vi[j,k] := findmax(x[:, k])[j] 

And what sort-of works is the following awful hack, because the LHS parser never checks that (v, i) isn't a symbol, and so just assigns to it like (v, i) = eachcol(rhs):

x = randn(10, 20);
Base.ndims(::Tuple) = 1
@cast (v, i)[j][k] := findmax(x[:, k])[j] lazy
v

@mcabbott
Copy link
Owner

mcabbott commented Mar 22, 2021

FWIW, behaviour on version 0.4 is that these work without tweaks, and should probably be added to the tests:

julia> x = rand(Int8, 3, 5)
3×5 Matrix{Int8}:
  76  -128   57   -8  -52
 -85    15  -33  -55  -20
  -2    -2  -29  -23  -17

julia> @cast vi[j,k] := findmax(x[:, k])[j] 
2×5 Matrix{Signed}:
 76  15  57  -8  -17
  1   2   1   1    3

julia> @cast (v, i)[j][k] := findmax(x[:, k])[j]
2-element Vector{SubArray{Signed, 1, LinearAlgebra.Transpose{Signed, Matrix{Signed}}, Tuple{Base.Slice{Base.OneTo{Int64}}, Int64}, false}}:
 [76, 15, 57, -8, -17]
 [1, 2, 1, 1, 3]

julia> v
5-element view(transpose(::Matrix{Signed}), :, 1) with eltype Signed:
  76
  15
  57
  -8
 -17

julia> @cast (v, i)[j][k] |= findmax(x[:, k])[j]  # less lazy
2-element Vector{Vector{Signed}}:
 [76, 15, 57, -8, -17]
 [1, 2, 1, 1, 3]

julia> @pretty @cast (v, i)[j][k] := findmax(x[:, k])[j]
begin
    ndims(x) == 2 || throw(ArgumentError("expected a 2-tensor x[:, k]"))
    local sandpiper = sliceview(x, (:, *))
    local butterfly = @__dot__(findmax(sandpiper))
    local raven = transmute(stack(butterfly), (2, 1))
    (v, i) = sliceview(raven, (:, *))
end

Still not ideal in terms of the element types, I guess. With Float64:

julia> x = rand(3, 5);

julia> @cast vi[j,k] := findmax(x[:, k])[j] 
2×5 Matrix{Real}:
 0.853762  0.394276  0.739257  0.347495  0.804691
 3         2         3         3         1

julia> @cast (v, i)[j][k] := findmax(x[:, k])[j];

julia> Int.(i)
5-element Vector{Int64}:
 3
 2
 3
 3
 1

@mcabbott
Copy link
Owner

mcabbott commented Aug 25, 2022

On this topic, StructArrays allows for efficient broadcasting out to several arrays, as for instance here: https://github.com/JuliaDiff/ChainRules.jl/blob/main/src/unzipped.jl#L6

It might not be hard to allow the pattern @cast v[k], i[k] := stuff[k] (no reduction, same indices on all both LHS arrays), implemented using something like this, or exactly with unzip_broadcast.

Issue about making this work with GPU arrays: JuliaArrays/StructArrays.jl#150

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants