Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revert array wrapper union changes #498

Merged
merged 6 commits into from
Oct 20, 2020
Merged

Revert array wrapper union changes #498

merged 6 commits into from
Oct 20, 2020

Conversation

maleadt
Copy link
Member

@maleadt maleadt commented Oct 19, 2020

Too costly, latency wise.

@GiggleLiu
Copy link
Contributor

Thanks! Using CuArrays only definitely makes the design simpler.
Wondering what is the motivation of the previous change? Does it make the performance in some practical applications better?

@maleadt
Copy link
Member Author

maleadt commented Oct 19, 2020

Using CuArray is of course simpler, but puts a lot of burden on the GPU package, requiring us to implement and maintain functionality that's already available in Base. For example, to decide if a view is contiguous, to compute the dimensions of a reshape, etc. I really don't want to do that, but sadly the cost in load time is too great.

@GiggleLiu
Copy link
Contributor

I see. Hope the load time can be optimized later. Just add an example to the issue in #494

julia> reshape(zeros(4,4), 2, 8) |> typeof
Array{Float64,2}

julia> reshape(CUDA.zeros(4,4), 2, 8) |> typeof
Base.ReshapedArray{Float32,2,CuArray{Float32,2},Tuple{}}

In a tensor package, reshaping happens quite often. It will be easier to handle if the array type is not changed.

@maleadt
Copy link
Member Author

maleadt commented Oct 19, 2020

Just add an example to the issue in #494

Right, but is that an issue? Your package should probably handle ReshapedArray, or you're only compatible with Array and its specific optimizations. You can't expect other array types to copy that behavior, since it involves a fair bit of complexity (not only duplicating the reshape implementation, but also refcounting the underlying storage, or whatever). That doesn't solve the instability though, but you get that with other wrappers anyway:

julia> reinterpret(Float64, zeros(4,4)) |> typeof
Base.ReinterpretArray{Float64,2,Float64,Array{Float64,2}}

@GiggleLiu
Copy link
Contributor

Hmm, you are right. It is more likely a matter of fixing legacy code.

@codecov
Copy link

codecov bot commented Oct 20, 2020

Codecov Report

Merging #498 into master will increase coverage by 0.09%.
The diff coverage is 91.89%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #498      +/-   ##
==========================================
+ Coverage   80.45%   80.54%   +0.09%     
==========================================
  Files         166      166              
  Lines        8822     8879      +57     
==========================================
+ Hits         7098     7152      +54     
- Misses       1724     1727       +3     
Impacted Files Coverage Δ
test/array.jl 0.00% <ø> (ø)
src/array.jl 88.29% <91.66%> (+0.44%) ⬆️
src/pool.jl 72.91% <92.85%> (+0.62%) ⬆️
lib/nvml/error.jl 71.42% <0.00%> (+21.42%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 31f67d6...ec433ac. Read the comment docs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cuda array Stuff about CuArray.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants