-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] issue DiskArrays.jl #131 #132
Conversation
@@ -493,7 +492,8 @@ end | |||
#Index with range stride much larger than chunk size | |||
a = _DiskArray(reshape(1:100, 20, 5, 1); chunksize=(1, 5, 1)) | |||
@test a[1:9:20, :, 1] == trueparent(a)[1:9:20, :, 1] | |||
@test getindex_count(a) == 3 | |||
# now getindex_count(a) == 1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Doesn't this mean sparse ranges will now read all the data in some cases?
I thought the ideas of batchgetindex
was to explicitly just read the required chunks.
@@ -171,6 +256,8 @@ function _readblock!(A::AbstractArray, A_ret, r::AbstractVector...) | |||
mi, ma = extrema(ids) | |||
return largest_jump > cs && length(ids) / (ma - mi) < 0.5 | |||
end | |||
# What TODO?: necessary to avoid infinite recursion | |||
need_batch = false |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is essentially deleting half the code in this file, there may be a case for reorganising things but I think a lot of optimisations are being thrown out doing this.
We need to fix the dispatch instead.
I think the issues mentioned here have been fixed in other PRs |
Thanks Fabian! |
In this PR, fixes the following issue
also it is now type stable
It avoids also reading too much of data.
I had to set the
need_batch = false
to avoid infinite recursion. Any advice would be appreciated.The implementation in NCDatasets is added to current implementation of the special case:
to that we do not remove any functionality.