This repository has been archived by the owner on Nov 17, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 6.8k
[MXNET-123] Bug fix for sparse batch loader #10124
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
@rahul003 @cjolivier01 @reminisce @haojin2 could anyone help review? |
anirudh2290
reviewed
Mar 22, 2018
src/io/iter_sparse_batchloader.h
Outdated
int64_t unit_size = 0; | ||
out_.inst_index[top] = inst.index; | ||
for (size_t i = 0; i < inst.data.size(); ++i) { | ||
if (!IsIndPtr(i)) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems like this will be called for every element in each batch. But the isindptr result for each i between different batches should be the same correct ? Does it make sense to cache this ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmmm good suggestion! I'll cache this
jinhuang415
pushed a commit
to jinhuang415/incubator-mxnet
that referenced
this pull request
Mar 30, 2018
* fix a bug in sparse batch loader * fix warning * fix bug when size=0 * cache is indptr
cjolivier01
pushed a commit
to cjolivier01/mxnet
that referenced
this pull request
Mar 30, 2018
This reverts commit e08e1fd.
rahul003
pushed a commit
to rahul003/mxnet
that referenced
this pull request
Jun 4, 2018
* fix a bug in sparse batch loader * fix warning * fix bug when size=0 * cache is indptr
zheng-da
pushed a commit
to zheng-da/incubator-mxnet
that referenced
this pull request
Jun 28, 2018
* fix a bug in sparse batch loader * fix warning * fix bug when size=0 * cache is indptr
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
Reopen #8922
The previous implementation of sparse batch loader waits for the number of data instances reaches batch_size before allocating the buffer for copy (unlike dense batch loader, we don't know the actual buffer size to allocate since the number of non-zeros varies). However, such delayed allocate is buggy when batch size is extremely large, because the DataInst returned by base_ is only a reference to the data. By the time copy happens, the referenced data might already be updated by the parser.
This PR modifies the sparse batch loader to be almost the same as batch loader, except that:
@cjolivier01 @reminisce @anirudh2290 @ZiyueHuang
Checklist
Essentials
make lint
)Changes
Comments