Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add batch_size #37

Merged
merged 1 commit into from Jun 15, 2020
Merged

add batch_size #37

merged 1 commit into from Jun 15, 2020

Conversation

davek44
Copy link
Collaborator

@davek44 davek44 commented Jun 10, 2020

Solo crashes when there are N cells where N % 128 == 1 due to the following scVI (pytorch?) issue: scverse/scvi-tools#426. They only fixed the problem for training; it still crashes when obtaining latent embeddings from a trained model.

Here, I added batch_size as a parameter, check for a single leftover cell, and change the batch size if it's detected.

@davek44 davek44 requested a review from njbernstein June 10, 2020 00:53
@njbernstein
Copy link
Contributor

@davek44 This looks good and I assume you tested the fix. I had one comment but feel free to merge after

valid_pct = params.get('valid_pct', 0.1)
learning_rate = params.get('learning_rate', 1e-3)
stopping_params = {'patience': params.get('patience', 10), 'threshold': 0}

# protect against single example batch
while num_cells % batch_size == 1:
batch_size = int(np.round(1.25*batch_size))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What would be wrong with?:

batch_size = int(np.round(1 + batch_size))

Also wouldn't need to put this in a while loop then.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea that's probably fine. GPUs tend to achieve better efficiency with batches divisible by powers of two, so I tried to come up with a strategy that would tend to maintain that. Either way is fine with me.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Gotcha. I figured there might be something like that. This is fine with me. Merge away!

@njbernstein
Copy link
Contributor

I'm gonna merge this in

@njbernstein njbernstein merged commit 47d105d into master Jun 15, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants