Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory usage #7

Open
BradleyH017 opened this issue Nov 23, 2020 · 2 comments
Open

Memory usage #7

BradleyH017 opened this issue Nov 23, 2020 · 2 comments

Comments

@BradleyH017
Copy link

Hi Chris,

Thanks for the great paper and easy to follow analysis.

I want to repeat this but using all healthy samples (not just the 10 in the training set). I have tried using run_analysis.r but with all cells of each subset. This is fine until ComBat batch correction, at which point this fails due to a memory error, regardless of how much memory I request for my R session.

I have also tried running as a DelayedArray, but this still fails. Presumably this is because the data is still being read to disk by the ComBat function.

I was wondering if you could shed some light on how much memory you are using to run this analysis, or how you overcame this.

Thanks for your help,
Bradley

@maarten-devries
Copy link

Hi Bradley,
Were you able to resolve this?
I'm running into the same issue.

Thanks!
Maarten

@BradleyH017
Copy link
Author

Hi Maarten,

Not using ComBat. I gave Harmony a go, but using reciprocal integration from Seurat gave me the best looking results batch-wise.

Best,
Bradley

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants