-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cholmod error 'problem too large' at file #8
Comments
I get the cholmod error, so I attempted your solution, but then got the error message:
Any ideas? Thanks for providing this tool. Andrew |
Hey Andrew, Sorry for the delay - I actually forgot to put in the nessecary library
Hopefully that helps, I did check the code with and without the library(Matrix) and I was able to reproduce the error you saw. So I am hoping that is all the fix, but please let me know if you have any other issues. Thanks |
Hello Nick, Thanks for getting back to me. That resolved the error message as you expected unfortunately my dataset still seems to big, so I get the same Cholmod when running the as.matrix step. I guess 70k cells must be too many and I don't want to split my seurat object into different samples. Andrew |
If any user is still having this problem - there is no longer a conversion from a sparse matrix to a full matrix - this eliminates the error (will be in the dev branch of escape committing this afternoon). |
Had an email with this error.
Issue here is that the count matrices within the Seurat or SingleCellExperiment object is too large to convert to traditional matrix using the call
as.matrix()
. This will be an issue if you have a ton of cells or conversely, a ton of features.Methods to circumvent
1. Summarize and Filter
Filter the sparse matrix for features that are expressed (or change 0 to percent of sample expressed)
2. Split Seurat Object and Loop
Although not ideal - there is also a possibility of looping through the seurat objects or initial count matrices.
The text was updated successfully, but these errors were encountered: