You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
with arbitrary samples (until recently I did not know which property was decisive) I got 'Error in .local(x, ...) : size factors should be positive' as error from computeDoubletDensity.
In some cases increasing subset.row was helpful (e.g. selecting the top 5000 highly variable features, instead of 2000). Sometimes though this could not resolve the issue.
I saw the other, similar, issue raised here (#32): My data set though was cleaned for cells with very low total read counts. The error persisted.
Solution:
One has to exclude cells which have zero total reads for features provided in subset.row:
Would it be acceptable to add a more meaningful error message? E.g. inform the user about cell names which have zero as library size?
I expect handling such error inside your function is not in your interest. If it is though:
(i) What would happen of library sizes are increased by a common value to avoid zeros? I mean adding 1 or 0.0001 or so, similar to log1p.
(ii) If such cells are excluded, an NA could be returned as doublet score. Or a -1 or so? Or they could be excluded complete from the return, which would cause other problems though.
Thanks.
The text was updated successfully, but these errors were encountered:
thanks and sorry for the delay. The solutions you suggest could have unintended downstream consequences, but I agree that it could help users to have a better idea of what's going on, so I simply added a more specific error message.
Dear all,
with arbitrary samples (until recently I did not know which property was decisive) I got 'Error in .local(x, ...) : size factors should be positive' as error from computeDoubletDensity.
In some cases increasing subset.row was helpful (e.g. selecting the top 5000 highly variable features, instead of 2000). Sometimes though this could not resolve the issue.
I saw the other, similar, issue raised here (#32): My data set though was cleaned for cells with very low total read counts. The error persisted.
Solution:
One has to exclude cells which have zero total reads for features provided in subset.row:
factors <- scuttle::librarySizeFactors(expr_mat[subset.row,])
which(factors == 0)
Would it be acceptable to add a more meaningful error message? E.g. inform the user about cell names which have zero as library size?
I expect handling such error inside your function is not in your interest. If it is though:
(i) What would happen of library sizes are increased by a common value to avoid zeros? I mean adding 1 or 0.0001 or so, similar to log1p.
(ii) If such cells are excluded, an NA could be returned as doublet score. Or a -1 or so? Or they could be excluded complete from the return, which would cause other problems though.
Thanks.
The text was updated successfully, but these errors were encountered: