You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm experiencing an issue when running computations with parallelization turned on for the ScRNA dataset. The performance becomes quite unpredictable, which is not the expected behaviour.
In particular, when we look at the sample complexity graph (attached), there is a significant dip around 40,000 data points. Given the nature of the algorithm, we would typically expect a smooth, monotonically increasing curve instead of a dip.
This issue only seems to occur when parallelization is enabled and specifically with the ScRNA dataset. I've not observed the same issue with other datasets.
The text was updated successfully, but these errors were encountered:
I'm experiencing an issue when running computations with parallelization turned on for the ScRNA dataset. The performance becomes quite unpredictable, which is not the expected behaviour.
In particular, when we look at the sample complexity graph (attached), there is a significant dip around 40,000 data points. Given the nature of the algorithm, we would typically expect a smooth, monotonically increasing curve instead of a dip.
This issue only seems to occur when parallelization is enabled and specifically with the ScRNA dataset. I've not observed the same issue with other datasets.
The text was updated successfully, but these errors were encountered: