You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have deployed the Deepcell notebook example Interior-Edge Segmentation 2D Fully Convolutional.ipynb on Google Colab. When I try to load the dataset HeLa_S3.npz from Deepcell's AWS example bucket, the computing environment runs out of memory and crashes. Can you please advise on the recommended hardware requirements on which Deepcell has been proven to work for this notebook, so that I can set up a computing environment that meets these requirements and run Deepcell?
We ran these notebooks on our own NVIDIA DGX station, on one GPU with 16GB of memory.
The HeLaS3 dataset is almost 6 GB. If you are unable to load the data into memory, we also have hosted the following 2 datasets which are both under 2GB:
Greetings,
I have deployed the Deepcell notebook example Interior-Edge Segmentation 2D Fully Convolutional.ipynb on Google Colab. When I try to load the dataset HeLa_S3.npz from Deepcell's AWS example bucket, the computing environment runs out of memory and crashes. Can you please advise on the recommended hardware requirements on which Deepcell has been proven to work for this notebook, so that I can set up a computing environment that meets these requirements and run Deepcell?
Thank you very much,
@manugarciaquismondo and @cornhundred
The text was updated successfully, but these errors were encountered: