Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Managing CUDA memory? #61

Closed
Data-drone opened this issue Dec 21, 2020 · 1 comment
Closed

Managing CUDA memory? #61

Data-drone opened this issue Dec 21, 2020 · 1 comment

Comments

@Data-drone
Copy link

I was trying this on a kaggle dataset and ran into CUDA Out of Memory issues.

How can I adjust the Auto and fit functions to make sure that this doesn't happen?

@carefree0910
Copy link
Owner

carefree0910 commented Dec 21, 2020

Yes! There are two ways to adjust:

  • specify target model in Auto to reduce the scale of the experiment:
auto = cflearn.Auto(..., models="fcnn")

The CUDA out of memory issue may be caused by searching the tree_dnn parameters, and I'm planning to reduce it soon 😉

  • specify cuda="cpu" in fit to use CPU instead of CUDA:
auto.fit(..., cuda="cpu")

However this may slow down the whole process 🤣

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants