You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I have a dataset that is quite large: 529 x 119,000. I would like to run the following call on my bigmem cluster: nmf(data.matrix, rank = 17, nrun = 10). The limits are 3TB RAM and 24h total for the job. Do you have any recommendations for requesting and optimizing memory allocation? Do you think I should decrease the nrun to ensure that the job finishes in 24 h? Thank you in advance for your help!
The text was updated successfully, but these errors were encountered:
Hello, I have a dataset that is quite large: 529 x 119,000. I would like to run the following call on my bigmem cluster: nmf(data.matrix, rank = 17, nrun = 10). The limits are 3TB RAM and 24h total for the job. Do you have any recommendations for requesting and optimizing memory allocation? Do you think I should decrease the nrun to ensure that the job finishes in 24 h? Thank you in advance for your help!
The text was updated successfully, but these errors were encountered: