You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I know this isn't the great contribution, but I wanted to report a ("bug") problem I found when loading a big dataset. My dataset has 86 rows and 7000 columns (gene expression dataset) and when I wanted to save my environment and loading it again I found out the memory limit of my CPU (32 GB Windows 10 currently) was going overflowed. I understand that this is because rsample is using a memory-efficient storage of large data when working but when loading this isn't working because R needs to read each dataset before?...
I was using the classic commands:
save from Rstudio Environment option
load(my_env.Rdata)
I just want to comment that I closed the issue because I understand is probably not the place to ask this. Just to add, that I solved the problem by saving each model and measurements and cleaning the workspace and fitting a new model obtaining the measures and cleaning and so on...
This is just being a bit more clever than I was when asking...
This is a big issue and one that I think is impossible to get around. caret doesn't save the intermediate models for this exact reason. I think that it will be people's expectation (that they should save the models) and it can be surprising how much space some of the models occupy.
This issue has been automatically locked. If you believe you have found a related problem, please file a new issue (with a reprex https://reprex.tidyverse.org) and link to this issue.
Hi,
I know this isn't the great contribution, but I wanted to report a ("bug") problem I found when loading a big dataset. My dataset has 86 rows and 7000 columns (gene expression dataset) and when I wanted to save my environment and loading it again I found out the memory limit of my CPU (32 GB Windows 10 currently) was going overflowed. I understand that this is because rsample is using a memory-efficient storage of large data when working but when loading this isn't working because R needs to read each dataset before?...
I was using the classic commands:
save from Rstudio Environment option
load(my_env.Rdata)
Error message (classic):
Memory Allocation “Error: cannot allocate vector of size 75.1 Mb”
I already increased memory.limit, checked the CPU Performance.
carlos,
The text was updated successfully, but these errors were encountered: