This request follows my post on StackOverflow. I've found that fwrite will do strange things in memory constrained environments. If write.table uses more memory than it is allowed, I've found that my high performance computing cluster will kill the job. But if I use fwrite, the job will not be killed--instead, fwrite will either write nothing, write an empty file, write part of the file, or write the file with errors--but the R code will appear to have finished correctly. It's taken me quite some time to figure this all out--it would be nice if there were a way to alert users in the future that the write operation failed because of memory constraints (i.e., so they know they just need to boost the RAM allocation instead of hunting down errors in their code that don't exist). I don't imagine that's an easy task (I know this is a rather vague request), but I thought I would bring this to your attention because I've spent the last few days wrangling with the issue.
The text was updated successfully, but these errors were encountered:
Thanks for this report. I'm confident that PR #3288 by @philippechataignon with some follow up from me, should solve this. When that PR is merged, it will close this issue. Please reopen if it comes up again.