New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
check/warn about filling up _TMPDIR #485
Comments
Python doesn't document a specific size as a maximum for temporary directories, as I believe it ends up being system-dependent. No clue what Windows routine it even calls for this (on *nix system all the |
that's what I would have guessed. beyond our control. we need a "troubleshooting" section somewhere for problems like this. |
If the user is on a FAT32 system the max file size will be 4gb. Probably not that, because it wouldn't work in any directory not just tmp There's also this:
But that seems like it would happen when we create the folder, right? |
user reports that output files took up 400MB disk space. the call to CmdStan didn't throw any errors - some systems might throw https://docs.python.org/3.6/library/exceptions.html#OSError when the disk is full. |
I think that Windows always puts %TEMP% on the C drive. So if it’s full, you’ve got problems even if you have terabytes free on other drives. Sounds like that was this specific users issue, and I’m afraid there’s not much we can do. We could add this to the error message that got reported for a different length csv size? |
Summary:
Problems reported via Discourse: https://discourse.mc-stan.org/t/analyzing-the-posterior-prediction-samples/24956/20?u=mitzimorris
Description:
User ran model to fit larget dataset and then run posterior predictive checks in the generated quantities block, using all default sampler settings and input dataset of 10M items.
CmdStan run succeeded, CmdStanPy unable to check the outputs when output files written to _TMPDIR -
fails with error:
When
output_dir
specified, call to sampler via CmdStanPy succeeds.The result of running the sampler with defaults is 4 Stan CSV files of 1001 rows and 10M columns.
Is this enough data to fill up _TMPDIR?
The text was updated successfully, but these errors were encountered: