New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SARIMAX gives error on model.save(): "SystemError: error return without exception set" #4006
Comments
This might be difficult to debug without a test case. Can you try to pickle or cPickle dump it directly instead of going through the save method. And maybe try different pickle options. SystemError might indicate a operating system limitation. |
Full stack trace
|
Full trace:
|
The last error error: |
But how to check here if data is big enough? SARIMAX(1, 0, 1)x(1, 1, 1, 24) dumps successfully. |
I agree that this sounds like trying to pickle too much data. In your original specification you have a state vector of dimension 190, which means that some of the output arrays are pretty big. For example, |
Fair enough about calculations. It is just weird, that the data which fits to memory, doesn't fit* to disk. *can not be saved |
The original error sounds like it might be related to numpy/numpy#2396 |
Also, pickle will make much larger objects than exist in memory - for example, if create |
You may want to consider using |
A common problem with large arrays: see numpy/numpy#2396 |
Yes, seems you are right. I managed to save model in Python 3.5, though. Size of the file - 13G. |
I'm keeping it open as a FAQ issue, given that it's the first time we get those problems and explanations. |
Separate from the technical question in this issue: @dchirikov What's your use case? Why do you want or need to pickle? I think in the long run we should get away from pickling as the only or recommended way to reuse the model estimate for, for example, prediction. |
@josef-pkt To be honest I am relatively new in statistics/econometrics and can't say I am really familiar with other ways of reusing models (if any). But in my case I need to compute models for 15 timeseries arrays and every model is computing 2-5 hours on pretty powerful HW. So in order not to lose those hours I need to back them up somehow. I will use predict for sure and fittedvalues to estimate quality and/or graphs. |
@dchirikov Thanks for the reply. I guess pickle is still the best choice for backing up a full model and results. |
And if you want to evaluate the model at exactly a saved set of parameters (i.e. you don't need to re-fit), then you can just do |
Closing as answered. |
Hi,
I am observind an issue on deep SARIMA(X) model:
This is quite big dataset, and if I limit Q/q and P/p to smaller values (1..2) model can be saved. But with described parameters I see the following #trace.
version of statsmodel is 0.8.0
The text was updated successfully, but these errors were encountered: