You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For some nonlinear queries, the peak memory usage of z3 may be quite high. If the system memory is not enough (e.g., the Github-hosted ubuntu and windows runners), such z3 process may be killed due to OOM, crashing halmos.
To prevent this, introduce an additional option, say --solver-max-memory, that sets the max memory of z3:
solver.set(max_memory=<size-in-megabytes>)
Also, set the default max memory of z3, based on the system memory size, e.g., using psutil.
The text was updated successfully, but these errors were encountered:
I think max_memory could be automatically set by default to e.g. 80%-90% of the system capacity. Then advanced users can override this value with --solver-max-memory
For some nonlinear queries, the peak memory usage of z3 may be quite high. If the system memory is not enough (e.g., the Github-hosted ubuntu and windows runners), such z3 process may be killed due to OOM, crashing halmos.
To prevent this, introduce an additional option, say
--solver-max-memory
, that sets the max memory of z3:Also, set the default max memory of z3, based on the system memory size, e.g., using psutil.
The text was updated successfully, but these errors were encountered: