You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This may be more of a discussion item, but starting here.
We encountered an interesting case of out-of-memory exception when constructing large problems in scipy/scipy#15888. I think while there's not necessarily anything to do about it (sometimes problems are large and you need more memory), in this case it appears it could be helped by not requiring a double allocation of memory (once for the Python C array copies and once for the HiGHS std::vector copies).
The stack overflow answer above suggests that a custom allocator could be used that simply "reuses" the array allocation for the vector, but it's not clear to me currently how well Cython supports custom C++ allocators/deleters.
EDIT: followup: without C++17 features and a big rewrite of the HiGHS C++ interfaces, achieving this with custom allocators is probably not possible
I'm curious if there's a possible refactor from using std::vector to something able to take ownership of existing memory allocations, most importantly, C arrays. Most interactions between HiGHS and other programming languages (Python, Julia, Fortran, etc.) are usually going to be facilitated through C interfaces. In these use cases, HiGHS will require at least double the amount of memory for a single problem unless another internal container is used which supports copy-less assignment.
I'm happy to try a few things if there's an appetite for this kind of overhaul, also wanting to know if you consider this model-creation "optimization" worth the effort
The text was updated successfully, but these errors were encountered:
I do consider this worth because as you said "sometimes problems are large and you need more memory" but till that point with the same amount of memory we can do a lot of work id we do not use memory naively.
This may be more of a discussion item, but starting here.
We encountered an interesting case of out-of-memory exception when constructing large problems in scipy/scipy#15888. I think while there's not necessarily anything to do about it (sometimes problems are large and you need more memory), in this case it appears it could be helped by not requiring a double allocation of memory (once for the Python C array copies and once for the HiGHS
std::vector
copies).As I observed in the SciPy issue:
I'm curious if there's a possible refactor from using
std::vector
to something able to take ownership of existing memory allocations, most importantly, C arrays. Most interactions between HiGHS and other programming languages (Python, Julia, Fortran, etc.) are usually going to be facilitated through C interfaces. In these use cases, HiGHS will require at least double the amount of memory for a single problem unless another internal container is used which supports copy-less assignment.I'm happy to try a few things if there's an appetite for this kind of overhaul, also wanting to know if you consider this model-creation "optimization" worth the effort
The text was updated successfully, but these errors were encountered: