Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Shared memory for multiple cones of the same type #84

Closed
migarstka opened this issue May 28, 2019 · 1 comment
Closed

Shared memory for multiple cones of the same type #84

migarstka opened this issue May 28, 2019 · 1 comment
Labels
enhancement New feature or request

Comments

@migarstka
Copy link
Member

migarstka commented May 28, 2019

It would be helpful to allow multiple cones to share the same workspace during the projection. Right now every cone pre-allocates the memory necessary during its projection. However, assuming the dimension of the cones are the same (as for exponential cones) and the projections are carried out sequentially there is no need for every cone to allocate its own temporary workspace variables. Instead one set of variables should be used by all the cones

@migarstka migarstka added the enhancement New feature or request label May 28, 2019
@goulart-paul
Copy link
Member

I thought that the goal of giving every cone its own memory space was to:

  1. Allow for parallel projection at some point in the future, and
  2. To allow us to use inexact/iterative factorisation methods for the PSD cone, where each projection would need to carry its own internal state.

If you really want to have a shared memory space for the exponential cones, then wouldn't it be better to just define a multivariate / vectorized exp cone type?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants