Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(De)serialization of graphs #308

Closed
casadibot opened this issue Oct 31, 2012 · 12 comments

Comments

@casadibot
Copy link
Member

commented Oct 31, 2012

It would be very handy to be able to save MX and SX graphs to a filesystem and load back.
In a related effort, Python pickling should become possible.

Created by jgillis at: 2011-12-22T13:59:40
Last updated at: 2012-08-14T11:01:15


Comment 1 by jgillis at 2012-08-14T01:15:19

Pointers: http://stackoverflow.com/questions/9310053/how-to-make-my-swig-extension-module-work-with-pickle


Comment 2 by jgillis at 2012-08-14T11:01:15

Another pointer: http://www.picklingtools.com/

@jgillis

This comment has been minimized.

Copy link
Member

commented May 27, 2013

Some serialization formats: yaml, json, bson

@jaeandersson

This comment has been minimized.

Copy link
Member

commented May 27, 2013

Storing an SX graph is trivial. All information is contained in the algorithm. One problem is that if you reconstruct the graph, you might create common subexpressions.

MX is much harder. And you cannot really do it without also doing FX and derived classes. For that you would need some systematic approach.

I'm not sure I think this is the right way to go. So far, we have done a pretty good job making the symbolics fast enough that people aren't really complaining about regenerating all the expression graphs all the time. Codegen also makes this less important.

@jaeandersson

This comment has been minimized.

Copy link
Member

commented May 27, 2013

About serialization of SX, my proposal would be to create a .nl exporter.

@jaeandersson

This comment has been minimized.

Copy link
Member

commented May 27, 2013

Created two issues for serialization of NLPs, #755 and #756.

jgillis added a commit that referenced this issue Jul 18, 2013
jgillis added a commit that referenced this issue Jul 19, 2013
@jaeandersson

This comment has been minimized.

Copy link
Member

commented Apr 27, 2015

I start to think that this would really be possible also for MX. It's a major effort, but it would offer a really nice platform-independent way of passing symbolic expressions.

@ghorn

This comment has been minimized.

Copy link
Member

commented Apr 28, 2015

great!!!

probably need a version scheme also

@jaeandersson

This comment has been minimized.

Copy link
Member

commented Apr 28, 2015

What do you mean by version scheme?

@ghorn

This comment has been minimized.

Copy link
Member

commented Apr 28, 2015

If you saved it with a different casadi version and try to load it, and the serialization format changes, an assertion would be nice. Maybe that's too much complexity to worry about

@jaeandersson

This comment has been minimized.

Copy link
Member

commented Apr 28, 2015

I think the first step has to be to get any working serialization. Before there is a stable serialization, versioning is of little use.

@jgillis

This comment has been minimized.

Copy link
Member

commented Sep 14, 2016

upvote from Raoul Herzog

@pstjohn

This comment has been minimized.

Copy link

commented Oct 10, 2016

I probably mentioned this in a previous sourceforge issue, but I'll add my 👍 to this effort:

Another benefit that might be overlooked is that the vast majority of python parallelization libraries rely on pickling to pass objects to and from processes. being able to pickle casadi's symbolic objects would make running optimizations in parallel significantly more straightforward.

jgillis added a commit that referenced this issue Nov 7, 2017
(This is NOT a desirable serialization format yet at all)
jgillis added a commit that referenced this issue Nov 8, 2017
(This is NOT a desirable serialization format yet at all)
jgillis pushed a commit that referenced this issue Feb 13, 2018
jgillis pushed a commit that referenced this issue Feb 13, 2018
jgillis pushed a commit that referenced this issue Feb 14, 2018
Joris Gillis
jgillis pushed a commit that referenced this issue Feb 14, 2018
jgillis pushed a commit that referenced this issue Feb 14, 2018
jgillis pushed a commit that referenced this issue Feb 15, 2018
jgillis pushed a commit that referenced this issue Apr 26, 2018
jgillis pushed a commit that referenced this issue Apr 26, 2018
jgillis pushed a commit that referenced this issue Apr 27, 2018
jgillis pushed a commit that referenced this issue Apr 27, 2018
jgillis pushed a commit that referenced this issue Apr 30, 2018
jgillis pushed a commit that referenced this issue Jun 7, 2018
@jgillis jgillis added this to the Version 3.5 milestone Jun 7, 2018
@jgillis

This comment has been minimized.

Copy link
Member

commented Jun 7, 2018

Implemented and tested.

@jgillis jgillis closed this Jun 7, 2018
jgillis pushed a commit that referenced this issue Jun 15, 2018
jgillis pushed a commit that referenced this issue Jun 15, 2018
jgillis pushed a commit that referenced this issue Jun 28, 2018
jgillis pushed a commit that referenced this issue Aug 6, 2018
jgillis pushed a commit that referenced this issue Aug 6, 2018
jgillis pushed a commit that referenced this issue Aug 6, 2018
Joris Gillis
jgillis pushed a commit that referenced this issue Aug 22, 2018
jgillis pushed a commit that referenced this issue Aug 31, 2018
jgillis pushed a commit that referenced this issue Sep 3, 2018
jgillis pushed a commit that referenced this issue Sep 5, 2018
Joris Gillis
jgillis pushed a commit that referenced this issue Oct 1, 2018
jgillis pushed a commit that referenced this issue Oct 12, 2018
jgillis pushed a commit that referenced this issue Oct 12, 2018
jgillis pushed a commit that referenced this issue Oct 12, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
5 participants
You can’t perform that action at this time.