Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Function to generate object and objective gradient evaluator functions #115

Merged
merged 10 commits into from
Feb 7, 2024

Conversation

tjstienstra
Copy link
Contributor

This PR proposes a utility function to allow one to specify an analytic expression as the objective function.
It is a relatively simple function with one major limitation, namely that one cannot use time-dependent and time-independent variables in the objective function.

Additionally, I noticed while opening this PR that #31 aims to implement a similar utility in an entirely different way. Will have to look into that one to see if it would be a better solution.

@tjstienstra
Copy link
Contributor Author

Okay, based on the other PR. I would say that this is a fun function for quite a few cases. However, we optimally want to define the objective expression as a continuous function, possibly with an integral.

import sympy as sm
import sympy.physics.mechanics as me

x, v, f = me.dynamicsymbols("x v f")
t = me.dynamicsymbols._t
m = sm.symbols("m")
integration_method = "backward euler"
node_times = np.linspace(0, 2, 101)  # duration=2.0 ; N=101 ; interval_value=0.02

obj_expr = m**2 + sm.Integral(f**2, (t, 0.5, 1.5))
obj, obj_grad = create_objective_function(obj_expr, (s, v), (f,), (m,), integration_method, node_times)

# Should give something like
# free[303] is just the value of m
# free[228] is f.subs(t, 0.52) (we are using backward euler), therefore it is not f.subs(t, 0.5)
# free[278] is f.subs(t, 1.50)
def obj(free):
    return free[303] ** 2 + np.sum(free[228:278] ** 2)

def obj_grad(free):
    return np.hstack((np.zeros(101), np.zeros(101), np.zeros(25), 2 * free[228:278], np.zeros(26), 2 * free[303]))

Not yet fully sure if this is the ideal method to construct obj_grad, but it is the version which looks most like what this PR currently tried to do.

@tjstienstra
Copy link
Contributor Author

A temporary simplification would be to just ignore the bounds of the integral for now, i.e. assert integral.limits == ((t,),)

@tjstienstra
Copy link
Contributor Author

tjstienstra commented Feb 3, 2024

Here is a fun objective, that will give even when trying to keep things simple quite a few problems.

import numpy as np
import sympy as sm
import sympy.physics.mechanics as me

x, v, f = me.dynamicsymbols("x v f")
t = me.dynamicsymbols._t
m, k, c = sm.symbols("m k c")
integration_method = "backward euler"
node_times = np.linspace(0, 2, 11)

states = sm.ImmutableDenseMatrix([x, v])
inputs = sm.ImmutableMatrix([f])
params = sm.ImmutableMatrix([c, k, m])
n, q, r = states.shape[0], inputs.shape[0], params.shape[0]

obj = m**2 + sm.Integral(f**2, (t,)) + sm.Integral(c**2 * f, (t,)) + sm.Integral(k**2 , (t,))
obj_grad = sm.ImmutableMatrix([obj]).jacobian(states.col_join(inputs).col_join(params))

@tjstienstra
Copy link
Contributor Author

@moorepants I think I figured out a method to combine it all. It is not the most beautiful function I have ever written, but it seems to do the job as the tests show.

@moorepants
Copy link
Member

#116 fixes the test failuers (pytest 8 just came out with a deprecation). You can merge master.


# Compute analytical gradient of the objective function
objective_grad = sm.ImmutableMatrix([objective]).jacobian(
states[:] + inputs[:] + params[:])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just as note for me to see what this looks like:

import sympy as sm
a, t = sm.symbols('a, t')
f, g = sm.Function('f')(t), sm.Function('g')(t)
expr = f**2 + a*sm.sin(g)
int_expr = sm.Integral(expr, t)
sm.ImmutableMatrix([int_expr]).jacobian([a, f, g])
Matrix([[Integral(sin(g(t)), t), Integral(2*f(t), t), Integral(a*cos(g(t)), t)]])

obj_grad_time_expr_eval = lambdify_function(
objective_grad[:n + q], np.hstack((0, np.ones(N - 1))), False)
obj_grad_param_expr_eval = lambdify_function(
objective_grad[n + q:], np.hstack((0, np.ones(N - 1))), True)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't follow these lines perfectly. What is the reason to force the sum or not?

Copy link
Contributor Author

@tjstienstra tjstienstra Feb 6, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, it all feels a bit like a cheat.

By multiplying with an array I achieve two goals:

  • I make sure that we always get the correct shape both sm.Integral(f(t), t) and sm.Integral(m, t) should work. In this case m is translated to an array of m.
  • It allows setting weights for nodes. In backward Euler you have a weight of 0 for the first node.

Taking the sum differs a little based on the fact whether we'd like to execute the actual integration or whether we'd like to keep the items separate.
When computing the gradient of a state, we want the gradient for each of the nodes separately, so we should not take the sum and just make sure that we apply the right weight using the multiplication array.
However, suppose we want the gradient of sm.Integral(f(t)**2 * m**2, t) to m, i.e. sm.Integral(2 * f(t)**2 * m, t), then we do need to properly integrate it by taking the sum.

@tjstienstra tjstienstra marked this pull request as ready for review February 6, 2024 08:17
opty/utils.py Outdated Show resolved Hide resolved
@moorepants
Copy link
Member

You happy with this? I think it is good to go in if you do.

@tjstienstra
Copy link
Contributor Author

I think that it is good for now. Unless you're planning to add a lot of advanced algorithms soon, I really don't see the need to make it more advanced. Like the internals may change. The only thing is that the keyword arguments expect fixed time stepping.
All other limitations seem to be covered using errors.

@moorepants
Copy link
Member

Ok. let's merge. Nice addition!

@moorepants moorepants merged commit f380c07 into csu-hmc:master Feb 7, 2024
12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants