Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Vectorization refactor #205

Merged
merged 27 commits into from
Jun 9, 2022
Merged

Vectorization refactor #205

merged 27 commits into from
Jun 9, 2022

Conversation

luisenp
Copy link
Contributor

@luisenp luisenp commented Jun 6, 2022

Refactored the vectorization logic. I did some quick tests on motion planning and tactile pose estimation, and seems to be working.

Some TODOs left:

  • Add unit tests for:
    • Shared aux. var computation
    • That schemas are computed correctly
    • Validate vectorized costs values
  • Add vectorized retraction after computing descent direction (separate PR, and we can move retract_and_update_variables in that one as well)
  • More thorough testing in the example applications.
  • Add robust costs (separate PR, Add Robust Cost Function #146).

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jun 6, 2022
@luisenp luisenp added enhancement New feature or request refactor Refactor library components and removed CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. labels Jun 6, 2022
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jun 6, 2022
@mhmukadam mhmukadam requested a review from rtqichen June 7, 2022 16:51
Copy link
Contributor

@fantaosha fantaosha left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall LGTM. The only thing I'm concerned with is to make sure _cached_errs and cached_jacobians in CostFunctionWrapper are cleared if the vars in corresponding cost functions are updated.

theseus/core/vectorizer.py Show resolved Hide resolved
theseus/core/vectorizer.py Show resolved Hide resolved
theseus/core/objective.py Outdated Show resolved Hide resolved
theseus/core/objective.py Outdated Show resolved Hide resolved
Copy link
Member

@mhmukadam mhmukadam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice work on the refactor @luisenp and of course nice work on the original features @fantaosha! Reviewed everything so far and left comments.

@mhmukadam mhmukadam requested a review from bamos June 9, 2022 16:56
@@ -64,6 +64,16 @@ def register_aux_vars(self, aux_var_names: Sequence[str]):
for name in aux_var_names:
self.register_aux_var(name)

def register_vars(self, vars: Iterable[Variable], is_optim_vars: bool = False):
for var_ in vars:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: trailing underscore.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch. Changed.

@@ -80,7 +80,7 @@ def ee_pose_err_fn(optim_vars, aux_vars):
max_iterations=15,
step_size=0.5,
)
theseus_optim = th.TheseusLayer(optimizer)
theseus_optim = th.TheseusLayer(optimizer, vectorize=False)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

True?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, forgot about these two. Added.

@luisenp luisenp merged commit 24e1295 into main Jun 9, 2022
@luisenp luisenp deleted the lep.vectorize branch June 9, 2022 20:13
@mhmukadam mhmukadam linked an issue Jul 12, 2022 that may be closed by this pull request
suddhu pushed a commit to suddhu/theseus that referenced this pull request Jan 21, 2023
* Created a wrapper cost function class that combines the aux vars for a cost function and its weight.

* Disabled support for optimization variables in cost weights.

* Changed Objective to iterate over CFWrapper if available, and TheseusLayer to create them at init time.

* Added a Vectorizer class and moved CFWrappers there.

* Renamed vectorizer as Vectorize, added logic to replace Objective iterator, and added it to TheseusLayer.

* Added a CostFunctionSchema -> List[CostFunction] to use for vectorization grouping.

* _CostFunctionWrapper is now meant to just store a cached value coming from vectorization. Otherwise it works just like the cost function it wraps.

* Added code to automatically compute shared vars in Vectorize.

* Changed vectorized costs construction to ensure that their weight is also a copy.

* Implemented main cost function vectorization logic.

* Updated bug that was causing detached gradients.

* Fixed invalid check in theseus end-to-end unit tests.

* Added unit test for schema and shared var computation.

* Added a test to check that computed vectorized errors are correct.

* Moved vectorization update call to base linearization class.

* Changed code to allow batch_size > 1 in shared variables.

* Fixed unit test and added call to Objective.update() in update_vectorization() if batch_size is None.

* Added new private iterator for vectorized costs.

* Replaced _register_vars_in_list with TheseusFunction.register_vars.

* Renamed vectorize_cost_fns kwarg as vectorize.

* Added license headers.

* Small refactor.

* Fixed bug that was preventing vectorized costs to work with to(). End-to-end tests now run with vectorize=True.

* Renamed the private Objective cost function iterator to _get_iterator().

* Renamed kwarg in register_vars.

* Set vectorize=True for inverse kinematics and backward tests.

* Remove lingering comments.


Co-authored-by: Taosha Fan <6612911+fantaosha@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. enhancement New feature or request refactor Refactor library components
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Automatic cost function grouping
6 participants