Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added clearer explanation at the end of Tutorial 0 and fixed doc typos #2

Merged
merged 2 commits into from
Dec 6, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion examples/motion_planning_2d.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
# From the root project folder do:
# mkdir expts
# cd expts
# wget https://dl.fbaipublicfiles.com/theseus/motion_planning_dataset.tar.gz
# wget https://dl.fbaipublicfiles.com/theseus/motion_planning_data.tar.gz
# tar -xzvf motion_planning_data.tar.gz
# cd ..
# python examples/motion_planning_2d.py
Expand Down
5 changes: 5 additions & 0 deletions examples/tactile_pose_estimation.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,8 @@
# Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.

import pathlib
import random

Expand Down
6 changes: 3 additions & 3 deletions tutorials/00_introduction.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -537,11 +537,11 @@
"id": "d9cc32be",
"metadata": {},
"source": [
"The `TheseusLayer` allows for backpropagation, and is semantically similar to a layer in a PyTorch neural network. Backpropagating through the `TheseusLayer` allows for learning of any necessary quantities of the problem, e.g., `CostWeight`, `Variable`, etc. The following tutorials will illustrate several applications for learning with a `TheseusLayer`.\n",
"The `TheseusLayer` allows for backpropagation, and is semantically similar to a layer in a PyTorch neural network. Backpropagating through the `TheseusLayer` allows for learning of any necessary quantities of the problem, such as cost weights, initial values for the optimization variables, and other parameters for the optimization. The following tutorials will illustrate several applications for learning with a `TheseusLayer`.\n",
"\n",
"To distinguish between the optimization done by the Theseus optimizers, and those done outside the Theseus optimizers (e.g., by PyTorch's autograd during learning), we will refer to them as *inner loop optimization* and *outer loop optimization* respectively. Note that the inner loop optimization optimizes only the optimization variables, and the outer loop optimization can optimize only (selected) auxiliary variables provided to the PyTorch autograd optimizers. A call to `TheseusLayer` `forward()` performs only inner loop optimization; typically the PyTorch autograd learning steps will perform the outer loop optimizations. We will see examples of this in the following tutorials.\n",
"To distinguish between the optimization done by the Theseus optimizers, and those done outside the Theseus optimizers (e.g., by PyTorch's autograd during learning), we will refer to them as *inner loop optimization* and *outer loop optimization*, respectively. Note that the inner loop optimization optimizes only the optimization variables, and the outer loop optimization can optimize torch tensors associated with selected variables provided to the PyTorch autograd optimizers. A call to `TheseusLayer` `forward()` performs only inner loop optimization; typically the PyTorch autograd learning steps will perform the outer loop optimizations. We will see examples of this in the following tutorials.\n",
"\n",
"Any updates to the auxiliary variables during the learning loop are best done via the `forward` method of the `TheseusLayer`. While variables and objectives can be updated independently without going through the `TheseusLayer`, this may result in an error during optimization, depending on the states of the internal data structures. Therefore, we recommend that any updates during learning be performed only via the `TheseusLayer`."
"During the outer loop, we will commonly want to update Theseus variables before running inner loop optimization; for example, to set initial values for optimization variables, or to update auxiliary variables with tensors learned by the outer loop. We recommend that such updates to Theseus variables are done via `TheseusLayer.forward()`. While variables and objectives can be updated independently without going through `TheseusLayer.forward()`, following this convention makes it explicitly what the latest inputs to the `TheseusLayer` are, helping to avoid hidden errors and unwanted behavior. Therefore, we recommend that any updates during learning be performed only via the `TheseusLayer`."
]
}
],
Expand Down