Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rework adding costs #32

Open
HEmile opened this issue Mar 5, 2020 · 0 comments
Open

Rework adding costs #32

HEmile opened this issue Mar 5, 2020 · 0 comments
Labels
enhancement New feature or request

Comments

@HEmile
Copy link
Owner

HEmile commented Mar 5, 2020

Now that wrappers are mostly outside of the API (outside exception cases), having @cost unwrap and then add as a cost seems unnecessary. The wrapper could be removed in favour of storch.add_cost, or alternatively tensor.backward() should implement what is currently implemented in storch.backward to make it more "PyTorchy".

I'm still not a big fan of that last idea, as it does not allow computing the extra cost nodes and adding them together in one big loss, which is much more efficient. So I think storch.add_cost should be the API to settle on.

@HEmile HEmile added the enhancement New feature or request label Mar 5, 2020
@HEmile HEmile added this to To do in Stochastic computation graphs via automation Mar 5, 2020
HEmile added a commit that referenced this issue Mar 5, 2020
- Removed the cost wrapper as it did not have much of a logical use anymore. See #32.
- Replaced `DeterministicTensor` by `CostTensor`, as its functionality was only to be able to accomodate positive `is_cost` checks.
- Added a experimental`reduce` wrapper that is able to reduce a batched dim without raising an error. Might remove it, as the use case did not actually need it.
- Reworked `storch.nn.b_binary_cross_entropy` to better accomodate the current API.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Development

No branches or pull requests

1 participant