Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SameDiff: training nets with mixed precision variables fails on updater state view array type #6992

AlexDBlack opened this issue Jan 14, 2019 · 1 comment


Copy link

commented Jan 14, 2019

Suppose I have 2 weight arrays: one double, one float.
SameDiff can handle this (with appropriate casts) but currently SameDiff creates a single INDArray for updater state. Thus when we try to update both parameters, one will fail due to mixed datatype in ops (double/float or float/double)

2 possiblities are available here:

  1. Split updater state by variable (updater state datatype matches variable datatype)
  2. Keep single updater state array, but add casting to the updaters to handle various input datatypes

@AlexDBlack AlexDBlack added the SameDiff label Jan 14, 2019

AlexDBlack added a commit that referenced this issue May 29, 2019
Fixes and SameDiff functionality (#7807)
* #6992 SameDiff mixed precision training support

* Placeholder shape validation

* Checkpoint listener

* SameDiff checkpoint listener

* SameDiff: Remove no longer required trainable params config from TrainingConfig

* SameDiff: add name scopes

* SameDiff name scopes - javadoc and tests

* #7802 Evaluation class - report single class not macro avg in stats() for binary case

* 7804 Arbiter - update score functions to use ND4J evaluation metric enums

* SameDiff flatbuffers export: don't export arrays for array type variables (not required)

This comment has been minimized.

Copy link
Contributor Author

commented Jun 3, 2019

@AlexDBlack AlexDBlack closed this Jun 3, 2019

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
None yet
1 participant
You can’t perform that action at this time.