You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Suppose I have 2 weight arrays: one double, one float.
SameDiff can handle this (with appropriate casts) but currently SameDiff creates a single INDArray for updater state. Thus when we try to update both parameters, one will fail due to mixed datatype in ops (double/float or float/double)
2 possiblities are available here:
Split updater state by variable (updater state datatype matches variable datatype)
Keep single updater state array, but add casting to the updaters to handle various input datatypes
The text was updated successfully, but these errors were encountered:
* #6992 SameDiff mixed precision training support
* Placeholder shape validation
* Checkpoint listener
* SameDiff checkpoint listener
* SameDiff: Remove no longer required trainable params config from TrainingConfig
* SameDiff: add name scopes
* SameDiff name scopes - javadoc and tests
* #7802 Evaluation class - report single class not macro avg in stats() for binary case
* 7804 Arbiter - update score functions to use ND4J evaluation metric enums
* SameDiff flatbuffers export: don't export arrays for array type variables (not required)
Suppose I have 2 weight arrays: one double, one float.
SameDiff can handle this (with appropriate casts) but currently SameDiff creates a single INDArray for updater state. Thus when we try to update both parameters, one will fail due to mixed datatype in ops (double/float or float/double)
2 possiblities are available here:
The text was updated successfully, but these errors were encountered: