Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ActivationListener isn't called on execBackwards #8319

Closed
orausch opened this issue Oct 28, 2019 · 2 comments · Fixed by KonduitAI/deeplearning4j#21
Closed

ActivationListener isn't called on execBackwards #8319

orausch opened this issue Oct 28, 2019 · 2 comments · Fixed by KonduitAI/deeplearning4j#21
Assignees
Labels
Milestone

Comments

@orausch
Copy link

@orausch orausch commented Oct 28, 2019

Issue Description

I want to get the calculate the loss and gradients for a batch at the same time (#8318 related) in my training loop. @AlexDBlack suggested to add an activation listener to the variable, but this doesn't seem to work. The following example does not print the expected output from the listener:
https://gist.github.com/orausch/f78035f42940e5f614c04f32cd53a271

Version Information

beta5

@orausch

This comment has been minimized.

Copy link
Author

@orausch orausch commented Oct 28, 2019

As a user I would also expect sd.getArrForVarName("loss") to also work, but that returns null.

@AlexDBlack AlexDBlack added this to the 1.0.0-beta6 milestone Oct 31, 2019
@AlexDBlack AlexDBlack self-assigned this Nov 1, 2019
AlexDBlack added a commit to KonduitAI/deeplearning4j that referenced this issue Nov 1, 2019
… called

Signed-off-by: AlexDBlack <blacka101@gmail.com>
@AlexDBlack

This comment has been minimized.

Copy link
Contributor

@AlexDBlack AlexDBlack commented Nov 1, 2019

Thanks for reporting this, it has been fixed here: KonduitAI#21
It will be merged back to Eclipse master soon

AlexDBlack added a commit to KonduitAI/deeplearning4j that referenced this issue Nov 1, 2019
… called

Signed-off-by: AlexDBlack <blacka101@gmail.com>
AlexDBlack added a commit to KonduitAI/deeplearning4j that referenced this issue Nov 2, 2019
* MKLDNN LSTM forward implementation (disabled pending eclipse#8331)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* eclipse#8318 add SameDiff.calculateGradientsAndOutputs

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Disable mkldnn backprop for now - pending fix, issue eclipse#8335

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* eclipse#8337 Fix CudaExecutioner unnecessary result array allocation/replacement

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small FlatBuffers serde fix, UInt8

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* eclipse#8135 ImagePreProcessingScaler - add segmentation support

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* eclipse#8319 Ensure listeners are called when they are supposed to be called

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* eclipse#8214 UNet (non-pretrained) last conv layer kernal size fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants
You can’t perform that action at this time.