Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fc_out, relu, top_k_op #2

Open
WuJunhui opened this issue Jun 6, 2018 · 2 comments
Open

fc_out, relu, top_k_op #2

WuJunhui opened this issue Jun 6, 2018 · 2 comments

Comments

@WuJunhui
Copy link

WuJunhui commented Jun 6, 2018

https://github.com/USTC-Video-Understanding/I3D_Finetune/blob/master/Demo_Transfer_rgb.py#L139
Hi,
I think the last fc layer may should not use an activation unit.
If the output of fc layer is all negative, after the ReLU unit, it would be all zero.
In this case, top_k_op = tf.nn.in_top_k(fc_out, label_holder, 1) will always return True.

@Rhythmblue
Copy link

I think you are right.
When I use this code to train , the result will be all the same sometimes.
You can just remove parameter of activation.

vra added a commit that referenced this issue Jun 10, 2018
vra added a commit that referenced this issue Jun 10, 2018
vra added a commit that referenced this issue Jun 10, 2018
@vra
Copy link
Contributor

vra commented Jun 10, 2018

Hi @WuJunhui ,
Thanks for your helpful advice, we have fix this issue in latest version of this repo. Please run git pull to download the newest code.

vra added a commit that referenced this issue Jun 15, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants