Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Keras] fix weight shape in dilated conv #1715

Merged
merged 4 commits into from
Sep 15, 2018
Merged

[Keras] fix weight shape in dilated conv #1715

merged 4 commits into from
Sep 15, 2018

Conversation

Huyuwei
Copy link
Contributor

@Huyuwei Huyuwei commented Sep 13, 2018

The weight shape of dilated conv passed from frontend to nnvm compiler should be the original h and w, not including inserted zeros.

@Huyuwei
Copy link
Contributor Author

Huyuwei commented Sep 13, 2018

not ready for merge, one issue to fix.

@Huyuwei
Copy link
Contributor Author

Huyuwei commented Sep 13, 2018

ready to merge now.

@tqchen
Copy link
Member

tqchen commented Sep 14, 2018

please rebase against the master. @kazum @PariksheetPinjari909 please review

Copy link
Contributor

@kazum kazum left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added a minor comment. Other parts look good to me.

alpha = keras_layer.alpha if hasattr(keras_layer, "alpha") else 1.6732
gamma = keras_layer.gamma if hasattr(keras_layer, "gamma") else 1.0507
alpha = keras_layer.alpha if hasattr(keras_layer, "alpha") else 1.67326324235437728481704299
gamma = keras_layer.gamma if hasattr(keras_layer, "gamma") else 1.05070098735548049341933498
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's use the exact same values as the Keras project's ones.

https://github.com/keras-team/keras/blob/master/keras/activations.py#L80

    alpha = 1.6732632423543772848170429916717
    scale = 1.0507009873554804934193349852946

Copy link
Contributor Author

@Huyuwei Huyuwei Sep 14, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have updated. BTW, I previously used shorter alpha and scale in order not to exceed the 100 characters per line lint limit.

@tqchen
Copy link
Member

tqchen commented Sep 15, 2018

Copy link
Contributor

@kazum kazum left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks!

@tqchen tqchen merged commit 8f5d3bd into apache:master Sep 15, 2018
@tqchen
Copy link
Member

tqchen commented Sep 15, 2018

Thanks @Huyuwei @kazum , this is now merged

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants