Skip to content
This repository has been archived by the owner on Nov 22, 2022. It is now read-only.

implemented gelu activations #829

Closed

Conversation

shreydesai
Copy link
Contributor

Summary: Implements Gaussian Error Linear Units (GELUs) as an activation for Deep CNN representation. Also creates an interface to leverage different types of activation functions.

Differential Revision: D16462672

@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Jul 25, 2019
Differential Revision: D16403533

fbshipit-source-id: c54e8eed3ecebcba26863acf9c092221d23cd7d1
Differential Revision: D16403538

fbshipit-source-id: 2506200ed55a55688ab3480592259865e07bebec
Differential Revision: D16403554

fbshipit-source-id: 8fac161e20069f88be511eebd98274ace43e3d39
Summary:
Pull Request resolved: facebookresearch#829

Implements Gaussian Error Linear Units (GELUs) as an activation for Deep CNN representation. Also creates an interface to leverage different types of activation functions.

Differential Revision: D16462672

fbshipit-source-id: 177dff360ef46f0e6041c76712e18eee89666b0a
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 3a3aea4.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants