Skip to content

Here is the implementation of Maxout Layer from paper: https://arxiv.org/pdf/1302.4389.pdf in PyTorch. In forward pass, maxout for the feature maps of CNN is calculated. For backward pass, derivative of error w.r.t input is only propagated through the cells of feature maps array which were activated in forward pass(which is derivative of Maxout)…

Notifications You must be signed in to change notification settings

Usama113/Maxout-PyTorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 

Repository files navigation

Maxout-PyTorch

Here is the implementation of Maxout Layer from paper: https://arxiv.org/pdf/1302.4389.pdf in PyTorch. In forward pass, maxout for the feature maps of CNN is calculated. For backward pass, derivative of error w.r.t input is only propagated through the cells of feature maps array which were activated in forward pass(which is derivative of Maxout). To be sure this works fine, results were compared with Maxout in Theano(which is with the name featurepool layer) and were confirmed to be working fine. To make it work according to your desire, just change max_out parameter in forward function. The code is very efficient is fast.

About

Here is the implementation of Maxout Layer from paper: https://arxiv.org/pdf/1302.4389.pdf in PyTorch. In forward pass, maxout for the feature maps of CNN is calculated. For backward pass, derivative of error w.r.t input is only propagated through the cells of feature maps array which were activated in forward pass(which is derivative of Maxout)…

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages