Skip to content

PyTorch implementation of Geoffrey Hinton's Dynamic Routing Between Capsules

License

Notifications You must be signed in to change notification settings

spikefairway/CapsNet-PyTorch

 
 

Repository files navigation

CapsNet-PyTorch

A PyTorch implementation of CapsNet based on Geoffrey Hinton's paper Dynamic Routing Between Capsules.

capsVSneuron

This figure is from CapsNet-Tensorflow.

This implementation is forked and revised version from motokimura's implementation. Revised points are the followings:

  • Fix softmax dimension in routing.
  • Initialize W in DigitCaps with uniform distribution.
  • Use Conv2D(in_channels, out_capsules * out_capsule_dim) as capsule layers in PrimaryCaps, for efficient computation.
  • Set initial learning rate to 0.001.
  • Mask with true label in reconstruction.
  • Update b_ij with agreement for each sample.

Requirements

Usage

Step 1. Clone this repository

$ git clone https://github.com/motokimura/CapsNet-PyTorch.git
$ cd CapsNet-PyTorch

Step 2. Start the training

$ python main.py

Step 3. Check training status and validation accuracy from TensorBoard

# In another terminal window, 
$ cd CapsNet-PyTorch
$ tensorboard --logdir ./runs

# Then, open "http://localhost:6006" from your browser and 
# you will see something like the screenshots in the `Results` section.

Some training hyper parameters can be specified from the command line options of main.py.

In default, batch size is 128 both for training and validation, and epoch is set to 10. Learning rate of Adam optimizer is set to 0.001 and is exponentially decayed every epoch with the factor of 0.9.

For more details, type python main.py --help.

Results

Some results at default training settings are shown here.

Train & test loss

License

MIT License

References

About

PyTorch implementation of Geoffrey Hinton's Dynamic Routing Between Capsules

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%