Skip to content

Simple tensorflow implementation of "Selective Kernel Networks"

License

Notifications You must be signed in to change notification settings

bravotty/SKNet-tensorflow

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SKNet-tensorflow

auc

Simple tensorflow implementation of Selective Kernel Networks

If you want to see the original author's code, pls refer to this github link

Version 1.0 : SKNet block without groups and BN params.

Version 1.1 : Set -> BN params. and remove -> fc layer BN ops.

  • Version 1.2 will be coming soon.

Requirements :

  • Python >= 3.6
  • Tflearn >= 0.3.2
  • Tensorflow >= 1.9.0

SKNet Block Structure :

Usage :

import SKNet
import tensorflow.contrib.slim as slim

...
conv1 = slim.conv2d(inputs, 64, [3, 3], scope='conv1')
conv2 = SKNet(conv1, 3, 2, is_training=True)
conv3 = slim.conv2d(inputs, 3, [3, 3],  scope='out')
...

training code

Be Careful !

At training stage, you may calculate the moving_mean and moving_var.

update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)
with tf.control_dependencies(update_ops):
# Ensures that we execute the update_ops before performing the train_step
    train_step = tf.train.GradientDescentOptimizer(learning_rate).minimize(loss)

Contact :

Any improvement or bug-fixing is welcome.

Create a pull request or issue when you are done.

License :

The MIT License

About

Simple tensorflow implementation of "Selective Kernel Networks"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages