Skip to content


Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time

Decoupled Networks

By Weiyang Liu*, Zhen Liu*, Zhiding Yu, Bo Dai, Rongmei Lin, Yisen Wang, James Rehg, Le Song

(* equal contribution)


Decoupled Networks is released under the MIT License (refer to the LICENSE file for details).


  • Examples for ImageNet-2012
  • Examples for CIFAR-100


  1. Introduction
  2. Short Video Introduction
  3. Citation
  4. Requirements
  5. Usage


Inner product-based convolution has been a central component of convolutional neural networks (CNNs) and the key to learning visual representations. Inspired by the observation that CNN-learned features are naturally decoupled with the norm of features corresponding to the intra-class variation and the angle corresponding to the semantic difference, we propose a generic decoupled learning framework which models the intra-class variation and semantic difference independently.

Specifically, we first reparametrize the inner product to a decoupled form and then generalize it to the decoupled convolution operator which serves as the building block of our decoupled networks. We present several effective instances of the decoupled convolution operator. Each decoupled operator is well motivated and has an intuitive geometric interpretation. Based on these decoupled operators, we further propose to directly learn the operator from data.

The latest version of our paper is available at arXiv and here. Our work is largely inspired and motivated by the observation that the CNN-learned features are naturally decoupled, as shown as follows.

As illustrated as follows, the central idea of decoupled networks is the decoupled convolution, which is used to replace all the original convolution operators.

Short Video Introduction

The following is a short video introduction by Zhen Liu.



If you find our work useful in your research, please consider to cite:

    author = {Liu, Weiyang and Liu, Zhen and Yu, Zhiding and Dai, Bo and Lin, Rongmei and Wang, Yisen and Rehg, James M. and Song, Le},
    title = {Decoupled Networks},
    booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
    year = {2018}


  1. Python 2.7
  2. TensorFlow (Tested on version 1.01)
  3. numpy


Part 1: Clone the repositary

  • Clone the repositary.

    git clone

Part 2: CIFAR-100

  • Training DCNets with TanhConv + Cosine on CIFAR-100:

    cd $DCNET_ROOT/dcnet_cifar100/tanh_cos
  • To train other models, change the model name (tanh_cos) in the script above to your desired one.

Part 3: ImageNet-2012

  • Download ImageNet-2012 dataset and process the dataset with TensorFlow-Slim.

  • We provide one example with the modified Resnet-18 for ImageNet-2012. We use TanhConv magnitude function + Cosine angular activation in this implementation. The user can replace magnitude function and angular function with the other choices mentioned in the paper or any other customized functions.

    cd $DCNET_ROOT/dcnet_imagenet
  • We provide our result for this implementation, which matches our reported result 88.9% in the paper.


No releases published


No packages published