Skip to content
Few-Shot Learning with Global Class Representations
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.

Few-Shot Learning with Global Class Representations

Created by Tiange Luo*, Aoxue Li*, Tao Xiang, Weiran Huang and Liwei Wang



This is the repository for our ICCV 2019 paper (arXiv report here).

In this paper, we propose to tackle the challenging few-shot learning (FSL) problem by learning global class representations using both base and novel class training samples. In each training episode, an episodic class mean computed from a support set is registered with the global representation via a registration module. This produces a registered global class representation for computing the classification loss using a query set. Though following a similar episodic training pipeline as existing meta learning based approaches, our method differs significantly in that novel class training samples are involved in the training from the beginning. To compensate for the lack of novel class training samples, an effective sample synthesis strategy is developed to avoid overfitting. Importantly, by joint base-novel class training, our approach can be easily extended to a more practical yet challenging FSL setting, i.e., generalized FSL, where the label space of test data is extended to both base and novel classes. Extensive experiments show that our approach is effective for both of the two FSL settings.

For more details of our framework, please refer to our paper or project website.


If you find our work useful in your research, please consider citing:

      title={Few-Shot Learning with Global Class Representations},
      author={Luo, Tiange and Li, Aoxue and Xiang, Tao and Huang, Weiran and Wang, Liwei},
      journal={arXiv preprint arXiv:1908.05257},

About this repository

Due to company and patent issues, the authors are striving for releasing source codes. We will do our best to release at least the core code of the proposed module.

You can’t perform that action at this time.