Skip to content

UCSD-AI4H/SSL-TL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Transfer Learning or Self-supervised Learning? A Tale of Two Pretraining Paradigms

Pytorch code and models for paper

Transfer Learning or Self-supervised Learning? A Tale of Two Pretraining Paradigms

Xingyi Yang∗, Xuehai He∗,Yuxiao Liang, Yue Yang, Shanghang Zhang, Pengtao Xie *Equally contributed

This repository contains code and pre-trained models used in the paper and 2 demos to demonstrate:

  1. Code for a comprehensive study between SSL and TL regarding which one works better under
    • domain difference between source15and target tasks,
    • the amount of pretraining data
    • class imbalance in source data
    • usage of target data for additional pretraining
  2. Code to calculate domain distance between source domain and target domain in term of (1)Visual distance and (2)Class similarity

Dependencies:

Datasets

In the paper, we used data from 5 source and 4 target datasets:

File orgnization

    - ssl (Self-supervise pretraining)
        - moco (MoCo pretraining)
    - tl (Supervised pretraining)
    - finetune (Fintune on Target tasks)
    - dataset (Datasplit for Caltech256)
    - domain (visual domain distance & label similarity)

Reference

  1. MoCo: https://github.com/facebookresearch/moco

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages