Skip to content

XinshaoAmosWang/DeepCriticalLearning

Repository files navigation

Deep Critical Learning (i.e., Deep Robustness) In The Era of Big Data

Here are related papers on the fitting and generalization of deep learning:

See Citation Details

Please kindly cite the following papers if you find this repo useful.

@article{wang2022proselflc,
  title={ProSelfLC: Progressive Self Label Correction Towards A Low-Temperature Entropy State},
  author={Wang, Xinshao and Hua, Yang and Kodirov, Elyor and Mukherjee, Sankha Subhra and Clifton, David A and Robertson, Neil M},
  journal={bioRxiv},
  year={2022}
}
@inproceddings{wang2021proselflc,
  title={ {ProSelfLC}: Progressive Self Label Correction
  for Training Robust Deep Neural Networks},
  author={Wang, Xinshao and Hua, Yang and Kodirov, Elyor and Clifton, David A and Robertson, Neil M},
  booktitle={CVPR},
  year={2021}
}
@phdthesis{wang2020example,
  title={Example weighting for deep representation learning},
  author={Wang, Xinshao},
  year={2020},
  school={Queen's University Belfast}
}
@article{wang2019derivative,
  title={Derivative Manipulation for General Example Weighting},
  author={Wang, Xinshao and Kodirov, Elyor and Hua, Yang and Robertson, Neil},
  journal={arXiv preprint arXiv:1905.11233},
  year={2019}
}
@article{wang2019imae,
  title={{IMAE} for Noise-Robust Learning: Mean Absolute Error Does Not Treat Examples Equally and Gradient Magnitude’s Variance Matters},
  author={Wang, Xinshao and Hua, Yang and Kodirov, Elyor and Robertson, Neil M},
  journal={arXiv preprint arXiv:1903.12141},
  year={2019}
}

PyTorch Implementation for ProSelfLC, Derivative Manipulation, Improved MAE

  • Easy to install
  • Easy to use
  • Easy to extend: new losses, new networks, new datasets and batch loaders
  • Easy to run experiments and sink results
  • Easy to put sinked results into your technical reports and academic papers.

Demos with extra configurations to get deterministic results, so that we can make fair comparisons

Install

See Install Guidelines

Set the Pipenv From Scratch

  • sudo apt update && sudo apt upgrade
  • sudo apt install python3.8
  • curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py
  • python3.8 get-pip.py
  • vim ~/.bashrc -> add export PATH="/home/ubuntu/.local/bin:$PATH" -> source ~/.bashrc
  • pip3 install pipenv

Build env for this repo using pipenv

  • git clone this repo
  • cd this repo
  • pipenv install -e . --skip-lock

How to use

Run experiments

  • cd this repo
  • pipenv shell
  • pre-commit install
  • run experimenets:
    • CIFAR-100: CUDA_VISIBLE_DEVICES=0 CUBLAS_WORKSPACE_CONFIG=:4096:8 TOKENIZERS_PARALLELISM=false python -W ignore tests/convnets_cifar100/trainer_calibration_vision_cifar100_covnets_proselflc.py
    • Protein classification: CUDA_VISIBLE_DEVICES=0 CUBLAS_WORKSPACE_CONFIG=:4096:8 TOKENIZERS_PARALLELISM=false python -W ignore tests/protbertbfd_deeploc/MS-with-unknown/test_trainer_2MSwithunknown_proselflc.py

Visualize results

The results are well sinked and organised, e.g.,

Examples of sinked experimental configs and resutls

See Sinked Results

How to extend this repo

Supplementary material

Talks

Notes and disclaimer

  • For any specific research discussion or potential future collaboration, please feel free to contact me.
  • This work is a personal research project and in progress using personal time.