Skip to content

xlab-BioAI/UniStab

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

UniStab: A unified predictor of protein stability changes across all mutation types

PyTorch Lightning Hydra arXiv

UniStab model overview

Introduction

UniStab is a unified deep learning framework that predicts protein stability changes across single-point, multi-point, and indel mutations in an end-to-end manner.

Installation

git clone https://github.com/xlab-BioAI/UniStab.git

Requirements

conda env create -f env.yaml  
conda activate UniStab

Downloading weights

Download the pre-trained model weights from Google Drive and place them in the appropriate directory.

Training

To train the model with default configurations:

python src/train_lightning.py

You can modify training parameters in config/default.yaml.

Inference

sh infe.sh

Data

License

Acknowledgments

We gratefully acknowledge the following projects and their contributions:

Parts of the codebase are adapted from these excellent works.

Citation

% TODO: 论文引用条目

Reference

@article{lin2023evolutionary,
  title={Evolutionary-scale prediction of atomic-level protein structure with a language model},
  author={Lin, Zeming and Akin, Halil and Rao, Roshan and Hie, Brian and Zhu, Zhongkai and Lu, Wenting and Smetanin, Nikita and Verkuil, Robert and Kabeli, Ori and Shmueli, Yaniv and others},
  journal={Science},
  volume={379},
  number={6637},
  pages={1123--1130},
  year={2023},
  publisher={American Association for the Advancement of Science}
}

@article{dieckhaus2025protein,
  title={Protein stability models fail to capture epistatic interactions of double point mutations},
  author={Dieckhaus, Henry and Kuhlman, Brian},
  journal={Protein Science},
  volume={34},
  number={1},
  pages={e70003},
  year={2025},
  publisher={Wiley Online Library}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors