Multilingual Biomedical PLM-KBioXLM
from transformers import RobertaModel
model=RobertaModel.from_pretrained('ngwlh/KBioXLM')
- Torch
- Transformers
- Deepspeed
- Tensorboard
- Firstly, in pre-training: ① XLMR+Pre training: Process first_ Data, pre train after obtaining data; ② KBioXLM follows the same steps as XLMR+Pretraining
- After obtaining the KBioXLM model, call the model in downstream to verify its effectiveness