Skip to content

Domain Adaptation for Large Language Models Multi-label Classifiers

Notifications You must be signed in to change notification settings

mirunabetianu/DALLMi

Repository files navigation

The file requirements.txt should contain all the necessary packages to run the code. I only included what was necessary for this part.

The main script is pu_bert_trainer.py

python3 pu_bert_trainer.py

The hyperparmeters are set in the param_grid dictionary at line 53 in pu_bert_trainer.py

param_grid = {
    'learning_rate': [1e-4],
    'batch_size': [16], # unlabelled (combined) samples
    'num_epochs': [12],
    'gamma': [1.0],
    'alpha': [1.0],
    'batch_size_small': [4], # positive samples, 1 batch for each label
    'dropout': [0.5],
    'weight_decay': [0.01]
}

About

Domain Adaptation for Large Language Models Multi-label Classifiers

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published