You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It looks that the shared models are pretrained models before finetuning.
Would it be possible to get access to two finetuned models (MIMIC-CXR/Open-I generation finetuned models)? I would use them exclusively for research purposes and cite this work accordingly.
Thank you so much for this repository.
@jungokasai I'm sorry that I can't directly provide the dozens of finetuned models (56 different models: 7 models in each task x 4 downstream tasks x 2 datasets). But it is rather straightforward. Using the pre-trained models we provide, you can fine-tune the model for each downstream task.
It looks that the shared models are pretrained models before finetuning.
Would it be possible to get access to two finetuned models (MIMIC-CXR/Open-I generation finetuned models)? I would use them exclusively for research purposes and cite this work accordingly.
Thank you so much for this repository.
Jungo
https://homes.cs.washington.edu/~jkasai/
The text was updated successfully, but these errors were encountered: