Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running locally with custom models #20

Open
Jess0-0 opened this issue Dec 9, 2020 · 4 comments
Open

Running locally with custom models #20

Jess0-0 opened this issue Dec 9, 2020 · 4 comments

Comments

@Jess0-0
Copy link

Jess0-0 commented Dec 9, 2020

Hi,

Thank you so much for this awesome project! I'm trying to visualize the attention weights of the model I trained (RoBERTa model with a multi-task final layer, fine-tuned using HuggingFace) and I'm wondering if that is supported by exBERT.
Also, I'm wondering if there will be any modification of the code needed if I trained my model using HuggingFace version later than v2.8. Thanks a lot!

@bhoov
Copy link
Owner

bhoov commented Jan 4, 2021

Apologies for taking so long to respond -- exBERT should work out of the box with Roberta but I haven't run it through tests for models trained on more recent versions of huggingface. My guess is that it would work as I am not aware of differences to how they save the model's weights. Would you be willing to try and see what issues you run into with your model? I will gladly take a look to fix/update anything that is not working.

@daddydrac
Copy link

@bhoov - I am unclear how to get this to connect to my existing ClinicalBERT model, can you please provide some guidance so I can see metrics in the web UI?

@bhoov
Copy link
Owner

bhoov commented Jan 20, 2021

The following instructions pull the model from https://huggingface.co/emilyalsentzer/Bio_ClinicalBERT with some sensible defaults. They should also work for a local model saved with huggingface's API

There are 2 main parts to exbert:

  1. Part A: The attention visualization and layer exploration (quick)
  2. Part B: Annotating a custom (small-ish) corpus and searching for nearest embeddings over them. (harder)

After you setup the environment:

  1. Part A: python server/main.py --model emilyalsentzer/Bio_ClinicalBERT --kind bidirectional
  2. Part B: From the root of the project:
cd server/data_processing
python create_corpus.py -f MY_LOCAL_CLINICAL_CORPUS.txt -o ../../clinical_corpus --model emilyalsentzer/Bio_ClinicalBERT
cd ../..
python server/main.py --model emilyalsentzer/BioClinicalBERT --kind bidirectional --corpus clinical_corpus/

@daddydrac
Copy link

daddydrac commented Jan 30, 2021

@bhoov Thank you 🙏 kindly for that explanation this helps a lot.

As for using makefile, it does break, and have been unsuccessful starting it up via other instructions.

If you can help me get it going, I’d be more than happy to contribute a Dockerfile so it will run everywhere. I also think that is a opportunity for me to refac some of the code & upgrade to PyTorch 1.7 & HuggingFace:latest, while I’m in there containerizing it. Let me know plz.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants