-
Notifications
You must be signed in to change notification settings - Fork 775
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How can use bertviz for Bert Questioning Answering?? #26
Comments
Hi, I've recently pushed some significant changes that enable you to run any model from the transformers library. If you are using the head view, you should be able to adapt the following notebook for your use case: https://github.com/jessevig/bertviz/blob/master/head_view_bert.ipynb |
Thanks for the quick reply. I tried out but it didn't show anything. I did as following:
Also I tried in this way to load the model using
ERROR:
Here |
Hmm, I suspect the model is not getting loaded correctly. I believe the directory might need a configuration file as well, though I can't recall. Are you able to display the value of attention that is being returned when you call your model? |
Okay, I think I see the problem in the second version. You need to set output_attentions=True. Want to try again with that change and let me know if it works? |
I tried as following:
But ended up showing nothing in the view as below in the figure I got the same output in first version :( |
Are you able to display the value of the attention returned again? |
Do i need to check with website settings(d3) for the visualization??? |
The attention looks good. It might relate to the website settings. Do you still have the following at the top?: %%javascript I'm working on a change where this is not needed, but currently it is. Sometimes there's some weirdness where you have to refresh your browser. Which browser are you using by the way? |
The javascript code looks good. Yeah I was thinking the same. Maybe I have some problem with the browser. I am currently using google chrome. |
Hmm, not sure if problem is with browser or not. Are you able to run the original version of the notebook without using your fine-tuned model? |
Occassionally it doesn't work the first time. Could you try re-running from the beginning. If that doesn't work could you close and re-open your browser? If that doesn't work, could you check the Javascript console for any errors? |
Sure Thank you so much ! :) |
Hi found this error in the web browser:
|
I see, thanks. So you're running on Colab? Sorry, I was assuming you were running it on your local machine with Jupyter notebook. Colab requires a different javascript setup, as shown here: https://colab.research.google.com/drive/1PEHWRHrvxQvYr9NFRC-E_fr3xDq1htCj#scrollTo=Mv6H9QK9yLLe Instead of: %%javascript You need: def call_html(): And then you need to add this line right before you invoke head_view(): call_html() Please let me know if that works. Sorry, working on making this more uniform between platforms. Javascript is always a challenge, especially in Colab! |
Yeah. It works now Thank you so much !! |
Awesome, glad to hear. Working on changes to make this easier in the future. Thanks. |
One more question, Sometimes I see in colab notebook that it throws "Runtime Disconnected" error when using long sequences. Is that common?? |
I hadn't seen that before, but I'm not surprised that it might happen for very long sequences. How long are the sequences? |
Also do you know if that error happens after the attentions have been returned, or before? I'm assuming it probably happens between code where attention is computed and visualization is rendered, but curious. |
length of sequence sentence_a = 40 Yes It happens in between attention calculation and rendering |
sentence_a = "who plays lady talisa in game of thrones" Oona Castilla Chaplin ( ˈuna kasˈtija ˈt͡ʃaplin ) ( born 4 June 1986 ) , known professionally as Oona Chaplin , is a Spanish actress . Her roles include Talisa Maegyr in the HBO TV series Game of Thrones , The Crimson Field and the series Taboo . " |
Thanks for sharing that. The tool doesn't currently scale well to long texts unfortunately. |
Is there any way to see the attention visualization for Bert Questioning and Answering model ?? Because I couldn't see BertForQuestionAnswering class in bertviz.pytorch_transformers_attn? I have fine-tuned over a QA dataset using hugging-face transformers and wanted to see the visualization for it. Can you suggest any way of doing it ??
The text was updated successfully, but these errors were encountered: