You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello Shivanshu, I'm not entirely sure what you mean. The BERT-based (and
similar keyword extractors)
work in supervised manner, and are a different branch of methods. You could
theoretically generate keywords with RaKUn and then learn them with bert,
however, this sounds a bit like the full neural approaches like the>
https://gitlab.com/matej.martinc/tnt_kid/-/tree/master
On Wed, Oct 21, 2020 at 8:36 AM Shivanshu Purohit ***@***.***> wrote:
Can I use a specific BERT model with this code? Say I want to use
bert-base-uncased to extract keywords. Will it work here?
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#6>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACMSERABOAWNCOWAJPRDQA3SLZ6N7ANCNFSM4SZKME6A>
.
Can I use a specific BERT model with this code? Say I want to use bert-base-uncased to extract keywords. Will it work here?
The text was updated successfully, but these errors were encountered: