Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About the entity disambiguation performance without candidate set #26

Closed
hitercs opened this issue Apr 20, 2021 · 2 comments
Closed

About the entity disambiguation performance without candidate set #26

hitercs opened this issue Apr 20, 2021 · 2 comments

Comments

@hitercs
Copy link

hitercs commented Apr 20, 2021

Hi,

Thanks for your work.

I run the experiment of evaluating the entity disambiguation performance without candidate set.
As shown in the paper, the performance should be.
image

However, when I run the entity disambiguation without candidate set using the provided checkpoint.
python evaluate_kilt_dataset.py path_to/fairseq_entity_disambiguation_aidayago path_to/datasets path_to/predictions --trie path_to/kilt_titles_trie_dict.pkl --batch_size 64 --device "cuda:0"
It gives the performance:

image

Is there anything wrong for my run?

@nicola-decao
Copy link
Contributor

nicola-decao commented Apr 20, 2021

Hi,

DISCLAIMER: I don't have access to the corporate machine I used for these experiments any more so I cannot check precisely what was the setting I used.

However, if it might be I didn't use the KILT trie but the YAGO trie. The KB AIDA uses is not the whole Wikipedia (5M items) but a 10x smaller set (400k if I remember). Additionally, the KILT KB is from 2019 were the AIDA KB is older and titles differ. Thus when using that the results should be higher.

Again, unfortunately I don't have code and data I used for that so if you want to use a different trie you need to build it yourself.

@hitercs
Copy link
Author

hitercs commented Apr 20, 2021

Thanks for your clarification. Well understood.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants