Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RepBERT #4

Closed
wangjiajia5889758 opened this issue May 3, 2021 · 1 comment
Closed

RepBERT #4

wangjiajia5889758 opened this issue May 3, 2021 · 1 comment

Comments

@wangjiajia5889758
Copy link

Is the generating of passage_embeddings of this program the same as the RepBERT program?:

python precompute.py --load_model_path ./data/ckpt-350000 --task doc
python precompute.py --load_model_path ./data/ckpt-350000 --task query_dev.small
python precompute.py --load_model_path ./data/ckpt-350000 --task query_eval.small

@jingtaozhan
Copy link
Owner

This repo uses the output embedding of [cls] as the query/passage embedding.
RepBERT uses the average token embedding as the query/passage embedding.
The encoder implementation is different.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants