You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This repo uses the output embedding of [cls] as the query/passage embedding.
RepBERT uses the average token embedding as the query/passage embedding.
The encoder implementation is different.
Is the generating of passage_embeddings of this program the same as the RepBERT program?:
python precompute.py --load_model_path ./data/ckpt-350000 --task doc
python precompute.py --load_model_path ./data/ckpt-350000 --task query_dev.small
python precompute.py --load_model_path ./data/ckpt-350000 --task query_eval.small
The text was updated successfully, but these errors were encountered: