Skip to content

Commit

Permalink
Fixed typos (#508)
Browse files Browse the repository at this point in the history
  • Loading branch information
lintool committed Apr 26, 2021
1 parent 4b6e900 commit 6d48609
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion docs/experiments-distilbert_kd.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Pyserini: Reproducing DistilBERT KD Results

This guide provides instructions to reproduce the TCT-ColBERT dense retrieval model on the MS MARCO passage ranking task, described in the following paper:
This guide provides instructions to reproduce the DistilBERT KD dense retrieval model on the MS MARCO passage ranking task, described in the following paper:

> Sebastian Hofstätter, Sophia Althammer, Michael Schröder, Mete Sertkan, and Allan Hanbury. [Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation
.](https://arxiv.org/abs/2010.02666) arXiv:2010.02666, October 2020.
Expand Down
2 changes: 1 addition & 1 deletion docs/experiments-sbert.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ Hybrid retrieval with dense-sparse representations (without document expansion):
- dense retrieval with SBERT, brute force index.
- sparse retrieval with BM25 `msmarco-passage` (i.e., default bag-of-words) index.

```bas
```bash
$ python -m pyserini.hsearch dense --index msmarco-passage-sbert-bf \
--encoded-queries sbert-msmarco-passage-dev-subset \
sparse --index msmarco-passage \
Expand Down

0 comments on commit 6d48609

Please sign in to comment.