Run
| RanxHub ID: | msmarco-passage/trec-dl-2019/ranxhub/distilbert-kd |
|---|---|
| Version: | 1.0 |
| Description: | DistilBERT KD run reproduced using Pyserini. |
| Tags: | Retrieval |
| Date: | 3 February 2023 |
| Run Authors: | Elias Bassani |
| From Paper: | |
| Paper Authors: |
Model
| Name: | DistilBERT KD |
|---|---|
| Description: | The authors use an ensemble of BERTcat models (the vanilla BERT passage re-ranking model) to teach and improve a DistilBERT ranker with a Margin-MSE loss. |
| Tags: | Dense Retrieval · Knowledge Distillation · DistilBERT |
| Paper: | Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation |
| Authors: | Sebastian Hofstätter · Sophia Althammer · Michael Schröder · Mete Sertkan · Allan Hanbury |
Results
| NDCG@10 | MAP@1000 | Recall@1000 |
|---|---|---|
| 0.6994296896841119 | 0.4052535407823343 | 0.7652937677433573 |