Run
| RanxHub ID: | msmarco-passage/trec-dl-2020/ranxhub/distilbert-kd-tasb |
|---|---|
| Version: | 1.0 |
| Description: | DistilBERT KD TASB run reproduced using Pyserini. |
| Tags: | Retrieval |
| Date: | 3 February 2023 |
| Run Authors: | Elias Bassani |
| From Paper: | |
| Paper Authors: |
Model
| Name: | DistilBERT KD TASB |
|---|---|
| Description: | The authors use an ensemble of BERTcat models (the vanilla BERT passage re-ranking model) to teach and improve a DistilBERT ranker with a Margin-MSE loss. The authors also use a topic-aware query and balanced margin sampling technique, called TAS-Balanced. |
| Tags: | Dense Retrieval · Knowledge Distillation · DistilBERT |
| Paper: | Efficiently Teaching an Effective Dense Retriever with Balanced Topic Aware Sampling |
| Authors: | Sebastian Hofstätter · Sheng-Chieh Lin · Jheng-Hong Yang · Jimmy Lin · Allan Hanbury |
Results
| NDCG@10 | MAP@1000 | Recall@1000 |
|---|---|---|
| 0.6853606924858654 | 0.46980015057307795 | 0.8726546234053667 |