Run
RanxHub ID: | msmarco-passage/dev/ranxhub/distilbert-kd |
---|---|
Version: | 1.0 |
Description: | DistilBERT KD run reproduced using Pyserini. |
Tags: | Retrieval |
Date: | 3 February 2023 |
Run Authors: | Elias Bassani |
From Paper: | |
Paper Authors: |
Model
Name: | DistilBERT KD |
---|---|
Description: | The authors use an ensemble of BERTcat models (the vanilla BERT passage re-ranking model) to teach and improve a DistilBERT ranker with a Margin-MSE loss. |
Tags: | Dense Retrieval · Knowledge Distillation · DistilBERT |
Paper: | Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation |
Authors: | Sebastian Hofstätter · Sophia Althammer · Michael Schröder · Mete Sertkan · Allan Hanbury |
Results
MRR@10 | MAP@1000 | Recall@1000 |
---|---|---|
0.32500119388729704 | 0.33078731041995807 | 0.9552650429799427 |