ranxhub

Run

RanxHub ID: msmarco-passage/trec-dl-2020/ranxhub/distilbert-kd
Version: 1.0
Description: DistilBERT KD run reproduced using Pyserini.
Tags: Retrieval
Date: 3 February 2023
Run Authors: Elias Bassani
From Paper:
Paper Authors:

Model

Name: DistilBERT KD
Description: The authors use an ensemble of BERTcat models (the vanilla BERT passage re-ranking model) to teach and improve a DistilBERT ranker with a Margin-MSE loss.
Tags: Dense Retrieval · Knowledge Distillation · DistilBERT
Paper: Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation
Authors: Sebastian Hofstätter · Sophia Althammer · Michael Schröder · Mete Sertkan · Allan Hanbury

Results

NDCG@10 MAP@1000 Recall@1000
0.6446694985165204 0.41588803860428253 0.795297885501652