ranxhub

Run

RanxHub ID: msmarco-passage/trec-dl-2019/ranxhub/distilbert-kd-tasb
Version: 1.0
Description: DistilBERT KD TASB run reproduced using Pyserini.
Tags: Retrieval
Date: 3 February 2023
Run Authors: Elias Bassani
From Paper:
Paper Authors:

Model

Name: DistilBERT KD TASB
Description: The authors use an ensemble of BERTcat models (the vanilla BERT passage re-ranking model) to teach and improve a DistilBERT ranker with a Margin-MSE loss. The authors also use a topic-aware query and balanced margin sampling technique, called TAS-Balanced.
Tags: Dense Retrieval · Knowledge Distillation · DistilBERT
Paper: Efficiently Teaching an Effective Dense Retriever with Balanced Topic Aware Sampling
Authors: Sebastian Hofstätter · Sheng-Chieh Lin · Jheng-Hong Yang · Jimmy Lin · Allan Hanbury

Results

NDCG@10 MAP@1000 Recall@1000
0.7210329462852862 0.4589655995972589 0.8406351026610451