ranxhub

Run

RanxHub ID: msmarco-passage/trec-dl-2020/ranxhub/tct-colbert-v2-hn+
Version: 1.0
Description: TCT-ColBERT-V2-HN+ run reproduced using Pyserini.
Tags: Retrieval
Date: 3 February 2023
Run Authors: Elias Bassani
From Paper:
Paper Authors:

Model

Name: TCT-ColBERT-V2-HN+
Description: The authors employ on a knowledge distillation approach based on Col-BERT late-interaction ranking model to reduce GPU memory consumption, enabling in-batch negatives during training.
Tags: Dense Retrieval · Knowledge Distillation · BERT
Paper: In-Batch Negatives for Knowledge Distillation with Tightly-Coupled Teachers for Dense Retrieval
Authors: Sheng-Chieh Lin · Jheng-Hong Yang · Jimmy Lin

Results

NDCG@10 MAP@1000 Recall@1000
0.6882393744726825 0.4754120714070939 0.842946385272863