Run
| RanxHub ID: | msmarco-passage/trec-dl-2019/ranxhub/tct-colbert-v2-hn+ |
|---|---|
| Version: | 1.0 |
| Description: | TCT-ColBERT-V2-HN+ run reproduced using Pyserini. |
| Tags: | Retrieval |
| Date: | 3 February 2023 |
| Run Authors: | Elias Bassani |
| From Paper: | |
| Paper Authors: |
Model
| Name: | TCT-ColBERT-V2-HN+ |
|---|---|
| Description: | The authors employ on a knowledge distillation approach based on Col-BERT late-interaction ranking model to reduce GPU memory consumption, enabling in-batch negatives during training. |
| Tags: | Dense Retrieval · Knowledge Distillation · BERT |
| Paper: | In-Batch Negatives for Knowledge Distillation with Tightly-Coupled Teachers for Dense Retrieval |
| Authors: | Sheng-Chieh Lin · Jheng-Hong Yang · Jimmy Lin |
Results
| NDCG@10 | MAP@1000 | Recall@1000 |
|---|---|---|
| 0.7203592714500755 | 0.446870052693327 | 0.8260529421972208 |