The MSR-Video to Text dataset with clean annotations

Video captioning automatically generates short descriptions of the video content, usually in form of a single sentence. Many methods have been proposed for solving this task. A large dataset called MSR Video to Text (MSR-VTT) is often used as the benchmark dataset for testing the performance of the...

Full description

Saved in:
Bibliographic Details
Published inComputer vision and image understanding Vol. 225; p. 103581
Main Authors Chen, Haoran, Li, Jianmin, Frintrop, Simone, Hu, Xiaolin
Format Journal Article
LanguageEnglish
Published Elsevier Inc 01.12.2022
Subjects
Online AccessGet full text
ISSN1077-3142
1090-235X
DOI10.1016/j.cviu.2022.103581

Cover

More Information
Summary:Video captioning automatically generates short descriptions of the video content, usually in form of a single sentence. Many methods have been proposed for solving this task. A large dataset called MSR Video to Text (MSR-VTT) is often used as the benchmark dataset for testing the performance of the methods. However, we found that the human annotations, i.e., the descriptions of video contents in the dataset are quite noisy, e.g., there are many duplicate captions and many captions contain grammatical problems. These problems may pose difficulties to video captioning models for learning underlying patterns. We cleaned the MSR-VTT annotations by removing these problems, then tested several typical video captioning models on the cleaned dataset. Experimental results showed that data cleaning boosted the performances of the models measured by popular quantitative metrics. We recruited subjects to evaluate the results of a model trained on the original and cleaned datasets. The human behavior experiment demonstrated that trained on the cleaned dataset, the model generated captions that were more coherent and more relevant to the contents of the video clips. •Identify duplicate captions and grammatical problems in the MSR-VTT dataset.•Clean the dataset and compare model performance before and after data cleaning.•Inspect the impact of each step in data cleaning.•Human evaluation shows the positive impact of data cleaning.
ISSN:1077-3142
1090-235X
DOI:10.1016/j.cviu.2022.103581