SoccerNet: A Scalable Dataset for Action Spotting in Soccer Videos

Silvio Giancola, Mohieddine Amine, Tarek Dghaily, Bernard Ghanem

Research output: Chapter in Book/Report/Conference proceedingConference contribution

116 Scopus citations

Abstract

In this paper, we introduce SoccerNet, a benchmark for action spotting in soccer videos. The dataset is composed of 500 complete soccer games from six main European leagues, covering three seasons from 2014 to 2017 and a total duration of 764 hours. A total of 6,637 temporal annotations are automatically parsed from online match reports at a one minute resolution for three main classes of events (Goal, Yellow/Red Card, and Substitution). As such, the dataset is easily scalable. These annotations are manually refined to a one second resolution by anchoring them at a single timestamp following well-defined soccer rules. With an average of one event every 6.9 minutes, this dataset focuses on the problem of localizing very sparse events within long videos. We define the task of spotting as finding the anchors of soccer events in a video. Making use of recent developments in the realm of generic action recognition and detection in video, we provide strong baselines for detecting soccer events. We show that our best model for classifying temporal segments of length one minute reaches a mean Average Precision (mAP) of 67.8%. For the spotting task, our baseline reaches an Average-mAP of 49.7% for tolerances d ranging from 5 to 60 seconds. Our dataset and models are available at https://silviogiancola.github.io/SoccerNet.
Original languageEnglish (US)
Title of host publication2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages1792-1802
Number of pages11
ISBN (Print)9781538661000
DOIs
StatePublished - Dec 18 2018

Fingerprint

Dive into the research topics of 'SoccerNet: A Scalable Dataset for Action Spotting in Soccer Videos'. Together they form a unique fingerprint.

Cite this