Key result
98.6% classification accuracy with 4 ms inference and 34.04 ms end-to-end pipeline latency.
Why it matters
- Live sports production needs instant understanding of game situations for automated broadcast decisions.
- Manual control rooms do not scale well for lower-tier matches with limited production teams.
Approach
- Trained a deep-learning classifier over labeled handball frames covering key tactical situations.
- Integrated the model into a real-time pipeline designed for production timing constraints.
Results
- Reached near-production latency while retaining high recognition accuracy.
- Demonstrated the feasibility of AI-assisted autonomous match direction.
Abstract
During the broadcast of sporting events, certain situations such as a penalty or a time-out occur, for which a specific action is required. In traditional broadcasting, many people are implied in making decisions based on what is happening at any given moment. To broadcast quality and entirely automatically matches it is necessary to be able to classify the important situations and then make decisions based on them. This paper presents a solution based on deep learning which is able to classify the main states of a handball match. The generated model has been trained using 127,600 images of 13 local team matches. On a test set of 118,129 images of other 7 matches, it is able to classify these situations with an accuracy of 98.6% in only 4 milliseconds, allowing to analyze the state of the game in real time. The full pipeline takes only 34.04 milliseconds using GPU acceleration, processing more than 25 frames per seconds.
B. Cabado, B. Guijarro-Berdiñas and E. J. Padrón, ‘Real-time classification of handball game situations’, 2022 IEEE 34th International Conference on Tools with Artificial Intelligence (ICTAI), Macao, China, 2022, pp. 686-691, doi: 10.1109/ICTAI56018.2022.00106.