Exploration of Applying Pose Estimation Techniques in Table Tennis

The newly developed computer vision pose estimation technique in artificial intelligence (AI) is an emerging technology with potential advantages, such as high efficiency and contactless detection, for improving competitive advantage in the sports industry. The related literature is currently lackin...

Full description

Saved in:
Bibliographic Details
Published inApplied sciences Vol. 13; no. 3; p. 1896
Main Authors Wu, Chih-Hung, Wu, Te-Cheng, Lin, Wen-Bin
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.02.2023
Subjects
Online AccessGet full text
ISSN2076-3417
2076-3417
DOI10.3390/app13031896

Cover

More Information
Summary:The newly developed computer vision pose estimation technique in artificial intelligence (AI) is an emerging technology with potential advantages, such as high efficiency and contactless detection, for improving competitive advantage in the sports industry. The related literature is currently lacking an integrated and comprehensive discussion about the applications and limitations of using the pose estimation technique. The purpose of this study was to apply AI pose estimation techniques, and to discuss the concepts, possible applications, and limitations of these techniques in table tennis. This study implemented the OpenPose pose algorithm in a real-world video of a table tennis game. The research results show that the pose estimation algorithm performs well in estimating table tennis players’ poses from the video in a graphics processing unit (GPU)-accelerated environment. This study proposes an innovative two-stage AI pose estimation method for effectively addressing the current difficulties in applying AI to table tennis players’ pose estimation. Finally, this study provides several recommendations, benefits, and various perspectives (training vs. tactics) of table tennis and pose estimation limitations for the sports industry.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2076-3417
2076-3417
DOI:10.3390/app13031896