Analysis of Aerobics Auxiliary Training Based on Deep Learning

With the in-depth integration of information technology and subject teaching, it is also an inevitable trend to apply modern information technology to aerobics teaching. In this paper, the N-best algorithm is used in the video and real-time camera in aerobics, so that the human posture parameters in...

Full description

Saved in:
Bibliographic Details
Published inScientific programming Vol. 2022; pp. 1 - 7
Main Author Li, Can
Format Journal Article
LanguageEnglish
Published New York Hindawi 22.03.2022
John Wiley & Sons, Inc
Subjects
Online AccessGet full text
ISSN1058-9244
1875-919X
1875-919X
DOI10.1155/2022/9269988

Cover

More Information
Summary:With the in-depth integration of information technology and subject teaching, it is also an inevitable trend to apply modern information technology to aerobics teaching. In this paper, the N-best algorithm is used in the video and real-time camera in aerobics, so that the human posture parameters in a single-frame image can be estimated. By using the relative position and motion direction of each part of the human body to describe the characteristics of aerobics, the Laplace scoring method is used to reduce the dimension of the data, and the discriminant human motion feature vector with a strong local topological structure is obtained. Finally, the iterative self-organizing data analysis technology (ISODATA algorithm) is used to dynamically determine the keyframe. In the aerobics video keyframe extraction experiments, the ST-FMP model improves the recognition accuracy of nondeterministic body parts of the flexible hybrid articulated human model (FMP) by about 15 percentage points and achieves 81% keyframe extraction accuracy, which is better than the keyframe algorithms of KFE and motion block. The proposed algorithm is sensitive to human motion features and human pose and is suitable for motion video annotation review.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1058-9244
1875-919X
1875-919X
DOI:10.1155/2022/9269988