Using subject-specific three-dimensional (3D) anthropometry data in digital human modelling: case study in hand motion simulation

Digital human modelling enables ergonomists and designers to consider ergonomic concerns and design alternatives in a timely and cost-efficient manner in the early stages of design. However, the reliability of the simulation could be limited due to the percentile-based approach used in constructing...

Full description

Saved in:
Bibliographic Details
Published inErgonomics Vol. 59; no. 11; pp. 1526 - 1539
Main Authors Tsao, Liuxing, Ma, Liang
Format Journal Article
LanguageEnglish
Published England Taylor & Francis 01.11.2016
Taylor & Francis LLC
Subjects
Online AccessGet full text
ISSN0014-0139
1366-5847
DOI10.1080/00140139.2016.1151554

Cover

More Information
Summary:Digital human modelling enables ergonomists and designers to consider ergonomic concerns and design alternatives in a timely and cost-efficient manner in the early stages of design. However, the reliability of the simulation could be limited due to the percentile-based approach used in constructing the digital human model. To enhance the accuracy of the size and shape of the models, we proposed a framework to generate digital human models using three-dimensional (3D) anthropometric data. The 3D scan data from specific subjects' hands were segmented based on the estimated centres of rotation. The segments were then driven in forward kinematics to perform several functional postures. The constructed hand models were then verified, thereby validating the feasibility of the framework. The proposed framework helps generate accurate subject-specific digital human models, which can be utilised to guide product design and workspace arrangement. Practitioner Summary: Subject-specific digital human models can be constructed under the proposed framework based on three-dimensional (3D) anthropometry. This approach enables more reliable digital human simulation to guide product design and workspace arrangement.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:0014-0139
1366-5847
DOI:10.1080/00140139.2016.1151554