Augmented reality display based on user behavior

•The study presents a user-behavior-driven augmented content display approach.•A user behavior perception algorithm, that infers the current state of the user by crosschecking his/her past behavior, is presented.•Five augmented content display patterns corresponding to the modeled user's behavi...

Full description

Saved in:
Bibliographic Details
Published inComputer standards and interfaces Vol. 55; pp. 171 - 181
Main Authors Tsai, Chung-Hsien, Huang, Jiung-Yao
Format Journal Article
LanguageEnglish
Published Amsterdam Elsevier B.V 01.01.2018
Elsevier BV
Subjects
Online AccessGet full text
ISSN0920-5489
1872-7018
DOI10.1016/j.csi.2017.08.003

Cover

More Information
Summary:•The study presents a user-behavior-driven augmented content display approach.•A user behavior perception algorithm, that infers the current state of the user by crosschecking his/her past behavior, is presented.•Five augmented content display patterns corresponding to the modeled user's behavior states are designed accordingly.•The experimental results show that iDisplay can accurately infer user states and manage augmented content display efficiently. The development and commercialization of smart glasses in recent years have made the exploration of one's surroundings with mobile augmented reality (MAR) browsers anytime and anywhere more practical. However, users often suffer from issues such as cognitive overload and inconvenient interactions when operating MAR browsers on smart glasses owing to the constraints of the screen resolution and size. To overcome these problems, this paper presents a user-behavior-driven augmented content display approach called iDisplay. First, user behaviors were modeled while the smart glasses were used. A user behavior perception algorithm that infers the current state of the user by crosschecking his/her past behavior and feature data extracted from the built-in sensors of the smart glasses was then developed. Five augmented content display patterns corresponding to the modeled user's behavior states were designed accordingly. To verify that iDisplay can be based upon the perceived user states to adaptively manage the smart glasses display, a prototype system was built to conduct a series of experiments. The experimental results show that iDisplay can accurately infer user states and manage augmented content display accordingly. A user study also shows that iDisplay can successfully reduce the user's cognitive load and split attention when searching for specific point-of-interest information while moving. Furthermore, all subjects claimed that iDisplay causes less dizziness during the experiments than the native overview + detail augmented reality interface.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ISSN:0920-5489
1872-7018
DOI:10.1016/j.csi.2017.08.003