0
Research Papers

Augmented Affective-Cognition for Usability Study of In-Vehicle System User Interface

[+] Author and Article Information
Feng Zhou

Mem. ASME
The G.W. Woodruff School
of Mechanical Engineering,
Georgia Institute of Technology,
813 Ferst Drive,
Atlanta, GA 30332-0405
e-mail: fzhou35@gatech.edu

Yangjian Ji

Industrial Engineering Centre,
Department of Mechanical Engineering,
Zhejiang University,
38 Zheda Rd, Hangzhou,
Zhejiang 310027, China
e-mail: mejyj@zju.edu.cn

Roger J. Jiao

The G.W. Woodruff School
of Mechanical Engineering,
Georgia Institute of Technology,
813 Ferst Drive,
Atlanta, GA 30332-0405
e-mail: rjiao@gatech.edu

1Corresponding author.

Contributed by the Computers and Information Division of ASME for publication in the JOURNAL OF COMPUTERS AND INFORMATION DIVISION IN ENGINEERING. Manuscript received July 19, 2013; final manuscript received December 1, 2013; published online February 12, 2014. Editor: Bahram Ravani.

J. Comput. Inf. Sci. Eng 14(2), 021001 (Feb 12, 2014) (11 pages) Paper No: JCISE-13-1128; doi: 10.1115/1.4026222 History: Received July 19, 2013; Revised December 01, 2013

Usability of in-vehicle systems has become increasingly important for ease of operations and safety of driving. The user interface (UI) of in-vehicle systems is a critical focus of usability study. This paper studies how to use advanced computational, physiology- and behavior-based tools and methodologies to determine affective/emotional states and behavior of an individual in real time and in turn how to adapt the human-vehicle interaction to meet users' cognitive needs based on the real-time assessment. Specifically, we set up a set of physiological sensors that are capable of collecting EEG, facial EMG, skin conductance response, and respiration data and a set of motion sensing and tracking equipment that is capable of capturing eye ball movement and objects which the user is interacting with. All hardware components and software are integrated into an augmented sensor platform that can perform as “one coherent system” to enable multimodal data processing and information inference for context-aware analysis of emotional states and cognitive behavior based on the rough set inference engine. Meanwhile subjective data are also recorded for comparison. A usability study of in-vehicle system UI is shown to demonstrate the potential of the proposed methodology.

FIGURES IN THIS ARTICLE
<>
Copyright © 2014 by ASME
Your Session has timed out. Please sign back in to continue.

References

Figures

Grahic Jump Location
Fig. 1

System architecture of augmented affective cognition for usability studies of in-vehicle system UIs

Grahic Jump Location
Fig. 2

Sensor platform for usability studies of in-vehicle system UIs

Grahic Jump Location
Fig. 3

Illustration of SCR features

Grahic Jump Location
Fig. 4

Boxplot of statistics of tasks

Grahic Jump Location
Fig. 5

Usability issues in the navigation system UI

Grahic Jump Location
Fig. 6

Usability issues in the radio and music player UI

Grahic Jump Location
Fig. 7

Usability issues in the air conditioner UI

Grahic Jump Location
Fig. 8

Usability issues in the dashboard UI

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In