Research Papers

Evaluation of System-Directed Multimodal Systems for Vehicle Inspection

[+] Author and Article Information
Lauren Cairco Dukes

Graduate Research Assistant
Virtual Environments Group,
School of Computing,
Clemson University,
Clemson, SC 29632
e-mail: LCairco@clemson.edu

Amy Ulinski Banic

Department of Computer Science,
University of Wyoming,
Laramie, WY 82071
e-mail: abanic@cs.uwyo.edu

Jerome McClendon

Graduate Research Assistant
e-mail: jmcclen@clemson.edu

Toni Bloodworth Pence

Graduate Research Assistant
e-mail: tbloodw@clemson.edu
Virtual Environments Group,
School of Computing,
Clemson University,
Clemson, SC 29632

James Mathieson

Graduate Research Assistant
e-mail: jmathie@clemson.edu

Joshua Summers

Associate Professor
e-mail: jsummer@clemson.edu
Department of Mechanical Engineering,
Clemson University,
Clemson, SC 29632

Larry F. Hodges

School of Computing,
Clemson University,
Clemson, SC 29632
e-mail: lfh@clemson.edu

Contributed by the Computers and Information Division of ASME for publication in the Journal of Computing and Information Science in Engineering. Manuscript received February 13, 2012; final manuscript received July 30, 2012; published online January 7, 2013. Editor: Bahram Ravani.

J. Comput. Inf. Sci. Eng 13(1), 011002 (Jan 07, 2013) (9 pages) Paper No: JCISE-12-1026; doi: 10.1115/1.4023004 History: Received February 13, 2012; Revised July 30, 2012

Multimodal systems have been previously used as an aid to improve quality and safety inspection in various domains, though few studies have evaluated these systems for accuracy and user comfort. Our research aims to combine our software interface designed for high usability with multimodal hardware configurations and to evaluate these systems to determine their user performance benefits and user acceptance data. We present two multimodal systems for using a novel system-directed interface to aid in inspecting vehicles along the assembly line: (1) wearable monocular display with speech input and audio output and (2) large screen display with speech input and audio output. We conducted two evaluations: (a) an experimental evaluation with novice users, resulting in accuracy, timing, user preferences, and other performance results and (b) an expert-based usability evaluation conducted on and off the assembly line providing insight on user acceptance, preferences, and performance potential in the production environment. We also compared these systems to current technology used in the production environment: a handheld display without speech input/output. Our results show that for visual and tactile tasks, benefits of system-directed interfaces are best realized when used with multimodal systems that reduce visual and tactile interaction per item and instead deliver system-directed information on the audio channel. Interface designers that combine system-directed interfaces with multimodal systems can expect faster and more efficient user performance when the delivery channel is different from channels necessary for task completion.

Copyright © 2013 by ASME
Your Session has timed out. Please sign back in to continue.


Boronowsky, M., Nicolai, T., Schlieder, C., and Schmidt, A., 2001, “Winspect: A Case Study for Wearable Computing-Supported Inspection Tasks,” Proceedings Fifth International Symposium on Wearable Computers, IEEE, pp. 163–164.
Burgy, C., Garrett, J. H., Jr., and Klausner, M., 2000, “Speech-Controlled Wearable Computer: A Mobile System Supporting Inspections in Garages.” Availabe at: http://www.ce.cmu.edu/~wearables/docs/scwc_projectreport_03_2000.pdf (Accessed 9 December 2012).
Ockerman, J., and Pritchett, A., 1998, “Preliminary Investigation of Wearable Computers for Task Guidance in Aircraft Inspection,” Digest of Papers, Second International Symposium on Wearable Computers, IEEE, pp. 33–40.
Sunkpho, J., Garrett, J., Jr., Smailagic, A., and Siewiorek, D., 1998, “MIA: A Wearable Computer for Bridge Inspectors,” Digest of Papers, Second International Symposium on Wearable Computers, IEEE, pp. 160–161.
BMW Manufacturing Co., Spartanburg South Carolina, 2012.
Cairco, L., Ulinski, A., McClendon, J., Bloodworth, T., Matheison, J., Hodges, L., and Summers, J., 2010, “Interface Design and Display Modalities to Improve the Vehicle Inspection Process,” Proceedings on WINVR, ASME.
Lukowicz, P., Timm-Giel, A., Lawo, M., and Herzog, O., 2007, “Wearit@work: Toward Real-World Industrial Wearable Computing,” IEEE Pervasive Comput., 6(4), pp. 8–13. [CrossRef]
Maurtua, I., Kirisci, P., Stiefmeier, T., Sbodio, M., and Witt, H., 2007. “A Wearable Computing Prototype for Supporting Training Activities in Automotive Production,” 4th International Forum on Applied Wearable Computing (IFAWC), VDE, pp. 1–12.
ETH-Ife-Wearable Computing–Wearit@Work Project Homepage, 2012.
Aleksy, M., Rissanen, M., Maczey, S., and Dix, M., 2011, “Wearable Computing in Industrial Service Applications,” Proc. Comput. Sci., 5, pp. 394–400. [CrossRef]
Rogers, Y., Sharp, H., and Preece, J., 2011, Interaction Design: Beyond Human-Computer Interaction, Wiley, New York.
Miller, G., 1956, “The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information,” Psychol. Rev., 63(2), pp. 81–97. [CrossRef] [PubMed]


Grahic Jump Location
Fig. 1

Display for multimodal configurations using an abstracted task list. The display is rendered on one of three hardware configurations and is paired with audio, touch, and/or speech recognition.

Grahic Jump Location
Fig. 2

Visual display for handheld configuration using an abstracted task list. Left: the handheld GUI when the application starts. Right: the GUI after several items has been checked off. The most recently checked item is green because it passed inspection, and the progress bar is color coded according to the inspection results of all items checked so far.

Grahic Jump Location
Fig. 3

(a) Large screen configuration and (b) one-handed touch for an abstracted task list using the handheld device

Grahic Jump Location
Fig. 4

(a) and (b) Monocular display hardware configuration; (c) example of a two-handed touch using the monocular display configuration

Grahic Jump Location
Fig. 5

(a) Vehicle body used for experimental evaluation. (b) Example of shapes used for abstracted inspection task. This item would be called “Small Blue Square,” would require a one-handed touch for the inspection action since the shape is a pentagon and would pass inspection since an odd number of dots are shaded.




Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In