Research Papers

Object Tracking With a Range Camera for Augmented Reality Assembly Assistance

[+] Author and Article Information
Rafael Radkowski

Assistant Professor
Department of Mechanical Engineering,
Iowa State University,
Ames, IA 50011
e-mail: rafael@iastate.edu

Contributed by the Computers and Information Division of ASME for publication in the JOURNAL OF COMPUTING AND INFORMATION SCIENCE IN ENGINEERING. Manuscript received January 2, 2015; final manuscript received November 5, 2015; published online January 13, 2016. Editor: Bahram Ravani.

J. Comput. Inf. Sci. Eng 16(1), 011004 (Jan 13, 2016) (8 pages) Paper No: JCISE-15-1002; doi: 10.1115/1.4031981 History: Received January 02, 2015; Revised November 05, 2015

This paper introduces a 3D object tracking method for an augmented reality (AR) assembly assistance application. The tracking method relies on point clouds; it uses 3D feature descriptors and point cloud matching with the iterative closest points (ICP) algorithm. The feature descriptors identify an object in a point cloud; ICP align a reference object with this point cloud. The challenge is to achieve high fidelity while maintaining camera frame rates. The point cloud and reference object sampling density are one of the key factors to meet this challenge. In this research, three-point sampling methods and two-point cloud search algorithms were compared to assess their fidelity when tracking typical products of mechanical engineering. The results indicate that a uniform sampling maintains the best fidelity at camera frame rates.

Copyright © 2016 by ASME
Your Session has timed out. Please sign back in to continue.



Grahic Jump Location
Fig. 1

Overview of the tracking process

Grahic Jump Location
Fig. 2

(a) A typical part that needs to be tracked in the addressed research area. (b) The associated 3D model that is used as reference object for tracking.

Grahic Jump Location
Fig. 3

(a) The input depth image Rt and (b) the resulting vertex map Vt

Grahic Jump Location
Fig. 4

Camera pose estimation: an initial position for reference model Pi and the result from ICP ([Rt]) are used to calculate the camera pose

Grahic Jump Location
Fig. 5

Iterations to minimize the distance between a reference object Pi and an environment point cloud X

Grahic Jump Location
Fig. 6

The AR application with (a) the reference object overlaid over the physical part and (b) with a virtual object superimposed on the physical part

Grahic Jump Location
Fig. 7

The test setup to record the datasets

Grahic Jump Location
Fig. 8

Screenshot of the simulation tool icptracksim

Grahic Jump Location
Fig. 9

Number of iterations from different camera positions

Grahic Jump Location
Fig. 11

Comparison of RMS over number of iterations

Grahic Jump Location
Fig. 12

Sample RMS error of the second experiment




Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In