0
Research Papers

Immersive Distributed Design Through Real-Time Capture, Translation, and Rendering of Three-Dimensional Mesh Data1

[+] Author and Article Information
Kevin Lesniak

Computer Science and Engineering,
The Pennsylvania State University,
University Park, PA 16802
e-mail: kal5544@psu.edu

Janis Terpenny

Industrial and Manufacturing Engineering,
The Pennsylvania State University,
University Park, PA 16802
e-mail: jpt5311@engr.psu.edu

Conrad S. Tucker

Engineering Design,
Industrial and Manufacturing Engineering,
The Pennsylvania State University,
University Park, PA 16802
e-mail: ctucker4@psu.edu

Chimay Anumba

Design, Construction and Planning,
University of Florida,
Gainesville, FL 32611
e-mail: anumba@ufl.edu

Sven G. Bilén

Engineering Design,
Electrical Engineering,
The Pennsylvania State University,
University Park, PA 16802
e-mail: sbilen@psu.edu

Contributed by the Computers and Information Division of ASME for publication in the JOURNAL OF COMPUTING AND INFORMATION SCIENCE IN ENGINEERING. Manuscript received September 15, 2016; final manuscript received October 10, 2016; published online February 16, 2017. Editor: Bahram Ravani.

J. Comput. Inf. Sci. Eng 17(3), 031010 (Feb 16, 2017) (9 pages) Paper No: JCISE-16-2084; doi: 10.1115/1.4035001 History: Received September 15, 2016; Revised October 10, 2016

With design teams becoming more distributed, the sharing and interpreting of complex data about design concepts/prototypes and environments have become increasingly challenging. The size and quality of data that can be captured and shared directly affects the ability of receivers of that data to collaborate and provide meaningful feedback. To mitigate these challenges, the authors of this work propose the real-time translation of physical objects into an immersive virtual reality environment using readily available red, green, blue, and depth (RGB-D) sensing systems and standard networking connections. The emergence of commercial, off-the-shelf RGB-D sensing systems, such as the Microsoft Kinect, has enabled the rapid three-dimensional (3D) reconstruction of physical environments. The authors present a method that employs 3D mesh reconstruction algorithms and real-time rendering techniques to capture physical objects in the real world and represent their 3D reconstruction in an immersive virtual reality environment with which the user can then interact. Providing these features allows distributed design teams to share and interpret complex 3D data in a natural manner. The method reduces the processing requirements of the data capture system while enabling it to be portable. The method also provides an immersive environment in which designers can view and interpret the data remotely. A case study involving a commodity RGB-D sensor and multiple computers connected through standard TCP internet connections is presented to demonstrate the viability of the proposed method.

FIGURES IN THIS ARTICLE
<>
Copyright © 2017 by ASME
Your Session has timed out. Please sign back in to continue.

References

Figures

Grahic Jump Location
Fig. 1

Head mounted virtual reality display

Grahic Jump Location
Fig. 2

Flow diagram of proposed method

Grahic Jump Location
Fig. 3

Mesh constructed from sensor data

Grahic Jump Location
Fig. 4

Subdivided meshes from reconstructed mesh

Grahic Jump Location
Fig. 5

Distributed components of method

Grahic Jump Location
Fig. 6

Real time mesh reconstruction in the Unity VR environment

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In