Research Papers

Immersive Distributed Design Through Real-Time Capture, Translation, and Rendering of Three-Dimensional Mesh Data1

[+] Author and Article Information
Kevin Lesniak

Computer Science and Engineering,
The Pennsylvania State University,
University Park, PA 16802
e-mail: kal5544@psu.edu

Janis Terpenny

Industrial and Manufacturing Engineering,
The Pennsylvania State University,
University Park, PA 16802
e-mail: jpt5311@engr.psu.edu

Conrad S. Tucker

Engineering Design,
Industrial and Manufacturing Engineering,
The Pennsylvania State University,
University Park, PA 16802
e-mail: ctucker4@psu.edu

Chimay Anumba

Design, Construction and Planning,
University of Florida,
Gainesville, FL 32611
e-mail: anumba@ufl.edu

Sven G. Bilén

Engineering Design,
Electrical Engineering,
The Pennsylvania State University,
University Park, PA 16802
e-mail: sbilen@psu.edu

Contributed by the Computers and Information Division of ASME for publication in the JOURNAL OF COMPUTING AND INFORMATION SCIENCE IN ENGINEERING. Manuscript received September 15, 2016; final manuscript received October 10, 2016; published online February 16, 2017. Editor: Bahram Ravani.

J. Comput. Inf. Sci. Eng 17(3), 031010 (Feb 16, 2017) (9 pages) Paper No: JCISE-16-2084; doi: 10.1115/1.4035001 History: Received September 15, 2016; Revised October 10, 2016

With design teams becoming more distributed, the sharing and interpreting of complex data about design concepts/prototypes and environments have become increasingly challenging. The size and quality of data that can be captured and shared directly affects the ability of receivers of that data to collaborate and provide meaningful feedback. To mitigate these challenges, the authors of this work propose the real-time translation of physical objects into an immersive virtual reality environment using readily available red, green, blue, and depth (RGB-D) sensing systems and standard networking connections. The emergence of commercial, off-the-shelf RGB-D sensing systems, such as the Microsoft Kinect, has enabled the rapid three-dimensional (3D) reconstruction of physical environments. The authors present a method that employs 3D mesh reconstruction algorithms and real-time rendering techniques to capture physical objects in the real world and represent their 3D reconstruction in an immersive virtual reality environment with which the user can then interact. Providing these features allows distributed design teams to share and interpret complex 3D data in a natural manner. The method reduces the processing requirements of the data capture system while enabling it to be portable. The method also provides an immersive environment in which designers can view and interpret the data remotely. A case study involving a commodity RGB-D sensor and multiple computers connected through standard TCP internet connections is presented to demonstrate the viability of the proposed method.

Copyright © 2017 by ASME
Your Session has timed out. Please sign back in to continue.


Oculus, 2016, “Oculus Touch,” Oculus VR LLC, Irvine, CA, accessed Jan. 09, 2016, https://www.oculus.com/en-us/
SteamVR, 2016, “SteamVR,” Valve Corporation, Bellevue, WA, accessed Jan. 09, 2016, http://store.steampowered.com/universe/vr
PlayStation VR, 2016, “PlayStationVR,” Sony Interactive Entertainment LLC, San Mateo, CA, accessed Jan. 09, 2016, https://www.playstation.com/en-au/explore/ps4/features/playstation-vr/
Rudarakanchana, N. , Van Herzeele, I. , Bicknell, C. D. , Riga, C. V. , Rolls, A. , Cheshire, N. J. , and Hamady, M. S. , 2014, “ Endovascular Repair of Ruptured Abdominal Aortic Aneurysm: Technical and Team Training in an Immersive Virtual Reality Environment,” Cardiovasc. Interventional Radiol., 37(4), pp. 920–927.
Sacks, R. , Perlman, A. , and Barak, R. , 2013, “ Construction Safety Training Using Immersive Virtual Reality,” Constr. Manage. Econ., 31(9), pp. 1005–1017. [CrossRef]
Bednarz, T. , James, C. , Widzyk-Capehart, E. , Caris, C. , and Alem, L. , 2015, “ Distributed Collaborative Immersive Virtual Reality Framework for the Mining Industry,” Machine Vision and Mechatronics in Practice, Springer, Berlin, pp. 39–48.
Larsson, A. , 2003, “ Making Sense of Collaboration: The Challenge of Thinking Together in Global Design Teams,” International ACM SIGGROUP Conference on Supporting Group Work, pp. 153–160.
Nguyen, C. V. , Izadi, S. , and Lovell, D. , 2012, “ Modeling Kinect Sensor Noise for Improved 3D Reconstruction and Tracking,” Second International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission (3DIMPVT), pp. 524–530.
Tucker, C. S. , John, D. B. S. , Behoora, I. , and Marcireau, A. , 2014, “ Open Source 3D Scanning and Printing for Design Capture and Realization,” ASME Paper No. DETC2014-34801.
Azuma, R. , Baillot, Y. , Behringer, R. , Feiner, S. , Julier, S. , and MacIntyre, B. , 2001, “ Recent Advances in Augmented Reality,” IEEE Comput. Graphics Appl., 21(6), pp. 34–47. [CrossRef]
Fernando, R. , and Kilgard, M. J. , 2003, The Cg Tutorial: The Definitive Guide to Programmable Real-Time Graphics, Addison-Wesley Longman Publishing, Boston, MA.
Roth, H. , and Vona, M. , 2012, “ Moving Volume KinectFusion,” Proceedings of the British Machine Vision Conference (BMVC), pp. 112.1–112.11.
Whelan, T. , Kaess, M. , Fallon, M. , Johannsson, H. , Leonard, J. , and McDonald, J. , 2012, “ Kintinuous: Spatially Extended Kinectfusion,” Report No. MIT-CSAIL-TR-2012-020.
Cutting, J. E. , 1997, “ How the Eye Measures Reality and Virtual Reality,” Behav. Res. Methods, Instrum., Comput., 29(1), pp. 27–36. [CrossRef]
Newcombe, R. A. , Izadi, S. , Hilliges, O. , Molyneaux, D. , Kim, D. , Davison, A. J. , Kohi, P. , Shotton, J. , Hodges, S. , and Fitzgibbon, A. , 2011, “ KinectFusion: Real-Time Dense Surface Mapping and Tracking,” 10th IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 127–136.
Microsoft, 2013, “ Kinect Fusion Explorer-WPF C# Sample,” Microsoft, Redmond, WA, accessed Sept. 14, 2016, https://msdn.microsoft.com/en-us/library/dn193975.aspx
Turner, E. , Cheng, P. , and Zakhor, A. , 2015, “ Fast, Automated, Scalable Generation of Textured 3d Models of Indoor Environments,” IEEE J. Sel. Top. Signal Process., 9(3), pp. 409–421. [CrossRef]
Vasudevan, N. , and Tucker, C. S. , 2013, “ Digital Representation of Physical Artifacts: The Effect of Low Cost, High Accuracy 3D Scanning Technologies on Engineering Education, Student Learning and Design Evaluation,” ASME Paper No. DETC2013-12651.
Hamzeh, O. , and Elnagar, A. , 2015, “ A Kinect-Based Indoor Mobile Robot Localization,” 10th International Symposium on Mechatronics and Its Applications (ISMA), pp. 1–6.
Epic Games, 2016, “What is Unreal Engine 4,” Epic Games, Inc., Cary, NC, accessed Sept. 14, 2016, https://www.unrealengine.com/what-is-unreal-engine-4
Unity Technologies, 2016, “Unity—Game Engine,” Unity Technologies, San Francisco, CA, accessed Sept. 14, 2016, https://unity3d.com/
Microsoft, 2014, “Kinect Hardware,” Microsoft, Redmond, WA, accessed Sept. 14, 2016, https://developer.microsoft.com/en-us/windows/kinect/hardware
Joint Photographic Experts Group, 1994, “JPEG—JPEG,” Joint Photographic Experts Group Committee, accessed Sept. 14, 2016, https://jpeg.org/jpeg/index.html
EmguCV, 2016, “Emgu CV: OpenCV in.NET (C#, VB, C++ and More),” EmguCV, accessed Sept. 14, 2016, http://www.emgu.com/wiki/index.php/Main_Page
Curless, B. , and Levoy, M. , 1996, “ A Volumetric Method for Building Complex Models From Range Images,” 23rd Annual Conference on Computer Graphics and Interactive Techniques, pp. 303–312.
Advanced Micro Devices, 2016, “RadeonTM R9 Series Graphics Cards|AMD,” Advanced Micro Devices, Inc., Sunnyvale, CA, accessed Sept. 14, 2016, http://www.amd.com/en-us/products/graphics/desktop/r9#


Grahic Jump Location
Fig. 1

Head mounted virtual reality display

Grahic Jump Location
Fig. 2

Flow diagram of proposed method

Grahic Jump Location
Fig. 3

Mesh constructed from sensor data

Grahic Jump Location
Fig. 4

Subdivided meshes from reconstructed mesh

Grahic Jump Location
Fig. 5

Distributed components of method

Grahic Jump Location
Fig. 6

Real time mesh reconstruction in the Unity VR environment



Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In