With design teams becoming more distributed, the sharing and interpreting of complex data about design concepts/prototypes and environments has become increasingly challenging. The size and quality of data that can be captured and shared, directly affects the ability of receivers of that data to collaborate and provide meaningful feedback. To mitigate these challenges, the authors of this work propose the real time translation of physical objects into an immersive virtual reality environment using readily available Red, Green, Blue and Depth (RGB-D) sensing systems and standard networking connections. The emergence of commercial, off-the-shelf RGB-D sensing systems, such as the Microsoft Kinect, have enabled the rapid 3D reconstruction of physical environments. The authors present a method that employs 3D mesh reconstruction algorithms and real time rendering techniques to capture physical objects in the real world and represent their 3D reconstruction in an immersive virtual reali1ty environment with which the user can then interact. Providing these features allows distributed design teams to share and interpret complex 3D data in a natural manner. The method reduces the processing requirements of the data capture system while enabling it to be portable. The method also provides an immersive environment in which designers can view and interpret the data remotely. A case study involving a commodity RGB-D sensor and multiple computers connected through standard TCP internet connections is presented to demonstrate the viability of the proposed method.