Guest Editorial

J. Comput. Inf. Sci. Eng. 2008;8(4):040301-040301-2. doi:10.1115/1.3006351.

Virtual Reality emerged as a new frontier of engineering during the 1990s. The development of the Cave Automatic Virtual Environment (CAVE) and large display systems changed the landscape of many areas of product development, including design of products, visualization of assemblies, and simulation of manufacturing systems and factories. Early applications of such virtual reality systems utilized 3D visualization using stereoscopy and user input devices such as the data glove and wand for interaction. However, it was realized very early during the development of such systems that a key ability lacking in such interactive systems was the lack of sensation of touch—an area that has come to be known as haptics.

Topics: Haptics
Commentary by Dr. Valentin Fuster

Research Papers

J. Comput. Inf. Sci. Eng. 2008;8(4):041001-041001-7. doi:10.1115/1.2987400.

This paper describes the design of a haptic system that allows the interactive modification of cutter orientation during five-axis finishing cuts with the aim of improving the surface finish quality and collision avoidance strategies. The system supports two haptic models that provide three degree of freedom (DOF) force feedback and 6DOF posture sensing. Details of five key functions of the system are given: (1) a rendering conversion that uses 3DOF (instead of five) force feedback haptic representation, (2) an efficient force feedback design that allows accurate results to be obtained from the user’s manipulation, (3) a fast collision detection scheme that achieves real-time feedback, (4) use of active haptic guidance to assist cutter-path generation, and (5) a design that supports both ball-end and flat-end tools with partial optimization.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2008;8(4):041002-041002-6. doi:10.1115/1.2987402.

This paper illustrates a novel registration method for augmented reality systems based on multiplanar structures and natural feature tracking. Our method distinguishes itself from other methods in the following ways: First, our method can use arbitrary multiple planes without any information on the physical relationship of the planes. Second, we put forward a new method to establish the world coordinate system. Compared with the traditional method, our approach needs only three noncollinear points of the real world and really enhances the usability of our system. Third, we propose novel classification based local and global tracking strategies to match features between reference images and the current frame directly. Our method casts off the requirement that the initial camera position should be close to the reference images while overcoming the problem of error accumulation. Some experiments have been carried out to demonstrate the validity of the proposed approach.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2008;8(4):041003-041003-6. doi:10.1115/1.2987403.

Haptic feedback is known to improve teleoperation task performance for a number of tasks, and one important question is which haptic cues are the most important for each specific task. This research quantifies human performance in an assembly task for two types of haptic cues: low-frequency (LF) force feedback and high-frequency (HF) force feedback. A human subjects study was performed with those two main factors: LF force feedback on/off and HF force (acceleration) feedback on/off. All experiments were performed using a three degree-of-freedom teleoperator where the slave device has a low intrinsic stiffness, while the master device on the other hand is stiff. The results show that the LF haptic feedback reduces impact forces, but does not influence low-frequency contact forces or task completion time. The HF information did not improve task performance, but did reduce the mental load of the teleoperator, but only in combination with the LF feedback.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2008;8(4):041004-041004-11. doi:10.1115/1.2988340.

This paper presents an infrastructure that integrates a haptic interface into a mainstream computer-aided design (CAD) system. A haptic interface, by providing force feedback in human-computer interaction, can improve the working efficiency of CAD/computer-aided manufacturing (CAM) systems in a unique way. The full potential of the haptic technology is best realized when it is integrated effectively into the product development environment and process. For large manufacturing companies this means integration into a commercial CAD system (Stewart, , 1997, “Direct Integration of Haptic User Interface in CAD Systems  ,” ASME Dyn. Syst. Control Div., 61, pp. 93–99). Mainstream CAD systems typically use constructive solid geometry (CSG) and boundary representation (B-Rep) format as their native format, while internally they automatically maintain triangulated meshes for graphics display and for numerical evaluation tasks such as surface-surface intersection. In this paper, we propose to render a point-based haptic force feedback by leveraging built-in functions of the CAD systems. The burden of collision detection and haptic rendering computation is alleviated by using bounding spheres and an OpenGL feedback buffer. The major contribution of this paper is that we developed a sound structure and methodology for haptic interaction with native CAD models inside mainstream CAD systems. We did so by analyzing CAD application models and by examining haptic rendering algorithms. The technique enables the user to directly touch and manipulate native 3D CAD models in mainstream CAD systems with force/touch feedback. It lays the foundation for future tasks such as direct CAD model modification, dynamic simulation, and virtual assembly with the aid of a haptic interface. Hence, by integrating a haptic interface directly with mainstream CAD systems, the powerful built-in functions of CAD systems can be leveraged and enhanced to realize more agile 3D CAD design and evaluation.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2008;8(4):041005-041005-7. doi:10.1115/1.2988341.

This paper presents a novel operational space calibration approach for robotic remote welding based on surface tracking with shared force control. A human-machine shared force controller is designed to combine manual control with local force control. A position-based force control strategy for surface tracking on the constrained motion plane is adopted. A precise method to measure the contact point’s spatial location during the surface tracking process is proposed. The operational space calibration for the L-pipe part is solved by direct least-squares fitting algorithm of elliptic tracking trajectory and L-pipe part calibration algorithm. The experimental results show that the proposed calibration method can make the operational space model error less than 1mm, which meets the requirements of carrying out assembling contact tasks during the remote welding process with passive compliance.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2008;8(4):041006-041006-10. doi:10.1115/1.2988383.

This paper presents a haptic system the authors have developed for shape exploration in the field of industrial design. The system consists of a novel haptic-based digital technology, allowing designers to add the tactile experience to the visual one. The haptic interface developed allows designers to see and haptically feel through free hand motions an object surface during its creation and evolution. The system closes the loop of shape modification and its subsequent evaluation: It is possible to evaluate the “what if” related to a new product shape, applying the modification and comparing the solutions more and more times, and generate and maintain different versions. That improves the level of interaction of designers with the digital models, exploits their skills, and shortens the product development life cycle.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2008;8(4):041007-041007-9. doi:10.1115/1.2988385.

This paper presents the analytical and experimental results on a new haptic telemanipulation environment for microrobot control. The proposed environment is comprised of a 5DOF force feedback mechanism, acting as the master, and a 2DOF microrobot, acting as the slave. The fact that the slave microrobot is driven by two centripetal force vibration micromotors makes the presented telemanipulation environment exceptional and challenging. The unique characteristics and challenges that arise during the haptic micromanipulation of the specific device are described and analyzed. The developed solutions are presented and discussed. Several experiments show that, regardless of the disparity between the master and slave, the proposed environment facilitates functional and simple microrobot control during micromanipulation operations.

Topics: Haptics , Force
Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2008;8(4):041008-041008-9. doi:10.1115/1.3006304.

A grasp exoskeleton actuated by a string-based platform is proposed to provide the force feedback for a user’s hand in human-scale virtual environments. The user of this interface accedes to seven active degrees of freedom in interaction with virtual objects, which comprises three degrees of translation, three degrees of rotation, and one degree of grasping. The exoskeleton has a light and ergonomic structure and provides the grasp gesture for five fingers. The actuation of the exoskeleton is performed by eight strings that are the parallel arms of the platform. Each string is connected to a block of motor, rotary encoder, and force sensor with a novel design to create the necessary force and precision for the interface. A hybrid control method based on the string’s tension measured by the force sensor is developed to resolve the ordinary problems of string-based interface. The blocks could be moved on a cubic frame around the virtual environment. Finally the results of preliminary experimentation of interface are presented to show its practical characteristics. Also the interface is mounted on an automotive model to demonstrate its industrial adaptability.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2008;8(4):041009-041009-12. doi:10.1115/1.3009670.

We developed a wearable exoskeleton haptic interface to fit the human body. We generated a force constituting a contrasting moment by pulling a wire using a dc motor. We also developed a control system, which included a motor controller, an interface, and a control software. We evaluated the performance of our interface by conducting a simple task experiment. To execute one task, the control data for each joint jaw must be prepared, and we used force control data generated by a rectified and filtered electromyogram (EMG) curve. From the force representation experiments, it was determined that a force curve based on the EMG data could be used for a haptic interface, and we confirmed that a suitable force curve could be obtained for each subject.

Commentary by Dr. Valentin Fuster

Technical Briefs

J. Comput. Inf. Sci. Eng. 2008;8(4):044501-044501-5. doi:10.1115/1.2988384.

A series of experiments was conducted to evaluate the operating characteristics of small DC motors that are often in tactile displays. The results indicated that these motors are reliable in terms of their frequency and amplitude of oscillation, but that the frequency varies across motors. A simulated skin material was developed to provide a substrate for evaluating the performance of the motors. There was a marked attenuation in frequency when the tactors were on this material and the surface waves could be detected 60 mm from the site of activation. These findings suggest that the spacing between tactors should be at least 60–80 mm if tactile cues are used to locate events in the environment.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2008;8(4):044502-044502-8. doi:10.1115/1.3006306.

Virtual reality (VR) technology holds promise as a virtual prototyping (VP) tool for mechanical assembly; however, several developmental challenges still need to be addressed before VP applications can successfully be integrated into the product realization process. This paper describes the development of System for Haptic Assembly and Realistic Prototyping (SHARP), a portable virtual assembly system. SHARP uses physics-based modeling for simulating realistic part-to-part and hand-to-part interactions in virtual environments. A dual-handed haptic interface for a realistic part interaction using the PHANToM® haptic devices is presented. The capability of creating subassemblies enhances the application’s ability to handle a wide variety of assembly scenarios at the part level as well as at the subassembly level. Swept volumes are implemented for addressing maintainability issues, and a network module is added for communicating with different VR systems at dispersed geographic locations. Support for various types of VR systems allows an easy integration of SHARP into the product realization process, resulting in faster product development, faster identification of assembly and design issues, and a more efficient and less costly product design process.

Commentary by Dr. Valentin Fuster

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In