0
Research Papers

Toward Safe Human Robot Collaboration by Using Multiple Kinects Based Real-Time Human Tracking

[+] Author and Article Information
Carlos Morato

Department of Mechanical Engineering &
Institute for Systems Research,
University of Maryland,
College Park, MD 20742
e-mail: cmorato@umd.edu

Krishnanand N. Kaipa

Department of Mechanical Engineering &
Institute for Systems Research,
University of Maryland,
College Park, MD 20742
e-mail: kkrishna@umd.edu

Boxuan Zhao

Department of Mechanical Engineering &
Institute for Systems Research,
University of Maryland,
College Park, MD 20742
e-mail: zhaoboxuan@gmail.com

Satyandra K. Gupta

Department of Mechanical Engineering &
Institute for Systems Research,
University of Maryland,
College Park, MD 20742
e-mail: skgupta@umd.edu

Collection of all points in the work cell that are occupied by the robot.

Angle between the sensor axis and the horizontal plane.

Ratio of area covered by the Kinect and total area of the workspace.

Ratio of workspace coverage and the total area of the work cell.

Angle between the Kinect axis and the side wall.

The Kinect index is omitted for brevity.

1Corresponding author.

Contributed by the Computers and Information Division of ASME for publication in the JOURNAL OF COMPUTING AND INFORMATION SCIENCE IN ENGINNERING. Manuscript received July 27, 2013; final manuscript received October 20, 2013; published online January 22, 2014. Assoc. Editor: Joshua D. Summers.

J. Comput. Inf. Sci. Eng 14(1), 011006 (Jan 22, 2014) (9 pages) Paper No: JCISE-13-1139; doi: 10.1115/1.4025810 History: Received July 27, 2013; Revised October 20, 2013

We present a multiple Kinects based exteroceptive sensing framework to achieve safe human-robot collaboration during assembly tasks. Our approach is mainly based on a real-time replication of the human and robot movements inside a physics-based simulation of the work cell. This enables the evaluation of the human-robot separation in a 3D Euclidean space, which can be used to generate safe motion goals for the robot. For this purpose, we develop an N-Kinect system to build an explicit model of the human and a roll-out strategy, in which we forward-simulate the robot's trajectory into the near future. Now, we use a precollision strategy that allows a human to operate in close proximity with the robot, while pausing the robot's motion whenever an imminent collision between the human model and any part of the robot is detected. Whereas most previous range based methods analyzed the physical separation based on depth data pertaining to 2D projections of robot and human, our approach evaluates the separation in a 3D space based on an explicit human model and a forward physical simulation of the robot. Real-time behavior (≈ 30 Hz) observed during experiments with a 5 DOF articulated robot and a human safely collaborating to perform an assembly task validate our approach.

FIGURES IN THIS ARTICLE
<>
Copyright © 2014 by ASME
Your Session has timed out. Please sign back in to continue.

References

Bicchi, A., and Tonietti, G., 2004, “Fast and Soft Arm Tactics: Dealing With the Safety-Performance Trade-Off in Robot Arms Design and Control,” IEEE Rob. Autom. Mag., 11(2), pp. 22–33. [CrossRef]
Haddadin, S., Albu-Schaffer, A., and Hirzinger, G., 2010, “Safety Analysis for a Human-Friendly Manipulator,” Int. J. Soc. Robot., 2, pp. 235–252. [CrossRef]
Haddadin, S., Albu-Schaffer, A., and Hirzinger, G., 2009, “Requirements for Safe Robots: Measurements, Analysis and New Insights,” Int. J. Robot. Res., 28(11–12), pp. 1507–1527. [CrossRef]
Heinzmann, J., and Zelinsky, A., 2003, “Quantitative Safety Guarantees for Physical Human-Robot Interaction,” Int. J. Robot. Res., 22(7–8), pp. 479–504. [CrossRef]
Vogel, C., Poggendorf, M., Walter, C., and Elkmann, N., 2011, “Towards Safe Physical Human-Robot Collaboration: A Projection-Based Safety System,” Intelligent Robots and Systems (IROS), 2011 IEEE/RSJ International Conference on, pp. 3355–3360.
Balan, L., and Bone, G. M., 2006, “Real-Time 3d Collision Avoidance Method for Safe Human and Robot Coexistence,” Intelligent Robots and Systems, 2006 IEEE/RSJ International Conference on, pp. 276–282.
Flacco, F., and De Luca, A., 2010, “Multiple Depth/Presence Sensors: Integration and Optimal Placement for Human/Robot Coexistence,” Robotics and Automation (ICRA), 2010 IEEE International Conference on, pp. 3916–3923.
Flacco, F., Kroger, T., De Luca, A., and Khatib, O., 2012, “A Depth Space Approach to Human-Robot Collision Avoidance,” Robotics and Automation (ICRA), 2012 IEEE International Conference on, pp. 338–345.
Kuhn, S., Gecks, T., and Henrich, D., 2006, “Velocity Control for Safe Robot Guidance Based on Fused Vision and Force/Torque Data,” IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, pp. 485–492.
Kulic, D., and Croft, E., 2007, “Pre-Collision Safety Strategies for Human-Robot Interaction,” Auton. Rob., 22(2), pp. 149–164. [CrossRef]
Najmaei, N., and Kermani, M., 2010, “Prediction-Based Reactive Control Strategy for Human-Robot Interactions,” Robotics and Automation (ICRA), 2010 IEEE International Conference on, pp. 3434–3439.
Schiavi, R., Bicchi, A., and Flacco, F., 2009, “Integration of Active and Passive Compliance Control for Safe Human-Robot Coexistence,” Robotics and Automation, 2009. ICRA’09. IEEE International Conference on, pp. 259–264.
Haddadin, S., Albu-Schaffer, A., De Luca, A., and Hirzinger, G., 2008, “Collision Detection and Reaction: A Contribution to Safe Physical Human-Robot Interaction,” Intelligent Robots and Systems, 2008. IROS 2008. IEEE/RSJ International Conference on, pp. 3356–3363.
Shin, D., Sardellitti, I., Park, Y.-L., Khatib, O., and Cutkosky, M. R., 2010, “Design and Control of a Bio-Inspired Human-Friendly Robot,” Int. J. Robot. Res., 29(5), pp. 571–584. [CrossRef]
Marvel, J. A., 2013, “Performance Metrics of Speed and Separation Monitoring in Shared Workspaces,” IEEE. Trans. Autom. Sci. Eng., 10(2), pp. 405–414. [CrossRef]
De Santis, A., Siciliano, B., De Luca, A., and Bicchi, A., 2008. “An Atlas of Physical Human–Robot Interaction,” Mech. Mach. Theory, 43(3), pp. 253–270. [CrossRef]
Laborie, P., and Ghallab, M., 1995, “Planning With Sharable Resource Constraints,” International Joint Conference on Artificial Intelligence, LAWRENCE ERLBAUM ASSOCIATES LTD, Vol. 14, pp. 1643–1651.
Gupta, S. K., Paredis, C., Sinha, R., and Brown, P., 2001, “Intelligent Assembly Modeling and Simulation,” Assem. Autom., 21(3), pp. 215–235. [CrossRef]
Morato, C., Kaipa, K. N., and Gupta, S. K., 2013, “Improving Assembly Precedence Constraint Generation by Utilizing Motion Planning and Part Interaction Clusters,” Comput.-Aided Des., 45(11), pp. 1349–1364. [CrossRef]
Brough, J., Schwartz, M., Gupta, S. K., Anand, D., Kavetsky, R., and Pettersen, R., 2007, “Towards Development of a Virtual Environment-Based Training System for Mechanical Assembly Operations,” Virtual Reality, 11(4), pp. 189–206. [CrossRef]
Gupta, S. K., Anand, D., Brough, J., Kavetsky, R., Schwartz, M., and Thakur, A., 2008, “A Survey of the Virtual Environments-Based Assembly Training Applications,” Virtual Manufacturing Workshop, Turin, Italy, October.
Kaipa, K. N., Morato, C., Zhao, B., and Gupta, S. K., 2012, “Instruction Generation for Assembly Operation Performed by Humans,” ASME Computers and Information in Engineering Conference, Chicago, IL, August 2–15, 2012.
Smith, C., and Christensen, H. I., 2009, “Wiimote Robot Control Using Human Motion Models,” Intelligent Robots and Systems, 2009. IROS 2009. IEEE/RSJ International Conference on, IEEE, pp. 5509–5515.
Morato, C., Kaipa, K. N., Zhao, B., and Gupta, S. K., 2013, “Safe Human Robot Interaction by using Exteroceptive Sensing Based Human Modeling,” ASME Computers and Information in Engineering Conference, Portland, OR, August.
Henrich, D., and Gecks, T., 2008, “Multi-Camera Collision Detection Between Known and Unknown Objects,” Distributed Smart Cameras, 2008. ICDSC 2008. Second ACM/IEEE International Conference on, pp. 1–10.
Fischer, J., H. B., and Schilling, A., 2007, “Using Time-of-Flight Range Data for Occlusion Handling in Augmented Reality,” Eurographics Symposium on Virtual Environments (EGVE).
Valentini, P. P., 2012, “Natural Interface in Augmented Reality Interactive Simulations,” Virtual Phys. Prototyping, 7, pp. 137–151. [CrossRef]
Najmaei, N., Kermani, M., and Al-Lawati, M., 2011, “A New Sensory System for Modeling and Tracking Humans Within Industrial Work Cells,” IEEE Trans. Instrum. Meas., 60(4), pp. 1227–1236. [CrossRef]
Maybeck, P., 1979, Stochastic Models, Estimation, and Control, Academic Press, Inc, New York, Vol. 1.
Corrales, J. A., Candelas, F. A., and Torres, F., 2008, “Hybrid Tracking of Human Operators Using imu/uwb Data Fusion by a Kalman Filter,” Proceedings of the 3rd ACM/IEEE International Conference on Human Robot Interaction, HRI’08, ACM, pp. 193–200.
Andrieu, C., and Doucet, A., 2002, “Particle Filtering for Partially Observed Gaussian State Space Models,” J. R. Stat. Soc., 64, pp. 827–836. [CrossRef]
Davies, S., 2007, “Watching Out for the Workers [Safety Workstations],” Manuf., IET., 86(4), pp. 32–34. [CrossRef]
ISO 10218-1:2011, “Robots and Robotic Devices—Safety Requirements for Industrial Robots—Part 1: Robots,” International Organization for Standardization, Geneva, Switzerland, 2011. http://www.iso.org/iso/home/store/catalogue_ics/catalogue_detail_ics.htm?csnumber=51330
ISO 10218-2:2011, “Robots and Robotic Devices—Safety Requirements for Industrial Robots—Part 2: Industrial Robot Systems and Integration,” International Organization for Standardization, Geneva, Switzerland, 2011. http://www.iso.org/iso/catalogue_detail.htm?csnumber=41571.

Figures

Grahic Jump Location
Fig. 1

Overall system overview: (a) Work cell used to evaluate human-robot interaction. (b) 5 DOF robot used for the experiments. (c) Physical simulation used to evaluate the interference between the human and the robot in real-time.

Grahic Jump Location
Fig. 2

The Microsoft Kinect directly outputs a 20-joint model of a human observed in a 3D scene

Grahic Jump Location
Fig. 3

Coverage (horizontal projection) obtained by using four Kinect sensors. The blue-color regions are fully covered. Red- and white-colored regions represent the dead regions of the work cell.

Grahic Jump Location
Fig. 4

Filter tracking performance along x axis

Grahic Jump Location
Fig. 5

Filter tracking performance along y axis

Grahic Jump Location
Fig. 6

Filter tracking performance along z axis

Grahic Jump Location
Fig. 7

Filter tracking performance in 3D

Grahic Jump Location
Fig. 8

Postures used to test the estimation accuracy of the overall system

Grahic Jump Location
Fig. 9

Discrepancy between projections of ground truth and estimated values on the XY plane

Grahic Jump Location
Fig. 10

Discrepancy between projections of ground truth and estimated values on the YZ plane

Grahic Jump Location
Fig. 11

Discrepancy between ground truth and estimated values for each joint averaged over 15 locations

Grahic Jump Location
Fig. 12

Illustration of precollision strategy: (a) Human is far away from the robot. As the distance between the spheres is significant, robot performs its intended task. (b) An imminent collision is detected by the system; therefore, the robot is paused and a visual alarm is raised (bounding spheres change color). (c) and (d) Human returns to a safety zone; therefore, the robot resumes its motion.

Grahic Jump Location
Fig. 13

Robot and human collaborate to assemble the third part (radio box) onto the chassis

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In