0
Research Papers

The Correcting Approach of Gyroscope-Free Inertial Navigation Based on the Applicable Topological Map

[+] Author and Article Information
Mehdi Dehghani

Faculty of Electrical & Computer Engineering,
University of Tabriz,
Tabriz 5166616471, Iran
e-mail: Dehghani.mehdi@hotmail.com

Hamed Kharrati

Faculty of Electrical & Computer Engineering,
University of Tabriz,
Tabriz 5166616471, Iran
e-mail: kharrati@tabrizu.ac.ir

Hadi Seyedarabi

Faculty of Electrical & Computer Engineering,
University of Tabriz,
Tabriz 5166616471, Iran
e-mail: seyedarabi@tabrizu.ac.ir

Mahdi Baradarannia

Faculty of Electrical & Computer Engineering,
University of Tabriz,
Tabriz 5166616471, Iran
e-mail: mbaradaran@tabrizu.ac.ir

1Corresponding author.

Manuscript received January 9, 2017; final manuscript received November 3, 2018; published online February 4, 2019. Assoc. Editor: Monica Bordegoni.

J. Comput. Inf. Sci. Eng 19(2), 021001 (Feb 04, 2019) (14 pages) Paper No: JCISE-17-1005; doi: 10.1115/1.4041969 History: Received January 09, 2017; Revised November 03, 2018

The accumulated error and noise sensitivity are the two common problems of ordinary inertial sensors. An accurate gyroscope is too expensive, which is not normally applicable in low-cost missions of mobile robots. Since the accelerometers are rather cheaper than similar types of gyroscopes, using redundant accelerometers could be considered as an alternative. This mechanism is called gyroscope-free navigation. The article deals with autonomous mobile robot (AMR) navigation based on gyroscope-free method. In this research, the navigation errors of the gyroscope-free method in long-time missions are demonstrated. To compensate the position error, the aid information of low-cost stereo cameras and a topological map of the workspace are employed in the navigation system. After precise sensor calibration, an amendment algorithm is presented to fuse the measurement of gyroscope-free inertial measurement unit (GFIMU) and stereo camera observations. The advantages and comparisons of vision aid navigation and gyroscope-free navigation of mobile robots will be also discussed. The experimental results show the increasing accuracy in vision-aid navigation of mobile robot.

Copyright © 2019 by ASME
Your Session has timed out. Please sign back in to continue.

References

Sales, D. O. , Correa, D. O. , Fernandes, L. C. , Wolf, D. F. , and Osorio, F. S. , 2014, “ Adaptive Finite State Machine Based Visual Autonomous Navigation System,” Eng. Appl. Artif. Intell., 29, pp. 152–162. [CrossRef]
Kok, M. , Hol, J. D. , and Schon, T. B. , 2017, “ Using Inertial Sensors for Position and Orientation Estimation,” Found. Trends Signal Process., 11(1–2), pp. 1–153. [CrossRef]
Dehghani, M. , Kharrati, H. , Seyedarabi, H. , and Baradarannia, M. , 2018, “ Improvement of Angular Velocity and Position Estimation in Gyro-Free Inertial Navigation Based on Vision Aid Equipment,” IET Comput. Vision, 12(3), pp. 261–275.
Kortenkamp, D. , Bonasso, R. , and Murphy, R. , 1991, AI-Based Mobile Robots: Case Studies of Successful Robot Systems, The MIT Press, Cambridge, MA, pp. 125–140.
Krishnan, V. , 1965, “ Measurement of Angular Velocity and Linear Acceleration Using Linear Accelerometers,” J. Franklin Inst., 280(4), pp. 307–315. [CrossRef]
Williams, T. , Pahadia, A. , Petovello, M. , and Lachapelle, G. , 2009, “ Using an Accelerometer Configuration to Improve the Performance of a MEMS IMU: Feasibility Study With a Pedestrian Navigation Application,” 22nd International Technical Meeting of the Satellite Division of the Institute of Navigation (ION GNSS), Savannah, GA, Sept. 22–25, pp. 3049–3063. https://www.ion.org/publications/abstract.cfm?articleID=8719
Lu, J. , and Lin, P. , 2011, “ State Derivation of a 12-Axis Gyroscope-Free Inertial Measurement Unit,” Sensors, 11(3), pp. 3145–3162. [CrossRef] [PubMed]
Schopp, P. , Klingbeil, L. , Peters, C. , and Manoli, Y. , 2010, “ Design, Geometry Evaluation and Calibration of a Gyroscope-Free Inertial Measurement Unit,” Sens. Actuators, 162(2), pp. 379–387. [CrossRef]
Liu, C. , Zhang, S. , Yu, S. , Yuan, X. , and Liu, S. , 2014, “ Design and Analysis of Gyro-Free Inertial Measurement Units With Different Confgurations,” Sens. Actuators, 214, pp. 175–186.
Xu, H. , and Shen, Y. P. , 2013, “ Target Tracking Control of Mobile Robot in Diversified Manoeuvre Modes With a Low Cost Embedded Vision System,” Ind. Robot Int. J., 40(3), pp. 275–287. [CrossRef]
Jones, S. D. , Andresen, C. , and Crowley, J. L. , 1997, “ Appearance Based Process for Visual Navigation,” Fifth International Symposium on Intelligent Robotic Systems (IROS '97), Grenoble, France, Sept. 7–11, pp. 236–242.
Matsumoto, Y. , Inaba, M. , and Inoue, H. , 2003, “ View-Based Navigation Using an Omni-View Sequence in a Corridor Environment,” Vision Appl., 14(2), pp. 121–128. [CrossRef]
Maohai, L. , Han, W. , Lining, S. , and Zesu, C. , 2013, “ Robust Omnidirectional Mobile Robot Topological Navigation System Using Omnidirectional Vision,” Eng. Appl. Artificial Intell., 26(8), pp. 1942–1952. [CrossRef]
Jefferson, R. , Souza, N. , Pessin, G. , Shinzato, P. , Osorio, F. , and Wolf, D. , 2013, “ Vision-Based Waypoint Following Using Templates and Artificial Neural Networks,” Neuro Comput., 107, pp. 77–86.
Zhang, Z. , 2001, “ A Flexible New Technique for Camera Calibration,” IEEE Trans. Pattern Anal. Mach. Intell., 22(11), pp. 1330–1334. [CrossRef]
Dehghani, M. , Ahmadi, M. , Khayatian, A. , and Eghtesad, M. , 2014, “ Vision-Based Calibration of a Hexa Parallel Robot,” Ind. Robot Int. J., 41(3), pp. 296–310. [CrossRef]
O'Donnell, C. F. , 2007, Inertial Navigation: Analysis and Design, McGraw-Hill, New York, pp. 53–64.
Hartley, R. , and Zisserman, A. , 2004, A Multiple View Geometry in Computer Vision, 2nd ed., Cambridge University Press, Cambridge, UK.
Foxlin, E. , and Naimark, L. , 2004, “ Miniaturization, Calibration and Accuracy Evaluation of a Hybrid Self-Tracker,” Second IEEE and ACM International Symposium on Mixed and Augmented Reality, Tokyo, Japan, Oct. 7–10, pp. 151–158.
Bouguet, J. Y. , and Perona, P. , 1998, “ 3D Photography on Your Desk,” Sixth International Conference on Computer Vision, Bombay, India, Jan. 4–7, pp. 43–50.
Alhwarin, F. , Wang, D. , Ristic-Durrant, D. , and Gräser,, A. , 2008, “ Improved SIFT-Features Matching for Object Recognition,” BCS International Academic Conference on Visions of Computer Science, London, Sept. 22–24 https://ewic.bcs.org/content/ConWebDoc/22888.
Azeem, A. , Sharif, M. , Shah, J. , and Raza, M. , 2015, “ Hexagonal Scale Invariant Feature Transform (H-SIFT) for Facial Feature Extraction,” J. Appl. Res. Technol., 13(3), pp. 402–408. [CrossRef]

Figures

Grahic Jump Location
Fig. 2

A rigid body in GFIMU coordinate

Grahic Jump Location
Fig. 3

E-puck mobile robot2

Grahic Jump Location
Fig. 4

The structure of sensors in GFIMU

Grahic Jump Location
Fig. 5

The CAD environment of the workspace

Grahic Jump Location
Fig. 6

Specified edges of elements

Grahic Jump Location
Fig. 7

The base and inertial frames

Grahic Jump Location
Fig. 8

The desired trajectory (in meter)

Grahic Jump Location
Fig. 9

Special points in epipolar geometry

Grahic Jump Location
Fig. 10

The place of the board in robot workspace

Grahic Jump Location
Fig. 11

(a) Simulation results of the movements of a mobile robot in the environment from the images of left camera. (b) Results of the movements of the mobile robot in the environment from stereo cameras, top-down view.

Grahic Jump Location
Fig. 12

Distortion reduction in one of the chess board images

Grahic Jump Location
Fig. 13

Feature points extraction in stereo images (node 1)

Grahic Jump Location
Fig. 14

Feature points extraction in stereo images (node 2)

Grahic Jump Location
Fig. 15

Collinear feature point (node 1)

Grahic Jump Location
Fig. 16

Collinear feature point (node 2)

Grahic Jump Location
Fig. 17

The first-degree polynomial (node 1)

Grahic Jump Location
Fig. 18

The first-degree polynomial (node 2)

Grahic Jump Location
Fig. 19

The second-degree polynomial (node 1)

Grahic Jump Location
Fig. 20

The second-degree polynomial (node 2)

Grahic Jump Location
Fig. 21

The third-degree polynomial (node 1)

Grahic Jump Location
Fig. 22

The third-degree polynomial (node 2)

Grahic Jump Location
Fig. 23

The schematic of the Amendment algorithm

Grahic Jump Location
Fig. 24

Gyroscope-free inertial measurement unit sensor coordinates—front view

Grahic Jump Location
Fig. 25

Gyroscope-free inertial measurement unit sensor coordinates—top-down view

Grahic Jump Location
Fig. 26

The outputs of tri-axial accelerometer 1

Grahic Jump Location
Fig. 27

The outputs of tri-axial accelerometer 2

Grahic Jump Location
Fig. 28

The outputs of tri-axial accelerometer 3

Grahic Jump Location
Fig. 29

The outputs of tri-axial accelerometer 4

Grahic Jump Location
Fig. 30

Position estimation in gyroscope-free navigation. Euler step size is (a) 0.66 and (b) 0.44.

Grahic Jump Location
Fig. 31

Position estimation in gyroscope-free navigation with Runge–Kutta (RK4)

Grahic Jump Location
Fig. 32

The robot observation nodes in the topological map

Grahic Jump Location
Fig. 33

Fabricated mobile robot and structure of sensors in GFIMU

Grahic Jump Location
Fig. 34

The place of the board in robot workspace and calibration images taken by the left camera

Grahic Jump Location
Fig. 35

Feature points extraction in stereo images based on SIFT algorithm in node one (px is pixel number)

Grahic Jump Location
Fig. 36

Feature points extraction in stereo images based on SIFT algorithm in node two (px is the pixel number)

Grahic Jump Location
Fig. 37

Robot position estimation in gyroscope-free navigation and gyroscope-free navigation plus vision aid utilities

Tables

Errata

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In