Skip to main content
Log in

A Practical Multirobot Localization System

  • Published:
Journal of Intelligent & Robotic Systems Aims and scope Submit manuscript

Abstract

We present a fast and precise vision-based software intended for multiple robot localization. The core component of the software is a novel and efficient algorithm for black and white pattern detection. The method is robust to variable lighting conditions, achieves sub-pixel precision and its computational complexity is independent of the processed image size. With off-the-shelf computational equipment and low-cost cameras, the core algorithm is able to process hundreds of images per second while tracking hundreds of objects with millimeter precision. In addition, we present the method’s mathematical model, which allows to estimate the expected localization precision, area of coverage, and processing speed from the camera’s intrinsic parameters and hardware’s processing capacity. The correctness of the presented model and performance of the algorithm in real-world conditions is verified in several experiments. Apart from the method description, we also make its source code public at http://purl.org/robotics/whycon; so, it can be used as an enabling technology for various mobile robotic problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Thrun, S., Burgard, W., Fox, D., et al.: Probabilistic robotics, vol. 1. MIT press Cambridge (2005)

  2. Breitenmoser, A., Kneip, L., Siegwart, R.: A monocular vision-based system for 6D relative robot localization. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 79–85 (2011)

  3. Yamamoto, Y., et al.: Optical sensing for robot perception and localization. In: IEEE Workshop on Advanced Robotics and its Social Impacts, pp. 14–17. IEEE (2005)

  4. Vicon: Vicon MX Systems. http://www.vicon.com. [cited 8 Jan 2014]

  5. Mellinger, D., Michael, N., Kumar, V.: Trajectory generation and control for precise aggressive maneuvers with quadrotors. Int. J. Robot. Res. 31(5), 664–674 (2012)

    Article  Google Scholar 

  6. Fiala, M.: ’ARTag’, an improved marker system based on artoolkit (2004)

  7. Wagner, D., Schmalstieg, D.: ARToolKitPlus for pose tracking on mobile devices. In: Proceedings of 12th Computer Vision Winter Workshop, pp. 139–146 (2007)

  8. Kato, D.H.: ARToolKit. http://www.hitl.washington.edu/artoolkit/, [cited 8 Jan 2014]

  9. Fiala, M.: Vision guided control of multiple robots. In: First Canadian Conference on Computer and Robot Vision, pp. 241–246 (2004)

  10. Rekleitis, I., Meger, D., Dudek, G.: Simultaneous planning, localization, and mapping in a camera sensor network. Robot. Auton. Syst. 54(11) (2006)

  11. Stump, E., Kumar, V., Grocholsky, B., Shiroma, P.M.: Control for localization of targets using rangeonly sensors. Int. J. Robot. Res. (2009)

  12. Fiala, M.: Comparing ARTag and ARtoolkit plus fiducial marker systems. In: Haptic Audio Visual Environments and their Applications, pp. 6–pp. IEEE (2005)

  13. Bošnak, M., Matko, D., Blažič, S.: Quadrocopter hovering using position-estimation information from inertial sensors and a high-delay video system. J. Intell. Robot. Syst. 67(1), 43–60 (2012)

    Article  Google Scholar 

  14. ArUco: a minimal library for augmented reality applications based on opencv. http://www.uco.es/investiga/grupos/ava/node/26. [cited 8 Jan 2014]

  15. Ahn, S.J., Rauh, W., Recknagel, M.: Circular coded landmark for optical 3d-measurement and robot vision. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1128–1133. IEEE (1999)

  16. Yang, S., Scherer, S., Zell, A.: An onboard monocular vision system for autonomous takeoff, hovering and landing of a micro aerial vehicle. J. Intell. Robot. Syst. 69(1–4), 499–515 (2012)

    Google Scholar 

  17. Lo, D., Mendonča, P.R., Hopper, A., et al.: TRIP: A low-cost vision-based location system for ubiquitous computing. Pers. Ubiquit. Comput. 6(3) (2002)

  18. Pedre, S., Krajník, T., Todorovich, E., Borensztejn, P.: Hardware/software co-design for real time embedded image processing: A case study. In: Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications, pp. 599–606. Springer (2012)

  19. Kulich, M., et al.: Syrotek - distance teaching of mobile robotics. IEEE Trans. Educ. 56(1), 18–23 (2013)

    Article  Google Scholar 

  20. Pedre, S., Krajník, T., Todorovich, E., Borensztejn, P.: Accelerating embedded image processing for real time: a case study. J. Real-Time Image Process. (2013)

  21. Heikkila, J., Silven, O.: A four-step camera calibration procedure with implicit image correction. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1106–1112 (1997)

  22. Yang, S., Scherer, S.A., Zell, A.: An onboard monocular vision system for autonomous takeoff, hovering and landing of a micro aerial vehicle. J. Intell. Robot. Syst. 69(1–4), 499–515 (2013)

    Article  Google Scholar 

  23. Krajník, T., Nitsche, M., Faigl, J.: The WhyCon system. http://purl.org/robotics/whycon, [cited 8 Jan 2014]

  24. Faigl, J., Krajník, T., Chudoba, J., Přeučil, L., Saska, M.: Low-cost embedded system for relative localization in robotic swarms. In: Proceedings of IEEE International Conference on Robotics and Automation (ICRA), pp. 985–990, IEEE, Piscataway (2013)

  25. Saska, M., Krajník, T., Přeučil, L.: Cooperative Micro UAV-UGV Autonomous Indoor Surveillance. In: International Multi-Conference on Systems, Signals and Devices, p. 36, IEEE, Piscataway (2012)

  26. Kernbach, S., et al.: Symbiotic robot organisms: Replicator and symbrion projects. In: Proceedings of the 8th Workshop on Performance Metrics for Intelligent Systems, pp. 62–69. ACM (2008)

  27. Cajtler, V.: Syrotek localization system. Bachelor thesis, Dept. of Cybernetics, CTU (2013). In Czech

  28. Krajník, T., Přeučil, L.: A simple visual navigation system with convergence property. In: Proceedings European Robotics Symposium (EUROS), pp. 283–292 (2008)

  29. Krajník, T., Nitsche, M., Pedre, S., Přeučil, L., Mejail, M.: A Simple Visual Navigation System for an UAV. In: International Multi-Conference on Systems, Signals and Devices, p. 34, IEEE, Piscataway (2012)

  30. Krajník, T., et al.: Simple, yet stable bearing-only navigation. J. Field Robot. 27(5), 511–533 (2010)

    Article  Google Scholar 

  31. Hawes, N.: STRANDS - Spatial-Temporal Representations and Activities for Cognitive Control in Long-Term Scenarios. http://www.strands-project.eu, [cited 8 Jan 2014]

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tomáš Krajník.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Krajník, T., Nitsche, M., Faigl, J. et al. A Practical Multirobot Localization System. J Intell Robot Syst 76, 539–562 (2014). https://doi.org/10.1007/s10846-014-0041-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10846-014-0041-x

Keywords

Navigation