Detailseite
Projekt Druckansicht

Insekten-inspirierte holistische visuelle Navigationsverfahren für die Navigation radgetriebener und fliegender Roboter in Outdoor-Umgebungen

Fachliche Zuordnung Automatisierungstechnik, Mechatronik, Regelungssysteme, Intelligente Technische Systeme, Robotik
Bild- und Sprachverarbeitung, Computergraphik und Visualisierung, Human Computer Interaction, Ubiquitous und Wearable Computing
Förderung Förderung von 2014 bis 2019
Projektkennung Deutsche Forschungsgemeinschaft (DFG) - Projektnummer 259361312
 
Erstellungsjahr 2019

Zusammenfassung der Projektergebnisse

The main goals of this project were to extend and improve holistic navigation methods for the challenging conditions in outdoor environments. The most important aspects in this context are additional degrees of freedom through uneven ground or slopes, strong changes in illumination, and complex depth structure. Holistic methods for visual navigation use the whole image content, in contrast to the more widely applied feature-based methods which rely on feature detection, description and matching. Information apart from the detected interest points is discarded in feature-based methods, but the entire image is used in holistic methods. At the present state, holistic methods show competitive performance in indoor environments, but are strongly restricted by underlying assumptions. Within this project, we addressed unsolved aspects of holistic navigation methods, especially MinWarping and multi-snapshot model, in the context of challenging outdoor environments. As main achievements, we reached improvements regarding tolerance to illumination changes, extended MinWarping by additional degrees of freedom, generalized and extended the multi-snapshot model to homing and route following in 3D, integrated metric measures for the use in topometric maps, and built a broad basis of image databases for the structured analysis of aspects related to visual navigation methods. Additionally, we performed a first detailed study comparing holistic and feature-based visual navigation methods, and thoroughly analyzed loop closing methods. We also presented a method for holistic sequence-based along-route localization. Motivated by the finding that insects simplify visual homing by means of compass cues, we developed a polarization sensor and an efficient method for estimating azimuth and elevation angle of the sun that were integrated and tested on a multicopter. Furthermore, we proposed view-based models of the visual guidance and control of learning and return flights in groundnesting wasps. Two outdoor application scenarios form the basis for this project: a wheeled robot in the context of a lawn-mower application, and a flying robot in an inspection or surveillance task.

Projektbezogene Publikationen (Auswahl)

  • (2015). Three-dimensional models of natural environments and the mapping of navigational information. Journal of Comparative Physiology A, 201(6):563–584
    Stürzl, W., Grixa, I., Mair, E., Narendra, A., and Zeil, J.
    (Siehe online unter https://doi.org/10.1007/s00359-015-1002-y)
  • (2016). How wasps acquire and use views for homing. Current Biology, 26(4):470–482
    Stürzl, W., Zeil, J., Boeddeker, N., and Hemmi, J. M.
    (Siehe online unter https://doi.org/10.1016/j.cub.2015.12.052)
  • (2017). A lightweight single-camera polarization compass with covariance estimation. In International Conference on Computer Vision (ICCV), pages 5363–5371
    Stürzl, W.
    (Siehe online unter https://doi.org/10.1109/ICCV.2017.572)
  • (2017). Cloud-edge suppression for visual outdoor navigation. Robotics, 6(4):38
    Hoffmann, A. and Möller, R.
    (Siehe online unter https://doi.org/10.3390/robotics6040038)
  • (2017). Comparing holistic and feature-based visual methods for estimating the relative pose of mobile robots. Robotics and Autonomous Systems, 89:51 – 74
    Fleer, D. and Möller, R.
    (Siehe online unter https://doi.org/10.1016/j.robot.2016.12.001)
  • (2017). Visual place recognition for autonomous mobile robots. Robotics, 6(2):9
    Horst, M. and Möller, R.
    (Siehe online unter https://doi.org/10.3390/robotics6020009)
  • (2018). Appearance-based along-route localization for planetary missions. In International Conference on Intelligent Robots and Systems (IROS)
    Grixa, I. L., Schulz, P., Stürzl, W., and Triebel, R.
    (Siehe online unter https://doi.org/10.1109/IROS.2018.8594518)
  • (2018). Robust visual-inertial state estimation with multiple odometries and efficient mapping on an MAV with ultra-wide FOV stereo vision. In International Conference on Intelligent Robots and Systems (IROS)
    Müller, M. G., Steidle, F., Schuster, M., Lutz, P., Maier, M., Stoneman, S., Tomic, T., and Stürzl, W.
    (Siehe online unter https://doi.org/10.1109/IROS.2018.8594117)
  • (2019). An insect-inspired model for acquiring views for homing. Biological Cybernetics, 113(4):439–451
    Schulte, P., Zeil, J., and Stürzl, W.
    (Siehe online unter https://doi.org/10.1007/s00422-019-00800-1)
  • (2019). Visual-inertial sensor fusion with a bio-inspired polarization compass for navigation of mavs. In International Micro Air Vehicle Competition and Conference (IMAV). S: 83-88
    Steidle, F., Stürzl, W., and Triebel, R.
 
 

Zusatzinformationen

Textvergrößerung und Kontrastanpassung