Mandyam V. Srinivasan, Ph.D.
Queensland Brain Institute and School of Information Technology and Electrical Engineering, University of Queensland
Audio and slides from the 2014 Keynote Address are available on the Cambridge Research Systems website.
MORE THAN A HONEY MACHINE: Vision and Navigation in Honeybees and Applications to Robotics
Saturday, May 17, 2014, 7:15 pm, Talk Room 1-2
Flying insects are remarkably adept at seeing and perceiving the world and navigating effectively in it, despite possessing a brain that weighs less than a milligram and carries fewer than 0.01% as many neurons as ours does. Although most insects lack stereo vision, they use a number of ingenious strategies for perceiving their world in three dimensions and navigating successfully in it.
The talk will describe how honeybees use their vision to stabilize and control their flight, and navigate to food sources. Bees and birds negotiate narrow gaps safely by balancing the apparent speeds of the images in the two eyes. Flight speed is regulated by holding constant the average image velocity as seen by both eyes. Visual cues based on motion are also used to compensate for crosswinds, and to avoid collisions with other flying insects. Bees landing on a surface hold constant the magnitude of the optic flow that they experience as they approach the surface, thus automatically ensuring that flight speed decreases to zero at touchdown. Foraging bees gauge distance flown by integrating optic flow: they possess a visually-driven “odometer” that is robust to variations in wind, body weight, energy expenditure, and the properties of the visual environment. Mid-air collisions are avoided by sensing cues derived from visual parallax, and using appropriate flight control maneuvers.
Some of the insect-based strategies described above are being used to design, implement and test biologically-inspired algorithms for the guidance of autonomous terrestrial and aerial vehicles. Application to manoeuvres such as attitude stabilization, terrain following, obstacle avoidance, automated landing, and the execution of extreme aerobatic manoeuvres will be described.
This research was supported by ARC Centre of Excellence in Vision Science Grant CE0561903, ARC Discovery Grant DP0559306, and by a Queensland Smart State Premier’s Fellowship.
Srinivasan’s research focuses on the principles of visual processing, perception and cognition in simple natural systems, and on the application of these principles to machine vision and robotics.
He holds an undergraduate degree in Electrical Engineering from Bangalore University, a Master’s degree in Electronics from the Indian Institute of Science, a Ph.D. in Engineering and Applied Science from Yale University, a D.Sc. in Neuroethology from the Australian National University, and an Honorary Doctorate from the University of Zurich. Srinivasan is presently Professor of Visual Neuroscience at the Queensland Brain Institute and the School of Information Technology and Electrical Engineering of the University of Queensland. Among his awards are Fellowships of the Australian Academy of Science, of the Royal Society of London, and of the Academy of Sciences for the Developing World, the 2006 Australia Prime Minister’s Science Prize, the 2008 U.K. Rank Prize for Optoelectronics, the 2009 Distinguished Alumni Award of the Indian Institute of Science, and the Membership of the Order of Australia (AM) in 2012