[ASC-media] Media Release: Bees’ perfect landing inspires robot aircraft
The Vision Centre (via SciNews)
releases at scinews.com.au
Mon Oct 28 12:33:07 PDT 2013
:: The Vision Centre
:: MEDIA RELEASE
[To ensure delivery, please add releases at scinews.com.au to your address book, contact list, safe senders list, or white list.]
Bees’ perfect landing inspires robot aircraft
October 29, 2013 – for immediate release
Scientists at Australia’s Vision Centre have discovered how the honeybee can land anywhere with utmost precision and grace – and the knowledge may soon help build incredible robot aircraft.
By sensing how rapidly their destination ‘zooms in’ as they fly towards it, honeybees can control their flight speed in time for a perfect touchdown without needing to know how fast they’re flying or how far away the destination is.
This discovery may advance the design of cheaper, lighter robot aircraft that only need a video camera to land safely on surfaces of any orientation, says Professor Mandyam Srinivasan of The Vision Centre (VC) and The University of Queensland Brain Research Institute.
“Orchestrating a safe landing is one of the greatest challenges for flying animals and airborne vehicles,” says Prof. Srinivasan. “To achieve a smooth landing, it’s essential to slow down in time for the speed to be close to zero at the time of touchdown.”
Humans can find out their distance from an object using stereovision – because their two eyes, which are separated by about 65 mm, capture different views of the object. However, insects can’t do the same thing because they have close-set eyes, Prof. Srinivasan explains.
“So in order to land on the ground, they use their eyes to sense the speed of the image of the ground beneath them,” he says. “By keeping the speed of this image constant, they slow down automatically as they approach the ground, stopping just in time for touchdown.
“However, in the natural world, bees would only occasionally land on flat, horizontal surfaces. So it’s important to know how they land on rough terrain, ridges, vertical surfaces or flowers with the same delicacy and grace.”
In the study, the VC researchers trained honeybees to land on discs that were placed vertically, and filmed them using high speed video cameras.
“The boards carried spiral patterns that could be rotated at various speeds by a motor,” says Prof. Srinivasan. “When we spun the spiral to make it appear to expand, the bees ‘hit the brakes’ because they thought they were approaching the board much faster than they really were.
“When we spun the spiral the other way to make it appear to contract, the bees sped up, sometimes crashing into the disc. This shows that landing bees keep track of how rapidly the image ‘zooms in’, and they adjust their flight speed to keep this ‘zooming rate’ constant.”
“Imagine you’re in space and you don’t know how far away you are from a star,” Prof. Srinivasan says. “As you fly towards it, the other stars ‘move away’ and it becomes the focus. Then when the star starts to ‘zoom in’ faster than the regular rate, you’ll slow down to keep the ‘zooming rate’ constant.
“It’s the same for bees – when they’re about to reach a flower, the image of the flower will expand faster than usual. This causes them to slow down more and more as they get closer, eventually stopping when they reach it.”
The VC researchers also developed a mathematical model for guiding landings, based on the bees’ landing strategy. Prof. Srinivasan says unlike all current engineering-based methods, this visually guided technique does not require knowledge about the distance to the surface or the speed at which the surface is approached.
“The problem with current robot aircraft technology is they need to use radars or sonar or laser beams to work out how far the surface is,” Prof. Srinivasan says. “Not only is the equipment expensive and cumbersome, using active radiation can also give the aircraft away.
“On the other hand, this vision-based system only requires a simple video camera that can be found in smartphones. The camera, by ‘seeing’ how rapidly the image expands, allows the aircraft to land smoothly and undetected on a wide range of surfaces with the precision of a honeybee.”
The study “A universal strategy for visually guided landing” by Emily Baird, Norbert Boeddeker, Michael R. Ibbotson and Mandyam V. Srinivasan is published in the latest issue of the Proceedings of the National Academy of Sciences (PNAS). See: http://bit.ly/1aarog4
The Vision Centre is funded by the Australian Research Council as the ARC Centre of Excellence in Vision Science.
Prof. Mandyam Srinivasan, The Vision Centre and UQ, ph +61(0)7 3346 6322 or 0434 603 082
Prof. Ted Maddess, Director, The Vision Centre, ph +61 (0)2 6125 4099 or 0411 443 415
Bryony Webster, COO, The Vision Centre, ph +61 (0)2 6125 5398
Mikaeli Costello, Queensland Brain Institute, UQ, ph +61(0)401 580 685
Mandy Thoo, The Vision Centre media contact, ph +61(0)435 759 182
Distributed by http://www.scinews.com.au
This email has been sent to asc-media at lists.asc.asn.au. If you no longer wish to receive these messages, please email leave at scinews.com.au. If you have trouble viewing this email, you can view it online at http://www.scinews.com.au/releases/email_redirect/861.
:: SciNews - ACT Registered Business F00130078 - http://www.scinews.com.au ::
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the ASC-media