Robotic models for the honeybee visual odometer

Lucia Bergantin
ISM, Aix-Marseille Université
https://fr.linkedin.com/in/lucia-bergantin-b7b026153

Date(s) : 16/03/2023   iCal
14 h 00 min - 15 h 00 min

In the hive, foraging honeybees inform their nestmates about the «location» of a food source by performing a waggle dance, which carries knowledge about the direction and «distance» to travel. Previous studies have suggested that the odometer (serving as a distance-meter) of flying honeybees assesses distance by mathematically integrating the raw angular velocity of the image sweeping backwards across their ventral viewfield, which is known as the translational optic flow. In aerial robotic applications, performing visual odometry onboard micro- and nano-drones is a particularly challenging task due to the low computational and perception resources available. Several winged insects, such as bees and butterflies, oscillate up and down while flying forward, adding an expansion and contraction component to their ventral optic flow vector field: this is the optic flow divergence. The question arises as to how raw integration of the optic flow (expressed in rad/s) could reliably encode a distance, since optic flow depends on the ground speed and the ground height. In this thesis, a model for the honeybee visual odometer, called SOFIa, is presented. The current ground height is estimated solely by means of an Extended Kalman Filter (EKF) and the optic flow divergence generated by the oscillating trajectory. The ground height estimate scales the translational optic flow, which is then mathematically integrated to obtain the distance travelled. By measuring the translational and divergence optic flow cues with optic flow sensors, the SOFIa visual odometer could be tested onboard a hexarotor both indoors and outdoors. A second model for the visual odometer (called SuRf) was also developed and tested in simulation. The SuRf visual odometer is also based on the scaling of the translational optic flow, but in this case, the optic flow taken into account is always perceived perpendicularly to the surface below. For this purpose, an active reorientation process was added so as to always keep the visual plane parallel to the ground below. The SuRf model improved the odometric performances obtained over uneven terrain in comparison with those of the raw SOFIa model. Modelling of the honeybee visual odometer using biologically plausible vision is therefore of great interest for two main reasons: (i) shed new light on the neuro-ethological processes at work in winged insects, and (ii) open the way to providing micro flying robots with minimalistic visual odometric equipment and abilities.

https://www.researchgate.net/profile/Lucia-Bergantin-2


IOSSB Seminar

 

Emplacement
Virtual event

Catégories



Retour en haut 

Secured By miniOrange