Abstract:Anyone who has tried to swat a fly has likely been frustrated by its remarkable agility.This ability stems from its visual neural perception system, particularly the collision-selective neurons within its small brain.For autonomous robots operating in complex and unfamiliar environments, achieving similar agility is highly desirable but often constrained by the trade-off between computational cost and performance.In this context, insect-inspired intelligence offers a parsimonious route to low-power, computationally efficient frameworks.In this paper, we propose an attention-driven visuomotor control strategy inspired by a specific class of fly visual projection neurons-the lobula plate/lobula column type-2 (LPLC2)-and their associated escape behaviors.To our knowledge, this represents the first embodiment of an LPLC2 neural model in the embedded vision of a physical mobile robot, enabling collision perception and reactive evasion.The model was simplified and optimized at 70KB in memory to suit the computational constraints of a vision-based micro robot, the Colias, while preserving key neural perception mechanisms.We further incorporated multi-attention mechanisms to emulate the distributed nature of LPLC2 responses, allowing the robot to detect and react to approaching targets both rapidly and selectively.We systematically evaluated the proposed method against a state-of-the-art locust-inspired collision detection model.Results showed that the fly-inspired visuomotor model achieved comparable robustness, at success rate of 96.1% in collision detection while producing more adaptive and elegant evasive maneuvers.Beyond demonstrating an effective collision-avoidance strategy, this work highlights the potential of fly-inspired neural models for advancing research into collective behaviors in insect intelligence.




Abstract:Lobula plate/lobula columnar, type 2 (LPLC2) visual projection neurons in the fly's visual system possess highly looming-selective properties, making them ideal for developing artificial collision detection systems. The four dendritic branches of individual LPLC2 neurons, each tuned to specific directional motion, enhance the robustness of looming detection by utilizing radial motion opponency. Existing models of LPLC2 neurons either concentrate on individual cells to detect centroid-focused expansion or utilize population-voting strategies to obtain global collision information. However, their potential for addressing multi-target collision scenarios remains largely untapped. In this study, we propose a numerical model for LPLC2 populations, leveraging a bottom-up attention mechanism driven by motion-sensitive neural pathways to generate attention fields (AFs). This integration of AFs with highly nonlinear LPLC2 responses enables precise and continuous detection of multiple looming objects emanating from any region of the visual field. We began by conducting comparative experiments to evaluate the proposed model against two related models, highlighting its unique characteristics. Next, we tested its ability to detect multiple targets in dynamic natural scenarios. Finally, we validated the model using real-world video data collected by aerial robots. Experimental results demonstrate that the proposed model excels in detecting, distinguishing, and tracking multiple looming targets with remarkable speed and accuracy. This advanced ability to detect and localize looming objects, especially in complex and dynamic environments, holds great promise for overcoming collision-detection challenges in mobile intelligent machines.