NeuroMechFly v2: Advancing Brain-Body Coordination in Robotics and AI

Researchers have unveiled a sophisticated digital twin of the adult fruit fly, known as NeuroMechFly v2, marking a significant leap in our ability to simulate how biological organisms process sensory information to navigate the physical world. Developed at the Neuroengineering Laboratory at EPFL in Lausanne, Switzerland, this model provides a high-fidelity environment to study the complex interplay between a nervous system and a physical body.

The project focuses on Drosophila melanogaster, the common fruit fly, creating a virtual counterpart that can see, smell, and walk across challenging terrains. By integrating vision and olfaction with a detailed biomechanical framework, NeuroMechFly v2 allows scientists to investigate hierarchical sensorimotor control—essentially how the brain and motor systems coordinate to produce complex behaviors.

This advancement is more than a biological curiosity; it serves as a bridge between neuroscience, and robotics. By building biologically inspired controllers, the team is exploring how the principles of insect navigation can be applied to develop more efficient machine learning-based controllers for autonomous artificial agents and robots.

The Anatomy of a Digital Twin: Vision, Smell, and Body

At its core, NeuroMechFly v2 is built upon a rigorous biomechanical model derived from micro-CT scans of a real adult female fly. To ensure the simulation reflects biological reality, researchers adjusted several body segments, specifically within the antennae, to better match the actual physical properties of the insect according to the project’s documentation.

The model’s sensory capabilities are designed to mimic the specialized organs of a fruit fly:

  • Vision: The fly is equipped with simulated compound eyes. These consist of individual units called ommatidia, which are arranged on a hexagonal lattice to process visual inputs on the retinas.
  • Olfaction: The simulation includes odor receptors located in the maxillary palps and the antennae. The system computes chemical intensity at these specific locations to simulate how a fly experiences smell.
  • Locomotion: The model supports the navigation of complex terrains, utilizing simulated leg adhesion to move across various surfaces.

Controlling these systems is a simulated Central Nervous System (CNS). The architecture follows a hierarchy consisting of the brain and the Ventral Nerve Cord (VNC), a structure that is biologically analogous to the brain-spinal cord organization found in humans.

Bridging the Gap with Sensorimotor Control

One of the primary goals of NeuroMechFly v2 is to move beyond simple motor control and instead explore how the brain and motor systems work in tandem. To achieve this, the researchers implemented ascending motor feedback, which allows the model to utilize information from the body to inform the brain’s decisions.

The team demonstrated the power of this approach by constructing controllers capable of head stabilization and path integration. To test the integration of multiple senses, they used reinforcement learning to train a controller for a multimodal navigation task, requiring the fly to use both vision and smell simultaneously as detailed in their research published via PubMed.

Further experiments pushed the boundaries of bio-realistic modeling. These included simulating the navigation of complex odor plumes and “fly-fly following,” a behavior achieved through a visual network constrained by the fly’s actual connectome (the map of neural connections in the brain).

FlyGym 2.x.x: A Massive Leap in Performance

To make these simulations accessible and efficient, the team developed FlyGym, the Python library that powers NeuroMechFly. In a major update in March 2026, the team introduced the FlyGym 2.x.x API, which featured a complete code rewrite and a redesigned interface per the official NeuroMechFly update.

FlyGym 2.x.x: A Massive Leap in Performance

The performance gains from this rewrite are substantial, allowing for much faster iteration in research. The updated API delivers:

  • CPU-based simulations: Approximately 10x speed-up, resulting in roughly 2x real-time throughput.
  • GPU-based simulations: Approximately 300x speed-up via Warp/MJWarp, achieving roughly 60x real-time throughput.

Beyond raw speed, the 2.x.x version introduced an interactive viewer, a simplified dependency stack, and an improved workflow for scene composition. Since this version is not backward compatible, the original FlyGym 1.x.x has been migrated to flygym-gymnasium, with its own dedicated documentation.

Why This Matters for AI and Robotics

The implications of NeuroMechFly v2 extend far beyond the study of insects. In the field of robotics, creating agents that can navigate unpredictable environments is a perennial challenge. Most current AI controllers rely on massive datasets or simplified physics; however, by studying the “embodied” intelligence of a fruit fly, engineers can discover more efficient ways to handle sensorimotor feedback.

The use of connectome-constrained networks—where the AI’s architecture is limited by the actual physical wiring of a biological brain—suggests a path toward more energy-efficient and robust autonomous systems. By understanding how a tiny insect manages to stabilize its head or track an odor plume with minimal computing power, researchers can develop artificial agents that are more agile and adaptive.

Key Technical Specifications

NeuroMechFly v2 Technical Overview
Feature Implementation Detail
Biological Base Adult female Drosophila melanogaster (Micro-CT scan)
Visual System Hexagonal lattice of ommatidia
Olfactory System Antennae and maxillary palp receptors
Control Hierarchy Brain and Ventral Nerve Cord (VNC)
Software Interface FlyGym 2.x.x (Python)
GPU Acceleration Warp/MJWarp (~60x real-time throughput)

As the Neuroengineering Laboratory continues to refine the model, the focus will likely shift toward even more complex behaviors and the integration of more detailed neural circuits. For those interested in the technical implementation, the team encourages feature requests and contributions via their GitHub repository.

We will continue to monitor updates from the EPFL Neuroengineering Laboratory regarding new capabilities added to the FlyGym API and further publications on connectome-based control.

Do you think bio-inspired robotics will eventually outperform traditional AI in navigation? Share your thoughts in the comments below.

Leave a Comment