AI Silences Fruit Fly Courtship Song & Controls Neurons in Real-Time

by Rachel Kim – Technology Editor

In a breakthrough that blends artificial intelligence with neuroscience, researchers have developed a system capable of identifying and interrupting a fruit fly’s courtship ritual in real time. The technology, detailed in the journal Science Advances, allows scientists to pinpoint the neural circuits responsible for complex social behaviors with unprecedented precision.

The system, dubbed YORU, utilizes a novel approach to behavioral detection. Rather than tracking individual body parts – a method often hampered by overlapping limbs and obscured views – YORU identifies entire postures as single behaviors within a single video frame. This “whole posture detection” proved remarkably accurate, achieving 90% to 98% accuracy across fruit flies, ants, and zebrafish for several challenging social behaviors, according to the study.

“Traditional tools tracked key points frame by frame, but social contact hid those points and made the math unstable,” explained Professor Azusa Kamikouchi of Nagoya University, the study’s senior author. “By treating the whole posture as the clue, YORU kept working in crowds.”

The speed of YORU is critical. The entire process, from camera frame capture to triggering a response, averages just 31 milliseconds – fast enough to intervene before a behavior, such as a wing extension during courtship, is completed. Here’s approximately 30% faster than comparable pose trackers, reducing average delay from 47 milliseconds.

To link behavior detection with neural control, the researchers employed optogenetics, a technique that allows specific neurons to be activated or silenced using light. Flies were genetically engineered to express light-sensitive proteins in targeted brain cells. When YORU detected the initiation of a courtship wing extension, it triggered a precisely aimed light pulse that silenced the relevant neurons, effectively halting the behavior. During two-fly tests, the light remained focused on the intended target 89.5% of the time.

This level of precision allows researchers to isolate the impact of specific neural circuits on social interactions without disrupting the behavior of nearby animals. “We can silence fly courtship neurons the instant YORU detects wing extension,” Kamikouchi stated.

Beyond controlling behavior, the researchers demonstrated YORU’s ability to interpret brain activity. By combining the behavioral data with calcium imaging – a technique that tracks neuron activity – they were able to correlate specific running and grooming behaviors in mice with distinct patterns of neural activity. The maps generated by YORU aligned with those created by human scoring, validating the tool’s reliability.

The researchers acknowledge limitations to the system. Some complex social behaviors unfold over multiple frames, potentially causing the single-frame detector to miss crucial details. The system currently lacks built-in identity tracking, meaning it can identify a behavior but not always confirm which individual is performing it over time. Hardware limitations, such as projector and controller delays, also pose a challenge.

To promote wider adoption, the researchers developed a user-friendly graphical interface, allowing scientists without extensive coding experience to train new behavior detectors and run experiments. The system is designed to integrate with existing laboratory equipment, such as cameras and lights.

The team is now focused on capturing longer, more complex behaviors and reducing hardware delays to improve targeting accuracy in larger, more dynamic groups. Future research will explore the ethical implications of this technology as it becomes more widely available.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.