A male fruit fly, poised for courtship, abruptly ceased his mating song as a brief flash of green light interrupted his advance. The female, unimpressed by the broken serenade, moved away. This wasn’t a case of cold feet, but a demonstration of a new artificial intelligence system capable of identifying animal behaviors and controlling the brain circuits that drive them.
Developed by a collaborative team from Nagoya University, Osaka University, and Tohoku University in Japan, the AI, dubbed YORU (Your Optimal Recognition Utility), represents a significant leap forward in behavioral neuroscience. The research, published in Science Advances, details an AI system that can not only recognize which animal is performing a behavior within a group, but also selectively target that individual’s brain cells during social interactions.
YORU’s accuracy in recognizing behaviors across species exceeds 90%, encompassing actions like food-sharing in ants, social orientation in zebrafish, and grooming in mice. The system’s capabilities were dramatically demonstrated with fruit flies, where researchers combined YORU with optogenetics – a technique using light to control genetically engineered neurons – to silence the neurons responsible for the male’s courtship song, effectively reducing his chances of mating success.
Traditional methods of behavioral analysis rely on tracking individual body parts frame by frame, a process akin to motion capture used in video games. This approach becomes exceedingly complex when multiple animals are interacting or overlapping, and it struggles to keep pace with experiments requiring split-second timing. YORU circumvents these limitations by recognizing entire behaviors from a single video frame, rather than tracking points over time. According to Hayato Yamanouchi, co-first author from Nagoya University’s Graduate School of Science, YORU spotted behaviors in flies, ants, and zebrafish with 90-98% accuracy and operated 30% faster than existing tools.
The true innovation, explained senior author Azusa Kamikouchi, lies in the integration of YORU with optogenetics. “You can silence fly courtship neurons the instant YORU detects wing extension,” she stated. “In a separate experiment, we used targeted light that followed individual flies and blocked just one fly’s hearing neurons while others moved freely nearby.”
This ability to focus control on individual animals addresses a critical shortcoming of previous methods, which could only illuminate entire experimental chambers, affecting all subjects simultaneously and hindering the study of individual roles in social dynamics. The process relies on a three-step approach:
First, animals are genetically modified to express light-sensitive proteins, known as opsins, in specific neurons. These proteins can either activate or deactivate neurons depending on the type. Second, YORU captures the animal’s behavior in real-time using a camera. When the AI detects the target behavior, it sends an electrical signal to a light source, which then illuminates the target animal. Finally, the light activates or blocks the genetically modified neurons, altering the animal’s brain activity and, its behavior.
YORU’s versatility extends beyond its cross-species applicability. The AI can be trained to recognize new behaviors with minimal data and requires no specialized programming skills to operate. The Nagoya University team has made YORU freely available online, aiming to empower scientists worldwide studying the neural basis of social interactions.