When Maurizio Porfiri set out a decade ago to build a robot that could control how a group of animals behaved, he envisioned using it in the wild to perform environmental functions such as steering fish away from danger. “If there was an oil spill or a natural disaster, then you could use the robot as kind of a sheepdog for driving the fish away from the polluted region,” he says.

Robot meets real. Credit: Illustration: E. Dewalt / Springer Nature; background image: Getty

His team first embarked on the project from a distinctly engineering point of view—what they cared about was whether or not the fish could be directed from one place to another by the robotic device. But their interest soon veered in a more philosophical direction. “As we were doing the experiments, we got interested in what’s going on in their heads,” says Porfiri, a mechanical engineer at New York University Tandon School of Engineering. “How do they perceive the mimicry of the robot? How do they respond to it? And, does their response depend on their personality?”

These are the questions Porfiri investigates today with zebrafish, his species of choice, and a workshop for building any manner of robotic rig. He and a handful of other researchers working at the intersection of biology and robotics are exploring the intriguing possibilities and the inherent challenges of creating robot versions of animals that can interact with their flesh and blood counterparts.

There’s a lot to be learned from letting robots loose in a group of behaving animals. For one thing, to understand animal behavior—be it directional decision-making in fish, communication in honeybees, or shelter-seeking in cockroaches—researchers implicitly or explicitly create a conceptual model of it. Recreating that behavior in a robot through cues convincing enough that the robot is accepted by its unmechanized peers provides a way of validating that model, says José Halloy, professor of physics at the Université Paris Diderot, who has built robotic cockroaches, chicks, and now zebrafish.

Such robots allow researchers to probe animals’ reactions to different variables in highly standardized ways. For example, it can be tough to tease out how an animal’s size affects how it interacts with its conspecifics, notes Porfiri, since size is usually accompanied by other factors such as age and fitness, which can in turn affect behavior. “With a robot, you can keep everything the same, and just change the size.”

Engineering robotic interlopers that can have sustained social interactions with their target organisms isn’t easy, however. “At the end of the day it has to be accepted by the animal,” says Halloy. Invariably, researchers encounter limitations—often unexpected ones—in biological knowledge. Which specific cues would make the artificial creatures most realistic to the real ones? What kinds of information should be programmed in the algorithms that would allow the robots to dynamically interact with the animals? Then there are the seemingly more mundane issues: Can the programs that ensure the robot doesn’t bump into animals, or the walls of a testing space, run in parallel with those that govern its higher-order interactions? Will the hum of the motor be too loud?

Follow the cues

Animal-inspired design has long been a theme in robotics, but much of it involves creating robots for human use—or simply for human fascination. Think the fantastical creations of the companies Boston Dynamics and Festo, mimicking everything from ants, fleas and spiders to dogs, cheetahs and kangaroos. Since about the early 1990s, researches have also designed robots that mimic features of specific animals in order to investigate how they perform certain behaviors. For example, bioroboticist Barbara Webb at the University of Edinburgh creates robotic insects to study complex behaviors, such as how ants navigate. Auke Ijspeert, at the Swiss Federal Institute of Technology in Lausanne, uses robotic salamanders to explore how the modeled animal’s neural circuitry supported its evolutionary shift from aquatic to terrestrial locomotion.

But only in the last decade have researchers begun to study how animals interact with robotic versions of themselves. One root of such efforts stretches decades back to ethologists such as Nikolaas Tinbergen, who shared the 1973 Nobel Prize in Physiology or Medicine for showing he could elicit instinctual behaviors, such as fighting, from fish using wooden dummies that carried species-specific cues. That work revealed that artificial animals could trick the real ones into interacting with them if they conveyed the right signals. Robotics, however, opens another dimension because the possibilities for interaction can go both ways and are significantly more complex.

In 2007, Halloy and his colleagues created cockroach robots that could integrate into a social group of real roaches and influence its dynamics1. Cockroaches aren’t very visual, so their mechanized brethren didn’t have to look like them. Instead, the researchers made a concoction of chemicals that Halloy calls Cockroach Chanel #5, essentially rebuilding the olfactory cue through which cockroaches communicate.

Cockroaches tend to scurry out of the light, so the team created two shelters in a well-lit enclosed space, one invitingly dark and the other a bit brighter. When the natural roaches and their four robot relatives were first released into the enclosure, group social dynamics prevailed and both gravitated to the darker shelter. But when the robo-roaches were programmed to prefer the lighter shelters, they could lure the insects to follow them there. The robots allowed the researchers to test their understanding of the animals’ behavior by pushing it into a direction that wouldn’t naturally arise, Halloy says.

Animal-robot encounters in the lab can also reveal gaps in researchers’ understanding of behaviors, says Tim Landgraf, who heads the Biorobotics Lab at Free University Berlin. About a decade ago, Landgraf set out to create a robot that could communicate with honeybees using the waggle dance, a form of encoded communication these pollinators use to tell their hive-mates the distance and the location of forage sites. The waggle dance has been studied for decades; indeed, Austrian ethologist Karl von Frisch first decoded it in the late 1920s and shared Tinbergen’s Nobel Prize for his work. “But there are still so many details we don’t understand,” Landgraf says.

Waggle and roll: Tim Landgraf's robotic bees don't need to look like an actual bee to perform a convincing waggle dance. Credit: T. Landgraf, Free University Berlin

Initially, he and his colleagues conducted a computer analysis of 108 dances to pin down the moves that varied least across performances. They then created a RoboBee that performed those movements for an audience of bees inside the hive. In its early iteration, the robot could entice bees to leave the hive, but couldn’t convey to them the direction of the food source2. Since then, the researchers have improved both the software and the hardware in the robot, and now some bees do follow its signal to the food source. Getting there required much tweaking: Landgraf made the robot’s body softer and changed a few other features, but what helped most, he suspects, was closing the opening to the hive so that the bees were less disturbed by changes in wind and temperature. “In the end it was intuition,” he says. “Nobody could really tell me how to do it right.”

The robots and the bees: Evolutionary algorithms can learn from the animals, and encourage specific behaviors. Credit: EU FET project ASSISIbf (project coordinator: T. Schmickl)

Thomas Schmickl, professor of zoology at Karl-Franzens University Graz in Austria, heads an international consortium that is also designing robots that interact with honeybees. These robots will modulate temperature in the hive and create subtle patterns of vibrations and air flow that mimic wing beats. The aim is to motivate the bees to create more brood, or eggs, and thereby grow new generations more quickly. Hopefully, he says, the boost might prevent colonies from collapsing. It’s pitch dark inside the hives, so the robots don’t have to look like bees—they simply have to emit the right stimuli in a distinct pattern.

Sometimes Schmickl and other researchers doing such work use software that predetermines the robots’ behaviors, but often the robots are not preprogrammed as to how best interact among their living counterparts. Instead, researchers generally use evolutionary computation—essentially, an algorithm that evolves based on feedback from the animals—to allow the robots to train themselves to produce the correct stimuli patterns. Such algorithms have revealed that the simple models researchers devise to describe behaviors are often wrong, Schmickl says. For example, researchers believed that  individual bees can differentiate between small differences in temperature, and that they choose the toastier spot by following the temperature gradient uphill, he says. "But that’s not true, we found. They move around more or less randomly, and they solve the problem socially, by interacting with each other.”

Tail beats: 3D-printing and a moveable tail help make Porfiri's robotic fish feel more convincing. Credit: NYU-Poly

Robot see, robot do

One widely roboticized animal in this still-nascent field is fish. As vertebrates, fish behave more variably than insects do, and although they technically are not a superorganism, they have strong collective behaviors ripe for robotic probing. Porfiri, Schmickl and Halloy all work with zebrafish these days, and Landgraf on guppies. As zebrafish in particular are already a widely used model species, introducing the robotics component can add a valuable dimension to biomedical research, Porfiri says. His group recently began using their robotic platform to explore how fish change their social behavior in response to pharmacological substances, such as drugs of abuse. They also provide the complete specs for the system to any interested researchers. Porfiri says researchers studying cancer and autism, among other research areas, have reached out to his group for the design of zebrafish replicas and computer codes to score their behavior and conduct experiments.

He and his team whip off hard-plastic or silicone fish on a 3D printer, then paint them—yellow heads and fins, grey with blue striped bodies—and stick on some beady eyes to create convincing zebrafish lookalikes. The dummy is then attached via a rod to a robotic platform mounted onto the side of the testing tank, which drives it along complex trajectories typical of how zebrafish move naturally. Cameras provide real-time video feedback of any real fish in the tank, which the robot uses to adapt the dummy’s behavior. The more that the robots are able to mimic the real fish, the more the fish, in turn, grow responsive to them3. The real need now is to create a robotic system that matches the complexity of behavioral interactions that the fish have amongst themselves, and that can respond appropriately to animals’ cues. “The interaction I get now is rather sad,” he says. “Imagine you are talking with somebody, and they always say what you say.”

Rise of the rat-bots: Robotic rats built in Japan could make it easier for researchers to study one-on-one social interactions. Credit: H. Ishii and A. Takanishi Laboratory, Waseda University

Halloy's lab, working within the consortium led by Schmickl, is trying to solve this problem by creating robot fish that use social cues to lure real fish into changing direction as they swim or moving between connected tanks4. This search for behavioral complexity is precisely what makes the field interesting from both a robotics and an ethology perspective, Halloy says. “Fish, in general, are stochastic agents—there’s something random in their behavior,” he explains. “But most deep learning algorithms are deterministic, so they cannot cope easily with random process.” This forces advances in both sides of this equation. “It’s a kind of mirror effect, with the animal inspiring the artificial intelligence and robotics, and the AI and robotics inspiring what we think of how to model the animals.”

Follow the leader: Controlled by a rig below the tank, robotic guppies can follow the crowd, and one day may lead it too. Credit: T. Landgraf, Free University Berlin

So far, it is not yet possible to realize Porfiri’s original vision of deploying these mechanical swimmers in nature to address environmental problems. The world outside the lab is just too rich and full of quickly changing features for today’s algorithms to cope. Making autonomous, untethered robots that can respond appropriately in the wild “is very ambitious and far away,” he says. But slowly, there’s movement in that direction. Earlier this year, for example, Daniela Rus’s lab at the Massachusetts Institute of Technology described a soft robotic fish called SoFi that moves like a real fish and can potentially swim alongside marine organisms5. That robot is controlled by humans through a wireless signal, but it is potentially able to observe ocean animals without disturbing them the way a human diver would.

As the field of biorobotics and artificial infiltrators advances, researchers will surely expand the list of behaviors and species that can be investigated through robotic interactions, and will begin to focus on individual as well as group behaviors. Already, Japanese researchers at Waseda University in Tokyo are developing a rodent-like robot that interacts with rats. Project lead Hiroyuki Ishii and his former supervisor Atsuo Takanishi have designed the robot with an algorithm that allows it to learn how to play with a rat one-on-one6. Ishii is currently using the robot in collaboration with a group of psychologists studying social integration in rats, but would like it to be capable of more complex behaviors, like grooming. “We need to develop more rat-like hardware and a more adaptive machine learning system to realize that,” Ishii says.

Just how widely the approach can be applied might depend on how well the behavior under investigation is understood, says Landgraf. On the other hand, animals such as mammals may be more capable of conditioned learning, where interactions with the robot can be rewarded, notes Schmickl. But until then, there is much to learn. “The basic thing we learned [so far] is that it’s not so simple as people thought,” he says. ‘We are really at the very beginning: We are defining the first principles, the first recipes in this very first cookbook of how to design such systems.”