This article also appears in
Subscribe now »

Dr. Steven Wiederman's project explores potential links between a dragonfly's target acquisition and lock-on capability, with autonomous driving challenges. (Image: The University of Adelaide).

Dragonfly study yields insight into vehicle autonomy

A predatory dragonfly’s ability to detect, track and anticipate the escape maneuvers of a juicy target may provide a link to making autonomous driving safer.

Dr. Steven Wiederman, Research Supervisor at the University of Adelaide Medical School in Australia believes that a target-detecting neuron in the dragonfly's tiny brain which anticipates movement, could provide a link to vehicle vision systems. The dragonfly’s visual focus on prey is so great that it can do this even when its target is embedded against a background of “clutter."

To demonstrate the neuron’s potential for safer autonomous mobility applications, a university research team is using an autonomous robot wheeled platform to test sensing techniques derived from the dragonfly. It's part of collaborative research project being conducted by Dr. Wiederman's group and a team at Lund University in Sweden.

The Australian researchers discovered that the target-detecting neuron was able to enhance a dragonfly’s responses in a small focus area just ahead of a moving object being chased. Even if the dragonfly has lost sight of its prey, the focus spread forward over time allows the insect’s brain to predict its likely track and subsequent reappearance to re-establish target acquisition.

A problem facing autonomous vehicles is priority decision making with at least two choices in given traffic situations (or in dragonfly terms, targets). In an interview with Automotive Engineering, Dr. Wiederman explained how the entomological study can help.

“Biological brains have the ability to competitively select one stimulus amidst distracters. We found a neuron in the dragonfly brain that exhibits such selective attention. When presented with two moving targets, the neuron selects just one—sometimes even switching between them mid-trial. 

"Sometimes the neuron can ‘lock-on’ to a less salient stimulus," he noted. "We are currently investigating what properties of the target make it the one chosen—is it timing, saliency or trajectory? Is it only attributes of the stimulus, or is the dragonfly choosing the target by some high-order, internal workings in its brain? Finally, when is it appropriate to lock-on, and when is it time to switch to a more salient object?"

By studying this tractable, model system, the researchers hope to gain insight into how more complex brains select stimuli, e.g. a human driving along a road with multiple stimuli. So they're developing models for autonomous, moving platforms based on the dragonfly selection processes.

A unique robotic platform

There are three reasons for using the dragonfly according to Dr. Wiederman, who also heads the Visual Physiology and Neurobotics Laboratory at the Australian Research Council (ARC) Center for Nanoscale Biophotonics. First, it's a fine animal model for electrophysiological recordings; second, it's one of the world’s most effective predators, and third, it exhibits interesting "high-order" processing, e.g. prediction and selection that may not be exhibited by simpler insects, such as the house fly.

Does the dragonfly’s eye have similarities to that of a human? Dr. Wiederman explained that the compound eye has thousands of individual lenses focusing light onto a single retina, while human eyes have one lens focusing light onto a single retina. The dragonfly has less resolution (visual acuity of only ~0.8o), while humans have a central fovea of very high acuity. The dragonfly only has about 10o of binocular overlap and the eyes are too close together, so it must use other techniques for depth perception (e.g. motion parallax).

But Dr. Weiderman noted that there are many similarities between how the underlying neurons process visual information across a diverse range of animal species. By using human psychophysics experiments, his team examines what types of processing are evident in both humans and dragonflies.

The camera-equipped robotic platform is a Clearpath Robotics’ Husky A200 using an open-source serial protocol. Configured at the university, it has been designed to replicate the dragonfly’s target-tracking capability via its predictive pursuit of prey. The researchers believe it to be a technology “first” in such a context.

“We use different camera systems dependent on what we are testing. We test movement of the camera to emulate eye and head movements independent of body directions," he said. The use of a larger ground vehicle provides flexibility to test computationally expensive algorithms.

"We test ‘active vision’—how the moving platform itself affects the algorithms in a closed loop. From the computational neuroscience, the team develops models for autonomous selection and pursuit of a moving target.

"We hit ‘go’ and see what the Clearpath Husky ground vehicle autonomously pursues,” Dr. Wiederman asserted.

The CSTMD1 neuron

While it is one thing for artificial systems to be able to see moving targets, tracing movement so that it can then move out of the way of those things is a significant aspect of self-steering vehicles. The researchers found that the neuron (CSTMD1) in dragonflies not only predicts where a target would reappear but also traced movement from one eye to the other – even across the brain’s hemispheres, Dr. Wiederman reported.

A study of CSTMD1 was recently published in the journal eLife. The article stated that a diverse range of animals have the capability of detecting moving objects within cluttered environments. It added that this discrimination is a complex task, particularly in response to a small target generating very weak contrast as it moves against a highly textured background.

The study refers to the “winner-takes-all” neuron in the dragonfly, which is likely to promote such competitive selection of an individual target whilst ignoring a distraction.

More information on the study of the implementation of CSTMD1 into the robot was published in July 2017 by the Journal of Neural Engineering.

The research project is an international collaboration funded by the Swedish Research Council, the Australian Research Council and the Swedish Foundation for International Co-operation in Research and Higher Education.

Continue reading »
X