They don’t miss very often. Dragonflies hit what they aim at.
What makes this visually guided, in flight navigation even more remarkable is that you and I might look at a typical dragonfly environment and see something like this (and yes, there is a dragonfly in there):
A dragonfly looking at the view might see something more like this:
This is about one sixtieth the resolution of the original. Dragonfly eyes are probably even lower resolution than that.
To catch prey as they do, dragonflies have a series of neurons in their brains that are highly tuned to specific properties in their visual field. A lot of progress has been made studying these cells with very abstract, precisely controlled visual stimuli in the lab. But the real world that the dragonfly has to navigate is one of booming, buzzing confusion. How do these neurons compute in such messy real world settings?
Wiederman and O’Carroll examined this with a visual neuron given the descriptive but uninspiring name of Centrifugal Small Target Motion Detector 1 (CSTMD1). Using 360° pictures of a woodland and a parking garage (‽), they placed a small black square that was predicted to drive the visual neuron bonkers. There were also some natural “distractors” in the image, and Wiederman and O’Carroll were able to photoshop those out to see the difference in response.
These sorts of results are probably best appreciated in relatively raw form (slightly modified version part of their Figure 2; click to enlarge). There's a lot going on here, but I’ll walk you through it.
At the top is the picture they used. There are three “targets” expected to make CSTMD1 fire at high frequency, indicated by the three vertical lines. The first on the left is a set of leaves, the middle one is a tree stump, and the right one is the “perfect” stimulus added by the experimenters to the scene.
The three boxes with red and black tick marks show when the CSTMD1 neuron is firing an action potential. Each row is one pass of the photograph. Each animal was tested multiple times, and each animal’s data is grouped together by alternating colours. All the recordings from one animal are in black, the recordings from the second animal are grouped together in red, and the recordings from a third animal are back in black again.
The lines on top of the boxes show the averaged firing rates of the neuron.
The CSTMD1 has a little spontaneous activity, but you can definitely see the responses on the neuron as the visual field passes over the targets. Keep in mind that the CSTMD1 neuron in an interneuron in the brain, not a simply sensory cell in the eye. It is receiving and integrating input from many photoreceptors in the eyes, which is why the response to the targets is wider than the targets themselves.
These data show that this neuron is able to filter out a substantial amount of visual “noise” from the environment.
The second box is interesting, because it shows the response to the tree stump is enhanced when the “perfect” target that the experimenters added is taken out of the picture. The reason for this seems to be that the left CSTMD1 neuron gets inhibited by the right CSTMD1 neuron. Again, because these neurons integrate information over a wide field, the presence of a strong target on the left side can dampen the response to a stimulus on the right side.
The authors don’t have an explanation, though, for why the response to the tree stump gets smaller when the leaves are retouched out of the photo (third box). There are obviously some other influences on this neuron that haven’t been documented yet, and Wiederman and O’Carroll suggest this is a good reason to keep making computer models of these circuits to track all these influences.
This paper is a nice demonstration of how neuroethological research can slowly move from completely artificial lab settings back into the things that we want to know about: behaviour in a natural setting. We’re not there with dragonflies yet, but we’re getting closer. The next step would be to present dragonflies with naturalistic video instead of just still pictures, especially videos with moving prey items flitting about. Or better yet, take the recording rigs outside and do some science in the sun.
Neurons in the wild
For new brain cells, go to the wild
Wiederman S, O'Carroll D. 2011. Discrimination of features in natural scenes by a dragonfly neuron. The Journal of Neuroscience 31(19): 7141-7144. DOI: 10.1523/JNEUROSCI.0970-11.2011
Photo by freeform systems on Flickr; used under a Creative Commons license.