Free from fear or favour
No tracking. No cookies

The Upside Down: Seeing Is Predicting

John Mitchinson explores why perception is as much about what we know as what we see

Fundus photograph of a normal right eye. Photo: Wikimedia Commons

Newsletter offer

Subscribe to our newsletter for exclusive editorial emails from the Byline Times Team.

At some point in the early 1820s, a small boy was gazing up at the belltower of the Garrison Church in Potsdam. In the belfry at the top of the tower, almost 300 ft above him, he saw some people. Thinking they were dolls, he asked his mother to reach up and get them for him. Years later he would write: “The circumstances were impressed on my memory, because it was by this mistake that I learned to understand the law of foreshortening in perspective.”

The boy was Hermann von Helmholtz and he would grow up to become one of the great polymaths of 19th Century science. His 1867 Treatise on Physiological Optics would transform the way perception was studied and understood, in the same way his invention of the ophthalmoscope transformed the study of the eye itself.  

This early lesson had demonstrated the unreliability of his own eyes. 

Later, backed up by careful scientific experimentation, he was to reach a conclusion that has revolutionised not just our study of perception, but of the brain itself. He proved that, even with the simplest of perceptions, the brain is making what he calls “unconscious inferences” about what is going on. It is filtering the visual stimuli and actively constructing a version of reality based on previous experience. What we see isn’t ‘reality’ but the result of a negotiation between our brain and the world. 

One good example of this was the eye’s blind spot, which Helmholtz described in detail. 

This is the pinhead-sized point, slightly to the right or left of the eyes’ centre of vision, where the optic nerve exits the retina. There are no light receptors there, so as Helmholtz concluded, “light that falls on the place where the optic nerve enters the eye is not perceived”. And yet we do not, unless we concentrate hard, notice any gap in our vision: our brain fills it in for us.

The experiments also confirmed a bias in our visual system towards moving images. There might be an evolutionary element to this – chasing prey and avoiding predators would tend to select for eyes attuned to movement. In fact, we now know movement is essential to vision – our eyes are continually moving in tiny flickers known as microsaccades. This is because, in order to send nerve impulses to the brain, the rod and cone cells that line our retina need to be continually stimulated. To see something stationary, we need microsaccades to create the illusion of movement in whatever we are looking at: they ensure light keeps striking our retina, although, once again, our brain edits these movements out as unnecessary.

ENJOYING THIS ARTICLE? HELP US TO PRODUCE MORE

Receive the monthly Byline Times newspaper and help to support fearless, independent journalism that breaks stories, shapes the agenda and holds power to account.

We’re not funded by a billionaire oligarch or an offshore hedge-fund. We rely on our readers to fund our journalism. If you like what we do, please subscribe.

One way of demonstrating your own brain’s visual editing function is to stand facing a mirror with a friend. Look first at one eye and then the other. You won’t be able to see your eyes moving but your friend will.

In order to ‘see’, we don’t even need eyes. 

Daniel Kish has been completely blind since losing his eyes to cancer before the age of 13 months. He has pioneered his own method of echolocation called “flash sonar” and in 2000 set up a charity, World Access for the Blind, which specialises in training instructors who can teach the method, particularly to blind children. He is very clear that echolocation is a form of “sight”. 

“We know from other studies that those who use human sonar as a principal means of navigation are activating their visual brain,” he observes. 

Kish’s descriptions of what he experiences are also remarkably visual for someone who has no memory of being able to see: “You do get a continuous sort of vision, the way you might if you used flashes to light up a darkened scene… It is in 3D, it has a 3D perspective, and it is a sense of space and spatial relationships. You also have a pretty strong sense of density and texture, that are like the colours of flash sonar.”

Human echolocation shows just how adept the brain is at moulding itself to experience. To make sense of the electrical impulses – whether they are clicks or textures or visual images – the brain constructs a model of the world that is useful rather than complete. It must assess whether something is helpful or threatening, familiar or unknown, very quickly. 

The most efficient way of doing this is to focus only on unexpected data. An obstacle on the road, a napkin on fire, an unfamiliar word. The expected progress of our lives sinks to a level that is mostly unconscious, thereby burning up much less neural energy and preventing us from being overwhelmed. 

The idea of our brains as a prediction engine is now a cornerstone of modern neuroscience. And it all started with a curious boy doubting the evidence of his senses.


Written by

This article was filed under