Have you ever seen your blind spot?

There’s a fantastically interesting demonstration over on Serendip, a science hub supported by Bryn Mawr College. Check it out.

Diagrams of a vertebrate eye and a cephalopod eye
Vertebrate eye (left) and cephalopod eye (right)

The gist of it is that vertebrate eyes, due to the way they evolved, are naturally set up in a way that leaves a hole in the vision. The retina, which senses light, is behind the nerve fibers that transfer the signals to the brain. In order to get those nerve fibers out of the eye, they bundle up near the back and go back into the skull– but since the retina is normally behind the fibers, there’s nothing to sense light in that spot. (Check out the image to the right in case that description’s not clear.)

The interesting thing, however, is that this doesn’t result in a gaping hole in your vision. Make no mistake, there’s absolutely no way, given the anatomy, that that portion of your eye detects light. Then why is your vision whole?

Enter the brain. Your brain, as the Serendip demonstrations show, is remarkably adept at filling in holes. In the demonstration, you adjust your vision while focusing on a single point, until a large black dot elsewhere on the page disappears. If you do this on a white background, then what you see is an unblemished field of white. But more remarkably, if there is a line running through the blind spot, your brain fills in that line, even though it’s getting no information for that spot. If the blind spot occurs in a field filled with a regular pattern, then your brain fills in the pattern appropriately.

I’m constantly amazed by this for a couple of reasons. First, it’s downright incredible. That our brains are sophisticated enough to patch up the image we’re seeing based on what’s around our blind spot– that’s just cool. And we don’t even notice it. Before I experienced the Serendip demonstration, I had no idea that my brain was doing all this work behind the scenes. Clearly, it’s doing its job, and it’s doing it well.

But the second thing that hits me about this is what it implies about our knowledge. The blind spot demonstration shows the power of our brains, which is incredible, but it’s also a reminder about the limitations of what we can know through perception. My brain is working hard and doing a great job at patching up that image and drawing the line right through the blind spot when that dot disappears, but it doesn’t change the fact that the dot disappears. True, if I stayed in that particular arrangement forever, never moving my eyes, I’d never realize that what I was seeing was not the whole picture. The whole thing seems accurate and truthful– but it’s not. The brain may continue the patterns, but it fails to perceive the big honking black dot in front of it. What we see is not always how it is.

I bring it up because I’ve been thinking about skepticism lately. Between the recent hubbub over the zodiac somehow changing; news about the manufacture of bogus hologram “Power Bands;” the continued exposure of the anti-vaccine fraud Andrew Wakefield; the flawed analysis of Cornell’s Professor Daryl Bem, whose mishandling of his data led him to conclude that humans have extrasensory perception; and a raft of other New Age, pseudoscientific malarkey, I’ve been pondering what it means to be skeptical, and on what bases I reject these claims. What I see over and over in both the originators of these claims and the people who repeat them is a failure of critical thinking. I want to lay out why critical thought is necessary, what it does and doesn’t entail, and use these foundations to systematically pick apart some of the more prevalent forms of bunk that are out there.

I’m not going to be the first to do this, I know. The Skeptic’s Dictionary is a fantastic resource for debunking a wide variety of these prevalent misconceptions, and I’m sure I’ll refer to it a number of times during this project. There are skeptical bloggers out there who have written tomes on this subject. I’m not naive enough to think my little project is going to rid the world of superstitious thinking and herald an era of critical thought, and that’s not my goal. My goal is to learn about skepticism and critical thought and exercise it, while hopefully leaving a clear path for anyone who wants to follow.

That said, let’s get back to blind spots.

The brain’s solution to the blind spot, as I mentioned above, is an incredibly impressive function. Its purpose is clear and laudable, since an educated patch over the blind spot is almost certainly preferable to an empty space of no perception. It’s a software solution that makes up for a hardware deficiency.

It’s not the only impressive piece of “software,” either. To continue the metaphor, the brain is constantly running thousands of programs, all of which have been selected by eons of evolution to help the species. We’ve got pattern-recognition software that allows us to learn and predict. We’ve got agency-ascribing software that helped us outsmart predators by imagining ourselves “in the shoes” of other beings. All of these systems have worked for so long because we implicitly trusted them, and without them, we probably would not have survived as a species.

Unfortunately, as anyone who’s worked with computers can tell you, computers are dumb. A program will do exactly what it’s told to do. To use an old example, give a computer the following commands:

  1. Say, “Hello world!”
  2. Go back to step 1

A neverending string of code reading, "Hello world!"

A computer with no other system in place to check against behavior like this will infinitely repeat “Hello world!” It doesn’t know any better– it’s doing exactly what it’s been told. The computer isn’t to blame for the output, because it’s doing its job perfectly. The blame falls on the programmer, whose program stuck the computer in an infinite loop. In programming, this is known as “garbage in, garbage out”– because a computer will not question even the most nonsensical input it’s given (unless, of course, it’s told to). Someone else has to be aware.

If you think of our brains as incredibly powerful biological supercomputers– which they kinda are– then you have to understand their limitations. The “programs” our brains are running have undoubtedly helped us for millennia, but their limitations sometimes get in the way of our understanding of the world. For instance, the way our brains process optics and calculate slopes undoubtedly makes coordination easier, but we’re not very good at perceiving slight curves– such as the curvature of the Earth. If you stand on the beach and stare out to the horizon, what you see is a flat plane extending as far as the eye can see. Our ancestors, naturally trusting what their senses told them, assumed that the Earth was flat, and if you were to only rely upon your own observations, you would probably come to a similar conclusion. We know the Earth is not flat, however– it’s an oblate spheroid (that is to say, a ball that’s a little tubby around the middle, not unlike Uncle Bob after Thanksgiving dinner). What we perceive is not necessarily what is.

An optical illusion demonstrating how the same color can look different
Squares A and B are the same color. No, really.

Our “blind spots” aren’t only in our vision, either. There are a number of ways that our brains misfire when processing information, leading to conclusions that are not logically supported. These are fallacies, or incorrect reasonings, and they’re so prevalent and so pernicious that they deserve their own post. Fallacious reasoning is a particularly insidious form of blind spot because the conclusions it leads to often seem to make total sense– and they appear to be reasoned out. For instance, if a gambler rubs his lucky coin every time he throws the dice, and in a week of doing this wins big, he’s likely to conclude that rubbing his lucky coin caused him to win. He did something and experienced an outcome– it makes sense, right? Unfortunately, like all the other blind spots, what we perceive and what seems to make sense are not necessarily what is.

Along with reasoning failures, there are also psychological processes that can distort our perceptions of the world. For instance, psychologists Karen Tavris and Elliot Aronson have written an excellent book called Mistakes Were Made (But Not by Me), which describes the psychological phenomenon of cognitive dissonance, and how our brains cope with it when it occurs. It’s a great book, and I highly recommend reading it yourself, but the gist is that the more an individual has riding on a belief, the more likely they are to subconsciously distort other perceptions and memories in order to preserve that belief. People can– and do– firmly internalize false memories, to the point that they are indistinguishable from genuine memories. Another psychological process that skews our perception of the world is confirmation bias– a psychological tendency to remember and look for information that supports our beliefs, and throw out, ignore, or forget information that runs contrary to it. Both of these are phenomena that occur all the time, to everyone, as a side effect of the many programs running in our brains.

What this all means is this: our brains and our bodies do not provide clear, unfiltered reflections of the world around us. When it comes to explaining the universe, we are woefully unequipped as individual human beings. We cannot rely solely upon our experiences, our insights as to what “feels” correct, or our memories to explain the cosmos, because all of them are untrustworthy and can report what is not necessarily the case. If we want to discover and explain how the world works, we have to recognize our own limitations, and use tools that allow us to circumvent or account for them. Critical thinking, and by extension, the scientific method, provide such tools, allowing us to work around our blind spots– instead of blindly charging ahead.

Science is a way of trying not to fool yourself. The first principle is that you must not fool yourself, and you are the easiest person to fool.

-Richard Feynman