Getting Smart About Intelligence


So what is intelligence, or more importantly, how does it work? The U.S. federal government recently announced it has budgeted some 3 billion dollars to mapping the human brain in hopes of answering these kinds of questions. (I would be happy to know why I struggle with driving directions.)

The larger question for the scientific world is: how does the brain make sense of the trillions of data bits thrown at it everyday? Many scientists subscribe to the theory that once data goes into the brain and gets processed, we can measure intelligence as subsequent behavioral output.  The idea being, the more complicated behavior, the more demonstrated intelligence.  For example, the fact that I can read a newspaper suggests I am more intelligent than my dog, who can fetch the newspaper but couldn’t make it past the first headline. Famed mathematician and computer scientist Alan Turing took this concept to its logical conclusion: something is intelligent the more it behaves like a human. 

But neuroscientist Jeff Hawkins, founder of Palm Computing and Handspring, says not so fast. Hawkin’s rejects human behavior as the gold standard for intelligence. Instead, Hawkins theory defines intelligence as ‘prediction based on pattern.’ And at the core of his prediction theory is memory. 

The mechanics look like this:

Information is delivered to the brain through the senses. The first stop is the older section of the brain, sometimes referred to as our reptilian brain. It got the nickname because reptiles  only work with this chunk of their gray matter. But for humans, evolution bestowed another processing center right on top of the old one. This new brain is called the cerebral cortex ("cortex" being Latin for bark). This is the pinkish, wrinkly mass we see when we picture the brain.  And this is where some of the information goes for higher processing after passing through our more instinctive reptilian brain. The newer, shinier system has the ability to analyze, look at our past and utilize memory to predict future outcomes.

Hawkins suggests that memory doesn’t necessarily store information as bits of data, like a computer, but instead as a sequence of patterns. For example, cheese wouldn’t only be stored under the heading C. You are capable of understanding cheese as part of a larger possibility, one involving pepperoni, sauce, and crust. In other words, through experience, your brain has come to understand cheese both as a free-standing item, but more importantly, it can recognize its role in the pattern known as pizza.

It is this associative quality of the memory which radically improves our ability to predict future outcomes. These memorized patterns, built through association, allow us to connect complex ideas and concepts in an instant. This computational ability currently leaves  even the most advanced computers in our dust.

Your brain is at a constant hum  measuring the outside world against new and pre-existing memory patterns, and then acting on this information. (So theoretically, once you’ve had the multi-sensory food experience known as Lou Malnati’s Chicago-style deep dish pizza, the intelligent brain would reject the inferior pattern known as Dominoes…)

Sadly, if you wipe out those patterns, you wipe out the ability to make informed decisions, because you literally don’t have anything to use as a reference. This is why Alzheimer’s disease is so devastating.

The good news is that you can reduce your chances of developing Alzheimer’s by 65% if you exercise 5 days a week for thirty minutes a day at an aerobic level. (This means getting to a point of being out of breath.) This level of exercise builds new neurons in your dentate gyrus, which is where memory patterns live, and according to Hawkins, is the horsepower driving intelligence.

See you down at the jogging path, provided I can remember the route.

Check out Robb’s new book and more 

Comments

Popular posts from this blog

The Robert Frost Quandary, or How Irrational Thinking Might Save Your Life

Who Are You: Does Personality Change as We Age?

The Best Painkiller Isn't What You Think It Is