Tuesday, June 30, 2009

Little Gray Cells

One of the reasons I love mysteries—at least the classical, cerebral whodunits of writers like Agatha Christie and Rex Stout—is that they employ the intelligence not only of the detectives doing the sleuthing, but also that of the reader or viewer. Nero Wolfe, Hercule Poirot, and Miss Marple all solve mysteries using various kinds of intelligence and wisdom, and don’t depend so much on sensation as intrigue for plot lines. Good puzzles to be solved are their mainstays—not gruesomeness, or, when there is, it’s usually suggested rather than described in gory detail.

It was with some amusement, therefore that I watched the two most recent Hercule Poirot episodes Sunday night on Masterpiece Mystery—right after watching a silly and rather gratuitous story reported by Leslie Stahl on 60 Minutes about the latest "mind-reading" advances in neuroscience. The claim, according to the blurb on the program's website, is that "Neuroscience has learned so much about how we think and the brain activity linked to certain thoughts that it is now possible - on a very basic scale - to read a person's mind."

The research reported, on scanning brains to locate certain kinds of activities, is nothing new. But Stahl’s breathless and eager reportage implied that we’re essentially one step away from developing the ability to read people’s thoughts via MRIs and other technomarvels.

Baloney. What the report did show is that given choices between two different objects that activate different regions of the brain, the machine can tell which object is being thought about. Set up the experiment in a particular way, predispose the subject (and the machine) to think about or look at specific objects and Bingo! The stuff of science fiction becomes the stuff of reality. Only the media thirst for new and spectacular results of the hottest scientific “breakthroughs” would translate this information into something it’s not: not all that new, not all that interesting, and, ultimately, of questionable use. The researchers, drooling at the chance to keep their funds rolling in by hitting the big time on 60 minutes, play along gleefully. Yes! This is the first step toward . . . whatever.

By no means do I intend to condemn all of contemporary neuroscience with my snarky attitude toward this story. As anyone can see on the well-designed and interesting page, Brainbriefings, from The Society for Neuroscience, potentially useful and perhaps necessary research is going on all the time. And maybe it’s my own personal failing that sees the research featured on 60 Minutes as not only useless but bogus. It’s just that repeatedly hearing people refer to the brain as a machine makes my own gray matter cause me to see red.

The brain is not a machine. The brain is in some ways like a machine, but in other ways it’s dramatically different. At times seeing the brain as a computer can be useful, but the analogy breaks down very quickly. Until we have organic machines that evolve through physical experience and embody being not just in brains but throughout their biosystems, there won’t be anything like a "thinking" machine.

Computers are often referred to as mechanical (or digital or synthetic) "brains," and the most frequent metaphor employed is the same human brain/computer metaphor in reverse. The computer is like a brain. A brain is like a computer. Both are limited, and both exhaust their usefulness quickly.

A computer is not a brain. It’s programmed by human beings to collect and process various kinds of information. Thanks to rapidly evolving technologies, computers are now capable of completing more and more sophisticated tasks. That this evolution might bring us Cylons or Borg or Replicators is the stuff of a different, and equally well-loved literary genre—but we’re not there yet, and we’re actually limited by our own models (so far the only things we can imagine look like human beings and other organic life forms).

I have no doubt that without serious evaluation of emerging technologies we might eventually do ourselves in with our own cleverness. And although I’m not afraid that anybody’s going to come up with a Replicator of the sort featured on Stargate: SG1 (and Atlantis), nanotechnology does have the potential to be used badly and to cause problems we haven’t even thought of yet—precisely because we don’t take time to imagine what might come along.

Information, may I remind some of these people, is not knowledge. Knowledge grows out of experience (the broader the better), and if we limit the experiences of our kids to video games, television, personal technologies, and all things digital, I’m not sure what will emerge. It’s a mystery to me how anybody expects us to be able to develop mechanical brains when we no longer exercise the organic versions in ways that originally made all of our fancy technologies possible in the first place.

I doubt that any of the men and women who worked on ENIAC, the first general-purpose computer ever made (originally housed at the Moore School of Engineering at Penn, across the walk from where I lived on campus), could have imagined the emergence of the microchip when they were busy programming their room-sized machine. The trouble with technology, as I’ve most likely mentioned before, is that the human brains that produce it don’t often take the time to puzzle through to the consequences.

What we really need is to foster the kind of thinking and imagination employed by my favorite sleuths as they go about solving mysteries: the ability to see through the hype and sensation and get to the roots of the problems. Nero Wolfe did it by ruminating over literature and potting orchids, Miss Marple whilst at her knitting, and Poirot by employing his "little gray cells." Were we all to expand our avocations to include physical activities and a bit of reading, we might be less caught up in notions of “mind-reading” and better able to put our minds to better uses.

Image credits: The Epithalmus nicely illustrated, a low-res image from an MRI, and a detail of the back of a panel of ENIAC, showing vacuum tubes, all via Wikimedia Commons. What would I do without these people?

No comments: