One of the reasons I love mysteries—at least the classical, cerebral whodunits of writers like Agatha Christie and Rex Stout—is that they employ the intelligence not only of the detectives doing the sleuthing, but also that of the reader or viewer. Nero Wolfe, Hercule Poirot, and Miss Marple all solve mysteries using various kinds of intelligence and wisdom, and don’t depend so much on sensation as intrigue for plot lines. Good puzzles to be solved are their mainstays—not gruesomeness, or, when there is, it’s usually suggested rather than described in gory detail.
It was with some amusement, therefore that I watched the two most recent Hercule Poirot episodes Sunday night on Masterpiece Mystery—right after watching a silly and rather gratuitous story reported by Leslie Stahl on 60 Minutes about the latest "mind-reading" advances in neuroscience. The claim, according to the blurb on the program's website, is that "Neuroscience has learned so much about how we think and the brain activity linked to certain thoughts that it is now possible - on a very basic scale - to read a person's mind."
The research reported, on scanning brains to locate certain kinds of activities, is nothing new. But Stahl’s breathless and eager reportage implied that we’re essentially one step away from developing the ability to read people’s thoughts via MRIs and other technomarvels.
Baloney. What the report did show is that given choices between two different objects that activate different regions of the brain, the machine can tell which object is being thought about. Set up the experiment in a particular way, predispose the subject (and the machine) to think about or look at specific objects and Bingo! The stuff of science fiction becomes the stuff of reality. Only the media thirst for new and spectacular results of the hottest scientific “breakthroughs” would translate this information into something it’s not: not all that new, not all that interesting, and, ultimately, of questionable use. The researchers, drooling at the chance to keep their funds rolling in by hitting the big time on 60 minutes, play along gleefully. Yes! This is the first step toward . . . whatever.
By no means do I intend to condemn all of contemporary neuroscience with my snarky attitude toward this story. As anyone can see on the well-designed and interesting page, Brainbriefings, from The Society for Neuroscience, potentially useful and perhaps necessary research is going on all the time. And maybe it’s my own personal failing that sees the research featured on 60 Minutes as not only useless but bogus. It’s just that repeatedly hearing people refer to the brain as a machine makes my own gray matter cause me to see red.
The brain is not a machine. The brain is in some ways like a machine, but in other ways it’s dramatically different. At times seeing the brain as a computer can be useful, but the analogy breaks down very quickly. Until we have organic machines that evolve through physical experience and embody being not just in brains but throughout their biosystems, there won’t be anything like a "thinking" machine.
Computers are often referred to as mechanical (or digital or synthetic) "brains," and the most frequent metaphor employed is the same human brain/computer metaphor in reverse. The computer is like a brain. A brain is like a computer. Both are limited, and both exhaust their usefulness quickly.
A computer is not a brain. It’s programmed by human beings to collect and process various kinds of information. Thanks to rapidly evolving technologies, computers are now capable of completing more and more sophisticated tasks. That this evolution might bring us Cylons or Borg or Replicators is the stuff of a different, and equally well-loved literary genre—but we’re not there yet, and we’re actually limited by our own models (so far the only things we can imagine look like human beings and other organic life forms).
I have no doubt that without serious evaluation of emerging technologies we might eventually do ourselves in with our own cleverness. And although I’m not afraid that anybody’s going to come up with a Replicator of the sort featured on Stargate: SG1 (and Atlantis), nanotechnology does have the potential to be used badly and to cause problems we haven’t even thought of yet—precisely because we don’t take time to imagine what might come along.
Information, may I remind some of these people, is not knowledge. Knowledge grows out of experience (the broader the better), and if we limit the experiences of our kids to video games, television, personal technologies, and all things digital, I’m not sure what will emerge. It’s a mystery to me how anybody expects us to be able to develop mechanical brains when we no longer exercise the organic versions in ways that originally made all of our fancy technologies possible in the first place.
I doubt that any of the men and women who worked on ENIAC, the first general-purpose computer ever made (originally housed at the Moore School of Engineering at Penn, across the walk from where I lived on campus), could have imagined the emergence of the microchip when they were busy programming their room-sized machine. The trouble with technology, as I’ve most likely mentioned before, is that the human brains that produce it don’t often take the time to puzzle through to the consequences.
What we really need is to foster the kind of thinking and imagination employed by my favorite sleuths as they go about solving mysteries: the ability to see through the hype and sensation and get to the roots of the problems. Nero Wolfe did it by ruminating over literature and potting orchids, Miss Marple whilst at her knitting, and Poirot by employing his "little gray cells." Were we all to expand our avocations to include physical activities and a bit of reading, we might be less caught up in notions of “mind-reading” and better able to put our minds to better uses.
Image credits: The Epithalmus nicely illustrated, a low-res image from an MRI, and a detail of the back of a panel of ENIAC, showing vacuum tubes, all via Wikimedia Commons. What would I do without these people?
Tuesday, June 30, 2009
Saturday, June 27, 2009
Shop Talk
In previous posts on the Farm, I've outlined my fondness for William Morris's philosophy of work and the dichotomy he saw between useful work and useless toil. I'm also an advocate of experiential education, the education of the whole person (as opposed to the education of the intellect alone), and (again inspired by Morris), the education of desire--which lies at the very core of my views on the environment and sustainability.
Last week on the Colbert Report, one of the guests was Matthew Crawford, a contributing editor for The New Atlantis and author of a rather intriguing book, Shop Class as Soulcraft: An Inquiry into the Value of Work. I'm already thinking, without having actually read the book (it was preceded by an essay of the same name in The New Atlantis, which I have read), that Morris would love it. And its premise, that doing things is every bit as important intellectually as talking about them, is enough to make me a fan for life. After all, my own little-read tome, More News From Nowhere, describes a society built on both doing and thinking, and fosters a kind of organic education that involves both the concrete and the abstract.
Some of my students may recall our conversations about the Bauhaus, and my arguments against the separation of art and craft. The Bauhaus, after all, combined a Foundation Course in theory (including material on the study of nature and the study of materials) followed by workshops under the tutelage of craftsmen and artists that put theory into practice.
I was personally affected by the American predilection for "tracking" students in elementary and secondary schools when I returned to the States in time for high school. "Smart" kids were put onto a college prep track, and "the others" were channeled into vocational studies. What that meant for me was no home ec, no art classes, no shop classes. Only academic studies. The one time I managed to break out of the mold was when I was a Junior and talked somebody into letting me take a typing class because I argued that it would help me write papers in college.
As it was set up, tracking was overall a bad model and led to a system of inequality that still exists. What seems to have sprung up in its stead (probably in an effort to address issues of "classism" in the seventies) is another bastardization of educational management: the idea that everybody can and should go to college. Now, I'm all for equal opportunity, but what if somebody really wants to spend the rest of his or her life building cars or airplanes or houses? Of course, one can argue that educated citizens make better construction workers--but why do they have to go to college to "get educated"? Especially if these folks are going to sit and stare at their philosophy teacher with the standard "why the hell do I have to take this class" look on their faces.
Matthew Crawford points to the origins of the problem in his interview on the Colbert show, where he spoke of the "pernicious and long history of tracking into vocational and college prep" courses, based on the "dichotomy between mental vs. manual" education--which in turn is based on a perception that "knowledge work" is better than manual work. Somehow, along the line, vocational training got a bad rap, and "knowledge work" took on an elitist mantle.
The idea that four-year colleges are necessary for future success in any field is just bogus. Community colleges or technical schools that provide continuing education in basic subjects like writing, maths, and general science can foster cultural literacy, leaving more of the time students would need for practical education on how to fight fires, build homes, or arrest felons.
I'm actually a product of the basic prejudice, as the first member of my working-class family to finish college, and I've resented the fact that, as a "smart" kid, I was channeled along paths I might not have taken. Many of my generation (cousins and siblings) followed me along the college route, but ended up in rather more practical professions: nursing and civil engineering. I don't regret the emphasis on intellectual pursuits, nor even the "impracticality" of studying Classics, archaeology, and philosophy. But the emphasis on preparing myself for a purely intellectual career sidetracked me from art and design, and it was only jobs that involved developing design skills that provided the needed education. Tracking backfired and probably started me down the path toward my present anarchic stance on education.
Now, of course, I live in the best of both worlds--for me. I get to teach history and philosophy to design students, and am constantly learning to combine the intellectual with the practical. But the lingering effects of valorizing the former over the latter can be seen in recent debates among my colleagues about how we should teach our students. Some have been educated in environments that lack the structure vocational training maintains, and would prefer to inspire their students to creativity without the strictures of formal lesson plans--and the debate will continue as long as the dichotomy survives.
On the other hand, I've been inspired to focus more carefully on Morris and the Bauhaus in my history classes, in order to emphasize a different history: one in which "knowledge" isn't confined to an intellectual model, but pervades learning in both the physical and mental realms.
Since I frequently have to address the question, "what does this have to do with becoming a graphic designer?" (or a fashion designer or a filmmaker or a web guy), my response usually has something to do with "knowing what's in the box" (and I'm getting rather weary of my own version of the box cliche) I'm actually pretty good at convincing design students that knowing about art history is a valuable adjunct to creativity (if for no other reason than showing them what's been done before). But I'm not sure how we're going to keep educating people in the classics anyway, now that popular culture is hell-bent on denigrating intelligence and shortening attention spans.
As literature and the performing arts continue to lean toward the lowest common denominator, the endurance of any kind of canon is in question. We may need to turn to something like John Ruskin's Working Men's College (which still exists in London) to provide continuing education beyond the fulfillment of vocational requirements--were we to begin paying attention to the training needs of the people who do some of the most important work around. After all, where would most of the folks in Dallas be had there not been electricians to repair downed lines after the last whopping thunderstorm? I strongly suspect that the intellectual vacuum created by reality TV and inane movies will eventually drive at least some of our future plumbers and such to seek intellectual stimulation, on their own terms, just as the workers of nineteenth-century London eagerly took advantage of Ruskin's drawing classes.
I think this issue is well worth pursuing in later posts, so this is by no means my final take. For related earlier comments on Owl's Farm, see the links to the right, and stay tuned for further ruminations.
Image credits: "Workshop" by Felipe Micaroni Lalli; Bauhaus image by Jim Hood; John Ruskin, self portrait, watercolour touched with bodycolour over pencil, all from Wikimedia Commons.Ruskin.jpg
Last week on the Colbert Report, one of the guests was Matthew Crawford, a contributing editor for The New Atlantis and author of a rather intriguing book, Shop Class as Soulcraft: An Inquiry into the Value of Work. I'm already thinking, without having actually read the book (it was preceded by an essay of the same name in The New Atlantis, which I have read), that Morris would love it. And its premise, that doing things is every bit as important intellectually as talking about them, is enough to make me a fan for life. After all, my own little-read tome, More News From Nowhere, describes a society built on both doing and thinking, and fosters a kind of organic education that involves both the concrete and the abstract.
Some of my students may recall our conversations about the Bauhaus, and my arguments against the separation of art and craft. The Bauhaus, after all, combined a Foundation Course in theory (including material on the study of nature and the study of materials) followed by workshops under the tutelage of craftsmen and artists that put theory into practice.
I was personally affected by the American predilection for "tracking" students in elementary and secondary schools when I returned to the States in time for high school. "Smart" kids were put onto a college prep track, and "the others" were channeled into vocational studies. What that meant for me was no home ec, no art classes, no shop classes. Only academic studies. The one time I managed to break out of the mold was when I was a Junior and talked somebody into letting me take a typing class because I argued that it would help me write papers in college.
As it was set up, tracking was overall a bad model and led to a system of inequality that still exists. What seems to have sprung up in its stead (probably in an effort to address issues of "classism" in the seventies) is another bastardization of educational management: the idea that everybody can and should go to college. Now, I'm all for equal opportunity, but what if somebody really wants to spend the rest of his or her life building cars or airplanes or houses? Of course, one can argue that educated citizens make better construction workers--but why do they have to go to college to "get educated"? Especially if these folks are going to sit and stare at their philosophy teacher with the standard "why the hell do I have to take this class" look on their faces.
Matthew Crawford points to the origins of the problem in his interview on the Colbert show, where he spoke of the "pernicious and long history of tracking into vocational and college prep" courses, based on the "dichotomy between mental vs. manual" education--which in turn is based on a perception that "knowledge work" is better than manual work. Somehow, along the line, vocational training got a bad rap, and "knowledge work" took on an elitist mantle.
The idea that four-year colleges are necessary for future success in any field is just bogus. Community colleges or technical schools that provide continuing education in basic subjects like writing, maths, and general science can foster cultural literacy, leaving more of the time students would need for practical education on how to fight fires, build homes, or arrest felons.
I'm actually a product of the basic prejudice, as the first member of my working-class family to finish college, and I've resented the fact that, as a "smart" kid, I was channeled along paths I might not have taken. Many of my generation (cousins and siblings) followed me along the college route, but ended up in rather more practical professions: nursing and civil engineering. I don't regret the emphasis on intellectual pursuits, nor even the "impracticality" of studying Classics, archaeology, and philosophy. But the emphasis on preparing myself for a purely intellectual career sidetracked me from art and design, and it was only jobs that involved developing design skills that provided the needed education. Tracking backfired and probably started me down the path toward my present anarchic stance on education.
Now, of course, I live in the best of both worlds--for me. I get to teach history and philosophy to design students, and am constantly learning to combine the intellectual with the practical. But the lingering effects of valorizing the former over the latter can be seen in recent debates among my colleagues about how we should teach our students. Some have been educated in environments that lack the structure vocational training maintains, and would prefer to inspire their students to creativity without the strictures of formal lesson plans--and the debate will continue as long as the dichotomy survives.
On the other hand, I've been inspired to focus more carefully on Morris and the Bauhaus in my history classes, in order to emphasize a different history: one in which "knowledge" isn't confined to an intellectual model, but pervades learning in both the physical and mental realms.
Since I frequently have to address the question, "what does this have to do with becoming a graphic designer?" (or a fashion designer or a filmmaker or a web guy), my response usually has something to do with "knowing what's in the box" (and I'm getting rather weary of my own version of the box cliche) I'm actually pretty good at convincing design students that knowing about art history is a valuable adjunct to creativity (if for no other reason than showing them what's been done before). But I'm not sure how we're going to keep educating people in the classics anyway, now that popular culture is hell-bent on denigrating intelligence and shortening attention spans.
As literature and the performing arts continue to lean toward the lowest common denominator, the endurance of any kind of canon is in question. We may need to turn to something like John Ruskin's Working Men's College (which still exists in London) to provide continuing education beyond the fulfillment of vocational requirements--were we to begin paying attention to the training needs of the people who do some of the most important work around. After all, where would most of the folks in Dallas be had there not been electricians to repair downed lines after the last whopping thunderstorm? I strongly suspect that the intellectual vacuum created by reality TV and inane movies will eventually drive at least some of our future plumbers and such to seek intellectual stimulation, on their own terms, just as the workers of nineteenth-century London eagerly took advantage of Ruskin's drawing classes.
I think this issue is well worth pursuing in later posts, so this is by no means my final take. For related earlier comments on Owl's Farm, see the links to the right, and stay tuned for further ruminations.
Image credits: "Workshop" by Felipe Micaroni Lalli; Bauhaus image by Jim Hood; John Ruskin, self portrait, watercolour touched with bodycolour over pencil, all from Wikimedia Commons.Ruskin.jpg
Sunday, June 7, 2009
Venus Revisited: Out of Africa?
So busy was I protesting the innate sexism involved in interpreting Paleolithic art that I neglected yet another, and perhaps even more important intrinsic bias: racism. All too frequently this aspect of the Big Picture escapes my notice because I'm comfortably ensconced in my lily white skin. It's fortunate, therefore, that the likes of John McWhorter exist, and I do get reminded every now and then that my views are not the only ones--nor the only reasoned ones. McWhorter is a linguist and fellow of the Manhattan Institute, and one of the most eloquent and thoughtful conservative columnists around. I've started looking on him as the true heir of William F. Buckley, and he's given me a reason to start reading The National Review again.
At any rate, McWhorter's May 15 column for TNR, Big Bosoms and the Big Bang: Did the Human Condition Really Emerge in Europe?? (bowdlerized in the Dallas Morning News on Sunday, May 31, using the subtitle as the title) pointed out a glaring omission in my assessment of the Hohle Fels figure: the underlying assumption behind most descriptions of art from Pleistocene Europe. What McWhorter argues so elegantly against is the notion that some kind of mutation allowed us to become truly human--in Europe, about 30,000 years ago.
Now, as I always point out in my first History of Art & Design I lecture, fully modern human beings provided evidence of artistic inclinations in southern Africa at least 70,000 years ago, by incising designs into bars of ochre, and apparently decorating themselves with shell necklaces. Not only that, but aboriginal rock drawings in Australia date to as early as 40,000 years ago, and some specimens from the south Asia may be even older. (See, for evidence, articles at Aboriginal Art Online, and Robert G. Bednarik's 2007 paper, "Lower Paleolithic Rock Art in India and Its Global Context.")
Just as Martin Bernal alerted us to the Eurocentrism involved in our understanding of how the Classical tradition arose in Greece (although he overstated his case by insisting that the Greeks "stole" ideas from Africa and the Near East, and his assertions about the origins of Greek culture in general are highly controversial; see his book Black Athena: The Afro-Asiatic Roots of Classical Civilization), McWhorter does us a favor by noting that if we look at human origins through the polarized lenses of Anglo-American archaeological tradition, we end up ignoring very good evidence that if we "became human" at any specific moment, it probably took place in Africa, Asia, or Australia, (or all of the above) rather than in Europe.
Although all archaeological evidence is problematic because the record is, by nature, incomplete (rendering female impact sketchy at best, if we spent our early years weaving and gathering and tending babies), the idea of locating our humanity in one small region at one particular time is just silly--whether it's Germany or India or the northwest coast of Australia. So many factors seem to have gone into making us what we are, that a small female figure, a carved penis, a disemboweled Bison, a design scratched on a pigmented mineral, or a stone adze don't even begin to tell us who we are or how we got to be us in the first place.
To my mind, if you're looking for Big Bangs, pick up Richard Wrangham's new book Catching Fire: How Cooking Made Us Human. Now that language, tool-making, and even laughing seem to have been taken out of the list of possibilities of what makes us "uniquely unique," the remaining difference seems to be this: that we're the only species that cooks its food.
Image credit: A bar of incised ochre and other tools from Blombos Cave in South Africa, dated to about 70,000 years BP. Copyright held by Chris Henshilwood, photo from Wikimedia Commons.
At any rate, McWhorter's May 15 column for TNR, Big Bosoms and the Big Bang: Did the Human Condition Really Emerge in Europe?? (bowdlerized in the Dallas Morning News on Sunday, May 31, using the subtitle as the title) pointed out a glaring omission in my assessment of the Hohle Fels figure: the underlying assumption behind most descriptions of art from Pleistocene Europe. What McWhorter argues so elegantly against is the notion that some kind of mutation allowed us to become truly human--in Europe, about 30,000 years ago.
Now, as I always point out in my first History of Art & Design I lecture, fully modern human beings provided evidence of artistic inclinations in southern Africa at least 70,000 years ago, by incising designs into bars of ochre, and apparently decorating themselves with shell necklaces. Not only that, but aboriginal rock drawings in Australia date to as early as 40,000 years ago, and some specimens from the south Asia may be even older. (See, for evidence, articles at Aboriginal Art Online, and Robert G. Bednarik's 2007 paper, "Lower Paleolithic Rock Art in India and Its Global Context.")
Just as Martin Bernal alerted us to the Eurocentrism involved in our understanding of how the Classical tradition arose in Greece (although he overstated his case by insisting that the Greeks "stole" ideas from Africa and the Near East, and his assertions about the origins of Greek culture in general are highly controversial; see his book Black Athena: The Afro-Asiatic Roots of Classical Civilization), McWhorter does us a favor by noting that if we look at human origins through the polarized lenses of Anglo-American archaeological tradition, we end up ignoring very good evidence that if we "became human" at any specific moment, it probably took place in Africa, Asia, or Australia, (or all of the above) rather than in Europe.
Although all archaeological evidence is problematic because the record is, by nature, incomplete (rendering female impact sketchy at best, if we spent our early years weaving and gathering and tending babies), the idea of locating our humanity in one small region at one particular time is just silly--whether it's Germany or India or the northwest coast of Australia. So many factors seem to have gone into making us what we are, that a small female figure, a carved penis, a disemboweled Bison, a design scratched on a pigmented mineral, or a stone adze don't even begin to tell us who we are or how we got to be us in the first place.
To my mind, if you're looking for Big Bangs, pick up Richard Wrangham's new book Catching Fire: How Cooking Made Us Human. Now that language, tool-making, and even laughing seem to have been taken out of the list of possibilities of what makes us "uniquely unique," the remaining difference seems to be this: that we're the only species that cooks its food.
Image credit: A bar of incised ochre and other tools from Blombos Cave in South Africa, dated to about 70,000 years BP. Copyright held by Chris Henshilwood, photo from Wikimedia Commons.
Subscribe to:
Posts (Atom)