Tuesday, December 1, 2009

Bread and Circuses

The conflict between privacy and celebrity has finally reached the point of pure idiocy. Actually, it already had, but the recent flap concerning Tiger Woods (who has actually earned his fame) points to the absurdity with which modern media attend to matters that would once have been purely private.

I don't know why Woods slammed into a tree in the middle of the night. And unless it somehow affects national security (as the also-recent flap concerning party crashers at the White House might), I'm content to keep my nose where it belongs. In my own business.

But prurient interest in what goes on in other people's houses seems to have become an international pastime. Witness the tragicomic antics of the "new" royal family in Britain, where the intensity of this kind of curiosity seems to have ramped up, and of which I was reminded last night as we watched Helen Mirren's performance as Elizabeth II in Stephen Frears's film The Queen. Never having met QEII herself, I can only judge the acting job superficially, but I was completely taken in by the compelling portrayal of a woman caught between the worlds of restrained tradition and media frenzy.

Queen Elizabeth's initial reaction to Princess Diana's death seems to have been that even though Diana was a public figure, her funeral should be private (there were, after all, children involved here). But nobody in the public eye--especially anyone who has played the public for sympathy (whether deserved or not)--gets privacy any more. The outpouring of grief over Diana's death fueled tabloid hysteria almost to the point of conflagration, as if the papers and television realized that they were losing a cash cow and had to wring out every byline and lurid headline before the furor itself died out.

Nowadays, celebrity is being sought for its own sake. Andy Warhole's fifteen minutes of fame idea has become a life-goal for some people. Substance and accomplishment are no longer criteria of fame. Instead, publicity seems to be the sole measure: whoever can garner the most coverage becomes the news item du jour.

All this is one more indication of the shallowing (probably not a word) of the cultural pool of metaphor. No longer deepening through shared experience, interesting connections, or intellectual freshness, the pool of which I frequently speak is just plain drying up. Metaphors are becoming predictable and stale (like the fifteen minutes of fame quip itself), and even the word "like" (which used to signal an upcoming comparison) has become a vocal mannerism equivalent to "uh" or "you know."

The only still-rich sources of metaphor seem to be science and technology, but even these are turning out tropes based on the rampant growth of social media. So "tweet" and "twitter" no longer refer to sounds made by birds, but rather to noises made by people seeking fame in cyberspace. Science contributes to new words that refer to increasingly short attention spans: nanosecond and (frequently used inappropriately) quantum leap. A quantum is actually really tiny ("quantum theory deals with the tiniest things we know, the particles that atoms are made of"), rather like the brains of those who seem to be operating the news media these days. Ahem.

I don't really want to belabor my point (because, as you know, I tend to go on and on about things), but I do so miss the days when I didn't really know anything about the lives of those making the news. In fact, all I really care about Tiger Woods is how he plays golf. Although I appreciate the example he has been setting for young people, I'm not going to be using any sports figure as an example when it comes to raising children--unless, of course, that sports figure also turns out to be a Rhodes Scholar or a brilliant scientist or an inspiring teacher as well.

Metaphor is about complexity; the more we know, the more creative the connections we can make. Perhaps now is the time for us to turn off our television sets and bake our own bread. Who knows what would happen if we taught our kids to notice what happens when raisin bread rises, rather than what the latest talentless nincompoop is doing in the latest media circus.

Image credit: Henryk Siemiradzki, A Christian Dirce, nineteenth century. The painting depicts Nero with an executed Christian woman in his circus (Wikimedia Commons). Comparisons of current cultural events and those of imperial Rome are commonplace these days, but not inappropriate. For more on this topic, see Cullen Murphy's recent book, Are We Rome?

Monday, November 23, 2009

Songs of a Distant Earth

Last week I worked for two hours on a post (using Wordpad, which doesn’t have an automatic backup) before being thwarted by one of Microsoft’s incessant updates to Vista. I was using Wordpad in the first place because I often spend several hours on a post so I don't tend to do it online. But Word embeds so much code that it screws with Blogger’s layouts, so I compose in Wordpad. At any rate, I lost it, and haven’t been back since. But my purpose remained intact, even though I lost steam along with my prose.

The purpose of the original post was to assure anyone who happens upon this blog that, contrary to current popular silliness, the absolutely only thing of any importance scheduled to occur on December 21, 2012, is my sixty-fifth birthday. No, the world is not going to end—not even according to the Maya, misconceptions about whose calendar are at the source of popular culture’s latest charlatan show.

It’s actually a good thing that I didn’t post on this last week for a couple of reasons. The first is that I finally caught up with the November 8 edition of the New York Times’s entertainment section, in which Tyler Gray notes that Roland Emmerich’s film, 2012 (now enjoying huge monetary success and making my job measurably more difficult) isn’t the least bit interested in knowing why the Maya calendar “ends” on this particular date. Gray did the smart thing and interviewed David Stuart (who won a MacArthur “genius” grant when he was eighteen for his already significant work on Mayan language). Stuart points out that December 21, 2012 is simply the end of the latest Baktun and the beginning of the next.

The Maya were, in fact, pretty scientific (rather than prophetic) about their calendrical system, drawing on previous historical patterns to suggest what might happen next. I was amused at Stuart’s expectation that he’d be “dealing with Mr. Emmerich’s misuse of Maya history for his whole career. I just hope he’s right, however, in thinking that nobody will really take the movie seriously.

But there’s such a raft of idiocy plying the waves of the internet these days that I’m not all that sanguine; maybe folks won’t believe the movie, but they certainly seem to be tuning into the noise. I urge anyone who’s the least bit interested in the whole 2012 mishegoss to pick up the latest issue of Skeptic magazine (or the National Geographic page linked below), which does a good job of poking holes in twenty differently spurious claims about a variety of catastrophes supposed to occur three years from now.

My main beef with all this rigmarole is that it pokes two of my “buttons”: the demise of intelligent thought and the growing lack of understanding—especially among the American populaceof how science works.

What I don’t understand is why people are so gullible, and why Americans in particular seem to have chosen to be believers rather than scientists. American exceptionalism (the view that we’re special, not only politically and economically, but in the eyes of god) may be at the root of it. After all, the United States exists primarily because it was founded by colonists who wanted to practice religions opposed to or by established churches in the countries from whence they came, and since then politics and religion (though officially separated by a Constitutional clause meant to forestall the establishment of a state religion) have mixed pretty freely. If you're not a professed believer, and preferably a Christian believer, good luck being elected to public office.

But religion doesn’t automatically make one susceptible to goofy claims. The nuns responsible for my early education planted in my nascent brain a love of learning that especially loved learning science. Unfortunately, the more damaging aspect of American-style religion is its tendency to instill doubt in science, unlike mainstream religion in Europe. Recent polls are full of statistics showing that nearly half of Americans think that human beings were created within the last 10,000 years, and claim not to believe in evolution.

The number of people who “believe” in evolution rises significantly as the level of education increases, but asking people if they “believe” in evolution in the first place betrays a fundamental misunderstanding of how science works.

In terms of logic, the problem lies in the fallacy of equivocation: equating two words that mean different things in different contexts. The classic example is the word theory, which to most people represents an idea or even a guess; in science, however, a theory is a coherent explanation for observable evidence. Although there’s always room for falsification in science, theory “behaves” like law, in that experiments and practice are grounded in it.

So gravity is, in fact, “only” a theory, but we behave as though it’s a law, even though our ideas about what it entails have changed somewhat since the time of Newton (largely because of Albert Einstein’s work). So yes, evolution is a theory; but despite a few gaps here and there, by far the preponderance of evidence leads us to the conclusion that Darwin pretty much had it right when he published Origin of Species 150 years ago tomorrow.

Scientists even use the word belief differently than most folks do—in the sense of expectation rather than pure faith. Astronomers, for example, expect there to be life on other planets based on current evidence.

The existence of extraterrestrial planets leads them to suspect with some statistical probability that some planets with some form of life do exist. Ideas like these are hard to disprove, because our galaxy alone contains so many stars that not finding the evidence we need to confirm alien life isn’t going to disprove it. But evidence could emerge that strongly suggests life somewhere else. Unless ET comes calling, though (which physics as we currently understand it pretty much precludes), we’re never going to know for sure. So scientific belief really describes a reason-based expectation, perhaps mixed with hope that discovery will occur in our lifetime—but it’s not blind faith.

Quite by coincidence I picked up Arthur Clarke’s Songs of A Distant Earth last Saturday (in part about what it would really take to travel to other planets if we had to; a rather serendipitous event that gave me the title for the post), and a few hours later snagged a copy of the December 2009 National Geographic, with its article, “Seeking New Earths,” by Timothy Ferris. The article talks about efforts to locate earth-like planets in orbit around sun-like stars, because they’re the most likely to have developed the kind of life that evolved here. The statistical likelihood of life seems to be pretty high, but life as we know it not so much. As Ferris points out, “Biological evolution is so inherently unpredictable that even if life originated on a planet identical to Earth at the same time it did here, life on that planet today would almost certainly be very different from terrestrial life.” (93)

The issue also conveniently includes a fine debunking of the 2012 myths: a fitting way to celebrate the publication of Origin of Species, I think, and a reminder that it might be possible to emerge from our quagmire of silliness and start to better use the brains we’ve got, god-given or not. However much I would like there to be extra-terrestrial life, unless somebody discovers an error in Einstein's cosmic speed limit, our fine, big brains will have to depend on what Timothy Ferris calls the richness of the human imagination to discover new planetsto articulate new theories to explain what's happening to our home planet, and suggest how we might fix it before we're forced to go looking for a new one.

Whether or not we believe in a god, our future depends on embracing science not as a substitute for religion, but as a means to understanding the world. Such understanding does not omit the possibility of faith
as many established religions have shown. Science and religion are not incompatible, unless one seeks absolute certainty (which science does not afford), or reads scripture literally. Even as a child, when I actually did believe in several deities in sequence, I couldn't help but marvel at the genius of any being who could have invented the complex and labyrinthine process that produced us all. Both our origins and our fate are far more interesting and astounding when seen through a microscope or a telescope, than when viewed through a veil of ignorance and wishful thinking.

Image credits: Nick Risinger's conception of the Milky Way galaxy, from Wikimedia Commons, where you can also find an annotated version that shows where we live. The painting of Darwin (also from the Commons) is by George Richmond; Darwin sat for the portrait not long before he sailed on the Beagle.

Wednesday, November 4, 2009

The Multitasking Myth

I am not generally fond of using "myth" in the sense of "lie" or "untruth widely accepted as truth." (I much prefer the notion of myth as cultural storytelling and as a means of preserving cultural identity.) But since this particularly negative sense of the word is being applied to a phenomenon I find extremely deleterious to learning, I'm ready to go along on this one.

So here's the big question. Is it possible for people in general (and young people in particular) to engage in two or more activities at the same time and do any of them well enough to accomplish the purposes attached to each?

The answer seems to be, according to a resounding majority of researchers, "no."

My General Studies colleagues and I have been saved from some measure of dispute by the fact that our department has banned the use of extraneous technologies in the classroom: no laptops, no iPods, no mobile phones, no recording devices--at least without documents supporting ADA accommodation.

Students occasionally try to circumvent the policy by texting under the desk, but their demeanor is so obvious that I usually catch them and suggest strongly that they put the damned phone away and pay attention. As you can imagine, I'm very well loved for this. But it does make me wonder about students' priorities when they can't shut the "communication" devices off for an hour and a half.

I recently came across Rebecca Clay's article for the APA's online journal, Monitor on Psychology, "Mini-multitaskers." Her thesis is pointed, and reveals the fallacy inherent in the notion that young folk can work effectively on several assignments or tasks at a time.

Multitasking may seem modern and efficient, but research suggests that it slows children's productivity, changes the way they learn and may even render social relationships more superficial.

She provides evidence from recent studies that multitaskers do not save time, but actually take longer to accomplish individual efforts because they're not really doing them simultaneously; they're switching back and forth, and the switching adds time to the job rather than reducing it. And the more complex the job, the more time is lost.

As for my favorite in-class bit of naughty behavior, Clay notes that

Text messaging during class isn't just a high-tech version of passing notes. Because of its demands on attention, multitasking also may impair young people's ability to learn.

This is because, as research out of UCLA indicates, information is processed differently and less effectively when multitasking than it is when one devotes one's full attention to an activity. This may, in turn, exacerbate the problem already bequeathed us by the current emphasis in elementary and high school on rote learning. Dividing attention seems to make it harder for students to truly understand what is being taught; instead, they're more likely to respond by rote, able only to barf back what they've been told without digesting it. Sorry for the ugly analogy, but it's apt.

So, if we combine the short attention spans instilled in our kids by Sesame Street (90 seconds) and television commercials (30 seconds), with the superficiality of rote learning and a constant barrage of digital media (cell phones, iPods, Twitter, Facebook), why doesn't every student have ADHD?

According to Tamara Waters-Wheeler, a North Dakota school psychologist quoted in the article, attention problems are increasing and even if students aren't diagnosed with attention-deficit hyperactivity disorder, they're nonetheless exhibiting symptoms because they've grown up multitasking.

The evidence also suggests that the communities established by text-messaging and social networking may be much more superficial than face-to-face friendship. This is particularly important because as media-mediated friendships increase, the quality of these relationships will change the way we look at community, friendship, and other human interactions. The implications for cultural change are manifold. Clay quotes a developmental psychologist, Patricia Greenfield, who finds this multitasking version of friendship troubling:

We evolved as human beings for face-to-face interaction. As more and more interaction becomes virtual, we could lose qualities like empathy that are probably stimulated by face-to-face interaction.

Clay's is only one of myriad articles and blogs devoted to the emerging problems associated with short attention spans and media addiction. One of the best is from Blogger in Middle Earth, Ken Allen, a New Zealand educator whose post on "Binge Thinking" gets at the meat of the matter, which he describes as "cognitive overload."

In order to deal with this phenomenon, it's probably time to start rethinking how we present material to our students. Even in a fast-paced program like ours (cramming a semester's worth of information and learning into eleven weeks), and even in the face of ever more intense scrutiny by the assessment regime, we've simply got to figure out how to slow things down enough to help students change habits that they've acquired over the last twenty years.

Some of what I've read has inspired me to reduce the amount of information and increase the depth to which we explore it each quarter. This is a tough slog for fact- and information-based courses like art history, but in order to address problems associated with superficial learning (and its potential effect on creativity, which I'll address in a later post), some sort of new approach seems necessary. My students are already hinting at a solution: more workshops. My course-evaluation comments are rife with requests for more hands-on activities to connect theory and practice.

I can assure you, however, that I will address the issues one at a time, and use one medium at a time in order to develop some useful strategies. No multitasking will be employed in pursuit of solutions.

As they say in the funnies, "Stay 'tooned."

Image credit: Printed books will always be my medium of choice, but the updated British Museum Reading Room combines the old and the new so beautifully that I thought I'd encourage readers to check it out. The full resolution photo is available here. It was taken in February, 2006, by Diliff and uploaded in its present form to Wikimedia Commons in November 2008. When I mentioned the British Library last week, and wondered what the new library looked like, I hadn't realized that the old reading room still existed at its original site. Because of its marriage of printed codex and digital technology, this seemed like an appropriate illustration for this post. No short attention spans allowed in this place!

Sunday, October 25, 2009

Technology and Education--Really Cool Toys

This will probably be an ongoing feature of the Owl of Athena, because I keep happening on items that encourage optimism about the future of education. Such encounters are not as frequent as I'd like in order for me to become generally more sanguine about where we're going, but they do help make up for all the rain we've been having lately, which has damaged some books I had stupidly stored in my garage.

The British Library has applied digital page-turning software to a selection of books in its collection, and the results are stunning: The Luttrell Psalter, the Lindisfarne Gospel (with its bejewelled cover in exquisite detail), one of Leonardo's notebooks, Jane Austen's History of England (written by hand when she was thirteen years old)--and more.

The technology, which allows readers to use a mouse to "turn" the pages, makes it almost like actually performing the act of page-turning, except for the texture of the page itself (which you wouldn't be able to feel anyway, because you'd be wearing cotton gloves if you had the text in front of you). The pages aren't merely visible, either; rotate, magnify, zoom, or move the pages about on the screen, and listen to or read commentary on them. For my History of Art & Design I students, this may well prove a valuable resource, when it comes time to solve the final design problem--creating an illuminated manuscript of your own.

The technology serves not only codices (both right- and left-reading), but volumes, like the Dering Roll, a three-meter long scroll of heraldic devices (featured on the Turning the Pages website). If you can't figure out how to use the icons, the "get help" link supplies clear instructions, and you can minimize the tool bar and otherwise "personalise" your experience as well. Unfortunately, only select pages are featured from the books featured on the British Library site, twenty three at the moment, but each one includes rich examples of what the book contains.

Some aspects seem a little silly--like the 3D option on the menu, which is otherwise helpful because it offers a choice of categories: in case, for example, you're interested in Science and Nature but not in Religious texts. Some of the categories are empty, but provide an idea of what will eventually be available.

Alas, the program currently (in its TTP 2.0 form) only works on PCs running XP or Vista. Some small recompense, perhaps, for those who haven't been completely converted to the Mac cosmos.

The same browser application powers the "Explore the Manuscripts" segment of the English Heritage page on Charles Darwin's home, Down House. Five of Darwin's notebooks are available, and all but the Beagle Diary can be viewed in their entirety.

The folks who made this software possible, Armadillo Systems, have developed it (according to managing director Michael Stocking's blog) to "to help museums and libraries provide access and interpretation for their collections." Since many of us will probably never go abroad again, much less obtain access to the rare book collections of the British Library, I can't think of a more noble cause. I also love the emphasis on interpretation, because that's the engine that drives learning, and the broader our perspective (including encounters with primary texts in their original form), the more fluent the interpretation. Now I just have to drive more of my students toward learning Greek and/or Latin, and I can die happy. Some considerable time in the future, I hope.

Image credit: The first page of Matthew's Gospel, from the Lindisfarne Gospel. Wikimedia Commons. For a much more intimate and true-to-life encounter, go to the British Library link mentioned in the post.

Wednesday, October 14, 2009

Metaphors Be With You

I usually start the quarter out talking about the necessity of metaphor in the learning process, but the manifold uses of metaphor and translation in everyday life have been popping up everywhere, prompting this post.

As some of you already know, I'm not an enormous fan of "self help" culture or new-agey spiritual quests, but every now and then I come across a book that falls into one or both of these categories and that proves both useful and interesting. One, Michael Gelb's How to Think Like Leonardo da Vinci, is especially helpful in teaching my students about the importance of making connections (connessione) and embracing ambiguity (sfumato) in the development of creativity. It's so useful, in fact, that I carry it around on my cart (portable office) from class to class, and more than one student has been inspired to check it out of the library.

More recently I happened on Norman Fischer's Sailing Home: Using the Wisdom of Homer's Odyssey to Navigate Life's Perils and Pitfalls (New York: Free Press, 2008), which was on the cheap deal rack at Borders. Always on the lookout for material my students can use for research on their illuminated manuscript project (the "Lotus Eaters" episode in The Odyssey is one of their choices), I thumbed through it and what little I read I found rather engaging. Fischer is a Buddhist monk, and my long association with the East has made me more sympathetic to Buddhists than to most established religions (I've actually know some pretty cool Buddhists who used to hang out with the Benedictine priests we visited in mountain missions in Taiwan). So I bought it. At three bucks it was cheap at twice the price (as the old saw goes).

While reading through the first chapter, "The Sea of Stories," I found Fischer's treatment of metaphor and its value to storytelling insightful and potentially useful. As he points out, "Creating, processing, and interpreting stories is a major industry" (13), and my students are some of that industry's future purveyors. They're taking my classes in order to help them become better storytellers. In his introduction, Fischer notes that "metaphors condition, far more than we realize, the way we think about ourselves and our world, and therefore the way we are and act" (7), and thus significantly influence the kinds of stories we tell, and our ability to tell them creatively.

As I've mentioned elsewhere on this blog, I'm fond of Gerald Holton's concept of the "cultural pool of metaphor" from which we draw images and ideas that have collected through the cultural experiences of ourselves and our ancestors. And my students probably get tired of hearing me say that "the more you know, the more you can know," because the more you know, the more metaphors you absorb, and the more lenses you have through which to interpret the world. The lens Fischer offers in his book asks us to look at an odyssey as a journey--not a quest, but a journey home. Students who come to understand that metaphor not only know Homer's world better, but can in turn (in Fischer's view) apply that understanding to their own world.

Not long after I finished the first chapter of Sailing Home, and sat down to work, I remembered that I was supposed to look up the word that refers to the human tendency to see patterns where they don't necessarily exist. I found it in the Skeptic's Dictionary: pareidolia, from the Greek para (beside) and eidos (image). It's the phenomenon that encourages people to see faces on Mars and Jesus in tortillas, and has people running to see the Virgin's image in a tree trunk. It's actually an aspect of metaphorical thinking, at least in Gregory Bateson's sense of "seeing as." Since Bateson was a psychologist as well as an anthropologist, it makes sense that he would recognize metaphor as the basis of some psychological practices and processes--like using the Rorschach test.

Pareidolia may also be at the root of some cave paintings, like the "Wounded" or "Leaping" bison from Altamira Cave, where cracks and contours on the cave wall may have suggested the positions of some of the figures. When we look up at the clouds like Opal and Earl (in one of my favorite comic strips, Pickles) or the folks in Strange Brew or Red and Rover and imagine seeing images of animals or other miscellaneous items, we're engaging in metaphor-making on a very basic level.

Parody is another example of "seeing as" and also an example of Terry Barrett's adage that "All art is in part about other art." Being able to see Michelangelo's David in a pair of Calvin Klein Jeans is pretty silly--but it probably could sell the jeans because David is a pretty fair model of a man (another metaphorical construct), as any giggly high school girl in Florence can tell you. I used to have my students produce parodies of famous art works--that is, at least, until I got one too many "booty call" renditions of Ingres's Grande Odalisque. Various artists have created major works that parody those of others, such as Picasso's play on Velasquez's Las Meninas.

Metaphor is such a basic aspect of human intellectual experience that I can't imagine creativity operating without it. Unfortunately, reduced to its most banal, metaphor becomes cliché, and loses its instructive and inspirational energy. But as long as we keep encouraging our students to learn more for the sake of learning, and to expand their storytelling horizons by filling up that cultural pool with new, inventive, different, meaningful, and invigorating metaphors, we'll be doing them and ourselves an enormous favor.

Image Source: "Nausicaa Playing Ball with her Maidens," one of John Flaxman's illustrations for the 1810 edition of The Odyssey. I chose this one because it illustrates the moment when Odysseus begins the final leg of his journey home and entertains Nausicaa's parents with stories of his adventures. It's also the segment of the story that inspired the opening scene for More News From Nowhere. I cleaned the image up a little, but got it from Wikimedia Commons. Fittingly enough, Athena herself appears, floating behind the young women.

A note on the title of this post: I stole it from a bumper sticker.

Saturday, September 5, 2009

A Confederacy of Nincompoops

Since the decline of Western Civilization--at least as manifested in these United States--currently shows signs of accelerating rather than abating any time soon, I thought I'd post this both on Owl's Farm and here on the Owl of Athena because it relates both to political economy and to education.

I have, of course, stolen the title--cheesily altered--from John Kennedy Toole's Pulitzer Prize-winning novel. His hero, Ignatius J. Reilly, was a fully-realized snark and I loved the book--but haven't read it in twenty years. If Toole had lived to see what happened to New Orleans a few years ago, he'd probably have written a sequel; but the world was already too much for him, and he died by his own hand over ten years before A Confederacy of Dunces was finally published.

Toole was a latter-day Jonathan Swift, critical of cultural excess and stupidity in the '80s, and he drew his title from Swift's own "Thoughts on Various Subjects": "When a true genius appears in the world, you may know him by this sign: that the dunces are all in confederacy against him."

Now, I'm not saying that Barack Obama is a "true genius," even though he well may be. He is, after all, a politician, and he seems poised to weasel out of a host of platform promises in some Quixotic quest of his own--bipartisanism. But for sure, the dunces and nincompoops have arisen, if not in actual hoards, at least in loud numbers that make the evening news every bloody night, and promise to make it more and more difficult for him to accomplish what he was elected to do.

Another of Swift's aphorisms (from the same source as Toole's title) speaks to the current phenomenon of loud-mouthed, ill-informed rantings that go on in the so-called "town meetings" and that may well signal the end of civil discourse in this country.

Few are qualified to shine in company; but it is in most men's power to be agreeable. The reason, therefore, why conversation runs so low at present, is not the defect of understanding, but pride, vanity, ill-nature, affectation, singularity, positiveness, or some other vice, the effect of a wrong education.

Or of no education at all, perhaps. Otherwise, why would any reasonable human being yank a kid out of class in fear of being "indoctrinated" by the duly elected President of the United States? The promised topic is a pep-talk on staying in school in order to excel in life. Objections to the address on the basis of some paranoid fear of subliminal persuasion to embrace Socialism sounds to me like these parents--who really don't want their kids in public school anyway, but can't "afford" to home school them, or sacrifice anything to send them to private parochial schools--just don't want their children educated at all, in any meaningful sense of the word.

They want the Bible taught in school, but they sure as hell don't want Biblical hermeneutics taught because that might cause kids to question particular interpretations of the book itself, or perhaps to insist that it be read in context. They want Creationism or Intelligent Design taught to "balance" the godlessness of "Darwinism," and they don't "believe" in the evidence emerging from science in regard to climate change. They want their kids to read "the classics," but only the ones they approve of. History has to tell it the way it was told when they were kids, despite any evidence that's emerged during the last hundred years or so that contradicts received views. And the United States must never, ever, be portrayed in a negative light. Art history can be taught, but parents want to be insured that little Chase won't see breasts on a Greek statue, or little Britney won't see the giant phallus on a Pacific Island totem, so don't take 'em to a museum.

I know that not all parents act this way, but the furor over President Obama's speech has brought back a flood of memories about recent skirmishes in local public schools, and the constant battles that go on over Texas textbook choices. I long for a new William F. Buckley to appear to bring intelligent voices back into Conservative conversations. David Brooks and Rod Dreher can only pull so much weight, and even they're drowned out when the screamers take the podium and start out-shouting reasonable discussion.

The truth is, if we keep on this path toward willful ignorance, afraid to let our children make up their own minds about issues, they'll never learn to think critically, and there won't be anyone around in the future to solve the problems we're not addressing today.

As I discussed the Norman Conquest in my art and design history classes this week, I was once again reminded that our children don't know much about the history of the world. Medieval life is a mystery to them (except the Disney or Monty Python versions), not because it's not especially interesting, but because some nincompoop school system doesn't think "ancient history" is very important. This despite the parallels between the Middle Ages and the present that keep popping up.

Around here it's because you'd have to talk about controversial religious matters, and point out conflicting ideas about the role of religion in the formation of the modern world. But modernity and change are issues that parents don't seem to want to deal with, and they don't seem to be particularly worried about being condemned to repeat what we've forgotten about history.

The focus on education is increasingly seen as "elitism," even as we're told to send everyone to college who can breathe, whether or not he or she is really interested in doings so, or prepared to work at it. Those of us who have gone beyond college are suspect, because so many of us favor thinking carefully instead of proceeding headlong into an argument with nothing but opinions as grounds.

Among Swift's other remarks (not all of which are particularly useful) is this: "Some people take more care to hide their wisdom than their folly."

Some people seem to be reveling in their folly, at the expense of ever attaining wisdom. The old guard--the politicians and commentators who could discuss issues rationally despite their differences, like Ted Kennedy and Bill Buckley--is gone, and I for one miss the folks who could keep us honest and reasonably well-informed. Their measured assessments of current issues are swiftly being replaced by squawking and flummery, and our country is intellectually poorer as a result.

Images: William Hogarth's Chairing the Member, from The Humours of an Election series, 1755. When considering how to "illuminate" this post, I immediately thought of Hogarth's rabble-rousers in this series on popular elections. New Orleans, Mardi Gras Day, 2006: Rex Parade float commemorating A Confederacy of Dunces on Canal Street near the corner of Charters. It's good to see that Toole's book is still celebrated in his home town. Photo by Infrogmation. Jonathan Swift, portrait by Charles Jervas, 1718. All from Wikimedia Commons.

Saturday, July 25, 2009

Design Lessons

The first two weeks of the summer quarter are now complete, and I thought I'd comment on the experience for those students who happen on the blog--whether on purpose, or because they accidentally poke a link somewhere on the course web page.

I'm trying a couple of newish things this quarter; I say new-ish because I've tried them before with varying levels of success. The first of these is the Protokoll, the old German-university tool taught me by one of my esteemed professors in grad school. The word means something like "minutes" (as in the minutes of a meeting), or "official record," although the English word "protocol" means something more akin to procedure (especially diplomatic) or a kind of etiquette.

My aim in this exercise is a bit of both, but its reinstatement is tied to our new lesson-planning effort in General Studies. While I originally balked at the idea of planning out my "lessons" several years ago when first told that I needed to do it, I've come to embrace the notion enthusiastically. Not only does a well-laid plan help me not forget stuff (which I'm wont to do in my state of advancing age and decrepitude, evoked nicely, I think, by the Per Lindroth caricature that illustrates the post), it also connects to the practice of good design. Since I've been lecturing about the relationship between art and design since week 1, it seems only appropriate that my lectures themselves be well designed.

The Protokoll helps me accomplish, with the aid of my students, the requirement that I bridge one week's topic to the next week's. Asking the students to take over the task accomplishes two further objectives: to engage them in the material, and to encourage participation and interaction among their fellows. So far this is all working fairly well. Most of the small groups assigned to deliver the first Protokoll in their sections have approached it enthusiastically enough, and have acquitted themselves nicely. Organization and logic aren't their strong suits, but I can stress these elements in future, and later efforts will probably be better. My colleague who teaches another section of the second-level course reports equal success with the assignment in his class.

Alas, I'm not faring as well with the other effort, which is to provoke more verbal interaction with the images I'm showing. I'm not as good at the Socratic thing as my philosophy-professor/tennis-coach husband is; but then he doesn't show slides. I'm trying, but the sheer volume of material I have to cover in 3.5 hours is daunting, and I'll have to reduce that if I'm going to get much more conversation. But that's a goal I can work on, and is part of the reason behind planning lessons in the first place.

At the moment I'm still working on a proto-technological model. Not all that long ago, I showed 35 mm slides (on a projector I called "Dead Bessie" in reference to an episode of Firefly)--before the advent of PowerPoint and the in-room technology kiosks now available. In the "olden days" I'd have to haul in whacking great huge television sets with VCRs attached, and I still use the cart that was once Dead Bessie's mode of transportation. PowerPoint provides a superb platform for image-delivery (although I steadfastly refuse to do bullet-pointed lectures that outline what I'm saying, and then hand them out to students; I'm pretty sure that's counter-productive if one is trying to get them to pay attention to what one is saying), and I love being able to show details of images I could once provide only if I had a slide of that detail. But an artifact of the old delivery system is that I'm still psychologically bound to the old images.

In some cases I've modified the sequence of slides, but I'll bet that if I went back and dug up a slide list for Romanticism from ten years ago, that it would contain substantially the same examples as this year's list does.

So my lesson planning over the next few quarters will involve going back over what I show, and perhaps coming up with better examples of what I'm trying to teach, and a better design for engaging students in the process of understanding. The ever-swelling popularity of the internet as an educational tool makes it easier to link materials from major museums, both local and international, and my aim is to focus on their collections--rather than on my own.

Stay tuned. Fewer but pithier slide examples, more evocative questions, and stunningly inventive assignments are just around the corner.

Image credit: "The Absent-minded Professor" by Per Lindroth (1878-1933) from the Runeberg Project (an effort to provide online editions of classic Scandinavian literature) via Wikimedia Commons.

Tuesday, June 30, 2009

Little Gray Cells

One of the reasons I love mysteries—at least the classical, cerebral whodunits of writers like Agatha Christie and Rex Stout—is that they employ the intelligence not only of the detectives doing the sleuthing, but also that of the reader or viewer. Nero Wolfe, Hercule Poirot, and Miss Marple all solve mysteries using various kinds of intelligence and wisdom, and don’t depend so much on sensation as intrigue for plot lines. Good puzzles to be solved are their mainstays—not gruesomeness, or, when there is, it’s usually suggested rather than described in gory detail.

It was with some amusement, therefore that I watched the two most recent Hercule Poirot episodes Sunday night on Masterpiece Mystery—right after watching a silly and rather gratuitous story reported by Leslie Stahl on 60 Minutes about the latest "mind-reading" advances in neuroscience. The claim, according to the blurb on the program's website, is that "Neuroscience has learned so much about how we think and the brain activity linked to certain thoughts that it is now possible - on a very basic scale - to read a person's mind."

The research reported, on scanning brains to locate certain kinds of activities, is nothing new. But Stahl’s breathless and eager reportage implied that we’re essentially one step away from developing the ability to read people’s thoughts via MRIs and other technomarvels.

Baloney. What the report did show is that given choices between two different objects that activate different regions of the brain, the machine can tell which object is being thought about. Set up the experiment in a particular way, predispose the subject (and the machine) to think about or look at specific objects and Bingo! The stuff of science fiction becomes the stuff of reality. Only the media thirst for new and spectacular results of the hottest scientific “breakthroughs” would translate this information into something it’s not: not all that new, not all that interesting, and, ultimately, of questionable use. The researchers, drooling at the chance to keep their funds rolling in by hitting the big time on 60 minutes, play along gleefully. Yes! This is the first step toward . . . whatever.

By no means do I intend to condemn all of contemporary neuroscience with my snarky attitude toward this story. As anyone can see on the well-designed and interesting page, Brainbriefings, from The Society for Neuroscience, potentially useful and perhaps necessary research is going on all the time. And maybe it’s my own personal failing that sees the research featured on 60 Minutes as not only useless but bogus. It’s just that repeatedly hearing people refer to the brain as a machine makes my own gray matter cause me to see red.

The brain is not a machine. The brain is in some ways like a machine, but in other ways it’s dramatically different. At times seeing the brain as a computer can be useful, but the analogy breaks down very quickly. Until we have organic machines that evolve through physical experience and embody being not just in brains but throughout their biosystems, there won’t be anything like a "thinking" machine.

Computers are often referred to as mechanical (or digital or synthetic) "brains," and the most frequent metaphor employed is the same human brain/computer metaphor in reverse. The computer is like a brain. A brain is like a computer. Both are limited, and both exhaust their usefulness quickly.

A computer is not a brain. It’s programmed by human beings to collect and process various kinds of information. Thanks to rapidly evolving technologies, computers are now capable of completing more and more sophisticated tasks. That this evolution might bring us Cylons or Borg or Replicators is the stuff of a different, and equally well-loved literary genre—but we’re not there yet, and we’re actually limited by our own models (so far the only things we can imagine look like human beings and other organic life forms).

I have no doubt that without serious evaluation of emerging technologies we might eventually do ourselves in with our own cleverness. And although I’m not afraid that anybody’s going to come up with a Replicator of the sort featured on Stargate: SG1 (and Atlantis), nanotechnology does have the potential to be used badly and to cause problems we haven’t even thought of yet—precisely because we don’t take time to imagine what might come along.

Information, may I remind some of these people, is not knowledge. Knowledge grows out of experience (the broader the better), and if we limit the experiences of our kids to video games, television, personal technologies, and all things digital, I’m not sure what will emerge. It’s a mystery to me how anybody expects us to be able to develop mechanical brains when we no longer exercise the organic versions in ways that originally made all of our fancy technologies possible in the first place.

I doubt that any of the men and women who worked on ENIAC, the first general-purpose computer ever made (originally housed at the Moore School of Engineering at Penn, across the walk from where I lived on campus), could have imagined the emergence of the microchip when they were busy programming their room-sized machine. The trouble with technology, as I’ve most likely mentioned before, is that the human brains that produce it don’t often take the time to puzzle through to the consequences.

What we really need is to foster the kind of thinking and imagination employed by my favorite sleuths as they go about solving mysteries: the ability to see through the hype and sensation and get to the roots of the problems. Nero Wolfe did it by ruminating over literature and potting orchids, Miss Marple whilst at her knitting, and Poirot by employing his "little gray cells." Were we all to expand our avocations to include physical activities and a bit of reading, we might be less caught up in notions of “mind-reading” and better able to put our minds to better uses.

Image credits: The Epithalmus nicely illustrated, a low-res image from an MRI, and a detail of the back of a panel of ENIAC, showing vacuum tubes, all via Wikimedia Commons. What would I do without these people?

Saturday, June 27, 2009

Shop Talk

In previous posts on the Farm, I've outlined my fondness for William Morris's philosophy of work and the dichotomy he saw between useful work and useless toil. I'm also an advocate of experiential education, the education of the whole person (as opposed to the education of the intellect alone), and (again inspired by Morris), the education of desire--which lies at the very core of my views on the environment and sustainability.

Last week on the Colbert Report, one of the guests was Matthew Crawford, a contributing editor for The New Atlantis and author of a rather intriguing book, Shop Class as Soulcraft: An Inquiry into the Value of Work. I'm already thinking, without having actually read the book (it was preceded by an essay of the same name in The New Atlantis, which I have read), that Morris would love it. And its premise, that doing things is every bit as important intellectually as talking about them, is enough to make me a fan for life. After all, my own little-read tome, More News From Nowhere, describes a society built on both doing and thinking, and fosters a kind of organic education that involves both the concrete and the abstract.

Some of my students may recall our conversations about the Bauhaus, and my arguments against the separation of art and craft. The Bauhaus, after all, combined a Foundation Course in theory (including material on the study of nature and the study of materials) followed by workshops under the tutelage of craftsmen and artists that put theory into practice.

I was personally affected by the American predilection for "tracking" students in elementary and secondary schools when I returned to the States in time for high school. "Smart" kids were put onto a college prep track, and "the others" were channeled into vocational studies. What that meant for me was no home ec, no art classes, no shop classes. Only academic studies. The one time I managed to break out of the mold was when I was a Junior and talked somebody into letting me take a typing class because I argued that it would help me write papers in college.

As it was set up, tracking was overall a bad model and led to a system of inequality that still exists. What seems to have sprung up in its stead (probably in an effort to address issues of "classism" in the seventies) is another bastardization of educational management: the idea that everybody can and should go to college. Now, I'm all for equal opportunity, but what if somebody really wants to spend the rest of his or her life building cars or airplanes or houses? Of course, one can argue that educated citizens make better construction workers--but why do they have to go to college to "get educated"? Especially if these folks are going to sit and stare at their philosophy teacher with the standard "why the hell do I have to take this class" look on their faces.

Matthew Crawford points to the origins of the problem in his interview on the Colbert show, where he spoke of the "pernicious and long history of tracking into vocational and college prep" courses, based on the "dichotomy between mental vs. manual" education--which in turn is based on a perception that "knowledge work" is better than manual work. Somehow, along the line, vocational training got a bad rap, and "knowledge work" took on an elitist mantle.

The idea that four-year colleges are necessary for future success in any field is just bogus. Community colleges or technical schools that provide continuing education in basic subjects like writing, maths, and general science can foster cultural literacy, leaving more of the time students would need for practical education on how to fight fires, build homes, or arrest felons.

I'm actually a product of the basic prejudice, as the first member of my working-class family to finish college, and I've resented the fact that, as a "smart" kid, I was channeled along paths I might not have taken. Many of my generation (cousins and siblings) followed me along the college route, but ended up in rather more practical professions: nursing and civil engineering. I don't regret the emphasis on intellectual pursuits, nor even the "impracticality" of studying Classics, archaeology, and philosophy. But the emphasis on preparing myself for a purely intellectual career sidetracked me from art and design, and it was only jobs that involved developing design skills that provided the needed education. Tracking backfired and probably started me down the path toward my present anarchic stance on education.

Now, of course, I live in the best of both worlds--for me. I get to teach history and philosophy to design students, and am constantly learning to combine the intellectual with the practical. But the lingering effects of valorizing the former over the latter can be seen in recent debates among my colleagues about how we should teach our students. Some have been educated in environments that lack the structure vocational training maintains, and would prefer to inspire their students to creativity without the strictures of formal lesson plans--and the debate will continue as long as the dichotomy survives.

On the other hand, I've been inspired to focus more carefully on Morris and the Bauhaus in my history classes, in order to emphasize a different history: one in which "knowledge" isn't confined to an intellectual model, but pervades learning in both the physical and mental realms.

Since I frequently have to address the question, "what does this have to do with becoming a graphic designer?" (or a fashion designer or a filmmaker or a web guy), my response usually has something to do with "knowing what's in the box" (and I'm getting rather weary of my own version of the box cliche) I'm actually pretty good at convincing design students that knowing about art history is a valuable adjunct to creativity (if for no other reason than showing them what's been done before). But I'm not sure how we're going to keep educating people in the classics anyway, now that popular culture is hell-bent on denigrating intelligence and shortening attention spans.

As literature and the performing arts continue to lean toward the lowest common denominator, the endurance of any kind of canon is in question. We may need to turn to something like John Ruskin's Working Men's College (which still exists in London) to provide continuing education beyond the fulfillment of vocational requirements--were we to begin paying attention to the training needs of the people who do some of the most important work around. After all, where would most of the folks in Dallas be had there not been electricians to repair downed lines after the last whopping thunderstorm? I strongly suspect that the intellectual vacuum created by reality TV and inane movies will eventually drive at least some of our future plumbers and such to seek intellectual stimulation, on their own terms, just as the workers of nineteenth-century London eagerly took advantage of Ruskin's drawing classes.

I think this issue is well worth pursuing in later posts, so this is by no means my final take. For related earlier comments on Owl's Farm, see the links to the right, and stay tuned for further ruminations.

Image credits: "Workshop" by Felipe Micaroni Lalli; Bauhaus image by Jim Hood; John Ruskin, self portrait, watercolour touched with bodycolour over pencil, all from Wikimedia Commons.

Sunday, June 7, 2009

Venus Revisited: Out of Africa?

So busy was I protesting the innate sexism involved in interpreting Paleolithic art that I neglected yet another, and perhaps even more important intrinsic bias: racism. All too frequently this aspect of the Big Picture escapes my notice because I'm comfortably ensconced in my lily white skin. It's fortunate, therefore, that the likes of John McWhorter exist, and I do get reminded every now and then that my views are not the only ones--nor the only reasoned ones. McWhorter is a linguist and fellow of the Manhattan Institute, and one of the most eloquent and thoughtful conservative columnists around. I've started looking on him as the true heir of William F. Buckley, and he's given me a reason to start reading The National Review again.

At any rate, McWhorter's May 15 column for TNR, Big Bosoms and the Big Bang: Did the Human Condition Really Emerge in Europe?? (bowdlerized in the Dallas Morning News on Sunday, May 31, using the subtitle as the title) pointed out a glaring omission in my assessment of the Hohle Fels figure: the underlying assumption behind most descriptions of art from Pleistocene Europe. What McWhorter argues so elegantly against is the notion that some kind of mutation allowed us to become truly human--in Europe, about 30,000 years ago.

Now, as I always point out in my first History of Art & Design I lecture, fully modern human beings provided evidence of artistic inclinations in southern Africa at least 70,000 years ago, by incising designs into bars of ochre, and apparently decorating themselves with shell necklaces. Not only that, but aboriginal rock drawings in Australia date to as early as 40,000 years ago, and some specimens from the south Asia may be even older. (See, for evidence, articles at Aboriginal Art Online, and Robert G. Bednarik's 2007 paper, "Lower Paleolithic Rock Art in India and Its Global Context.")

Just as Martin Bernal alerted us to the Eurocentrism involved in our understanding of how the Classical tradition arose in Greece (although he overstated his case by insisting that the Greeks "stole" ideas from Africa and the Near East, and his assertions about the origins of Greek culture in general are highly controversial; see his book Black Athena: The Afro-Asiatic Roots of Classical Civilization), McWhorter does us a favor by noting that if we look at human origins through the polarized lenses of Anglo-American archaeological tradition, we end up ignoring very good evidence that if we "became human" at any specific moment, it probably took place in Africa, Asia, or Australia, (or all of the above) rather than in Europe.

Although all archaeological evidence is problematic because the record is, by nature, incomplete (rendering female impact sketchy at best, if we spent our early years weaving and gathering and tending babies), the idea of locating our humanity in one small region at one particular time is just silly--whether it's Germany or India or the northwest coast of Australia. So many factors seem to have gone into making us what we are, that a small female figure, a carved penis, a disemboweled Bison, a design scratched on a pigmented mineral, or a stone adze don't even begin to tell us who we are or how we got to be us in the first place.

To my mind, if you're looking for Big Bangs, pick up Richard Wrangham's new book Catching Fire: How Cooking Made Us Human. Now that language, tool-making, and even laughing seem to have been taken out of the list of possibilities of what makes us "uniquely unique," the remaining difference seems to be this: that we're the only species that cooks its food.

Image credit: A bar of incised ochre and other tools from Blombos Cave in South Africa, dated to about 70,000 years BP. Copyright held by Chris Henshilwood, photo from Wikimedia Commons.

Monday, May 25, 2009

Women, Sex, and Paleolithic Art

This is a subject upon which I rant in class with measurable frequency (at least once per quarter, two classes, four times a year). Although my students have been spared this quarter because I've been on leave, I was reminded forcefully of the problem by recent headlines about a little carved ivory figurine found in Hohle Fels Cave in Germany. The tiny figure, possibly meant to be worn around the neck on a string, has been described as "Prehistoric Porn" and by University of Tübingen archaeologist Nicholas Conardas (who should know better) as being "sexually charged." A series of detailed photos are available at Spiegel Online, and it's pretty obvious that media hype took over on this one fairly quickly.

I've already mentioned this event (because the media silliness has overshadowed the discovery itself) on The Farm in the "Nature and/as Nurture" post. But it's time to inflict my angst on any hapless reader who finds this essay (and they well might if they're looking for prehistoric pornography, due to the tags I've chosen), because the story is really about ignorance and the need for enlightenment. I am bloody well sick and tired of bad interpretation in general and in particular the idiocy surrounding the steatopygous figures (so-called "Venuses" even though they have nothing to do with the Roman goddess of erotic love, who didn't come along until several thousand years later) found in prehistoric Europe and Asia Minor. It's another example of bad metaphor at work, and probably more evidence of the impact of what Elizabeth Fisher calls "the pernicious analogy."

Fisher's book, Woman's Creation: Sexual Evolution and the Shaping of Society, was published in 1979, but I knew nothing of it until I read Ursula K. Le Guin's essay, "The Carrier Bag Theory of Fiction" (reprinted in 1996 in The Ecocriticism Reader, and available online through Google Books; I first read it in her essay anthology, Dancing at the Edge of the World, published in 1989). I located Woman's Creation at the university library, and later copied it in toto because it was already out of print. In her book, Fisher introduces an interesting hypothesis, which is probably even more convincing now that we know more about early technologies than we once did.

Here's the basic problem. When I first studied anthropology in the mid-sixties, the current notion of human nature was as homo faber: man the maker (emphasis on man, for my purposes). What made us human, and significantly different from other species, goes the idea, was that we made and used tools. Of course, later evidence emerged that smudged that theory pretty seriously, when other animals (primate and otherwise) were seen to engage in tool-making and -using. But the initial observation was based on a bogus idea in the first place--and one you can see working in the "Dawn of Man" sequence of the Clarke/Kubrick film, 2001: A Space Odyssey. What makes us human in that film is 1) influence of a higher intelligence that 2) teaches us how to whack each other over the head with "tools" made of animal bones. (If this sounds familiar to Farm readers, I've probably already held forth on a similar topic; while I was looking for such mention I discovered just how repetitive I tend to be.)

One of the problems with the archaeological record in the first place is that it's always incomplete. It consists only of artifacts that have managed to survive the millennia. This automatically precludes anything perishable: tissue and fiber especially (unless preserved in a medium like a peat bog or ice). So anything woven from plant materials is very likely to have been devoured by the same little beasties that make sure human flesh doesn't stay on buried bones. If, as Elizabeth Wayland Barber (in Women's Work: The First 20,000 Years) and Elizabeth Fisher both claim, women's technologies had to do with carrying babies and food (weaving, basket-making, etc.), little of that activity will be represented. All we find (even in my own limited experience as an archaeologist) are sturdier materials, such as the stuff of which tools are made. Hence the original assumption.

The neglect of women in early considerations of human nature is based on the absence of evidence, not its presence. It's also based on a notion of male prowess and sexual power that may have been entirely different during the late Pleistocene than it is today (with our current preoccupations with drugs to alleviate "diseases" such as "ED" and "Low T"). Elizabeth Fisher suggests that until human beings settled into more or less permanent agricultural communities with their recently domesticated animals, men may not have known that they had anything to do with "fathering" children at all. Sex and parenthood even in some extant small-scale societies are separate functions, because children are seen as gifts from spirits or gods. Fatherhood in such situations is a social role, rather than biological, because the "actual" father may be different (especially in cultures where young girls are married off to much older men). Women are the givers and nurturers of children, not men. The relationship between sex and procreation becomes clearer once the consequences are observable in animals with shorter gestation periods, and that's when any egalitarian relationship between men and women that may have existed before settled agriculture begins to erode.

The year I started college, 1966 marks a sea-change in the understanding of hunting and gathering cultures. As research by Richard Lee and Ervin DeVore among peoples like the !Kung San in southwestern Africa showed us, gathering was responsible for a substantial portion of any group's caloric intake. Not hunting: gathering, performed by women collecting food and carrying it home in baskets, nets, and other containers, also created by women.

Settled agriculture and the planting of seeds changed all that, as Fisher makes clear. Not only did men discover their role in making babies, but they began to imagine a connection between the planting and growing of seeds and the "planting" of "seeds" in the womb of the mother. Of course we know now that the analogy is faulty because a seed is itself a fertilized ovum; a man cannot "plant" a baby in its mother--he needs the mother's egg. But back on the early farmstead, one can imagine the scenario:

Women are powerful creatures. They bleed every month, have babies, give milk, gather and prepare food, weave clothing and containers. Men hunt occasionally, distribute the meat, make tools to hunt; later, they perhaps make the implements to cultivate crops, and they may plant the fields. Perhaps they harvest. And then one of them rises up, beats his chest and says to his buddies: We are the planters of the seed. The woman is only the dirt in which we plant the seed that makes the child. She is an empty vessel. We have the power of life.

Hence what Fisher calls the "pernicious analogy": semen = seed. So it shouldn't be surprising that when modern males (even some who are smart enough to know what the role of women was really like) uncover voluptuous-looking figures of women they're quite comfortable seeing these figures as the prehistoric equivalent of Playboy.

I always thought it odd that men might carry around miniature dollies to arouse their passions, and thought it far more likely that the shape of these figures might instead provide an inspiration for women. Since gatherers don't usually carry around a lot of body fat, if they fall below 10% their monthly menstrual/ovulatory cycles cease and they can't conceive. Women with a substantial amount of body fat, however, especially in the breasts, stomachs, hips, buttocks, and thighs, are surely fertile or have already borne children.

So before folks start going off on porn, perhaps they should consider the very real possibility that these examples of portable art were carried by women as fertility talismans. Men were undoubtedly too busy polishing their spears (ahem) and telling hunting yarns to play with dolls. In truth, the only thing it's really possible to know about these figures is where they were found, what they were made of, and about how old they are. The date might be interesting (we, homo sapiens sapiens, keep getting older and older with each new discovery), but more bad metaphors don't do anybody any good.

Addendum: A more recent article in the Huffington Post offers a different interpretation, and points to the existence of a goddess culture in Paleolithic Europe. As sympathetic as I am to ideas other than the notion of "Venuses" and pornography, the jury on the goddess culture is still out--for reasons that I address above: we can only be certain about the location, the material composition, and the date of these figures. The older they are, the harder it is for us to know anything else.

Image credits: I pinched the Hohle Fels figurine image from this excellent Science Daily article because it invited me to "enlarge" it. The Woman from Willendorf is from Wikimedia Commons and taken by Oke.

Thursday, April 30, 2009

Losing Language, Losing Meaning

In my first outing post-surgery (after being told by my surgeon that I was well on the mend and good to go for driving, fewer than three weeks from the day he cracked my chest), my daughter and I celebrated at Starbucks and then headed for Half Price Books. I always look in the nature writers and science fiction sections first, and was rewarded on both counts. The best find was Ursula LeGuin's Lavinia, a story based on the last part of Vergil's Aeneid, and on a woman (Aeneas's "native" wife) barely mentioned in the poem. But no one is better at imagining worlds not her own than Ursula K. LeGuin, and I snatched the book up in a nonce. Well, maybe two nonces (you'd have to be a fan of A Funny Thing Happened on the Way to the Forum to get that one). Please note that the owl is only a coincidence.

LeGuin's opening remarks lament the loss of Latin, and the fact that it won't be long before nobody will be reading Vergil in the original. Already I have to explain to my students who Vergil was and why Dante would be following him around in the Divine Comedy. My own Latin is barely functional; I learnt what I know only because it was required of a Greek major at U.C. Riverside, but I can find my way through the geography of grammar and syntax with the help of a primer and a dictionary. But I know exactly what she means. These languages are on their way to the cultural dustbin and will soon become the sole purview of wizened scholars and other odd folk.

If only we knew what we were losing! If only today's students understood how much richer their own language and their lives would be were they supplemented by the words and wisdom of the ancient world (people who already made the mistakes we're in the process of making now). But even the basic etymologies of English words are becoming lost to them. They no longer care or are even vaguely interested in why our words are shaped the way they are. They not only don't know what "sanguine" means, but they're not even aware of its subtleties: it can mean both optimistic and bloody (as the character Zoe points out in an episode of Firefly; but that show was canceled, so there goes another ed-op).

Speaking of blood, which is actually where all this is headed, I'm now a member of a very large community of people dependent on the drug Coumadin (generic name Warfarin), an anticoagulant that will keep my blood flowing freely through my shiny new mechanical aortic valve. Unfortunately, these drugs also suffer from lost language and bad metaphor, because they are commonly known as "blood thinners." But they don't thin the blood at all; that's simply somebody's lame attempt at helping the dumb and dumberer understand what "anticoagulation" means, without simply pointing out the etymology of the term: "anti" = "against"; "coagulate" (from the Latin coagulare, to curdle) = clot; hence a usable definition as "prevention of clotting." It has nothing to do with the "thickness" of blood, but apparently people have less trouble with inappropriate metaphors (they can understand thick and thin better than clot-prevention for some reason) than they do with knowing that a word was derived from a more useful metaphor: the curdling process involved in cheese-making. (It occurs to me, as an afterthought, that most people know longer know much about how cheese is made, so perhaps that's the reason for the bad metaphor.)

So better metaphors are out there; after all, it would be quite simple to understand that you want the consistency of your blood to be like single curds flowing freely in whey than like mozzarella! Or, for that matter, to keep the consistency balanced so that it isn't entirely whey and certainly doesn't become even as dense as ricotta. The "whey" would represent the condition (not thickness) that might generate free bleeding (the absence of clotting), and the flow of curds through whey as having the potential to clot when necessary (so you don't bleed to death), but not clot so easily that the offending conglomerations would jam up and cause a stroke. "Thick" vs. "thin" is not only simple, it's simplistic; it doesn't even come close to describing the process.

In the world of tweets and twitters and twits, there's no room for the subtle--especially if it takes up more characters (anticoagulation =15; "blood thinning" = 13, not counting the space). Also, if you don't know anything about language, you have to look the word up in a good dictionary. How many folks even own one any more?

Where does this leave us? Not in a happy place, I think. Several years ago I began to notice that students were becoming less and less able to identify where artists were from by looking at their names. They can no longer tell that Leonardo or Michelangelo are Italian, that Chardin and Watteau are French, that van der Weyden is Dutch, and that Turner is English. They can't identify a Russian or a Greek name without being told where the artist is from. They're better at Spanish names, and Arabic, which makes sense, but they've lost everything else unless they've studied a language in high school (no longer a requirement for graduation, or if it is, they just take Spanish by default) or have parents or relatives from other parts of the world. Asian names seem to be recognizable in general, but not those from specific countries (they can't tell Korean from Japanese from Chinese). I attribute this vague familiarity to manga, anime, and video games, which tend to be only vaguely Asian and not ethnically specific.

While I was still in hospital, the fulcrum of multiculturalism in otherwise xenophobic America, I played a game, at which I ended up batting about .850: guessing staff-members' nationalities by their names. I missed one, because his surname (Tan) was a remnant of the Chinese in his ancestry, but all of his immediate family were Philippino. I guessed Chinese, of course--even though he didn't look it (the Philippines have a more varied ethnic background, including European and native islander, than most Chinese do, and it shows in their physiognomy). I also spent three of five years in Taiwan being educated by Philippina nuns, so I've had a lot of experience with both. At any rate, I caught the Thais, the Vietnamese, the Chinese, and even (sort of) the Danish Colombian gas-passer who ministered to me during surgery. I guessed Swedish, but he forgave me.

Video games and anime have done their part in spurring some students to learn some Japanese, and sometimes Chinese, but there's less of an interest in European languages, seemingly because "they all speak English." Meanwhile, the State Department, the armed forces, and the CIA are on constant vigil for speakers of Arabic, Farsi, African dialects, and even French, German, and Russian--but I'm pretty sure they end up having to train most of their promising candidates in specific languages.

The only spark of hope in the current situation is the increasing evidence on language and aging. Studies indicate that acquiring a second or third language engages one's brain and helps to sustain cognitive function. This doesn't make much sense to a twenty-year old, but if you're my age (and in danger of stroking out if you don't take your meds properly), any such news is good news.

Now, where did I shelve that Latin grammar?

Image credits: Æneas lands on the shores of Latium with his son Ascanius behind him; on the left, a sow tells him where to found his city. (Lavinia's father was the king of Latium.) British Museum. Roman marble relief, CE 140-150. Copyright Marie-Lan Nguyen/Wikimedia Commons.