Monday, November 23, 2009

Songs of a Distant Earth

Last week I worked for two hours on a post (using Wordpad, which doesn’t have an automatic backup) before being thwarted by one of Microsoft’s incessant updates to Vista. I was using Wordpad in the first place because I often spend several hours on a post so I don't tend to do it online. But Word embeds so much code that it screws with Blogger’s layouts, so I compose in Wordpad. At any rate, I lost it, and haven’t been back since. But my purpose remained intact, even though I lost steam along with my prose.

The purpose of the original post was to assure anyone who happens upon this blog that, contrary to current popular silliness, the absolutely only thing of any importance scheduled to occur on December 21, 2012, is my sixty-fifth birthday. No, the world is not going to end—not even according to the Maya, misconceptions about whose calendar are at the source of popular culture’s latest charlatan show.

It’s actually a good thing that I didn’t post on this last week for a couple of reasons. The first is that I finally caught up with the November 8 edition of the New York Times’s entertainment section, in which Tyler Gray notes that Roland Emmerich’s film, 2012 (now enjoying huge monetary success and making my job measurably more difficult) isn’t the least bit interested in knowing why the Maya calendar “ends” on this particular date. Gray did the smart thing and interviewed David Stuart (who won a MacArthur “genius” grant when he was eighteen for his already significant work on Mayan language). Stuart points out that December 21, 2012 is simply the end of the latest Baktun and the beginning of the next.

The Maya were, in fact, pretty scientific (rather than prophetic) about their calendrical system, drawing on previous historical patterns to suggest what might happen next. I was amused at Stuart’s expectation that he’d be “dealing with Mr. Emmerich’s misuse of Maya history for his whole career. I just hope he’s right, however, in thinking that nobody will really take the movie seriously.

But there’s such a raft of idiocy plying the waves of the internet these days that I’m not all that sanguine; maybe folks won’t believe the movie, but they certainly seem to be tuning into the noise. I urge anyone who’s the least bit interested in the whole 2012 mishegoss to pick up the latest issue of Skeptic magazine (or the National Geographic page linked below), which does a good job of poking holes in twenty differently spurious claims about a variety of catastrophes supposed to occur three years from now.

My main beef with all this rigmarole is that it pokes two of my “buttons”: the demise of intelligent thought and the growing lack of understanding—especially among the American populaceof how science works.

What I don’t understand is why people are so gullible, and why Americans in particular seem to have chosen to be believers rather than scientists. American exceptionalism (the view that we’re special, not only politically and economically, but in the eyes of god) may be at the root of it. After all, the United States exists primarily because it was founded by colonists who wanted to practice religions opposed to or by established churches in the countries from whence they came, and since then politics and religion (though officially separated by a Constitutional clause meant to forestall the establishment of a state religion) have mixed pretty freely. If you're not a professed believer, and preferably a Christian believer, good luck being elected to public office.

But religion doesn’t automatically make one susceptible to goofy claims. The nuns responsible for my early education planted in my nascent brain a love of learning that especially loved learning science. Unfortunately, the more damaging aspect of American-style religion is its tendency to instill doubt in science, unlike mainstream religion in Europe. Recent polls are full of statistics showing that nearly half of Americans think that human beings were created within the last 10,000 years, and claim not to believe in evolution.

The number of people who “believe” in evolution rises significantly as the level of education increases, but asking people if they “believe” in evolution in the first place betrays a fundamental misunderstanding of how science works.

In terms of logic, the problem lies in the fallacy of equivocation: equating two words that mean different things in different contexts. The classic example is the word theory, which to most people represents an idea or even a guess; in science, however, a theory is a coherent explanation for observable evidence. Although there’s always room for falsification in science, theory “behaves” like law, in that experiments and practice are grounded in it.

So gravity is, in fact, “only” a theory, but we behave as though it’s a law, even though our ideas about what it entails have changed somewhat since the time of Newton (largely because of Albert Einstein’s work). So yes, evolution is a theory; but despite a few gaps here and there, by far the preponderance of evidence leads us to the conclusion that Darwin pretty much had it right when he published Origin of Species 150 years ago tomorrow.

Scientists even use the word belief differently than most folks do—in the sense of expectation rather than pure faith. Astronomers, for example, expect there to be life on other planets based on current evidence.

The existence of extraterrestrial planets leads them to suspect with some statistical probability that some planets with some form of life do exist. Ideas like these are hard to disprove, because our galaxy alone contains so many stars that not finding the evidence we need to confirm alien life isn’t going to disprove it. But evidence could emerge that strongly suggests life somewhere else. Unless ET comes calling, though (which physics as we currently understand it pretty much precludes), we’re never going to know for sure. So scientific belief really describes a reason-based expectation, perhaps mixed with hope that discovery will occur in our lifetime—but it’s not blind faith.

Quite by coincidence I picked up Arthur Clarke’s Songs of A Distant Earth last Saturday (in part about what it would really take to travel to other planets if we had to; a rather serendipitous event that gave me the title for the post), and a few hours later snagged a copy of the December 2009 National Geographic, with its article, “Seeking New Earths,” by Timothy Ferris. The article talks about efforts to locate earth-like planets in orbit around sun-like stars, because they’re the most likely to have developed the kind of life that evolved here. The statistical likelihood of life seems to be pretty high, but life as we know it not so much. As Ferris points out, “Biological evolution is so inherently unpredictable that even if life originated on a planet identical to Earth at the same time it did here, life on that planet today would almost certainly be very different from terrestrial life.” (93)

The issue also conveniently includes a fine debunking of the 2012 myths: a fitting way to celebrate the publication of Origin of Species, I think, and a reminder that it might be possible to emerge from our quagmire of silliness and start to better use the brains we’ve got, god-given or not. However much I would like there to be extra-terrestrial life, unless somebody discovers an error in Einstein's cosmic speed limit, our fine, big brains will have to depend on what Timothy Ferris calls the richness of the human imagination to discover new planetsto articulate new theories to explain what's happening to our home planet, and suggest how we might fix it before we're forced to go looking for a new one.

Whether or not we believe in a god, our future depends on embracing science not as a substitute for religion, but as a means to understanding the world. Such understanding does not omit the possibility of faith
as many established religions have shown. Science and religion are not incompatible, unless one seeks absolute certainty (which science does not afford), or reads scripture literally. Even as a child, when I actually did believe in several deities in sequence, I couldn't help but marvel at the genius of any being who could have invented the complex and labyrinthine process that produced us all. Both our origins and our fate are far more interesting and astounding when seen through a microscope or a telescope, than when viewed through a veil of ignorance and wishful thinking.

Image credits: Nick Risinger's conception of the Milky Way galaxy, from Wikimedia Commons, where you can also find an annotated version that shows where we live. The painting of Darwin (also from the Commons) is by George Richmond; Darwin sat for the portrait not long before he sailed on the Beagle.

Wednesday, November 4, 2009

The Multitasking Myth

I am not generally fond of using "myth" in the sense of "lie" or "untruth widely accepted as truth." (I much prefer the notion of myth as cultural storytelling and as a means of preserving cultural identity.) But since this particularly negative sense of the word is being applied to a phenomenon I find extremely deleterious to learning, I'm ready to go along on this one.

So here's the big question. Is it possible for people in general (and young people in particular) to engage in two or more activities at the same time and do any of them well enough to accomplish the purposes attached to each?

The answer seems to be, according to a resounding majority of researchers, "no."

My General Studies colleagues and I have been saved from some measure of dispute by the fact that our department has banned the use of extraneous technologies in the classroom: no laptops, no iPods, no mobile phones, no recording devices--at least without documents supporting ADA accommodation.

Students occasionally try to circumvent the policy by texting under the desk, but their demeanor is so obvious that I usually catch them and suggest strongly that they put the damned phone away and pay attention. As you can imagine, I'm very well loved for this. But it does make me wonder about students' priorities when they can't shut the "communication" devices off for an hour and a half.

I recently came across Rebecca Clay's article for the APA's online journal, Monitor on Psychology, "Mini-multitaskers." Her thesis is pointed, and reveals the fallacy inherent in the notion that young folk can work effectively on several assignments or tasks at a time.

Multitasking may seem modern and efficient, but research suggests that it slows children's productivity, changes the way they learn and may even render social relationships more superficial.

She provides evidence from recent studies that multitaskers do not save time, but actually take longer to accomplish individual efforts because they're not really doing them simultaneously; they're switching back and forth, and the switching adds time to the job rather than reducing it. And the more complex the job, the more time is lost.

As for my favorite in-class bit of naughty behavior, Clay notes that

Text messaging during class isn't just a high-tech version of passing notes. Because of its demands on attention, multitasking also may impair young people's ability to learn.

This is because, as research out of UCLA indicates, information is processed differently and less effectively when multitasking than it is when one devotes one's full attention to an activity. This may, in turn, exacerbate the problem already bequeathed us by the current emphasis in elementary and high school on rote learning. Dividing attention seems to make it harder for students to truly understand what is being taught; instead, they're more likely to respond by rote, able only to barf back what they've been told without digesting it. Sorry for the ugly analogy, but it's apt.

So, if we combine the short attention spans instilled in our kids by Sesame Street (90 seconds) and television commercials (30 seconds), with the superficiality of rote learning and a constant barrage of digital media (cell phones, iPods, Twitter, Facebook), why doesn't every student have ADHD?

According to Tamara Waters-Wheeler, a North Dakota school psychologist quoted in the article, attention problems are increasing and even if students aren't diagnosed with attention-deficit hyperactivity disorder, they're nonetheless exhibiting symptoms because they've grown up multitasking.

The evidence also suggests that the communities established by text-messaging and social networking may be much more superficial than face-to-face friendship. This is particularly important because as media-mediated friendships increase, the quality of these relationships will change the way we look at community, friendship, and other human interactions. The implications for cultural change are manifold. Clay quotes a developmental psychologist, Patricia Greenfield, who finds this multitasking version of friendship troubling:

We evolved as human beings for face-to-face interaction. As more and more interaction becomes virtual, we could lose qualities like empathy that are probably stimulated by face-to-face interaction.

Clay's is only one of myriad articles and blogs devoted to the emerging problems associated with short attention spans and media addiction. One of the best is from Blogger in Middle Earth, Ken Allen, a New Zealand educator whose post on "Binge Thinking" gets at the meat of the matter, which he describes as "cognitive overload."

In order to deal with this phenomenon, it's probably time to start rethinking how we present material to our students. Even in a fast-paced program like ours (cramming a semester's worth of information and learning into eleven weeks), and even in the face of ever more intense scrutiny by the assessment regime, we've simply got to figure out how to slow things down enough to help students change habits that they've acquired over the last twenty years.

Some of what I've read has inspired me to reduce the amount of information and increase the depth to which we explore it each quarter. This is a tough slog for fact- and information-based courses like art history, but in order to address problems associated with superficial learning (and its potential effect on creativity, which I'll address in a later post), some sort of new approach seems necessary. My students are already hinting at a solution: more workshops. My course-evaluation comments are rife with requests for more hands-on activities to connect theory and practice.

I can assure you, however, that I will address the issues one at a time, and use one medium at a time in order to develop some useful strategies. No multitasking will be employed in pursuit of solutions.

As they say in the funnies, "Stay 'tooned."

Image credit: Printed books will always be my medium of choice, but the updated British Museum Reading Room combines the old and the new so beautifully that I thought I'd encourage readers to check it out. The full resolution photo is available here. It was taken in February, 2006, by Diliff and uploaded in its present form to Wikimedia Commons in November 2008. When I mentioned the British Library last week, and wondered what the new library looked like, I hadn't realized that the old reading room still existed at its original site. Because of its marriage of printed codex and digital technology, this seemed like an appropriate illustration for this post. No short attention spans allowed in this place!