Category Archives: Literature

Revisiting Richard III

by Doug McCarty

Garage Sale Books.

Garage Sale Books.

I remember when I was quite young, not quite eight years old, and I visited a neighborhood garage sale. I liked to read and was fairly precocious, so it was natural for me to be drawn to a box of old books in a corner, barely visible among all the various knickknacks and junk. Most of what was in the box was pretty uninteresting, the usual Reader’s Digest collections in hardback and the like. I did find something, though, that caught my eye—a tattered three-volume collection of Shakespeare’s works. I think I paid a dime for all three books.

Wm. Shakespeare

William Shakespeare.

Now, what is a seven-year-old going to do with Shakespeare? Not that much, at first. The collection sat in the bookshelves in my room until a got a little older. Once I started, however, I read the entire set, all the plays, all the poems. I may not have understood everything I was reading all that well, but I kept on and on.

Certain plays held my attention more than others, Richard III especially. Any of you who have read the play or seen one or more of the numerous film adaptions know well the historical description of this king as a deformed hunchback with a withered arm who had his nephews (the sons of his late brother Edward IV) murdered in the Tower of London and who seized the crown through nefarious means. This description bothered me from the beginning, although I could never put my finger on why. Shakespeare, of course, could hardly write about the late king in a positive light, given that in 1591, when the play was probably written, the Tudor line that overthrew and killed Richard was still in charge in England. It always seemed strange that such an enigmatic figure was written in such a way, particularly when this character was given some of the finest lines read from among all the plays.

Sir Laurence Olivier as Richard III in 1955

Sir Laurence Olivier as Richard III in 1955.

Years went by, and I reviewed the play from time to time, then very little after graduate school. In recent years, I discovered that there was a revival of interest in Richard, especially among those who wanted to reform the negative image created chiefly by Sir Thomas More and dramatized by Shakespeare. After a few moments of online research, I realized that I had been sleeping while new data and information on Richard was being uncovered. Unknown to me, there are several societies and research projects devoted to discovering the truth about Richard, who is, by the way, the current Queen of England’s 14th great-grand-uncle. There is a bunch of new stuff out there, but I will avoid boring anyone with all of it, even though it excites me.

Sir Ian McKellen as a historically-reimagined Richard III in 1995

Sir Ian McKellen as a historically-reimagined Richard III in 1995.

The most interesting thing I found out is that Richard III’s remains were discovered underneath a parking lot in 2012 after vanishing over five centuries ago, at the site of the medieval Greyfriars Church, where he was reputed to have been interred. The skeleton was identified, according to researchers, beyond a reasonable doubt as belonging to Richard.

This reasonably-positive identification was made through comparison of his mitochondrial DNA with that of two matrilineal descendants of Richard’s eldest sister, Anne of York. The spine showed considerable deformity consistent with scoliosis, which may have given Richard the appearance of having one shoulder higher than the other.

The Skeleton in the Grave.

The Skeleton in the Grave.

The skeleton sustained at least ten wounds demonstrative of battlefield injuries of Richard’s day, when he was killed by the forces of Henry Tudor at the Battle of Bosworth Field in 1485, which was the end of the Wars of the Roses and is typically considered to mark the end of the Middle Ages in England. There are plenty of gruesome details and conjectures about the events leading up to his death, and I invite you to do a web search for yourself, if you are at all inclined.

Portrait of Richard III in National Portrait Gallery

Portrait of Richard III

At this point, the team from the University of Leicester who participated in Richard III’s discovery, among other players, is planning to decode his genome, which has set off a flurry of controversy and protests, even involving Buckingham Palace.

Although history has been unkind to Richard, perhaps justly so in some instances, he was by many accounts an able and talented military commander, a king who reputedly enacted laws to protect the poor, and a noble who lived at the very least according to the principles of his time. He remains an enigma and perhaps always will, but that is what makes him so interesting, at least to me.

It’s So Bad it’s Good.

By Claude Tate

I devoted one of my blog entries last fall to a movie, Raise the Red Lantern,  which I felt was not only an excellent movie in and of itself, but a movie of educational value in that it provided a window onto traditional Confucian society in early 20th century China.  In fact, I liked it so much I’ve used it a number of times in classes and recommended it on numerous occasions to my BLS students.

Gong Li in Raise the Red Lantern

Gong Li in Raise the Red Lantern

This blog is also devoted to a movie, but a very different kind of movie. This movie is far removed from an excellent movie. It’s a bad movie. A very bad movie. The general consensus is that it is the worst movie ever made.  In fact, it is so bad that even some critics see it as good.  For example, Phil Hall on his film review site, Rotten Tomatoes, said the exceptionally poor quality of the movie made him laugh so much, he could not put it at the top of his ‘worst of’ list.  Another source, Videohound’s Complete Guide to Cult Flicks and Trash Pics, states that, “In fact, the film has become so famous for its own badness that it’s now beyond criticism.”

Also, this movie does not have the clear educational value that “Lantern” has. But if one defines educational value somewhat loosely—strike that, very loosely—it is not without some value.  In fact, I used it on several occasions in a face to face class I used to teach entitled “US History Since 1945″.  Since we had time in that class to go beyond the high points that one is limited to in the broad surveys, I tried to include things that would allow students to re-imagine what everyday life was like. Toward that end, when I covered the 1950s, among other things, I brought in clips of some of the old classic TV shows as well as some movies.  The atomic bomb and the possibility of nuclear annihilation became a part of our lives during the ’50s, so there were a number of movies made that were built around that theme. Some were good, while others were not so good. We also really became very much aware of space during this ’50s, so a number of movies were made devoted to that theme. Again, some were good, while others were bad. While I mainly used clips, when there was time, I would try to work in an entire movie using one or both of those themes. I tried a couple of the really good ones, but film-making has changed quite a bit since then, so students didn’t seem to appreciate the quality of what they were seeing. So I decided to look for stinkers.  Luckily, I not only found a stinker, the stench from this ‘masterpiece’ could encircle the earth several times over. Students generally really liked the movie, so I thought I would suggest it here.

Plan_9_poster

Plan 9 From Outer Space original poster

It is (music please) Plan 9 From Outer Space (made in 1956, released in 1959). It is a horror movie that incorporated an invasion from outer space theme as aliens planned to conquer the earth by raising the dead against us. It was conceived, produced, written, and directed by the infamous Edward D. Wood, Jr.  After years of criticism, in 1980 Michael Medved and Harry Medved named it the “worst movie ever made” and awarded it their Golden Turkey Award.  That same year Ed Wood (who died in 1978) was also posthumously awarded the Golden Turkey Award as the worst director ever.  I don’t know whether any of the actors in the movie received any ‘worst’ awards for their performances, but a number of them should have. It also played at a Worst Films Festival in New Orleans, figured prominently in an episode of Seinfeld, and was the centerpiece of the movie, Ed Wood, (1994) which was directed and produced by Tim Burton and starred Johnny Depp.

Johnny Depp as Ed Wood

Johnny Depp as Ed Wood

For those of you who haven’t seen the movie, I don’t want to spoil the beauty of it by saying too much about the number of ‘gems’ it contains. But a few hints will not hurt.  Bela Lugosi was going to make a comeback with this movie, but died after only a few test shots. Ed, ‘ingeniously’ included those few shots on a number of occasions in the movie. For other shots, his wife’s chiropractor, who looked nothing like Lugosi, was the stand in. That is why the DVD released through Image Entertainment, states “Almost Starring Bela Lugosi” on the cover.  The special effects are also a thing of beauty as the flying saucers are campy even for the ’50s.  And one just has to love how night turns to day and back to night in back to back scenes. But if you wish to know more, a plot summary can be found on “The Movie Club Annuals…” website.

I own the DVD. I just had to have a physical copy. But it is in the public domain, and can be accessed on YouTube here.   You should also be able to download it from other sources.

Ed Wood

Ed Wood

By the way, Ed Wood made movies before and after this one.  I have not seen them so I cannot attest to their quality, but given his talent for movie making he showed in Plan 9 and the titles, I would assume they are bad also.  But movies evidently weren’t his only passion.  He wrote a large number of books, which I have not read nor intend to, but from the sampling of titles I’ve seen, he seems to have been just as good at writing books as making movies. Ed also led an interesting personal life which you get some hint of in the movie, Ed Wood.  If you want to find out more about Ed or his other “artistic” endeavors, you’re on your own.  I’m only recommending Plan 9 From Outer Space.

If you plan to watch Plan 9, I would suggest you watch Ed Wood first.  I think it will help you understand and appreciate both Ed and Plan 9 since it focuses on Ed’s early career and the making of this ‘masterpiece’.  By the way, Ed Wood is a very good movie. It won two Academy Award, one for Best Supporting Actor (Martin Landau), and one for Best Makeup (Rick Baker, Ve Neill, and Yonlanda Tousseing).  Unfortunately, this movie is not in the public domain so you will need to rent it.

Vampira (Maila Nurmi) in Plan 9

Vampira (Maila Nurmi) in Plan 9 From Outer Space

I would also suggest you watch it with friends. It’s always more fun to see an awful flick with friends as they may see things to make fun of you may miss. So I guess it has value beyond just its dubious educational value. It’s a great excuse for friends to get together.

All Hallows Eve…and Errors

by Matt McKinnon

All Hallows Eve, or Hallowe’en for short, is one of the most controversial and misunderstood holidays celebrated in the United States—its controversy owing in large part to its misunderstanding.  More so than the recent “War on Christmas” that may or may not be raging across the country, or the most important of all Christian holidays—Easter—blatantly named after the pagan goddess (Eostre), Halloween tends to separate Americans into those who enjoy it and find it harmless and those who disdain it and find it demonic.  Interestingly enough, both groups tend to base their ideas about Halloween on the same erroneous “facts” about its origins.

A quick perusal of the internet (once you have gotten by the commercialized sites selling costumes and the like) will offer the following generalizations about the holiday, taken for granted by most folks as historical truth.

Common ideas from a secular and/or Neopagan perspective:

  • Halloween developed from the  pan-Celtic feast of Samhain (pronounced “sah-ween”)
  • Samhain was the Celtic equivalent of New Years
  • This was a time when the veil between the living and dead was lifted
  • It becomes Christianized as “All Hallows Day” (All Saints Day)
  • The eve of this holy day remained essentially Pagan
  • Celebrating Halloween is innocent fun

Common ideas from an Evangelical Christian perspective (which would accept the first five of the above):

  • Halloween is Pagan in origin and outlook
  • It became intertwined with the “Catholic” All Saints Day
  • It celebrates evil and/or the Devil
  • It glorifies death and the macabre
  • Celebrating Halloween is blasphemous, idolatrous, and harmful

Even more “respectable” sites like those from History.com and the Library of Congress continue to perpetuate the Pagan-turned-Christian history of Halloween despite scarce evidence to support it, and considerable reason to be suspicious of it.

To be sure, like most legends, this “history” of Halloween contains some kernel of fact, though, again like most things, its true history is much more convoluted and complex.

The problem with Halloween and its Celtic origins is that the Celts were a semi-literate people who left only some inscriptions: all the writings we have about the pre-Christian Celts (the pagans) are the product of Christians, who may or may not have been completely faithful in their description and interpretation.  Indeed, all of the resources for ancient Irish mythology are medieval documents (the earliest being from the 11th century—some 600 years after Christianity had been introduced to Ireland).

It may be the case that Samhain indeed marked the Irish commemoration of the change in seasons “when the summer goes to its rest,” as the Medieval Irish tale “The Tain” records.  (Note, however, that our source here is only from the 12th century, and is specific to Ireland.)  The problem is that the historical evidence is not so neat.

A heavy migration of Irish to the Scottish Highlands and Islands in the early Middle Ages introduced the celebration of Samhain there, but the earliest Welsh (also Celtic) records afford no significance to the same dates.  Nor is there any indication that there was a counterpart to this celebration in Anglo-Saxon England from the same period.

So the best we can say is that, by the 10th century or so, Samhaim was established as an Irish holiday denoting the end of summer and the beginning of winter, but that there is no evidence that November 1 was a major pan-Celtic festival, and that even where it was celebrated (Ireland, Scottish Highlands and Islands), it did not have any religious significance or attributes.

As if the supposed Celtic origins of the holiday are uncertain enough, its “Christianization” by a Roman Church determined to stomp out ties to a pagan past are even more problematic.

It is assumed that because the Western Christian churches now celebrate All Saints Day on November 1st—with the addition of the Roman Catholic All Souls Day on November 2nd—there must have been an attempt by the clergy of the new religion to co-opt and supplant the holy days of the old.  After all, the celebrations of the death of the saints and of all Christians seem to directly correlate with the accumulated medieval suggestions that Samhain celebrated the end and the beginning of all things, and recognized a lifting of the veil between the natural and supernatural worlds.

The problem is that All Saints Day was first established by Pope Boniface IV on 13 May, 609 (or 610) when he consecrated the Pantheon at Rome.  It continued to be celebrated in Rome on 13 May, but was also celebrated at various other times in other parts of the Western Church, according to local usage (the medieval Irish church celebrated All Saints Day on April 20th).

Its Roman celebration was moved to 1 November during the reign of Pope Gregory III (d. 741), though with no suggestion that this was an attempt to co-opt the pagan holiday of Samhain.  In fact, there is evidence that the November date was already being kept by some churches in England and Germany as well as the Frankish kingdom, and that the date itself is most probably of Northern German origin.

Thus the idea that the celebration of All Saints Day on November 1st had anything to do either with Celtic influence or Roman concern to supersede the pagan Samhain has no historical basis: instead, Roman and Celtic Christianity followed the lead of the Germanic tradition, the reason for which is lost to history.

The English historian Ronald Hutton concludes that, while there is no doubt that the beginning of November was the time of a major pagan festival that was celebrated in all of the pastoral areas of the British Isles, there is no evidence that it was connected with the dead, and no proof that it celebrated the new year.

By the end of the Middle Ages, however, Halloween—as a Christian festival of the dead—had developed into a major public holiday of revelry, drink, and frolicking, with food and bonfires, and the practice of “souling” (a precursor to our modern trick-or-treating?) culminating in the most important ritual of all: the ringing of church bells to comfort the souls of people in purgatory.

The antics and festivities that most resemble our modern Halloween celebrations come directly from this medieval Christian holiday: the mummers and guisers (performers in disguise) of the winter festivals also being active at this time, and the practice of “souling” where children would go around soliciting “soul cakes” representing souls being freed from purgatory.

The tricks and pranks and carrying of vegetables (originally turnips) carved with scary faces (our jack o’ lanterns) are not attested to until the nineteenth century, so their direct link with earlier pagan practices is sketchy at best.

While the Celtic origins of Samhain may have had some influence on the celebration of Halloween as it begin to take shape during the Middle Ages, the Catholic Christian culture of the Middle Ages had a much more profound effect, where the ancient notion of the spiritual quality of the dates October 31st/November 1st became specifically associated with death—and later with the macabre of more recent times.

Thus modern Halloween is more directly a product of the Christian Middle Ages than it is of Celtic Paganism.  To the extent that some deny its rootedness in Christianity or deride its essence as pagan is more an indication of how these groups feel about medieval Catholic Christianity than Celtic Paganism (about which we know so very little).

And to the extent that we fail to realize just how Christian many of these practices and festivities were, we fail to see how much the Reformation and later movement of pietism and rationalism have been successful in redefining exactly what “Christian” is.

As such, Halloween is no less Christian and no more Pagan than either Christmas or Easter.

Happy trick-or-treating!

How Free Should Freedom of Speech Be?

by Matt McKinnon

Only now, weeks after the blatantly anti-Islamic film “Innocence of Muslims” posted on YouTube, making headlines and spawning violent reactions from Muslims across the globe, have tensions begun to ease a bit.  Oh, to be sure, mainstream media and American attention has moved on, only to return when the next powder keg blows, while much of the rest of the world is left to grapple with serious questions of rights and responsibilities in this new age of technology.

Riots in Libya in response to “Innocence of Muslims”

These recent riots across the Muslim world bring into high relief serious questions about the freedom of speech, for us as citizens of the United States, as well as participants in a global society made smaller and smaller with the advance of technology.

The problem, at first glance, seems like the usual violent overreaction by Muslim extremists whose narrow view of religion and politics seeks only to protect their view at the expense of the rights of others who may disagree.

Salman Rushdie

We are quickly reminded of the fatwa pronounced on Salman Rushdie  for the horrific action of writing a novel (itself a work of fiction), as well as the assassination of Dutch director Theo van Gogh for his work “Submission” about the treatment of women in Islam.

But a deeper look into this issue, instead of bringing clarity and self-assured anti-jihadist jihad, reveals a real problem that is not so black and white, not so clearly one of the fundamental right of freedom of speech versus ignorant fundamentalism, but rather one that is complicated, with subtleties and nuances not conducive to entertainment-news sound bites and the glib remarks of politicians.

The question becomes, not so much what freedoms we as U.S. citizens have with respect to our Constitution and domestic laws, but rather what extent should we, as participants in a larger global society, respect the various notions of freedom of speech at work in other countries and cultures different from our own.

Theo van Gogh

For, when we look closer at the latest incident itself, instead of finding the literary musings of a great novelist or the social critique of a world-class film director, we find a joke of a movie—though not really a movie at all: more like a few scenes of such low quality and disconnect that one doubts the existence of a larger work.  Some have called it “repugnant,” but what makes it so is not its content, which can scarcely be taken seriously, but rather the intention behind it.

The title itself (Innocence of Muslims) makes little sense—if predicated of the film’s contents.  However, if the title was a display of ironic/sarcastic foresight, then it fits all too well.  It would be hard to find anyone in the Western world who would take it seriously.  There is no plot, no character development, in fact, no real characters—only thinly disguised stereotypes meant to offend.

And that—the intention to offend—seems to be the only real purpose of the work itself.

Now it must be said that intention to offend is not, in and of itself, always a bad thing.  And the intention to offend religion especially is not either.  (I find myself doing both as often as possible.)  The problem arises when the intention to offend is also an intention to provoke, not discussion and debate, a la Rushdie and van Gogh, but violence and riot.

A few details are warranted:

As it turns out, the video was posted on YouTube in early July, 2012 as “The Real Life of Muhammad” and “Muhammad Movie Trailer,” but received little or no attention.  It was then dubbed into Arabic, re-titled with the aforementioned ironic/sarcastic name, and broadcast on Egyptian television on September 9th—two days before the eleventh anniversary of 9/11.

And the rest, as they say, is history.

(Except that it’s still going on, and will continue to escalate in the future.)

Holocaust survivors in Skokie, IL

So why is this case NOT the same as that of Rushdie and van Gogh, overlooking of course the artistic and social merit of these two?  Well, it strikes me that this incident is less like writing a book or making a movie criticizing specific points of any (and all) religions and more like yelling “Fire!” in a crowded theatre or assembling Nazis to march through Jewish neighborhoods in Skokie, Il in1977.

The first are clearly examples of protected free speech, the second is not, and the third is still hotly debated 35 years the case went to court.

Here is where American jurisprudence only helps so far.  For U.S. laws are clear that free speech can be limited if the immediate result is incitement to riot.  (This is what the city of Skokie argued—and lost—in their case against the Nazis: that their uniformed presence in a neighborhood with a significant number of Holocaust survivors would incite riot.)

Members of Westboro Baptist Church

To complicate matters, the freedom of speech laws are even more restrictive in other countries—and not just those “narrow-minded,” theocratic Muslim ones either.  In fact, Canada as well as much of Europe has free-speech laws that significantly restrict its exercise.   For example, the infamous members of Westboro Baptist Church, who recently won a Supreme Court case supporting their right to picket the funerals of U.S. soldiers, are not allowed to protest north of the border in Canada, as that country’s hate speech laws forbid such activity.  (Many in the U.S. disagreed with SCOTUS’s ruling and were in favor of restricting free speech in this case.)

And in most European countries, publicly denying the Holocaust is a crime.  As is racist hate speech (just ask Chelsea footballer John Terry).

The production and “marketing” of “Innocence of Muslims” might or might not meet the criteria of hate-speech, but that is neither my concern nor my point.

The fact that the video was produced with the intention to inflame Muslims, and that when it initially failed to do so it was dubbed into Arabic and specifically presented to Egyptian television to be broadcast to millions, makes it hard to deny that its real intention was to promote and incite riot.

The missing pieces here are the internet and technology—and the laws that have failed to keep up with them.  For with social media, YouTube, smart phones, iPads, Skype, etc…, placing an inflammatory video in the right hands with the capability to reach millions almost instantaneously just may be the 21st century version of standing on a street corner inciting folks to riot.

In fact, it may be worse.

Not recognizing this, I fear, will have even more dire consequences in the future.

__________

Editor’s note: The usual practice in the BLS Program is to provide direct links to primary sources when possible. However, in the case of the video discussed in this entry, we decided it would be imprudent to link to it directly. If you want to view the video for yourself, a search of the title will lead you to the original posting, various repostings, and sundry articles and editorials about the video and its aftermath.

Spiders and Toads

By Marc Williams

Laurence Olivier as Richard III.

“Now is the winter of our discontent
Made glorious summer by this sun of York.”
~Richard III (Act 1, scene 1).

King Richard III is among Shakespeare’s greatest villains. Based on the real-life Richard of Glouster, Shakespeare’s title character murders his way to the throne, bragging about his deeds and ambitions to the audience in some of Shakespeare’s most delightful soliloquies. Shakespeare’s Richard is famously depicted as a hunchback, and uses his physical deformity as justification for his evil ambitions:

Why, I, in this weak piping time of peace,
Have no delight to pass away the time,
Unless to spy my shadow in the sun
And descant on mine own deformity:
And therefore, since I cannot prove a lover,
To entertain these fair well-spoken days,
I am determined to prove a villain
And hate the idle pleasures of these days.

For stage actors, Richard III is a tremendously challenging role. On one hand, he is pure evil—but he must also be charming and likeable. If you aren’t familiar with the play, its second scene features Richard successfully wooing Lady Anne as she grieves over her husband’s corpse! And Richard is her husband’s killer! Shakespeare’s Richard is both evil and smooth.

Simon Russell Beale as Richard III.

Actors must also deal with the issue of Richard’s physical disability. For instance, Richard is described as a “poisonous bunch-back’d toad,” an image that inspired Simon Russell Beale’s 1992 performance at the Royal Shakespeare Company, while Antony Sher’s iconic 1984 interpretation was inspired by the phrase “bottled spider,” an insult hurled at Richard in Act I.

Anthony Sher’s “bottled spider” interpretation of  Richard III.

While much of the historical record disputes Shakespeare’s portrayal of Richard as a maniacal mass-murderer, relatively little is known about Richard’s disability. According to the play, Richard is a hunchback with a shriveled arm. However, there is little evidence to support these claims.

This uncertainty may soon change. Archaeologists in Leicester, England have uncovered the remnants of a chapel that was demolished in the 16th century. That chapel, according to historic accounts of Richard’s death at the Battle of Bosworth, was Richard’s burial site. Not only have researchers found the church, but they have also located the choir area, where Richard’s body was allegedly interred. And indeed, last week, the archaeologists uncovered bones in the choir area:

If the archeologists have indeed found the remains of Richard III, the famous king was definitely not a hunchback. It appears he suffered from scoliosis—a lateral curve or twist of the spine—but not from kyphosis, which is a different kind of spinal curvature that leads to a pronounced forward-leaning posture. As Dr. Richard Taylor explains in the video, the excavated remains suggest this person would have appeared to have one shoulder slightly higher than the other as a result of scoliosis.

Interestingly, Ian McKellen’s performance as Richard III, captured in Richard Loncraine’s 1996 film, seems to capture the kind of physical condition described by Dr. Taylor, with one shoulder slightly higher than the other. At the 6:45 mark in this video, one can see how McKellen dealt with Richard’s condition.

So it appears Shakespeare not only distorted historical details in Richard III, he also apparently distorted the title character’s shape. Of Shakespeare’s Richard, McKellen wrote:

Shakespeare’s stage version of Richard has erased the history of the real king, who was, by comparison, a model of probity. Canny Shakespeare may well have conformed to the propaganda of the Tudor Dynasty, Queen Elizabeth I’s grandfather having slain Richard III at the Battle of Bosworth. Shakespeare was not writing nor rewriting history. He was building on his success as thee young playwright of the Henry VI trilogy, some of whose monstrously self-willed men and women recur in Richard III.

It seems likely that Shakespeare wanted Richard to seem as evil as possible in order to flatter Queen Elizabeth I, depicting her grandfather as England’s conquering hero. But why distort Richard’s physical disability as well?

In describing Richard’s body shape, it is difficult to ascertain what Shakespeare’s motives might have been and perhaps even more difficult to assess his attitudes toward physical difference in general. For example, in my “Big Plays, Big Ideas” class in the BLS program, we discuss the issue of race in Othello, even though we don’t know much about what Shakespeare thought about race. Many scholars have investigated the subject of physical difference in Shakespeare, of course: there are papers on Richard’s spine, naturally, but also Othello’s seizures, Lavinia’s marginalization in Titus Andronicus after her hands and feet are severed, the depiction of blindness in King Lear, and even Hermia’s height in A Midsummer Night’s Dream. And just as one must ask, “is Othello about race,” we might also ask, “is Richard III about shape?” I doubt many would argue that physical difference is the primary focus of Shakespeare’s Richard III, but it will be interesting to observe how the apparent discovery of Richard’s body will affect future performances of the play. Will actors continue to twist their bodies into “bottled spiders,” or will they focus on the historical Richard’s scoliosis—and perhaps ask why such vicious language is used to describe such a minor difference?

Nimrod: What’s In a Name?

by Matt McKinnon

My teenage son is a nimrod. Or so I thought.

And if you have teenagers, and were born in the second half of the twentieth century, you have probably thought at one time or another that your son (or daughter) was a nimrod too, and would not require any specific evidence to explain why.

Of course this is the case only if you are of a certain age: namely, a Baby Boomer or Gen Xer (like myself).

For if you are any older, and if you are rather literate, then you would be perplexed as to why I would think that my son was a nimrod, and why was I not capitalizing Nimrod as it should be.  Since it is, after all, a proper noun.

It is?

Yes, it is.  Or rather it was.  Let me explain.

It turns out, the word “nimrod” (or more properly “Nimrod”) has a fascinating history in which it goes about a substantial reinterpretation.  (Any nimrod can find this out by searching the web, though there is precious little explanation there.)   This, by itself, isn’t surprising, as many words that make their way through the ages transform as well.  But the transformation of “Nimrod” to “nimrod” is particularly interesting in what it tells us about ourselves and our culture.

Nimrod, you see, was a character from the Hebrew Scriptures, or as Christians know it, the Old Testament:

“Cush became the father of Nimrod; he was the first on earth to become a mighty warrior. He was a mighty hunter before the LORD; therefore it is said, ʻLike Nimrod a mighty hunter before the LORD.ʼ”  (Genesis 10:8-9 NRSV)

This is the manner in which older biblically literate folks will understand the term: “as a mighty hunter.”

But there’s more here, for these folks also might understand the term as referencing a tyrannical ruler.

Why?  Well, the etymology of the word links it to the Hebrew “to rebel,” not for anything that Nimrod actually does in the Old Testament, but because, as many scholars attest, it is probably a distortion of the name for the Mesopotamian war-god Ninurta.  And the later chroniclers of Israelite religion didn’t have much sympathy for the polytheism of their Mesopotamian neighbors—especially when it so obviously informed their own religious mythology.

So the word, when it very early on enters the biblical narrative, already shows signs of transformation and tension as referencing both a mighty hunter as well as someone rebellious against the Israelite god.

In fact, Jewish and Christian tradition name Nimrod as the leader of the folks who built the Tower of Babel, though this is not found anywhere in the scriptures.  This, then, is how Nimrod is now portrayed in more conservative circles, despite the lack of biblical attestation:

And as the word is already attested to in Middle English, by the 16th century it is clearly being used in both manners in the English language: as a tyrant and as a great warrior or hunter.

Now I can assure you, neither of these describes my teenage son.  So what gives?

Well, “Nimrod” shows up in a 1932 Broadway play (which only had 11 showings) about two lovesick youngsters:

“He’s in love with her. That makes about the tenth. The same old Nimrod. Won’t let her alone for a second.”

Here, however, the emphasis is still on the term’s former meaning as a hunter, though its use in the play to describe a somewhat frivolous and hapless fellow who moves from one true love to the next points us in the right direction.

And in a 1934 film You’re Telling Me, W.C. Fields’ character, a bit of a buffoon himself (and a drunkard), takes a few swings of a limp golf club and hands it back to his dim-witted caddy, saying in a way only W.C. Fields could:

“Little too much whip in that club, nimrod.”

So here we have the first recorded instance of the word’s transformation from a great hunter or tyrant to a stupid person or jerk.

But that’s not the end of the story.  After all, how many of us have seen You’re Telling Me?  (I haven’t, at least, not until I did the research.)

So the last, and arguably most important piece to the puzzle is not the origination of the word or its transformation, but rather the dissemination of it.

And that, as I’m sure many of you are aware, is none other than T.V. Guide’s greatest cartoon character of all time: Bugs Bunny, who first debuted in the 1940’s, not that long after You’re Telling Me premiered.

In this context, the one most folks born after World War II are familiar with, Bugs Bunny refers to the inept hunter Elmer Fudd as a “little nimrod.”  And the rest, as they say, is history.

For what emerges from Bugs’ usage is not the traditional reference to Fudd as a hunter (though this is the obvious, albeit ironic, intention), but rather Fudd’s more enduring (and endearing?) quality of ineptitude and buffoonery.

And anyone who has (or knows) a teenager can certainly attest to the applicability of this use of the term in describing him or her.

But the important thing is what this says about literacy and our contemporary culture.

For whereas my parents’ generation and earlier were more likely than not to receive their cultural education from Classical stories, the great literature of Europe, and the Bible, those of us born in the latter half of the 20th century and later, are much more likely to receive our cultural education from popular culture.

I have seen this firsthand when teaching, for example, Nietzsche’s “Thus Spake Zarathustra,” which offers a critique of Judaism and Christianity by parodying scripture.  The trouble is, when students don’t know the referent, they can’t fully understand or appreciate the allusion.  And this is as true of Shakespeare and Milton as it is of Nietzsche…or Bugs Bunny for that matter.

And the ramifications of this are far greater than my choosing the proper term to criticize my teenage son.

(Though ya gotta admit, “nimrod” sounds pretty apropos.)

Choose Your Own Adventure

By Carrie Levesque

Recently in the Russian Novel of Conscience course we have been discussing Yevgeny Zamyatin’s 1921 dystopian novel We, about a highly mechanized and regimented totalitarian society (the One State) hundreds of years in the future where citizens have achieved the ultimate happiness: unfreedom.  Taking as our starting point Marx’s claim that machines, invented to help man, have become the symbols of his servitude, we debate the extent to which machines and technology have enslaved or liberated men in today’s world.

As a class, we’ve compiled a pretty good list of technology’s benefits (efficiency, convenience, online degree programs!) and costs (myriad media addictions, a privileging of online relationships at the expense of face-to-face ones).  At midterm, several students have written excellent papers on how we have created a sort of One State within the United States through certain government policies and technologies which reduce rather than foster our individuality and humanity.

Many of these discussions have stirred up nostalgia for simpler times, when it seems people had different values and a different relationship to one another.  They’ve made me think about a book I read recently on a more extreme response to this question, the Back to the land movement (which is a great deal more complex than just ‘living simply,’ but I’m limited to 800 words…).

I grew up in a remote area of northern Maine that has always attracted Back-to-the-landers.  What possesses these diehards who apparently find the southern Maine homesteads of the followers of Scott and Helen Nearing not austere or isolated enough, that they would haul their few remaining possessions to the place where the logging roads end and call it home, I can’t say for sure.

But I’ll admit to having a touch of that idealism myself- to unplug, to live off the land, to disconnect from our nonstop media and rampant consumerism of all the latest technology (though you’d have to be insane to choose the wilds of northern Maine, a place with two seasons: Brutal Winter and Rainy Black Fly Infestation).  Though I now prefer a more comfortable climate, I understand the appeal of living in a beautiful, natural setting, devoting most of one’s time to work in the outdoors without a care for whatever new technology or entertainment the rest of the world is enthralled with.

Coleman Family

And yet, through a closer examination of life ‘off the grid,’ I’ve also come to a greater appreciation of many benefits of our modern life. I recently read Melissa Coleman’s memoir This Life Is In Your Hands about growing up the daughter of famous homesteader and Nearing mentee Eliot Coleman. She chronicles the great strain that the demands of homesteading put on her family, resulting in her father’s ill health, her baby sister’s tragic death and her parents’ divorce.  (On a lighter note, she also reveals some of the purist Nearings’ well-kept secrets: Helen’s love of ice cream, mail order fruit and other delicacies.  Even the folks who wrote the (sometimes a tad righteous) book on living local and off the grid indulged a little on occasion).  Though there were certainly aspects of their lives on the homestead that were richly satisfying, some readers may come away wondering if their chosen cure for the ills of modern life wasn’t in some ways as harmful as the disease, physically as well as spiritually.

Another interesting look at the real-life struggles of those who lived in those idealized ‘simpler times’ is the PBS reality series Frontier House.   In 2001, three families (selected from among some 5,000 applicants!) lived off the land for six months on the simulated frontier of 1880s Montana.  The success of their venture was assessed by historians based on whether each family had put by enough food and fuel over the summer and fall to survive a Montana winter.  Though they labored admirably, through all sorts of drama, if memory serves it was decided all would have perished.  The simpler times were never as simple as they seem.   (Frontier House is available in UNCG’s Instructional Film Collection, but sadly, not on Netflix).

There are no easy answers to the question of man’s relationship to technology.  Most people I know lament their dependency on smart phones, social media and a food supply so highly engineered that many of us have no idea what we’re really eating half the time (pink slime, anyone?).   Yet we have so much to be grateful for.  We live in a time of amazing medical advances.  Whatever may plague or disappoint us in our lives, we have the freedom and resources at our fingertips to research alternatives and connect with like-minded people to find a solution.  For all our similarities, thankfully these United States are not the One State.  Our ultimate happiness is not to be found in our unfreedom, but in our freedom to negotiate these complex choices and relationships, to choose our own adventure.

New Year’s Resolutions

By Carrie Levesque

Is it too early to start thinking about New Year’s Resolutions?  It’s not something I usually get to until the holiday insanity is over and the next wave of media bombardment starts pushing gym memberships and Nutrisystem packages.   Though I filter out most of the blah blah blah, I do get to thinking about how to best use this gift of a new year.

While procrastinating online a few days ago, I came across an article, “Five lessons learned from living in Paris.”   I was struck by its epigraph, a quote from Hemingway’s letters, where he writes, “Paris is so very beautiful that it satisfies something in you that is always hungry in America.”   Hemingway’s quote and this article on Jennifer L Scott’s new book,”Lessons from Madame Chic: The Top 20 Things I Learned While Living in Paris” are both indirectly about Americans’ hunger for beauty and grace in a commerce-driven culture that sometimes doesn’t seem to have a lot of use for either.

Two of Scott’s ‘lessons’ resonate with me as I reevaluate my life and habits at year’s end.  First was her observation that “Parisians often turned mundane aspects of everyday life into something special.”  She recalls how her host father made an event of savoring a bit of his favorite cheese every night at the end of dinner; it was just routine enjoyment of a cheese course, but he was so passionate about it that it became a special ritual whose memory Scott savored after she returned to American life.  Meals tend to be something we rush through in the US, a pit-stop refueling on our way to getting done more of whatever it is we are always rushing to get done.  It seems if we’re looking for anywhere in our lives to slow down and satisfy a spiritual hunger along with our physical hunger, mealtime is a worthy candidate.

She also comments on the European tendency to take more pride and care in one’s dress while at the same time owning fewer clothes and accessories than Americans tend to, which simplifies the process of making oneself presentable each day.  This calls to mind web projects like Project 333, one of many movements today exploring the benefits of ‘voluntary simplicity.’   Owning fewer things doesn’t only save you money, it also frees you from having to care for and about heaps of stuff, so that you have more time and energy to indulge in things like Scott’s first point: really enjoying and being present for life’s smaller, even routine, pleasures.

OK, so none of this is anything new.  Oceans of ink have already been spilled on these topics (sorry, Ms. Scott).  And yet we still make New Year’s resolutions, and we still hunger and struggle, in all sorts of ways, to be better (whatever each of us decides that means), to improve the world around us.  Which is why at New Year’s I find myself looking to accounts I’ve come across of people closer to home who decided to devote their year to some sort of radical reevaluation of the way we live (like a New Year’s Resolution on steroids), and the lessons they learned from it.

Two of my favorites in the ‘less is more’ genre are Judith Levine’s “Not Buying It: My Year without Shopping” (fairly self-explanatory) and Colin Beavan’s “No Impact Man” (which became a movie in 2009- he and his family sought to live (in NYC, for a whole year) in a way that made no environmental impact).    For giving more consideration to the sacred place of food in our lives, I think of Barbara Kingsolver’s wonderful “Animal, Vegetable, Miracle,” her account of eating only in-season food produced by or close-by her Virginia mountain farm for one year.  Two other works I might get to this year are A.J. Jacobs’ “The Year of Living Biblically”(in which Jacobs, an agnostic, tries to take literal direction from the Good Book and considers the place of the sacred in our lives) and Sara Bongiorni’s “A Year Without ‘Made in China’: One Family’s True Life Adventure in the Global Economy“ (in which she walks the walk of the ‘Buy American’ talk).

While I don’t see myself devoting my year to anything requiring quite this level of discipline (baby steps!), it is inspiring to live vicariously through these authors and reflect on what they gained from their projects.  They and their families forged meaningful new traditions and found a new sense of community that arises when one disconnects a bit from the mainstream economy (potlucks instead of takeout, preparing a fresh meal as a family instead of popping a frozen meal in the microwave, sharing communal resources instead of everyone needing to own and maintain their own whoziwhatsits).   If you’ve got the money, it’s easy to escape to an exotic foreign locale to feed your need for beauty.  But it’s possible if we take them more to heart, these authors’ efforts can challenge us to find a beauty in our own communities that can leave us all a little less hungry in America.

Confederates in the Attic

By Carrie Levesque

It’s an experience not uncommon to people who love books: every so often, you stumble across one that changes your life.

Years ago, I was catsitting for a neighbor, a Duke administrator with an extensive personal library of contemporary world literature.   It was two weeks before I was to leave for Prague to begin research on one dissertation topic, when I found myself throwing myself headlong into an entirely new topic after being blown away by a book I had randomly selected from his shelves.  I didn’t know then as I poured over Dubravka Ugresic’s The Culture of Lies that I had found not only a new dissertation topic, but also a timely new teaching field that would inspire me for many years to come.  I’m very thankful today that this book and I crossed paths (and especially that my advisor was willing to go along with that abrupt change of plans).  I don’t know that “Czech Women’s Symbolist Literature” would have found the same broad appeal as “Women, War and Terror” as I joined the BLS program a few years later.

So this summer, it happened again.  A recommended title I jotted down from a newspaper article on Southern culture turned up a few weeks later at my county’s Friends of the Library sale and bam- I had a live one.

I devoured the first 300 or so pages of Tony Horwitz’s wildly thought-provoking and entertaining Confederates in the Attic and then dawdled through the last 90 because I could not bear for it to end.  In his quest to understand “how the Lost Cause still resonates in the memory and rituals of the South” (book jacket), journalist Horwitz interviews some often colorful, always impassioned figures, white and black, from urban eccentrics to quiet, small-town folks, struggling to preserve or just make sense of what it means to be Southern roughly 130 years (at the time of writing) after the War Between the States.   He probes the relationship between the Civil War and the South through its symbols (the rebel flag controversy), its branding and marketing (lunch with Scarlett O’Hara, anyone?), and most fascinating, its reenactors, the most hardcore of which spend many damp, cold nights in the fields of Virginia and Pennsylvania, huddled, hungry and stinky, trying to get close to the experience of poorly-equipped soldier-ancestors on the march into battle.

Prior to reading this book, I confess I had an attitude toward the Civil War common to many of us born Yankee: I knew just what I needed to know: the North whupped the South and put an end to a certain deplorable ‘peculiar institution.’  It was, to me, in my shamefully glossed-over understanding, not much more than a tragic means to a well-overdue end.   But, of course, it was that and infinitely more than that.  It was the most horrific 4 years this country has ever seen, with over 620,000 dead and unspeakable suffering on the front lines and on the home front.  In some of the fiercest battles, a soldier died nearly every second.   It was a conflict started in part by tensions between two culturally and economically very different parts of this country, and both its liberating and shattering effects continue to shape this enduring tension 150 years later.

Since I finished Confederates, the pile of Civil-War-related books at my bedside grows almost daily.  With over 70,000 books published on the Civil War as of 2002 (Library of Congress), I’ve got my work cut out for me in pursuing this new personal and academic obsession.

My daughter at the burial site of Stonewall Jackson's arm.

Though my husband and I aren’t yet outfitting ourselves for a march to Gettysburg, we have begun dragging our daughters and any other willing family members to battlefields and other related sites (though not yet on quite as intensive a schedule as the “Civil Wargasm” tours of Horwitz’s star reenactor, Robert Lee Hodge).  A pilgrimage to Lexington and Appomattox is planned for spring.

And of course I’m intrigued about the possibility of bringing this interest into an updated version of Women, War and Terror.  There is a considerable body of exciting research on the role of women in the Civil War.  Mary Chestnut’s famous diary is cited in every source I come across.  I’m about halfway through historian Carol Berkin’s Civil War Wives, which has been a riveting read.  And I’m dying to know more about Jennie Hodgers, aka Albert Cashiers, one of 400 documented cases of women who disguised themselves as men in order to fight as soldiers in the Civil War (http://www.civilwar.org/education/history/biographies/jennie-hodgers.html).

So what book has changed your life?  Or another question I’d love a response to: what is your relationship to the Civil War?  If you’ve got family stories to tell, you’ve got an eager ear right here!

But They’re Just Words

By Joyce Clapp

In an online forum that I frequent, we recently had a disagreement (it happens in large, diverse online communities). The forum is for people in married-like relationships, however the community member defines “married-like”. However, this particular community is very heavily female skewed, and people forget that not everyone in the community is a biological woman legally married to a biological man. One of the moderators posted, reminding us to use inclusive languages in posts (such as significant other). Referring to your SO however you liked is fine, but we’re asked to not display assumptions through our language about other people’s relationships.

As someone who lives with my partner of eight years (and my partner’s mother, and our three cats), I appreciated the reminder, but didn’t think too much of it. Inclusive language is easy and just makes sense, right?

Apparently not, for the post is at 763 comments of discussion.

In another online forum, people were discussing the difference between e-books and paper books. However, they were referring to paper books as “real books”. When I commented on it, my best friend said “Oh, you know what they’re talking about.” I may, but that doesn’t stop people’s words from downplaying the growing prominence of e-books in our society. Intentional or not, that is the meaning that comes through when calling paper books “real”.

Words are so important. Words are symbols that carry weight in our society. Language is a one of the few tools we use every day (every minute!) Because language is so common and so necessary in our lives, our culture is simultaneously dependent upon our language, and yet, doesn’t encourage treating that language as important. People will say “I get so tired of having to be PC!” That statement reveals something about the importance we put on how the people around us feel and the care we put (or don’t) into our interactions. Our language and our choice of words communicates (consciously or not) information about our political beliefs, how we feel about the people around us, our educational levels, how we use that education, and how much we respect ourselves and our environments (and that’s just the start of what our language says).

As students, instructors, and other professionals in an online environment, they’re not “just words”. They’re all we have to convey our knowledge and our intentions. I often find myself typing “I think I know what you meant here, but I can’t grade you on what I think you meant – I have to grade you on what’s here.” Over time in a variety of online educational settings, I’ve found myself adjudicating disputes in discussion boards between well meaning parties who didn’t chose their words well. Words matter.

I call on you to be deliberate in both your written and spoken language, to use words well, and to be kind in your communications with other people. The next time you start to say “Oh, it’s just a word!” ask yourself what that word, as a label, direction, or endearment, might mean to the persons you are communicating with – and then act accordingly.

President Obama’s “Words Don’t Matter” speech from 2008.