Tag Archives: Middle Ages

All Hallows Eve…and Errors

by Matt McKinnon

All Hallows Eve, or Hallowe’en for short, is one of the most controversial and misunderstood holidays celebrated in the United States—its controversy owing in large part to its misunderstanding.  More so than the recent “War on Christmas” that may or may not be raging across the country, or the most important of all Christian holidays—Easter—blatantly named after the pagan goddess (Eostre), Halloween tends to separate Americans into those who enjoy it and find it harmless and those who disdain it and find it demonic.  Interestingly enough, both groups tend to base their ideas about Halloween on the same erroneous “facts” about its origins.

A quick perusal of the internet (once you have gotten by the commercialized sites selling costumes and the like) will offer the following generalizations about the holiday, taken for granted by most folks as historical truth.

Common ideas from a secular and/or Neopagan perspective:

  • Halloween developed from the  pan-Celtic feast of Samhain (pronounced “sah-ween”)
  • Samhain was the Celtic equivalent of New Years
  • This was a time when the veil between the living and dead was lifted
  • It becomes Christianized as “All Hallows Day” (All Saints Day)
  • The eve of this holy day remained essentially Pagan
  • Celebrating Halloween is innocent fun

Common ideas from an Evangelical Christian perspective (which would accept the first five of the above):

  • Halloween is Pagan in origin and outlook
  • It became intertwined with the “Catholic” All Saints Day
  • It celebrates evil and/or the Devil
  • It glorifies death and the macabre
  • Celebrating Halloween is blasphemous, idolatrous, and harmful

Even more “respectable” sites like those from History.com and the Library of Congress continue to perpetuate the Pagan-turned-Christian history of Halloween despite scarce evidence to support it, and considerable reason to be suspicious of it.

To be sure, like most legends, this “history” of Halloween contains some kernel of fact, though, again like most things, its true history is much more convoluted and complex.

The problem with Halloween and its Celtic origins is that the Celts were a semi-literate people who left only some inscriptions: all the writings we have about the pre-Christian Celts (the pagans) are the product of Christians, who may or may not have been completely faithful in their description and interpretation.  Indeed, all of the resources for ancient Irish mythology are medieval documents (the earliest being from the 11th century—some 600 years after Christianity had been introduced to Ireland).

It may be the case that Samhain indeed marked the Irish commemoration of the change in seasons “when the summer goes to its rest,” as the Medieval Irish tale “The Tain” records.  (Note, however, that our source here is only from the 12th century, and is specific to Ireland.)  The problem is that the historical evidence is not so neat.

A heavy migration of Irish to the Scottish Highlands and Islands in the early Middle Ages introduced the celebration of Samhain there, but the earliest Welsh (also Celtic) records afford no significance to the same dates.  Nor is there any indication that there was a counterpart to this celebration in Anglo-Saxon England from the same period.

So the best we can say is that, by the 10th century or so, Samhaim was established as an Irish holiday denoting the end of summer and the beginning of winter, but that there is no evidence that November 1 was a major pan-Celtic festival, and that even where it was celebrated (Ireland, Scottish Highlands and Islands), it did not have any religious significance or attributes.

As if the supposed Celtic origins of the holiday are uncertain enough, its “Christianization” by a Roman Church determined to stomp out ties to a pagan past are even more problematic.

It is assumed that because the Western Christian churches now celebrate All Saints Day on November 1st—with the addition of the Roman Catholic All Souls Day on November 2nd—there must have been an attempt by the clergy of the new religion to co-opt and supplant the holy days of the old.  After all, the celebrations of the death of the saints and of all Christians seem to directly correlate with the accumulated medieval suggestions that Samhain celebrated the end and the beginning of all things, and recognized a lifting of the veil between the natural and supernatural worlds.

The problem is that All Saints Day was first established by Pope Boniface IV on 13 May, 609 (or 610) when he consecrated the Pantheon at Rome.  It continued to be celebrated in Rome on 13 May, but was also celebrated at various other times in other parts of the Western Church, according to local usage (the medieval Irish church celebrated All Saints Day on April 20th).

Its Roman celebration was moved to 1 November during the reign of Pope Gregory III (d. 741), though with no suggestion that this was an attempt to co-opt the pagan holiday of Samhain.  In fact, there is evidence that the November date was already being kept by some churches in England and Germany as well as the Frankish kingdom, and that the date itself is most probably of Northern German origin.

Thus the idea that the celebration of All Saints Day on November 1st had anything to do either with Celtic influence or Roman concern to supersede the pagan Samhain has no historical basis: instead, Roman and Celtic Christianity followed the lead of the Germanic tradition, the reason for which is lost to history.

The English historian Ronald Hutton concludes that, while there is no doubt that the beginning of November was the time of a major pagan festival that was celebrated in all of the pastoral areas of the British Isles, there is no evidence that it was connected with the dead, and no proof that it celebrated the new year.

By the end of the Middle Ages, however, Halloween—as a Christian festival of the dead—had developed into a major public holiday of revelry, drink, and frolicking, with food and bonfires, and the practice of “souling” (a precursor to our modern trick-or-treating?) culminating in the most important ritual of all: the ringing of church bells to comfort the souls of people in purgatory.

The antics and festivities that most resemble our modern Halloween celebrations come directly from this medieval Christian holiday: the mummers and guisers (performers in disguise) of the winter festivals also being active at this time, and the practice of “souling” where children would go around soliciting “soul cakes” representing souls being freed from purgatory.

The tricks and pranks and carrying of vegetables (originally turnips) carved with scary faces (our jack o’ lanterns) are not attested to until the nineteenth century, so their direct link with earlier pagan practices is sketchy at best.

While the Celtic origins of Samhain may have had some influence on the celebration of Halloween as it begin to take shape during the Middle Ages, the Catholic Christian culture of the Middle Ages had a much more profound effect, where the ancient notion of the spiritual quality of the dates October 31st/November 1st became specifically associated with death—and later with the macabre of more recent times.

Thus modern Halloween is more directly a product of the Christian Middle Ages than it is of Celtic Paganism.  To the extent that some deny its rootedness in Christianity or deride its essence as pagan is more an indication of how these groups feel about medieval Catholic Christianity than Celtic Paganism (about which we know so very little).

And to the extent that we fail to realize just how Christian many of these practices and festivities were, we fail to see how much the Reformation and later movement of pietism and rationalism have been successful in redefining exactly what “Christian” is.

As such, Halloween is no less Christian and no more Pagan than either Christmas or Easter.

Happy trick-or-treating!

Nimrod: What’s In a Name?

by Matt McKinnon

My teenage son is a nimrod. Or so I thought.

And if you have teenagers, and were born in the second half of the twentieth century, you have probably thought at one time or another that your son (or daughter) was a nimrod too, and would not require any specific evidence to explain why.

Of course this is the case only if you are of a certain age: namely, a Baby Boomer or Gen Xer (like myself).

For if you are any older, and if you are rather literate, then you would be perplexed as to why I would think that my son was a nimrod, and why was I not capitalizing Nimrod as it should be.  Since it is, after all, a proper noun.

It is?

Yes, it is.  Or rather it was.  Let me explain.

It turns out, the word “nimrod” (or more properly “Nimrod”) has a fascinating history in which it goes about a substantial reinterpretation.  (Any nimrod can find this out by searching the web, though there is precious little explanation there.)   This, by itself, isn’t surprising, as many words that make their way through the ages transform as well.  But the transformation of “Nimrod” to “nimrod” is particularly interesting in what it tells us about ourselves and our culture.

Nimrod, you see, was a character from the Hebrew Scriptures, or as Christians know it, the Old Testament:

“Cush became the father of Nimrod; he was the first on earth to become a mighty warrior. He was a mighty hunter before the LORD; therefore it is said, ʻLike Nimrod a mighty hunter before the LORD.ʼ”  (Genesis 10:8-9 NRSV)

This is the manner in which older biblically literate folks will understand the term: “as a mighty hunter.”

But there’s more here, for these folks also might understand the term as referencing a tyrannical ruler.

Why?  Well, the etymology of the word links it to the Hebrew “to rebel,” not for anything that Nimrod actually does in the Old Testament, but because, as many scholars attest, it is probably a distortion of the name for the Mesopotamian war-god Ninurta.  And the later chroniclers of Israelite religion didn’t have much sympathy for the polytheism of their Mesopotamian neighbors—especially when it so obviously informed their own religious mythology.

So the word, when it very early on enters the biblical narrative, already shows signs of transformation and tension as referencing both a mighty hunter as well as someone rebellious against the Israelite god.

In fact, Jewish and Christian tradition name Nimrod as the leader of the folks who built the Tower of Babel, though this is not found anywhere in the scriptures.  This, then, is how Nimrod is now portrayed in more conservative circles, despite the lack of biblical attestation:

And as the word is already attested to in Middle English, by the 16th century it is clearly being used in both manners in the English language: as a tyrant and as a great warrior or hunter.

Now I can assure you, neither of these describes my teenage son.  So what gives?

Well, “Nimrod” shows up in a 1932 Broadway play (which only had 11 showings) about two lovesick youngsters:

“He’s in love with her. That makes about the tenth. The same old Nimrod. Won’t let her alone for a second.”

Here, however, the emphasis is still on the term’s former meaning as a hunter, though its use in the play to describe a somewhat frivolous and hapless fellow who moves from one true love to the next points us in the right direction.

And in a 1934 film You’re Telling Me, W.C. Fields’ character, a bit of a buffoon himself (and a drunkard), takes a few swings of a limp golf club and hands it back to his dim-witted caddy, saying in a way only W.C. Fields could:

“Little too much whip in that club, nimrod.”

So here we have the first recorded instance of the word’s transformation from a great hunter or tyrant to a stupid person or jerk.

But that’s not the end of the story.  After all, how many of us have seen You’re Telling Me?  (I haven’t, at least, not until I did the research.)

So the last, and arguably most important piece to the puzzle is not the origination of the word or its transformation, but rather the dissemination of it.

And that, as I’m sure many of you are aware, is none other than T.V. Guide’s greatest cartoon character of all time: Bugs Bunny, who first debuted in the 1940’s, not that long after You’re Telling Me premiered.

In this context, the one most folks born after World War II are familiar with, Bugs Bunny refers to the inept hunter Elmer Fudd as a “little nimrod.”  And the rest, as they say, is history.

For what emerges from Bugs’ usage is not the traditional reference to Fudd as a hunter (though this is the obvious, albeit ironic, intention), but rather Fudd’s more enduring (and endearing?) quality of ineptitude and buffoonery.

And anyone who has (or knows) a teenager can certainly attest to the applicability of this use of the term in describing him or her.

But the important thing is what this says about literacy and our contemporary culture.

For whereas my parents’ generation and earlier were more likely than not to receive their cultural education from Classical stories, the great literature of Europe, and the Bible, those of us born in the latter half of the 20th century and later, are much more likely to receive our cultural education from popular culture.

I have seen this firsthand when teaching, for example, Nietzsche’s “Thus Spake Zarathustra,” which offers a critique of Judaism and Christianity by parodying scripture.  The trouble is, when students don’t know the referent, they can’t fully understand or appreciate the allusion.  And this is as true of Shakespeare and Milton as it is of Nietzsche…or Bugs Bunny for that matter.

And the ramifications of this are far greater than my choosing the proper term to criticize my teenage son.

(Though ya gotta admit, “nimrod” sounds pretty apropos.)

Video Games: Live!

By Marc Williams

RSC's production of "The Winter's Tale," photo from the RSC.

This summer’s Lincoln Center Festival in New York hosts one of the most highly-anticipated theatrical events in recent memory: a six-week residency by the Royal Shakespeare Company that began on July 6.  The RSC is located in Stratford-upon-Avon, Shakespeare’s home town, and is widely considered the world’s top producer of classical theatre.  For this New York residency, not only has the RSC brought five of its exquisite productions to the Lincoln Center Festival but it has also reconstructed its Stratford performance space right inside the Park Avenue Armory.  For American theatre enthusiasts, the residency is a dream-come-true: a chance to see five RSC productions without purchasing five airline tickets.

While the RSC has generated appropriate buzz over the past few weeks, another Shakespearean experiment has stolen some headlines.  Sleep No More, a loose adaptation of Macbeth, is produced in a 1930’s Manhattan hotel on 27th Street.  The production embraces mansion-platea staging, a technique we study in my Eye Appeal class in the BLS program at UNCG.  Mansion-platea staging involves small performance areas (“mansions”) that represent a particular location in the story, with several mansions lined up in a row or circle, each representing a different location.  The actors and audience move together from one mansion to the next as the story progresses. This isn’t how most of us encounter theatre today, so Sleep No More may seem highly unusual.  However, walking through a haunted house or even sitting on Disney’s Pirates of the Caribbean ride replicates mansion-platea staging faithfully.

Sleep No More‘s producing organization, PUNCHDRUNK, has inserted an unusual twist on mansion-platea staging: audience choice.  This production not only immerses the audience into the playing space with the actors but also gives the audience freedom to wander about the six-story building however they like.  They can follow characters from one room to the next for a linear narrative experience, or move randomly around the building for a more fractured experience.  No matter how they choose, the audience cannot possibly witness the entire production at once since there are events happening simultaneously in different areas of the building.

One of the rooms from "Sleep No More," photo from the NY Times.

I’ve not seen the production but have read much about it.  I was surprised when I read Wired.com’s Jason Schreier’s review of this production.

[Sleep No More is] a nonlinear narrative in which the order of events — and consequently, the plot — is determined by what you see.

The primary problem with this method of storytelling is that you’re not really part of it.

Sleep No More has two rules: Keep your mask on and don’t talk to anybody. Outside those restrictions, you can do whatever and go wherever you want. At one point you might wind up in a dimly lit graveyard, alone and terrified. Then you’re in a ballroom, where garishly dressed gentlemen and ladies are dancing to an infectious beat. Next you’re in a pantry, opening jars of candy and trying to decide whether eating them will kill you. Problem is, nothing you do really matters.

A screen shot from "L.A. Noire," by Rockstar Games.

The title of his review (“Interactive Play Sleep No More Feels Like a Game, But More Confusing”) suggests the experience is intended to be interactive but that  isn’t true.  He goes on to compare the production to several popular video games like L.A. Noire and Fallout: New Vegas, arguing that the games are superior experiences primarily because the game player isn’t “just an observer.”  This perspective probably seems reasonable to Schreier, who is primarily a video game critic.  My question for Schreier is: are audiences accustomed to having an effect on the outcome of a theatrical performance?

In a way, all theatre is interactive in that actor and audience inhabit the same space; the audience’s reactions and attitudes psychologically affect the actors and this effect subtly (and sometimes boldly) influences the performance.  But audiences aren’t typically expected to participate in the action, which is what Schreier seems to expect: a kind of theatrical “Choose Your Own Adventure.”

In contrast to Schreier’s dissatisfaction, several major theatre critics have responded positively to Sleep No More and audiences have been attracted to the production’s novelty.  However, many theatre practitioners have long wondered how video games and other electronic media might affect the next generation of theatregoers.  Will the theatre adapt to its changing audience? Will there be an audience at all?  Schreier’s review makes me wonder if the if the next generation of theatergoers is already clamoring for theatrical evolution.  And while PUNCHDRUNK and other organizations are experimenting with theatrical form, one has to wonder how (or if) a theatrical institution like the Royal Shakespeare Company will adapt when the time comes.  How will other artistic forms evolve with the video game generation?

What are “liberal studies” anyway?

By Marc Williams

This entry begins the official blog life of UNCG’s Bachelor of Arts in Liberal Studies program.  We’ve begun our Facebook life with some discussions on education.  Perhaps it would be worthwhile to begin our blog with the term “Liberal Studies.”  What exactly does that mean?

Image from "Education in Ancient Greece," Michael Lahanas

The phrase derives from artes liberales (“liberal arts”), which describes the kinds of knowledge (“arts”) that free citizens (“liberated”) should possess. This definition of liberal arts can be traced from Ancient Greece all the way through the Middle Ages.   Artes liberales can be contrasted with artes illiberales, which refers to a kind of education intended specifically for economic gain (such as vocational training).    The branches of knowledge that comprise the liberal arts include mathematics, music, literature, logic, rhetoric, grammar, and oratory.  In this regard, a “liberal studies” education is intended to be broad in focus and inclusive of a variety of disciplines.

Most universities today offer degree programs with a liberal arts structure. First-year university students are often surprised by how many courses are required outside of their intended field of study.  “How will a philosophy course help me if I want to work in advertising?”

One answer to this question comes from a recent Carnegie Foundation study, recently outlined on Businessweek.com by William M. Sullivan:

More than ever, American business needs leaders who are creative and flexible enough to innovate in a complex, competitive, global economy. The recent near-collapse of the world economy underscores the importance of business professionals who can act with foresight and integrity, aware of the public impact of their decisions. […].

The Carnegie Foundation study found that undergraduate business programs are too often narrow in scope. They rarely challenge students to question their assumptions, think creatively, or understand the place of business in larger institutional contexts. […] .

The study, soon to appear as Rethinking Undergraduate Business Education: Liberal Learning for the Profession (Jossey-Bass/Wiley), went in search of business programs that set out to provide students with more than tools for advancing their careers, as important as those tools are. […].

Not surprisingly, this report suggests business programs include a healthy dose of liberal arts courses—courses that specifically develop the analytical and critical thinking skills required to deal with ambiguous and complex questions, as well as courses that manage to connect to the business curriculum.  These critical thinking, analytical, and creative skills are precisely the focus of the Bachelor of Arts in Liberal Studies program at the University of North Carolina at Greensboro.  To learn more, please visit http://www.uncg.edu/aas/bls.