Category Archives: Religion and Spirituality

Why All Babies Deserve to Die: Science and Theology in the Abortion Debate

by Matt McKinnon

The debate rages on…

The debate rages on…

Just a few of the headlines on the abortion debate from the last few weeks:

I would say that the Abortion issue has once again taken center stage in the culture wars, but it never really left. Unlike homosexual marriage, which seems to be making steady progress towards resolution by a majority of Americans that the freedom to marry of consenting adults is basic civil right, the abortion debate continues to divide a populace who is torn between adjudicating the priority of the basic rights of both mother and “potential” child.

I say “potential” child because herein is where the real debate lies: exactly when does a fertilized human egg, a zygote, become a “person,” endowed with certain human if not specifically civil rights?

Is it a person yet?

Is it a person yet?

Dougherty’s main point in his article on liberal denial focuses on the “fact” of the beginnings of human life. He claims that liberals tend to make one of two types of arguments where science and human life are concerned: either they take the unresolved legal issue regarding the idea of personhood and transfer it back to the “facts” of biology, concluding that we cannot really know what human life is or when it begins, or they acknowledge the biological fact of the beginning of human life but claim that this has no bearing on how we should think about the legality of abortion.

Both sorts of arguments, he claims, are obscurantist, and fail to actually take into account the full weight of science on the issue.

But the problem, I contend, isn’t one of science: it’s one of theology—or philosophy for those less religiously inclined.

The problem is not the question of “what” human life is or “when” it begins. Dougherty points out:

After the fusion of sperm and egg, the resulting zygote has unique human DNA from which we can deduce the identity of its biological parents. It begins the process of cell division, and it has a metabolic action that will not end until it dies, whether that is in a few days because it never implants on the uterine wall, or years later in a gruesome fishing accident, or a century later in a hospital room filled with beloved grandchildren.

Two-cell zygote.

Two-cell zygote. Is this a person?

So basically, human life begins at conception because at that point science can locate a grouping of cells from which it can deduce all sorts of things from its DNA, and this grouping of cells, if everything goes nicely, will result in the birth, life, and ultimate death of a human being.

He even gets close to the heart of the problem when, in arguing against an article by Ryan Cooper, he claims that many people are not fine with the idea that an abortion represents the end of a life, nor are they comfortable with having a category of human life that is not granted the status of “humanity”—and thus not afforded basic human rights.

The problem with all of these discussions is that they dance around the real issue here—the issue not of “human life” and its definition and beginning, but rather the philosophical and often theological question of the human “person.”

If we look closely at Dougherty’s remarks above, we note two distinct examples of why the generation of human life is a “fact”: (1) we can locate DNA that tells us all sorts of things about the parents (and other ancestors) of the fetus and (2) this fetus, if everything works properly, will develop into a human being, or rather, I would argue, a human “person.”

For there’s the distinction that makes the difference.

After all, analyze any one of my many bodily fluids and a capable technician would be able to locate the exact same information that Mr. Dougherty points out is right there from the first moments of a zygote’s existence. But no one claims that any of these bodily fluids or the cells my body regularly casts off are likewise deserving of being labeled “human life,” though the sperm in my semen and the cells in my saliva are just as much “alive” as any zygote (believe me, I’ve looked).

No, the distinction and the difference is in the second example: The development of this zygote into a human person. My sperm, without an egg and the right environment, will never develop into a human being. The cells in my saliva have no chance at all—even with an egg and the right conditions.

Nope, not people.

Nope, not people.

So the real force of Doughtery’s argument lies in the “potential” of the zygote to develop into what he and anti-abortion folks would claim is already there in the “reality” of a human person.

The debate thus centers on the question of human personhood, what we call theological or philosophical anthropology. For one side, this personhood is the result of a development and is achieved sometime during the embryonic stage (like “viability”) or even upon birth. For others, it is there at conception. For some in both camps it would include a “soul.” For others it would not.

So the reason that the abortion debate is sui generis or “of its own kind” is because here the issue is not the rights of a minority versus the rights of a majority, as it is in the debate about homosexual marriage, or even the rights of the mother versus the rights of the child. Rather the real debate is about when “human life” is also a human “person” (note this is also informs the debate of whether or not to end the life of someone in a vegetative state).

Is this a person?

Fetus at four weeks. Is this a person?

To this end, Mr. Dougherty is correct: We can and do know what human life is and when it begins. And he is correct that many are uncomfortable with the idea that abortion means the death of a human life. But he fails to recognize that the reason this is the case is that while those on one side regard this “life” as a human person, others do not. Potentially, perhaps, but not a “person” yet. And certainly not one whose “right to life” (if there even is such a thing: nature says otherwise—but that’s another blog post) trumps the rights of the mother.

So what does all of this have to do with all babies deserving to die? It’s simple: this is what the (necessary?) intrusion of theology into public policy debates entails. Once theological ideas are inserted (and note that I am not arguing that they should or shouldn’t be), how do we adjudicate between their competing claims or limit the extent that they go?

For the two great Protestant Reformers Martin Luther and John Calvin, representing the two dominant trajectories of traditional Protestant Christianity, humans are, by nature, sinful. We are conceived in sin and born into sin, and this “Original Sin” is only removed in Baptism (here the Roman Catholic Church would agree). Furthermore, we are prone to keep sinning due to the concupiscence of our sinful nature (here is where the Roman Church would disagree). The point is that, for Protestants, all people are not only sinful, but are also deserving of the one chief effect of sin: Death.

romans_6-23

“For the wages of sin is death.” — Romans 6:23

 

Calvin was most explicit in Book 2, Chapter 1 of his famous Institutes:

Even babies bring their condemnation with them from their mother’s wombs: they suffer for their own imperfections and no one else’s. Although they have not yet produced the fruits of sin, they have the seed within. Their whole nature is like a seedbed of sin and so must be hateful and repugnant to God.

Since babies, like all of us, are sinful in their very nature, and since they will necessarily continually bear the fruits of those sins (anyone who’s ever tried to calm a screaming infant can attest to this), and since the wages of those sins is death, then it’s not a far-fetched theological conclusion that all babies deserve to die. And remember: “they suffer for their own imperfections.”

But they don’t just deserve to die—they deserve to go to hell as well (but that’s also another blog post). And this, not from the fringes of some degenerate religious thinker, but from the theology of one of Protestant Christianity’s most influential thinkers.

A sinner in the eyes of God (or at least Calvin).

A sinner in the eyes of God (according to John Calvin, anyway).

Of course, it should be noted that Calvin does not imply that we should kill babies, or even that their death at human hands would be morally justifiable: thought he does argue (and here all Christian theology would agree) that their death at the hand of God is not just morally justifiable, it is also deserved. It should also be noted that the Roman Catholic theology behind the idea that children cannot sin until they reach the age of reason is predicated on the notion that this is only the case once their Original Sin has been removed in Baptism (So Jewish, Muslim, and Hindu kids would be sinful, unlike their Christian counterparts).

Again, this is not to argue that philosophical and theological principles should not be employed in the abortion debate, or in any debate over public policy. Only that (1) this is what is occurring when pro-choice and anti-abortion folks debate abortion and (2) it is fraught with complexities and difficulties that few on either side seem to recognize.

And contrary to  Mr.Dougherty, this is beyond the realm of science, which at best tells us only about states of nature.

But the only way we have a “prayer” of real sustained dialogue—as opposed to debates that ignore our competing fundamental positions—is to take seriously the philosophical and theological issues that frame the question (even if my own example is less than serious).

But I’m not holding my breath. I would most certainly die if I did.

Merry and Bright: The Spectacle of the Christmas Tree

By Marc Williams

“Spectacle” can be broadly defined as a visually striking display, event, or performance. Spectacle has long been associated with live performance, since costumes, scenery, lighting, dance, and other visual elements are frequently used to enhance the performance experience. In my BLS class, Eye Appeal, we focus on the spectacles that occur not only on stage but also in every day life. In my most recent blog entry, I wrote about the spectacle of the Macy’s Thanksgiving Day Parade and its spectacular precursors, the cycle plays of medieval Europe–in this post I’ll focus on Christmas trees and other holiday displays.

2013-White-House-Christmas-Tree-e1386618998921The day after Thanksgiving, November 29, 2013, an 18.5-foot Douglas fir was delivered to the White House. Since 1966, the White House Christmas Tree has been provided annually by the National Christmas Tree Association. Since that time, the First Lady has been responsible for creating a theme for the tree each year, and its decoration and lighting has become an annual spectacle for those in the Washington, D.C. area—an interesting blend of politics, religion, and spectacle. There were indeed White House Christmas trees before 1966, but more on that later.

The delivery of the 2013 White House Christmas Tree

Evergreens have been associated with winter solstice for many centuries. In Ancient Egypt and later in Ancient Rome, for example, evergreens were brought into homes to celebrate the continuation or return of life following the winter. Some believe these pagan solstice traditions were adapted by early Christians and evolved into our modern Christmas tree. The earliest recorded Christmas trees were found in 16th century Germany and were typically decorated with apples. The apple decorations are associated with December 24, as the medieval Christian calendar celebrated Adam and Eve’s Day on that date. Christmas trees were introduced to the United States in the early 1800s and were sold commercially by the 1850s [source]. At the time, Christmas trees were a new “fad” in America and many people associated Christmas trees with the German settlers who introduced them.

Interestingly, the White House Christmas Tree has a controversial past. The first White House Christmas tree was displayed by President Franklin Pierce in 1853. In 1899, while Christmas trees had become more common in America, they were still considered by many to be a fad. A White House Christmas tree was by no means obligatory. That year, Chicago Tribune readers mounted a letter-writing campaign urging President McKinley to buck the Christmas tree trend for a variety of reasons—many letters focused on deforestation, with one writer calling Christmas trees “arboreal infanticide.” Other letter writers called Christmas trees “un-American,” since Christmas trees were still considered a German tradition by many. Given the Christmas tree’s pagan connections, some letter-writers viewed the White House tree as anti-Christian. Controversy surrounding the tree continues today, as some critics wonder if the White House Christmas Tree should focus on tradition rather than religion, or if the tree should exist at all.

rockefeller-center-xmas-tree

The Tree at Rockefeller Center

Perhaps the most iconic Christmas tree in the United States is found in New York City at Rockefeller Center. The tree is positioned just above the famous ice skating rink and immediately front of 30 Rockefeller Plaza. The Rockefeller Center tree has been a tradition for over eighty years and its lighting has become a major entertainment event. The 2013 tree is 76 feet tall, weighs twelve tons, features over 45,000 lights, and is topped with a nine-foot wide Swarovski star.

angels

Two rows of trumpeting angels are installed along the plaza, forming a lane that frames the tree beautifully when viewed from Fifth Avenue. The lighting ceremony has now become a televised event with celebrity hosts and performers; the 2013 lighting ceremony featured Mariah Carey, Mary J. Blige, the Radio City Rockettes, and many others.

The Radio City Rockettes at the 2013 tree lighting ceremony.

The Radio City Rockettes at the 2013 tree lighting ceremony.

Here in Greensboro, residents of the Sunset Hills neighborhood create an unique annual holiday spectacle: a neighborhood-wide display of lighted “ball” decorations. This local tradition began with Jonathan Smith’s family, residents of Sunset Hills, about sixteen years ago. The balls are homemade, constructed from chicken wire shaped into spheres, then wrapped with a strand of Christmas lights. The balls hang from tree branches, some nearly thirty feet off the ground.

Sunset Hills in Greensboro, NC.

Sunset Hills in Greensboro, NC.

The video below features the 2008 display and Smith discussing how the tradition started.

Lighted Christmas Balls In Greensboro, North Carolina

Have you seen any of these holiday spectacles in person? What role does spectacle play in your holiday celebrations?

Happy Holidays?!

by Ann Millett-Gallant

Ah, Christmas.  ‘Tis the season, deck the halls, joy to the world, and all that stuff.

Christmas Tree, 2013

Christmas Tree, 2013

Christmas is supposed to be the most wonderful time of the year, and yet, celebrations often come with obligations and complications.  Christmas, for those of us that partake in it, has multiple, and sometimes conflicting meanings.  For some, Christmas is a religious observance.  Although I respect the Christian origins of the holiday, I don’t follow any particular religion, have been to church only a handful of times, and associate Christmas more with elves than I do wise men.  Christmas is primarily a commercial event for many people, and although I do enjoy giving and receiving, I appreciate the thought put into a gift more so than the dollar amount.  Finally, Christmas, for many, surrounds family traditions.  For me, the term “family” is as diverse as the specific activities that I enjoy with different family members. “Families” can be groups you are born or adopted into, ones your divorced parents marry you into, ones you yourself marry into, and ones that accumulate throughout your life from circles of friends and colleagues.  I enjoy spending time with all these individuals, but I value more personal time with smaller groups than large gatherings where I exchange small talk with a variety of people. I also appreciate private time during the holidays, in which I have more time to paint, write, read, and watch movies. I believe the notion of “celebrating” should not be limited to one set of activities on one specific day.  My rituals aren’t always traditional, but they are genuine and specific to me. So far this season, my mom has come to NC from Ohio to visit me, and we did a lot of Christmas shopping.  I gave her one of my watercolor paintings, and she showered me and my husband with gifts.  Giving to her children and grandchildren throughout the year and especially at Christmas is a great joy to her.  And I certainly benefit and appreciate that joy!  It was fun shopping with her for me, as well as for other members of the family.

Ann with Shark

Ann with the Shark

Here is a photograph of me at the mall, holding a giant, stuffed shark she bought for her grandson (my nephew).  I smiled as I traveled through the mall carrying the shark and people smiled back at me.  I wish I had thought to photo-bomb the Christmas Santa, but it gives me a new activity for next year. The week before Christmas, I went to a matinee of American Hustle with friends who also had free time.  My friend Jay picked me up, and my friend Julie drove me home and helped my dye my hair a festive red.  In appreciation, I made a small painting for her of her favorite flower, the hibiscus.  These kinds of activities may not directly relate to Christmas, but they contribute to my seasonal cheer.

Hibiscus

Hibiscus, 2013

My husband and I have a non-traditional Christmas palm tree he usually puts up, but this year, I wanted to contribute more, so I made a collage of a tree with acrylic paint and images from magazines.  For me, it represents how we design our own, unique Christmas rituals.  I bought him a smoker as an early Christmas present, so he could smoke us a Thanksgiving turkey.  It was delicious, and I am looking forward to the Christmas turkey!  He enjoys cooking, and I enjoy eating everything he prepares, according to my tastes.  It makes him happy when I am happy.  On Dec. 25, as he prepares our meal, I will wear my Christmas-themed pajamas, binge-watch Modern Family on DVD, and indulge in an afternoon glass of wine.

After Christmas day, my dad and step-mom will come from Ohio to Durham to visit me, my step-sister, and our families.  There will be dinners, movies, and possible cookie decorating with my 5 year old niece, as we try not to wake up her 6 month old brother.  Every year, my dad jokes about how I used to behave during Christmas as a child.  When I was young, Christmas was primarily about the presents, although I reveled in all the Christmas activities: making and decorating cookies; hanging all the ornaments, garland, and lights, inside and outside the house; sitting on Santa’s lap; and caroling.  My dad chuckles as he recalls how I would be awake almost all night on Christmas Eve in anticipation.  I loved opening presents and then throughout the day, despite my sleep deprivation, playing with new toys with my cousins from out of town, or playing board games with my grandparents.  I also adored eating a Christmas dinner that I “helped” my mom prepare and falling asleep early in front of the Christmas tree.  Then in a few days, after New Year’s was over, the tree came down, I went back to school, and I crashed.  Teachers would call my parents to express concern about my melancholy.  I attribute this now somewhat with it being winter in Ohio, but I know it was largely due to just letdown and exhaustion.  I would cheer up over time.

These activities may be familiar some of my readers and foreign to others, raising the question: What is the true meaning of Christmas? I think there are many answers.  Perhaps the more important question to ask is how to take the elements of joy we feel at Christmas and spread them throughout the year.  I would say to everyone, regardless of the season: bake, if you enjoy it; sing if you feel compelled; decorate your surroundings with color; relish time with loved ones; be generous and thoughtful; and have a piece of red velvet cake with ice cream and savor every bite.

Luminaries on UNCG Campus

Holiday Luminaries on UNCG Campus

Santa, Jesus, and Race

by Matt McKinnon

So, as I’m sure most of you have heard, Santa Claus and Jesus are both White.  They just are.  And these are historical facts.

1

Megyn Kelly.

Or so proclaimed Fox television personality Megyn Kelly on her show The Kelly File, while discussing an article by Slate contributor Aisha Harris proposing that the beloved American icon be “made over” in a more culturally neutral manner. (Harris’s suggestion is a cartoon gift-bearing penguin complete with red suit and obviously fake beard.)

What ensued was a predictable lampooning by late-night comedians and news commentators, followed by Fox News’ just as predicable recalcitrant dismissal of a “liberal media’s” obsession with race and race-baiting.  For her part, Ms. Kelly refused to apologize for her position, calling the entire episode “tongue-in-cheek,” though she did acknowledge that the issue of Jesus’ ethnicity is “unsettled.”

2

Harris’ Santa (illustration by Mark Stamaty).

There are many issues here that deserve commentary and discussion, not the least of which is Ms. Harris’ initial suggestion that an ethnically-neutral Santa Claus is much better suited for a multi-ethnic culture like that of the contemporary United States than the image of fat white man landing on rooftops and essentially performing what would be interpreted as “breaking and entering” in most legal jurisdictions.  (And at least one comedian has suggested the ramifications for a black counterpart, given the Stand Your Ground laws in places like Florida.)

But I, like most in the entertainment and media industries, am more interested in what Kelly said, and how she said it, than in what precipitated it.

Kelly, of course, was simply pointing out what most Americans probably assume: Santa Claus as a fictionalized character is based on the historical fourth century Christian bishop Saint Nicholas of Myra.  Santa has always been portrayed as a white man because, duh, his likeness is based on a real white man.  The notion that he can be portrayed any other way than that of mainstream American culture and its hegemonic ethnicity (white) is practically unthinkable—at least by white folks in the mainstream.

In fact, when watching the video from the show, it is more Kelly’s insistence of the brute factuality of Santa’s (and Jesus’s) ethnicity that is so problematic—and not necessarily her position.

And on this subject, a few ideas deserve discussion.

First and most obvious is that what the historical Saint Nicholas and Jesus both share is, if not a common ethnicity, then at least a common geography and culture—that of the Hellenistic Near East.  And while Nicholas was most probably a Greek from “Asia Minor,” and Jesus a Palestinian Jew, neither of them would have considered himself “white”—and would probably not be considered so by today’s use of the term.  So if Santa Claus is simply a reinterpretation of the Anatolian bishop Nicholas of Myra, then Ms. Kelly is mistaken: neither he (nor Jesus) is white.

They just aren’t.

A forensic reconstruction of St. Nicholas' face, surrounded by his image in traditional icons.

St. Nicholas’ face, in a forensic reconstruction and traditional icons.

But, without getting into the specifics, our Santa Claus’s development most probably owes more to Pagan characters, practices, and legends than he does to the Christian bishop of Myra.  And so, arguably, on this point, Kelly is correct: Santa Claus, inasmuch as he evolves from Germanic mythology, is “white” or Northern European (though the same cannot be said for Jesus).

3

A Medieval European “Wild Man.”

Of course, the real issue in all of this is the assumption that Santa is white—must be white, can only be white—whether rooted in history or mythology.  And that is the assumption of hegemony: Where whiteness is the presumed sole and legitimate norm.  Where the way things are is the way they have been and should be.

And pointing this out to someone who is oblivious to the ubiquity of whiteness in our culture can be awkward, unsettling, and even shocking.

Hence the insistence that Santa, like Jesus, is white—without any thought or consideration that such a proclamation may be controversial or that it could possibly be historically and culturally conditioned—tells us more about the speaker, her audience, and our culture than it does about either Santa or Jesus.

But this is not to belittle Ms. Kelly or chasten her about her firmly-held beliefs and ethnic identity.  For the real “White Man’s (sic) Burden” is not to rule over and “civilize” the non-White world, but rather to recognize and confront examples of “White Privilege” in our pluralistic and multi-ethnic culture.  And while there are much more important aspects of white privilege in present day America (like being arrested, prosecuted, convicted, and sentenced to jail far less often than non-whites for similar crimes), the ubiquity of whiteness as the aesthetical norm is not to be dismissed.

But this is easier said than done, since, while the ubiquity of whiteness is quite obvious to most non-white folks, it tends to be invisible to most whites.

And the same thing that happened to Ms. Kelly happened to me, though I was seven years old and in the second grade at the time.

I had the same reaction to the “Black Santa” that my African-American teacher Mrs. Watson had put up on her classroom wall: “Santa Claus is not black.  He’s white,” I thought to myself, aghast at the suggestion.  Luckily, I didn’t have a television show to proclaim this, and didn’t actually articulate it to any of my classmates either.

But the memory has stayed with me—as one where, for the first time, I was confronted with the hegemony of my own culture: me, a little white boy bused from my mostly white neighborhood to school in the projects: a minority among minorities, but whose whiteness was still the norm.

Or should be, as I mistakenly assumed then.

4

Still from the Good Times episode “Black Jesus” (1974).

And around the same time (1974) and in much the same way, I was confronted by the “Black Jesus” episode on Good Times, where JJ had used Ned the Wino as the model for his painting of the Christ, since—being passed out in the gutter—he was the only one who could hold the pose long enough for JJ to paint.

Much later, of course, I was confronted by a long history of Jesus being portrayed in the varying ethnicity of the portrayers—across the centuries, from Asia to Africa to Europe and the Americas.

An Ethiopian Jesus and a Chinese Jesus

Jesus in Ethiopian and Chinese depictions.

And then by the African-American Liberation Theologian James Cone’s insistence that “Jesus is a black man,” in that, according to Christian theology, God comes to his people in the guise of the most repressed and outcast among us.  And in the United States, the reasoning goes, who has been more marginalized than the African slave?

But, arguably, I may not have been as sympathetic and understanding of art history or liberation theology had I not first been confronted in the privileged place of my whiteness in American culture by folks like Mrs. Watson, and the producers, writers, and cast of Good Times.

So what is troubling in Ms. Kelly’s remarks is not her assumption that both Santa and Jesus are white—but her insistence that they are, an insistence that suggests that she has probably never been confronted by the hegemony of her whiteness or the problems that such an unyielding hegemony can produce, at least in a multi-ethnic culture like ours where the power to portray is the power to influence.

It is the catering to the very understandable perspective of a certain portion of the American public that times are changing, have changed, and that “their America,” for better or worse, is gone and not coming back.  And while such a perspective is frightening and should be met with sympathy and sensitivity in order to soothe and diffuse it, it is usually met with a demagoguery of entrenchment from the one side and outright scorn from the other.

Where are Mrs. Watson and JJ when we need them?

(Of course, I must admit, I don’t particularly care for the Penguin Santa myself, whatever ethnicity it is.)

A Holiday Spectacle

by Marc Williams

The Macy’s Thanksgiving Day Parade processes through Times Square.

Thanksgiving has always been my favorite holiday. I remember waking up on Thanksgiving with the smell of sage wafting through the house as my mother began a long day of cooking and baking. Of course my morning was spent in front of the television watching the Macy’s Thanksgiving Day Parade. And now that I have children of my own, Thanksgiving has come full circle: I’m the one doing the cooking and my son, now four years old, is the one enjoying the parade.

The Macy’s parade has been around since 1924, when Macy’s department store on 34th Street decided to hold a parade as a marketing ploy. Macy’s employees dressed in costumes—clowns, cowboys, and the like—and walked with Central Park Zoo animals on a six-mile route through Manhattan. The parade was a success for Macy’s, became an annual event, and is now the most popular holiday parade in America. The Rose Parade in Pasadena, California and Mardi Gras krewes in New Orleans are also major parades, and communities all over the country stage parades in celebration of St. Patrick’s Day, Independence Day, fall harvests, and other occasions. Parades big and small are commonplace, yet many viewers probably don’t realize that modern parades owe a debt both to theatre and to the church.

During the early Middle Ages, the Catholic Church in Europe began dramatizing events from the Bible in an attempt to share the liturgy with a growing population that was largely illiterate. Church interiors were utilized in these liturgical dramas, with the central nave area of the basilica used for spectators and the columns along either side of the nave were used to separate different “stages,” or “mansions,” as they were called. The audience would walk from one mansion to the next, one mansion featuring the Garden of Eden, the next featuring Noah’s Ark, and so on. Interestingly, this staging technique—in which the spectator moves in and out of different acting areas—is still used today. It’s the same staging technique encountered in a “haunted house” attraction, and is also used by theme parks. Disney’s famous It’s a Small World After All and Pirates of the Caribbean rides use the same concept, although the audience is seated in a boat that moves through the attraction.

Disney’s Pirates of the Caribbean ride.

In the 13th century, Pope Innocent III forbade the clergy from participating in these liturgical plays. The performances were enormously popular but the Pope believed the scripture was becoming obscured by scenery, costumes, special effects, and scripts that were becoming increasingly colloquial.

The plays, known as “mysteries,” then moved outside the church. In some cases, the plays were held immediately outside the church—right on the church steps. Large stone basilicas with impressive steps and entryways could make for impressive theatrical backdrops. In some towns, performances moved to a town square. The image below from the Valenciennes Passion shows a town square that has several “mansions” built into its existing architecture for the performance of a 25-day long play depicting the life and death of Jesus Christ. The city’s own gates appear to be used as the gates to Jerusalem, with a setting constructed for the temple, another for the sea, and so on.

Setting for the Valenciennes Passion, 1547.

In other towns, the spectators were spread out, with groups seated on risers in different areas of the town. For these mystery plays, the sets, costumes, and actors moved from one audience to the next on pageant wagons. These impressive devices could “dock” with an existing stage platform, providing a custom backdrop for the performance. Also, the wagon would also carry all the props and costumes needed to tell the story. These pageant wagons are the forerunners of our modern parade floats; the audience sits or stands as the small staging area moves to them. The performance occurs, then the float moves along the designated route.

An English pageant wagon.

Spectacle was of tremendous importance during the mystery plays. Because the church was no longer the organizing force for the performances, the plays were presented by other groups in the community. In England, for instance, the responsibility was given to the local craft guilds. Typically, each guild was responsible for one story: finding the actors, building the scenery and costumes, and paying their share of the overall expenses for the event. The guilds were typically assigned a story related to their expertise: the shipwrights would present Noah, the goldsmiths would present the three kings, since they could supply gold crowns, the blacksmiths might be in charge of nailing Christ to the cross, and so on. The guilds took tremendous pride in their contributions and their efforts were dazzling: some wagons featured trap doors for surprising entrances and exits, others featured elaborate scenic detail that was stunningly lifelike, while some featured flying effects and other stage magic.

Indeed the same is true today at the Macy’s Thanksgiving Day Parade. New balloons and floats are added each year, and these new efforts are consistently focused on providing a new spectacle that has never been seen in a parade before. In this year’s parade, for example, Cirque du Soleil provided a new float that is the biggest in parade history, complete with acrobats and contortionists—it was like a little self-contained circus.

Cirque du Soleil’s “Dreamseeker” float in the 2013 Macy’s Parade.

In my BLS course Eye Appeal: Spectacle on Stage and in Life, we discuss modern-day spectacles like the Macy’s Thanksgiving Day Parade and historical spectacles like the mystery plays. We consider why we are so consistently impressed by the “wow factor” that spectacles offer, and wonder what draws each of us individually toward these spectacles. For me, the Macy’s Parade takes me back to my childhood home, watching television in my pajamas with the smell of sage in the air. What memories do you associate with the Macy’s Parade? Or do you have another favorite holiday spectacle?

On Sitting In, and Standing Up

by Jay Parr

I had a completely different blog entry ready to go this morning, but then I woke from a dream that got me thinking about something more important.

Woolworth's Sit-In

In the dream I was walking into a diner that was attached to a basic travel hotel. There were three or four young women — college athletes dressed in team sweatshirts or some such (you know how vague dreams can be) — sitting on the bench waiting to be seated. The host offered to seat me (and my companions?), when I pointed out that those young women had been there first.

That was when it came to my attention that the diner would not seat unaccompanied women.

I’m proud of my dream self, because I went ballistic. I started off ranting at the poor young host. He was, of course, just an employee, who could either do what he was told or find himself without even this subsistence-level job. In fact, as I pointed past him at the unoccupied counter seating, traditionally used by those who are eating “unaccompanied,” his face kind of looked like the the counter clerk’s in that famous image at the top of this post: Surely sympathetic (I mean, the guy in that picture couldn’t even eat at the counter where he worked), but in no position to even comment on the disparity, much less do anything about it.

Newt Gingrich being Very Important

Newt Gingrich being Very Important

 After a vague dream-transition I found myself talking to the man in charge. And a police officer. Both were white men. The manager/owner was older, white-haired, and reeked of privilege. Actually, looking back at the dream, he kind of reminds me of Newt Gingrich. He was spewing some nonsense about the morality of allowing unaccompanied young women to come into a family establishment and distract the poor unsuspecting fathers from their families. Because that’s obviously what these college athletes were up to, in their team sweatshirts, with no makeup on, hair pulled up in practical athletic ties, ignoring everyone else and talking shop amongst themselves. Surely it was all a ruse, and they were really there to steal me from my wife and daughter. Oh, and somehow it was their fault that I just might be too weak-willed to control myself? And of course, were I to have such a moment of weakness it would be inconceivable that they might, you know, reject my advances or something.

The cop had been called because some hothead was making a scene.

That’s about all I remember of the dream. That and something about large vehicles getting tangled up at highway speeds (anxiety much?). But as I was setting the coffee to brew this morning I started wondering what I really would have done, had I found myself in a similar situation, say, perhaps at that Woolworth’s counter down on Elm Street on that Monday afternoon in the winter of ’60. I like to think I would have pointed out those four scared but stoic freshmen and politely said, “They were here before me; I’ll wait until they’ve been served.” I mean, I know I wouldn’t have been among the hecklers shouting racist epithets (I’ve always been a little too Quaker for that), but would I have just quietly gotten my order and gone on with my day? Would I have gone home and mentioned the incident to my wife? Would I have been among the Woman’s College (UNCG) or Guilford College students who came downtown to clog the counters with white “customers” insisting that the the black protesters be served first? Or would I have been too busy supporting my family (or perhaps “too busy supporting my family”) to do much more than follow the articles in the newspaper?

pride_flag

The Pride Flag, because not all families are heteronormative.

I definitely connect that issue with North Carolina’s “Amendment One” vote last May. I was vocally against it, not just because I support same-sex marriage (which I do), but all the more so because its wording is so much broader and insidious that it affects any unmarried couple in the state, gay or straight. Oh, and their children.

I learned of the bill’s introduction in the state legislature shortly after an old coworker of mine lost his partner of thirty years and had to endure absurd legal challenges because the state considered my marriage — my second marriage, mind you, which was less than three years old at the time and had been performed in another state — more valid than his decades-long partnership, which had begun before my wife was even born. She and I have been flying a pride flag on our house since the referendum bill passed in the legislature. It’s a small gesture, but it’s how we feel about the issue.

UNCG students having fun at at a Muslim Student Association picnic.

UNCG students having fun at at a Muslim Student Association picnic.

The fact that those being denied service in my dream were women also points (albeit circuitously) to mainstream America’s complicated and uncomfortable relationship with Islamic nations, Muslim Americans, and Islam in general. I have a problem with any legal system or culture that limits the options of any group merely by virtue of their membership in that group. That goes for nations that curtail the rights of women — some of which do so on religious grounds, and some of which (not all the same ones) are Islamic nations — but it also goes for western nations and institutions that want to limit the rights of Muslim women to wear hijab, niqab, or even burqas. My wife has childhood friends, two sisters, who are Muslim. One of the sisters is divorced from an abusive husband — and the Muslim divorce was a lot simpler than the American legal divorce. The other sister once set aside the injunction against being alone with a man other than her husband, simply so that her sister’s childhood friend’s husband (i.e., yours truly) didn’t have to sit and wait alone. Brought me delicious cardamom tea and we had a delightful conversation amidst the din of playing children. Southern hospitality at its finest. These women are American born and raised. They are not oppressed by a misogynistic culture (well, that’s debatable, but that’s a whole different conversation). Their choice to wear hijab is not a symptom of their oppression, but an expression of their cultural identity. Yes, there are women who wear hijab (and niqab, and burqas) because they are legally bound to do so by oppressive theocratic legal systems. Yes, there are places in the world where unaccompanied women cannot be seated in a restaurant, or drive a car, or even walk down the street, because those in power have deemed it inappropriate. And yes, there are radical Muslim elements that view America(ns) as the godless enemy. But we can’t allow ourselves to conflate an expression of religious and cultural identity (wearing hijab) with sympathy for oppressive governments or violent radicals. Really. It makes as much sense to declare anyone with a crucifix or a rosary in league with the IRA bombers (and don’t get me started on how our media always point out the religious affiliation of “Islamic terrorists” but never that of Christian terrorists). But I digress.

I suppose this post could be an examination of my responsibilities as one who benefits from the privilege of the straight white male, or more broadly, the responsibilities of anyone who benefits from the privilege of majority status. Because I really do feel that whenever I encounter situations in which someone is being denied equal treatment or equal access to resources because of their gender — or their race, or their economic background, or their sexual identity, or their cultural identity, or their citizenship status — that it is my responsibility to call attention to the disparity, to voice my opposition to it, and to subvert it in any way that I can. And I guess that’s why, even in that dream that got me started on this rambling post, I caused enough of a ruckus that someone called the cops. Because really, it’s what I think any of us should do.

What bothers me most, though, is that it never occurred to me to simply say of those unaccompanied girls, “Oh, they’re with me.”

“So, You Pastor a Church?”

by Matt McKinnon

I'm no pastor and I don't play one on TV.

I’m no pastor, and I don’t play one on TV.

It’s a question I used to get all the time, mostly from me and my wife’s family members.  Good, God-fearing folks (for the most part) who simply assumed that devoting one’s professional life to the study of religion must mean being a pastor—since “religion” must be synonymous with “church.”  Why else would someone spend upwards of eight years in school (after undergrad?!) studying various religions and even languages few people on earth still use?

And while one of my three degrees in religious studies is from a non-denominational “divinity” school (Yale) and my doctorate from a Roman Catholic university (Marquette), my degrees themselves are academic, preparations for scholarship in the academy and not the pulpit.  But that still hasn’t stopped folks from asking the above question, and has also led to invitations to offer prayer at family gatherings, read scripture at special events, and even give short homilies when the situation arises.

Now don’t get me wrong, there’s nothing wrong with being a pastor, or priest, or imam, or rabbi.  Plenty of good folks are in these lines of work, many of whom I have studied alongside of in pursuing my education.  My wife’s cousin, in fact, is a Baptist preacher—a wonderful man who is much more qualified to pray and preach and—God forbid—counsel folks than me.  So the problem is not my disdain for this profession: the problem is that it is not my profession.

But the real issue here is not what I do but rather the underlying problem that most folks have in understanding exactly what “religious studies” does—and how it is different from “theology” and the practice of religion.

This was never as clear as in the recent Fox News interview of religious studies scholar Reza Aslan about his new book on Jesus, “Zealot: The Life and Times of Jesus of Nazareth.”

.

Lauren Green

Lauren Green

Never mind that Fox religion correspondent Lauren Green gives a horrible interview, spending much more time on what critics have to say about Aslan’s book than on the book itself.  For while this may be bad, even worse is that it becomes painfully clear that she probably has not read the book—and may have not even perused even the first two pages.  But what is most troubling here is that the RELIGION CORRESPONDENT for a major news network is working with the same misunderstandings and ignorance of what exactly religious studies is and what religious studies scholars do as regular folks who are not RELIGION CORRESPONDENTS.

Zealot

Aslan’s Zealot

Her assumption is that the story here, the big scoop, the underlying issue with Aslan’s book about Jesus is that…the author is a Muslim.  And not just a Muslim, but one who used to be a Christian.  Despite Aslan’s continued attempts to point out that he has a PhD in religious studies, has been studying religions for over twenty years, and has written many books dealing with Christianity, Islam, Judaism, and even Hinduism, Ms. Green cannot get past what she—and many of his critics—see as the real issue: he is a Muslim writing a “controversial” book about Jesus—the “founder” of Christianity as she calls him.

Now I put “controversial” in quotations because, as anyone even remotely aware of scholarship on Christianity knows, the most “controversial” of his claims are nothing new: scholars since the 19th century have been coming to many of the same conclusions that Aslan has come to.  And I put “founder” in quotations as well, since these same folks even tangentially aware of New Testament scholarship know that Jesus himself lived and died a Jew, and never “founded” a new religion.

Dr. Reza Aslan

Dr. Reza Aslan

Not being aware of any of this is not really the problem, but rather a symptom of the bigger issue: Ms. Green, like many folks, simply does not understand what the discipline of religious studies is, or what religious studies scholars do.  So why would she be aware of information that is common knowledge for any undergrad who has sat through a survey course on the introduction to religion at a mainstream college or university?

Except that, uh, she is the RELIGION CORRESPONDENT for a major news network, and would thus benefit from knowing not just about the practice of religion, but about the way it is studied as well.

Now, my own mother has been guilty of this (though she’s no RELIGION CORRESPONDENT), one time explaining to me why she would rather have a class on Buddhism, for example, taught by a practicing Buddhist, or on Islam by a practicing Muslim.  And here we have the crux of the problem: for the role of a scholar is not simply to explain what folks believe or what a religion teaches, though that is part of it.  The role of a scholar is also to research and discover if what a religion says about something has any historical veracity or is problematic or even inconsistent.  Our role is to apply critical analysis to our subjects, the same way a scholar of English Literature or Russian History or Quantum Physics would.

Scholars of the Hebrew Scriptures, for example, have argued that there are two competing and contradictory creation stories in Genesis, that the book of Isaiah was composed by at least three authors, that the genealogical narratives in Matthew and Luke disagree, and that Paul only actually composed about half of the letters in the New Testament that bear his name.  And you will find all of these ideas routinely taught in secular state schools like UNCG as well as mainstream seminaries like Princeton and Wake Forest.

It just doesn’t matter what one’s religion is, or even if they have one.  Some of the best and most reliable books on New Testament subjects have been written by Roman Catholics, Protestants, atheists, Jews, Women, and yes, even Muslims.  One’s personal religion simply has no place in scholarship, anymore than being a Christian or Jew or Muslim would affect the way that a biologist studies cells or an astronomer studies space.

Scholarly Books about Jesus

Scholarly Books about Jesus

One’s religion, or lack thereof, may point someone in certain directions and may inform what interests him or her—and may even make what they do a vocation or calling.  It may inform their training and influence their methodologies.  Or it may not.  But it doesn’t make them qualified to study one religion or prevent them from studying another.  One’s training—including those degrees that Dr. Aslan pointed out—is what does that.

As my first religion professor Henry Levinson (a Festive-Naturalist Jew who didn’t hold the traditional concept of God adhered to by his religion) often put it: “It doesn’t take one to know one; it takes one to be one.”

Dr. Henry Levinson

Dr. Henry Levinson

Religious studies scholars are trying to “know” religions and religious people, not “be” them, for that is something tangential at best to our roles as scholars.

So this should be the official motto of all religious studies scholarship, where what one’s religion “is” has no bearing on the quality of the scholarship they do.

Anything less is not scholarship.

It’s simply propaganda.

Life Becomes Art: Modeling for Joel-Peter Witkin

by Ann Millett-Gallant

Joel-Peter Witkin, "Retablo (New Mexico)" (2007)

Joel-Peter Witkin, “Retablo (New Mexico)” (2007).

In 2010, I published my first book, The Disabled Body in Contemporary Art.  In it, I analyze the artworks of contemporary disabled artists, many of which are self-portraits and performance, in comparison with images of disabled bodies by non-disabled, contemporary artists.  I also place such contemporary work in comparison with images from the history of body displays in art and visual culture, such as fine art painting, medical photographs, freakshow displays, documentary photographs, and popular culture.  I was very proud when the book was called the first to cross the disciplines of art history with disability studies and am happy that it has been adopted as required reading for courses on a variety of subjects related to visual culture, disability studies, and cultural studies.

Joel-Peter Witkin, "First Casting for Milo" (2005), as used for the cover of The Disabled Body in Contemporary Art.

Joel-Peter Witkin, “First Casting for Milo” (2005), as used for the cover of Ann Millett-Gallant’s book, The Disabled Body in Contemporary Art.

The book overlaps with subjects of many of my online courses at UNCG.  In it, I discuss the work of Frida Kahlo, which, although it precedes the time period on which the book focuses, set much precedence for the self-portrait and performative work of contemporary disabled, as well as many non-disabled, women artists.  We discuss such work in my online Art 100 course in a unit about feminist art and notions of arts and crafts.  Much of the artwork I analyze in my book is photography, which relates directly to my BLS course: Photography: Contexts and Illusions.  I also discuss performance, which is a major subject of my BLS course: Representing Women, as well as The Art of Life.  The Art of Life course focuses on the intersections between art and everyday life in a variety of ways, which is also a theme of this book.  In all three of these BLS classes, we debate the implications of self-display on the part of artists.  I delivered a talk about my book for the art department of UNCG in Fall of 2010 and again at the Multicultural Resource Center in Fall of 2012.  At both meetings I received a lot of interested feedback and compelling questions, as well as generous praise.  I am interested in teaching an online course centered on the subjects of my book in the future.

Frida Kahlo in 1931

Frida Kahlo in 1931, six years after the bus accident that left her in lifelong pain.

The subject matter of this book has proved to be personal to me in more ways than one, and in some ways unexpected.  I have been physically disabled since birth, involved in studying and making art since childhood, and interested in bridging these subjects in my teaching and writing as an academic professional.  And there is more.  While researching the beginnings of this book in New York City in the Fall of 2004, I visited the Ricco Maresca Gallery for a Joel-Peter Witkin exhibit (examples of Witkin’s work may be viewed at the Catherine Edelman Gallery and the Etherton Gallery).

I viewed the gallery and met the photography curator, Sarah Hasted, who was as enthusiastic about Witkin’s controversial work as I was and was also a personal friend of his.  She thought that because of my interest in his work, knowledge of art history, experiences (personal and scholarly) with disability, and, above all, because of my body, Joel and I should meet and collaborate on a photograph.  I was eager to serve as his model.  I felt that while arguing that self-display for disabled people, as well as other individuals, can be a liberating personal and political act, I felt that I should have the experience, or in other words, I should put my body where my mouth was.  After much correspondence and many sketches later, in the Spring of 2007, I traveled to Albuquerque, NM to meet him and to become a performing agent in one of his tableaux.

WitkinSelf1995

Witkin self portrait (1995).

I wrote about my many experiences in my journal and later in my book.  The long weekend is now a blur, but I recall specific details: visiting with Witkin’s horses and dogs earlier on the day of the shoot; befriending his wife, Barbara; taking off my prosthesis and my clothes, yet feeling no embarrassment; being painted white to replicate the color of marble sculpture; and posing beside another nude model for different shots.  Covered in body paint, I almost felt costumed, and as time passed and I posed with other models and in front of photography professionals, I felt less self-conscious.  Being posed as an eye catching detail in the photograph, I felt picturesque.  I remember how Witkin would become animated: “That’s it!” he’d exclaim, with almost orgasmic excitement.  Yet it was all business for him.  He was creating his work, which was the source of his fiery pleasure, and we were actors playing roles.

The resulting photograph is titled Retablo (New Mexico) (2007), referencing Latin American, Catholic folk art traditions (and, for me, many self-portraits by Frida Kahlo).  The image was conceived when Witkin saw a retablo image featuring two lesbians embracing, wearing only thongs, and posing above the following retablo prayer:

San Sebastian, I offer you this retablo because Veronica agreed to come live with me. We are thankful to you for granting us this happiness without having to hide from society to have our relationship. Sylvia M. (translation)

Ann Millett-Gallant at her computer

Ann Millett-Gallant at her computer.

Witkin’s photograph also contains this prayer and, of course, fabulist imagery.  It is based on this and other similar retablos, printed in France, of homosexuals giving thanks to God and to saints for graces received in their lives. In Witkin’s version, Duccio’s Christ resists Lucifer’s temptations after viewing the future of the world, which includes the tragedy of 9/11.  Witkin’s composition features a triumphant female nude figure as Vernocia, displaying her corporeal glory and gazing down at her lover, Sylvia, a seated nude figure (me), beside her.  We are staged on a pedestal covered in flowing drapery and in front of an elaborate backdrop, which includes a photograph of the same model in a characteristic St Sebastian pose and a painted, shadowed, and winged form confronting a hand of salvation.  An iconographic reminder of death and a warning symbol of righteousness, a skeleton, lounges comically on the left side of the scene.  I cannot logically explain the photograph, as it defies a central narrative.  It is far more sensory than sensible.  I have my back to the camera and am seated on my two shorted legs (one congenitally amputated above the knee and one below), as I extend my “deformed,” or here fabulist/fabulous arms.  The female figures are opposing in the positions – one flaunting the front of her nude body, the other much smaller and flaunting her back.  The two bodies complement one another and complete a disfigured, heavenly narrative. Witkin said he especially, aesthetically admired my back, which inspired the pose.  This seated figure that is me is magical and all-powerful; as viewers stare at my back, I stare back.  Like the other models in my book, I perform for my readers/viewers.  Life becomes art.  The photograph epitomizes the Art of Life for me.

Today, a print of the photograph hangs in my living room, while another image of Witkin’s graces the cover of my book, I refer to the photographer as Joel, and Paul, my companion on the trip who served as Joel’s assistant, is now my husband.

Environmentalism and the Future

by Matt McKinnon

Let me begin by stating that I consider myself an environmentalist.  I recycle almost religiously.  I compost obsessively.  I keep the thermostat low in winter and high in summer.  I try to limit how much I drive, but as the chauffeur for my three school-age sons, this is quite difficult.  I support environmental causes and organizations when I can, having been a member of the Sierra Club and the Audubon Society.

1I find the arguments of the Climate Change deniers uninformed at best and disingenuous at worst.  Likewise, the idea of certain religious conservatives that it is hubris to believe that humans can have such a large effect on God’s creation strikes me as theologically silly and even dishonest.  And while I understand and even sympathize with the concerns of those folks whose businesses and livelihoods are tied to our current fossil-fuel addiction, I find their arguments that economic interests should override environmental concerns to be lacking in both ethics and basic forethought.

That being said, I have lately begun to ponder not just the ultimate intentions and goals of the environmental movement, but the very future of our planet.

Earth and atmospheric scientists tell us that the earth’s temperature is increasing, most probably as a result of human activity.  And that even if we severely limited that activity (which we are almost certainly not going to do anytime soon), the consequences are going to be dire: rising temperatures will lead to more severe storms, melting polar ice caps, melting permafrost (which in turn will lead to the release of even more carbon dioxide, increasing the warming), rising ocean levels, lowering of the oceans’ ph levels (resulting in the extinction of the coral reefs), devastating floods in some places along with crippling droughts in others.

2And according to a 2007 report by the Intergovernmental Panel on Climate Change, by 2100 (less than 100 years) 25% of all species of plants and land animals may be extinct.

Basically, our not-too-distant future may be an earth that cannot support human life.

Now, in my more misanthropic moments, I have allowed myself to indulge in the idea that this is exactly what the earth needs.  That this in fact should be the goal of any true environmental concern: the extinction of humanity.  For only then does the earth as a planet capable of supporting other life stand a chance.  (After all, the “environment” will survive without life, though it won’t be an especially nice place to visit, much less inhabit, especially for a human.)

3And a good case can be made that humans have been destroying the environment in asymmetrical and irrevocable ways since at least the Neolithic Age when we moved from hunter and gatherer culture to the domestication of plants and animals along with sustained agriculture.  Humans have been damaging the environment ever since.  (Unlike the beaver, as only one example of a “keystone species,” whose effect on the environment in dam building has an overwhelming positive and beneficial impact on countless other species as well as the environment itself.)

4So unless we’re seriously considering a conservation movement that takes us back to the Paleolithic Era instead of simply reducing our current use and misuse of the earth, then we’re really just putting off the inevitable.

But all that being said, whatever the state of our not-too-distant future, the inevitability of the “distant future” is undeniable—for humans, as well as beavers and all plants and animals, and ultimately the earth itself.  For the earth, like all of its living inhabitants, has a finite future.

Around 7.5 billion years or so is a reasonable estimate.  And then it will most probably be absorbed in the sun, which will have swollen into a red giant.

5(Unless, as some scientists predict, the Milky Way collides with the Andromeda galaxy, resulting in cataclysmic effects that cannot be predicted.)

At best, however, this future only includes the possibility of earth supporting life for another billion years or so.  For by then, the increase in the sun’s brightening will have evaporated all of the oceans.

6Of course, long before that, the level of carbon dioxide in the atmosphere (ironically enough) will have diminished well below the quantity needed to support plant life, destroying the food chain and causing the extinction of all animal species as well.

And while that’s not good news, the worse news is that humans will have been removed from the equation long before the last holdouts of carbon-based life-forms eventually capitulate.

(Ok, so some microbes may be able to withstand the dry inhospitable conditions of desert earth, but seriously, who cares about the survival of microbes?)

Now if we’re optimistic about all of this (irony intended), the best-case scenario is for an earth that is able to support life as we know it for at most another half billion more years.  (Though this may be a stretch.)  And while that seems like a really long time, we should consider that the earth has already been inhabited for just over 3 and a half billion years.

So having only a half billion years left is sort of like trying to enjoy the last afternoon of a four-day vacation.

7

Enjoy the rest of your day.

All Hallows Eve…and Errors

by Matt McKinnon

All Hallows Eve, or Hallowe’en for short, is one of the most controversial and misunderstood holidays celebrated in the United States—its controversy owing in large part to its misunderstanding.  More so than the recent “War on Christmas” that may or may not be raging across the country, or the most important of all Christian holidays—Easter—blatantly named after the pagan goddess (Eostre), Halloween tends to separate Americans into those who enjoy it and find it harmless and those who disdain it and find it demonic.  Interestingly enough, both groups tend to base their ideas about Halloween on the same erroneous “facts” about its origins.

A quick perusal of the internet (once you have gotten by the commercialized sites selling costumes and the like) will offer the following generalizations about the holiday, taken for granted by most folks as historical truth.

Common ideas from a secular and/or Neopagan perspective:

  • Halloween developed from the  pan-Celtic feast of Samhain (pronounced “sah-ween”)
  • Samhain was the Celtic equivalent of New Years
  • This was a time when the veil between the living and dead was lifted
  • It becomes Christianized as “All Hallows Day” (All Saints Day)
  • The eve of this holy day remained essentially Pagan
  • Celebrating Halloween is innocent fun

Common ideas from an Evangelical Christian perspective (which would accept the first five of the above):

  • Halloween is Pagan in origin and outlook
  • It became intertwined with the “Catholic” All Saints Day
  • It celebrates evil and/or the Devil
  • It glorifies death and the macabre
  • Celebrating Halloween is blasphemous, idolatrous, and harmful

Even more “respectable” sites like those from History.com and the Library of Congress continue to perpetuate the Pagan-turned-Christian history of Halloween despite scarce evidence to support it, and considerable reason to be suspicious of it.

To be sure, like most legends, this “history” of Halloween contains some kernel of fact, though, again like most things, its true history is much more convoluted and complex.

The problem with Halloween and its Celtic origins is that the Celts were a semi-literate people who left only some inscriptions: all the writings we have about the pre-Christian Celts (the pagans) are the product of Christians, who may or may not have been completely faithful in their description and interpretation.  Indeed, all of the resources for ancient Irish mythology are medieval documents (the earliest being from the 11th century—some 600 years after Christianity had been introduced to Ireland).

It may be the case that Samhain indeed marked the Irish commemoration of the change in seasons “when the summer goes to its rest,” as the Medieval Irish tale “The Tain” records.  (Note, however, that our source here is only from the 12th century, and is specific to Ireland.)  The problem is that the historical evidence is not so neat.

A heavy migration of Irish to the Scottish Highlands and Islands in the early Middle Ages introduced the celebration of Samhain there, but the earliest Welsh (also Celtic) records afford no significance to the same dates.  Nor is there any indication that there was a counterpart to this celebration in Anglo-Saxon England from the same period.

So the best we can say is that, by the 10th century or so, Samhaim was established as an Irish holiday denoting the end of summer and the beginning of winter, but that there is no evidence that November 1 was a major pan-Celtic festival, and that even where it was celebrated (Ireland, Scottish Highlands and Islands), it did not have any religious significance or attributes.

As if the supposed Celtic origins of the holiday are uncertain enough, its “Christianization” by a Roman Church determined to stomp out ties to a pagan past are even more problematic.

It is assumed that because the Western Christian churches now celebrate All Saints Day on November 1st—with the addition of the Roman Catholic All Souls Day on November 2nd—there must have been an attempt by the clergy of the new religion to co-opt and supplant the holy days of the old.  After all, the celebrations of the death of the saints and of all Christians seem to directly correlate with the accumulated medieval suggestions that Samhain celebrated the end and the beginning of all things, and recognized a lifting of the veil between the natural and supernatural worlds.

The problem is that All Saints Day was first established by Pope Boniface IV on 13 May, 609 (or 610) when he consecrated the Pantheon at Rome.  It continued to be celebrated in Rome on 13 May, but was also celebrated at various other times in other parts of the Western Church, according to local usage (the medieval Irish church celebrated All Saints Day on April 20th).

Its Roman celebration was moved to 1 November during the reign of Pope Gregory III (d. 741), though with no suggestion that this was an attempt to co-opt the pagan holiday of Samhain.  In fact, there is evidence that the November date was already being kept by some churches in England and Germany as well as the Frankish kingdom, and that the date itself is most probably of Northern German origin.

Thus the idea that the celebration of All Saints Day on November 1st had anything to do either with Celtic influence or Roman concern to supersede the pagan Samhain has no historical basis: instead, Roman and Celtic Christianity followed the lead of the Germanic tradition, the reason for which is lost to history.

The English historian Ronald Hutton concludes that, while there is no doubt that the beginning of November was the time of a major pagan festival that was celebrated in all of the pastoral areas of the British Isles, there is no evidence that it was connected with the dead, and no proof that it celebrated the new year.

By the end of the Middle Ages, however, Halloween—as a Christian festival of the dead—had developed into a major public holiday of revelry, drink, and frolicking, with food and bonfires, and the practice of “souling” (a precursor to our modern trick-or-treating?) culminating in the most important ritual of all: the ringing of church bells to comfort the souls of people in purgatory.

The antics and festivities that most resemble our modern Halloween celebrations come directly from this medieval Christian holiday: the mummers and guisers (performers in disguise) of the winter festivals also being active at this time, and the practice of “souling” where children would go around soliciting “soul cakes” representing souls being freed from purgatory.

The tricks and pranks and carrying of vegetables (originally turnips) carved with scary faces (our jack o’ lanterns) are not attested to until the nineteenth century, so their direct link with earlier pagan practices is sketchy at best.

While the Celtic origins of Samhain may have had some influence on the celebration of Halloween as it begin to take shape during the Middle Ages, the Catholic Christian culture of the Middle Ages had a much more profound effect, where the ancient notion of the spiritual quality of the dates October 31st/November 1st became specifically associated with death—and later with the macabre of more recent times.

Thus modern Halloween is more directly a product of the Christian Middle Ages than it is of Celtic Paganism.  To the extent that some deny its rootedness in Christianity or deride its essence as pagan is more an indication of how these groups feel about medieval Catholic Christianity than Celtic Paganism (about which we know so very little).

And to the extent that we fail to realize just how Christian many of these practices and festivities were, we fail to see how much the Reformation and later movement of pietism and rationalism have been successful in redefining exactly what “Christian” is.

As such, Halloween is no less Christian and no more Pagan than either Christmas or Easter.

Happy trick-or-treating!