Tag Archives: Matt McKinnon

¿Habla American?: Why English as an Official Language is Blatantly Un-American

by Matt McKinnon

We the People...

Nosotros, el Pueblo…

I’m no fan of corporations.  In fact, I am often critical of them and the too-big-to-fail capitalism that has come to dominate global economics. But I am willing to congratulate them on the off-chance that they do something good or get something right.

Like the Cheerios commercials featuring a multi-ethnic girl with her family that prompted racist hate-speech from trolls everywhere. Or the recent revelation that multinational corporations are taking climate change seriously, since it poses a real economic threat to them.

Or when Coca-Cola broadcast this advertisement during the Super Bowl:

(Coke doubled-down amidst much criticism to play it again for the Winter Olympics.)

Now, I’m no dummy, and I’m certainly not naïve. I realize that the folks at Coca-Cola are first and foremost interested in their bottom line, and that means selling more Coke. And as we are all aware by now, the United States is undergoing a considerable demographic shift, so much so that white people will no longer be the majority by 2043. And more to the point: white kids will no longer make up a majority of youth in five or six years. Yes, five or six years! Which is why companies like Coca-Cola are so interested in multicultural approaches to advertising.

So yes, I know all this, and yet still find it laudable (1) that Coca-Cola produced the commercial, and (2) that they stood by it despite heavy criticism.

But enough about Coke. My real interest is the criticism that was generated by having non-white U.S. citizens sing a patriotic song in a language other than English. And the next logical step that many critics make: viz., that English should be the official language of the United States.

This impulse is nothing new. Nor is the fear and prejudice behind it.

Benjamin Franklin.

Benjamin Franklin.

Our brilliant and esteemed Founding Father Benjamin Franklin railed against the unwanted influence of what he called “swarthy German” immigrants with surprisingly racist overtones:

“Why should Pennsylvania, founded by the English, become a Colony of Aliens, who will shortly be so numerous as to Germanize us instead of our Anglifying them, and will never adopt our Language or Customs, any more than they can acquire our Complexion. “

(Indeed: Who knew that only the English were truly white?)

Of course, Franklin was wrong then, as those who criticize the Coke ad and call for English as our official language are wrong now. They are wrong for a practical reason based in historical fact: the new German immigrants did not “Germanize” the English, despite the fact that more Americans now claim German ancestry than any other ethnic or national group. No, they learned English because it was practical to do so, though some retained the use of their native tongue well into the 20th century.

Likewise, studies show that recent immigrants are assimilating in similar fashion, just as immigrants have been doing since, well, the original English came over and ironically did not assimilate into existing native cultures.

And this means that they are learning English.

Loosing my Espanish, by H.G. Carrillo (2004).

Loosing my Espanish, by H.G. Carrillo (2004).

A Pew study found that 91% of second-generation children from Hispanic families speak English “very well or pretty well” and that the number rises to 97% of third-generation children. Indeed, other studies show that not only are second and third generations learning English, they are more likely than not to learn only English—and not the language of their parents’ or grandparents’ homeland.

But there is another—deeper and more essential—reason why English is not and should not be the official language of our land. And while this argument could be made from the liberal and progressive “love-for-all-things-multicultural” perspective worthy of this “liberal studies” blog, the stronger argument is actually one more conservative in nature, rooted as it is in the very fabric of our democracy, in what it means to be American.

The argument is simple: making English, or any language, the Official Language of the United States is blatantly Un-American at its core.

In fact, the late conservative writer Joseph Sobran made a similar argument some thirty years or so ago, to the chagrin of some whose conservative principles only went as deep as their nationalism. (This was the same Joe Sobran whom Pat Buchanan called “perhaps the finest columnist of our generation” and Ann Coulter named “the world’s greatest writer” and the “G.K. Chesterton of our time.”)

Joseph Sobran.

Joseph Sobran.

The point is twofold: First, from a conservative perspective, government should be limited and should only be about the business of governing—not social engineering. Mandating that Americans learn and use English is as absurd from a conservative viewpoint as mandating that they learn and use French, or that they eat their vegetables and lay off the Supersized fries and soda. This, argues conservatism, is simply not the purview of government, and it doesn’t matter whether learning English or eating broccoli are good ideas or not (as I think they both are). What matters is that this is not the government’s responsibility to decide or dictate.

And second, a government “of the people, by the people, and for the people” should, as much as is possible, reflect the majority of the people, while safeguarding the rights of the minority. But such a reflection, like the people it reflects, is in a constant state of change.

So in this case, what could be more basic than the right to express oneself in the language of one’s choice? And what could be more democratic than a government committed to accommodating that language—those languages—and to understanding and communicating with its own citizens?

For what right is more basic than the choice of language? Freedom of speech? Freedom of the press? Freedom of religion? All of these are secondary, at least temporally, to the choice of language whereby one speaks, or publishes, or prays aloud to their God.

Indeed, the only act synchronous to that of speaking is the forming of one’s very thoughts. And yet, even here, do we really decide what language we use to form our thoughts? Or does our language shape our thoughts and even ourselves?

If so, what would it mean that for some U.S. citizens, their very thoughts are formed in an unofficial language?

Government should not be in the business of constraining either the free thought or the free expression of its citizens.

"We speak English."

“We speak English.”

Furthermore, the fact of English as our common language is an accident of history. Not only are we the largest English-speaking nation in the world, we are also the second largest Spanish-speaking nation (second only to Mexico). And what is more democratic than a common language that actually resonates with the voice of the people? If Spanish, or French, or Chinese should one day become the preferred language of a majority of U.S. citizens, how undemocratic would it be that their free and common expression would be constrained by the short-sightedness of having made English the Official Language?

To extrapolate from James Madison’s argument against the state establishment of Christianity in his Memorial and Remonstrance: any government that can today make English the official language can tomorrow replace it with Spanish or Arabic.

***

This is what it means to be American: to have the right to choose whatever language one wishes to express oneself, be it for business, or entertainment, or religion, or school—ever mindful of the need to balance this with the necessity of being understood.

As Americans, we lack an ethnic identity. And we lack an established religion. And we lack an official language.

But we are united as a people precisely because we lack these. Since our ethnic identities and religions and languages are plural. As are we.

But in this plurality there is strength.

And from this plurality there is unity.

Or, as our Founding Fathers put it,

Dean Bryant Johnson, "E Pluribus Unum" (2012), detail.

Dean Bryant Johnson, “E Pluribus Unum” (2012), detail.

E pluribus unum.

(And that ain’t English.)

HousingFest 2014: February 22 in Charlotte

by Matt McKinnon

6

Cold.

Colder than this.

It’s been cold this week in Peoria, where I currently reside.  Real cold.  Mind-numbing, face-hurting, frost-biting cold.  The high on Monday was 5 degrees Fahrenheit.  The low on Tuesday was 5 below.  And those were the actual temperatures.  The wind chills with the latest Arctic blast regularly approached 30 below.  The schools have yet to be closed for snow since we moved out here three years ago, but have now been closed three times due to cold temperatures.  Evidently, with wind chills so low, frostbite can occur within ten minutes or so.  And no one wants to be responsible for turning Peoria’s schoolchildren into a bunch of kidsicles.

I bring this up because it seems that these extremely low temperatures, along with Thanksgiving and Christmas, are the rare annual events that cause most folks to think about the homeless.  I mean, I almost froze to death running outside in my pajamas to set the garbage can back up—I can only imagine what folks who lack adequate housing have to put up with in these conditions.

2

Homeless.

Last January, the U.S. Government reported that some 633,782 people in the United States were homeless (62,619 of whom are veterans).  The good news is that this number has decreased since 2007: down 6.8% among individuals, down 3.7% among families, down 13.1% among unsheltered homeless, and down 19.3% among the chronically homeless.

And it is this last group I want to focus on.

According to the U.S. Government, a “chronically homeless” person is defined as “an unaccompanied homeless individual with a disabling condition who has either been continuously homeless for a year or more, or has had at least four episodes of homelessness in the past three years.”  Data suggests that the chronically homeless only make up some 10% of all homeless people, but may account for as much as 50% of the services provided (though how best to interpret these numbers is debated).  What can be said with certainty is that chronic homelessness is a major problem with complex causes that needs to be addressed on a number of fronts.

cropped-logo-1Luckily, there are organizations like the Urban Ministry Center in Charlotte, NC, attempting to do just that—offering food, shelter, treatment, and community to some five to six hundred area homeless people.  According to the National Alliance to End Homelessness, the best way to help those who are chronically homeless is a Housing First approach coupling permanent housing with supportive services (such as case management, mental health and substance abuse services, health care, and employment).

This makes sense: What a chronically homeless person needs most is a home, not just a temporary shelter.  They need the stability and security that permanent supportive housing provides—creating an environment in which other contributing issues (that exacerbate and cause of homelessness in the first place) can be addressed.

4

This is precisely what the Urban Ministry Center’s HousingWorks program does: “(G)ive chronically homeless individuals what they need most—a safe, stable, affordable home—and then provide the wrap-around support to help them remain housed and regain lives of wellness and dignity.”

This philosophy of “Housing First” recognizes the complexity of chronic homelessness and seeks a solution based on the premise that housing is a fundamental right, regardless of a person’s mental health, physical condition, or addiction.  It’s a simple idea: get folks into housing first—provide a safe and secure home—and then work on whatever issues need attention in order to insure that the person remains in housing.

After all, their website asks: “Who among us could tackle sobriety while sleeping under a bridge?”

5

Housing First has become a major movement in the United States in the search for solutions to the problem of chronic homelessness, as both National Public Radio and the Wall Street Journal have reported.  Indeed, the WSJ article reveals that, according to a Chicago-based study, providing permanent housing to the chronically homeless improves (and saves) lives as well as saving taxpayer money.

In Charlotte alone, the cost in temporary shelter, emergency room treatment, as well as hospital and jail stays is over $39,000.00 a year—for each chronically homeless person.  That’s a cost that the community pays. In other words, it comes mostly out of our tax dollars.  Compare that with the $13,983 annual cost that Charlotte’s HousingWorks program spends to provide both permanent housing and case management (including services like medical, mental health, and addiction treatment) for the same homeless individual.  Less money to be sure, but the real savings is in the improvement of people’s quality of life and the success in providing meaningful treatment that saves and changes lives.

It’s just this simple: “HousingWorks saves lives and saves our community money.

7

The Blind Boys of Alabama.

To that end, the Urban Ministry Center is having a benefit concert to help end chronic homelessness in Charlotte—its first annual HousingFest.  On February 22nd, 2014 the Grammy award winning group The Blind Boys of Alabama and Grammy award winning singer/songwriter Jim Lauderdale will be performing at the Neighborhood Theater in Charlotte, NC.  Tickets for this worthwhile event are only $25.00 and can be purchased through the Neighborhood Theater’s website.

8

Jim Lauderdale.

If you’re in the Charlotte area, I urge you to support this important cause and enjoy some outstanding musical entertainment in the process. After all, homelessness is chronic issue, regardless of the time of year and the weather.

Here’s a chance to help do something about it.

Editor’s note: A big, hearty welcome to those of you who are following us since last week’s post went viral (over half a million hits)! Our contributors are the faculty and (occasionally) students and alumni of the BLS Program at UNCG, which is an online multidisciplinary program for nontraditional students. With people from so many different disciplines coming together to write about whatever inspires them, this blog is rather eclectic. I think Matt’s post this week makes it pretty clear that we’re not just a parenting blog, though we do have quite a few parents and they do write about the topic from time to time. We may have a post on co-sleeping for you in a couple of weeks.

Feel free to browse back through our back posts. I’ve been working on the tags (which are inconsistent at best); you can now click on an author’s name in the tags to see all that author’s posts. I just looked at the “parenting” tag and realized it needs a lot of work, so don’t rely on that one just now. Enjoy! -JP

Santa, Jesus, and Race

by Matt McKinnon

So, as I’m sure most of you have heard, Santa Claus and Jesus are both White.  They just are.  And these are historical facts.

1

Megyn Kelly.

Or so proclaimed Fox television personality Megyn Kelly on her show The Kelly File, while discussing an article by Slate contributor Aisha Harris proposing that the beloved American icon be “made over” in a more culturally neutral manner. (Harris’s suggestion is a cartoon gift-bearing penguin complete with red suit and obviously fake beard.)

What ensued was a predictable lampooning by late-night comedians and news commentators, followed by Fox News’ just as predicable recalcitrant dismissal of a “liberal media’s” obsession with race and race-baiting.  For her part, Ms. Kelly refused to apologize for her position, calling the entire episode “tongue-in-cheek,” though she did acknowledge that the issue of Jesus’ ethnicity is “unsettled.”

2

Harris’ Santa (illustration by Mark Stamaty).

There are many issues here that deserve commentary and discussion, not the least of which is Ms. Harris’ initial suggestion that an ethnically-neutral Santa Claus is much better suited for a multi-ethnic culture like that of the contemporary United States than the image of fat white man landing on rooftops and essentially performing what would be interpreted as “breaking and entering” in most legal jurisdictions.  (And at least one comedian has suggested the ramifications for a black counterpart, given the Stand Your Ground laws in places like Florida.)

But I, like most in the entertainment and media industries, am more interested in what Kelly said, and how she said it, than in what precipitated it.

Kelly, of course, was simply pointing out what most Americans probably assume: Santa Claus as a fictionalized character is based on the historical fourth century Christian bishop Saint Nicholas of Myra.  Santa has always been portrayed as a white man because, duh, his likeness is based on a real white man.  The notion that he can be portrayed any other way than that of mainstream American culture and its hegemonic ethnicity (white) is practically unthinkable—at least by white folks in the mainstream.

In fact, when watching the video from the show, it is more Kelly’s insistence of the brute factuality of Santa’s (and Jesus’s) ethnicity that is so problematic—and not necessarily her position.

And on this subject, a few ideas deserve discussion.

First and most obvious is that what the historical Saint Nicholas and Jesus both share is, if not a common ethnicity, then at least a common geography and culture—that of the Hellenistic Near East.  And while Nicholas was most probably a Greek from “Asia Minor,” and Jesus a Palestinian Jew, neither of them would have considered himself “white”—and would probably not be considered so by today’s use of the term.  So if Santa Claus is simply a reinterpretation of the Anatolian bishop Nicholas of Myra, then Ms. Kelly is mistaken: neither he (nor Jesus) is white.

They just aren’t.

A forensic reconstruction of St. Nicholas' face, surrounded by his image in traditional icons.

St. Nicholas’ face, in a forensic reconstruction and traditional icons.

But, without getting into the specifics, our Santa Claus’s development most probably owes more to Pagan characters, practices, and legends than he does to the Christian bishop of Myra.  And so, arguably, on this point, Kelly is correct: Santa Claus, inasmuch as he evolves from Germanic mythology, is “white” or Northern European (though the same cannot be said for Jesus).

3

A Medieval European “Wild Man.”

Of course, the real issue in all of this is the assumption that Santa is white—must be white, can only be white—whether rooted in history or mythology.  And that is the assumption of hegemony: Where whiteness is the presumed sole and legitimate norm.  Where the way things are is the way they have been and should be.

And pointing this out to someone who is oblivious to the ubiquity of whiteness in our culture can be awkward, unsettling, and even shocking.

Hence the insistence that Santa, like Jesus, is white—without any thought or consideration that such a proclamation may be controversial or that it could possibly be historically and culturally conditioned—tells us more about the speaker, her audience, and our culture than it does about either Santa or Jesus.

But this is not to belittle Ms. Kelly or chasten her about her firmly-held beliefs and ethnic identity.  For the real “White Man’s (sic) Burden” is not to rule over and “civilize” the non-White world, but rather to recognize and confront examples of “White Privilege” in our pluralistic and multi-ethnic culture.  And while there are much more important aspects of white privilege in present day America (like being arrested, prosecuted, convicted, and sentenced to jail far less often than non-whites for similar crimes), the ubiquity of whiteness as the aesthetical norm is not to be dismissed.

But this is easier said than done, since, while the ubiquity of whiteness is quite obvious to most non-white folks, it tends to be invisible to most whites.

And the same thing that happened to Ms. Kelly happened to me, though I was seven years old and in the second grade at the time.

I had the same reaction to the “Black Santa” that my African-American teacher Mrs. Watson had put up on her classroom wall: “Santa Claus is not black.  He’s white,” I thought to myself, aghast at the suggestion.  Luckily, I didn’t have a television show to proclaim this, and didn’t actually articulate it to any of my classmates either.

But the memory has stayed with me—as one where, for the first time, I was confronted with the hegemony of my own culture: me, a little white boy bused from my mostly white neighborhood to school in the projects: a minority among minorities, but whose whiteness was still the norm.

Or should be, as I mistakenly assumed then.

4

Still from the Good Times episode “Black Jesus” (1974).

And around the same time (1974) and in much the same way, I was confronted by the “Black Jesus” episode on Good Times, where JJ had used Ned the Wino as the model for his painting of the Christ, since—being passed out in the gutter—he was the only one who could hold the pose long enough for JJ to paint.

Much later, of course, I was confronted by a long history of Jesus being portrayed in the varying ethnicity of the portrayers—across the centuries, from Asia to Africa to Europe and the Americas.

An Ethiopian Jesus and a Chinese Jesus

Jesus in Ethiopian and Chinese depictions.

And then by the African-American Liberation Theologian James Cone’s insistence that “Jesus is a black man,” in that, according to Christian theology, God comes to his people in the guise of the most repressed and outcast among us.  And in the United States, the reasoning goes, who has been more marginalized than the African slave?

But, arguably, I may not have been as sympathetic and understanding of art history or liberation theology had I not first been confronted in the privileged place of my whiteness in American culture by folks like Mrs. Watson, and the producers, writers, and cast of Good Times.

So what is troubling in Ms. Kelly’s remarks is not her assumption that both Santa and Jesus are white—but her insistence that they are, an insistence that suggests that she has probably never been confronted by the hegemony of her whiteness or the problems that such an unyielding hegemony can produce, at least in a multi-ethnic culture like ours where the power to portray is the power to influence.

It is the catering to the very understandable perspective of a certain portion of the American public that times are changing, have changed, and that “their America,” for better or worse, is gone and not coming back.  And while such a perspective is frightening and should be met with sympathy and sensitivity in order to soothe and diffuse it, it is usually met with a demagoguery of entrenchment from the one side and outright scorn from the other.

Where are Mrs. Watson and JJ when we need them?

(Of course, I must admit, I don’t particularly care for the Penguin Santa myself, whatever ethnicity it is.)

Who Am I? (On Genealogy and Genetic Ancestry)

by Matt McKinnon

Who am I?

I have long been pestered by this question, seeking the answer not in a litany of likes and dislikes or the self-obsessed perspective that modern Western consumerist culture offers me.  But neither in the personal history of myself—where I’m from, where I’ve been, and so on.  Or even less in my career, my “profession,” what I do to make enough money to live comfortably and raise a family.

1

No, my interest in identity has been more in my genealogy, my distant past, and what we now call “deep genealogy”—the history of my DNA, that mysterious code I have no control over but that dictates much of who I am.

2The more I have sought answers in these two areas, the more I have come to realize that they are decidedly different—that my genealogy (the relatively recent history of my family and ancestors) and my “deep genealogy” (the origins and history of my DNA) offer two quite different portraits—even though the latter, after tens of thousands of years, ultimately leads to the former.

But that’s the key: after tens of thousands of years.

I remember my first dabbling in genealogy when I was in high school: I had always known that my name, McKinnon—or rather MacKinnon—was Scottish in origin.  I had been told by my family that we were mostly “Scots-Irish,” a term which, I came to find out later, is basically an American invention used rarely if ever in either Scotland or Ireland.  It can denote the Ulster Scots whom the English used to colonize Northern Ireland in the 17th century (and are thus not “genetically” Irish at all), or Lowland Scots and those of the Borderlands between Scotland and England.

3But a little research soon proved that the MacKinnon name is Highland, not Lowland or Border, and certainly not “Scots-Irish.”  The Highlands and Islands of Scotland are mostly Gaelic, and hence Celtic in origin, while the Scots of the Lowlands are a mix of Celtic, Roman, German, English, Scandinavian, Irish, and Scottish in varying amounts.  And since our most recent Scottish ancestor was a MacKinnon who left the Isle of Skye sometime in the late 18th or early 19th century, my Highland ancestry was confirmed.

So I spent the rest of my high school days donning plaid scarves, Shetland wool sweaters, and Harris Tweed caps and playing records of bagpipe music at home that frightened the cat and annoyed my parents and siblings to no end.

But deep down, I knew that this was not answer enough.  Indeed, ethnic identity continued to elude me and offer more questions than answers.

And it still does, even after countless hours spent researching family history and genealogy, and hundreds of dollars spent on research and DNA analysis.  Perhaps my developing awareness of the fragmentary and somewhat arbitrary nature of what we call “history” has made my search one of exponential questions instead of hard and fast answers.

For what we call “Celtic” is in fact a linguistic designation, like (and related to) “Germanic” or “Balto-Slavic.”  These are first and foremost language identifiers and not “genetic” ones.

So MacKinnon, being a Highland name, at least designates my ethnic identity as Celtic, right?

Perhaps.  At least to some extent.  But what does that really mean?

After all, these groups—Celtic, Germanic, Balto-Slavic, Italic—are only Indo-European linguistic identifiers with origins in a shared Proto-Indo-European population of tribes who inhabited Europe most probably during the late Neolithic Age (circa 4000 BCE).  Only then did these peoples begin their various migrations north and west as they differentiated into the more well-known (if often mistakenly applied) names like the Celts, Germans, Slavs, Romans, etc…

4The point being that, any location of one’s ancestry as “Scottish,” or “Highland,” or “Gaelic,” or “Celtic,” or, for that matter “Germanic” or “Balto-Slavic” is rather arbitrary in that it assigns prominence to one moment in a wave of modern human migration that began in Africa some 70,000 years ago and arrived on the Pontic-Caspian steppe in what is today Eastern Europe about 30,000 years later.  From there, these various groups migrated into all directions, as wave after wave of tribes populated Europe, developing different cultures and languages, though all sharing the same not-too-distant Indo-European past.

(It is interesting to note as well that these folks only started to look “European,” i.e., “white” around 11,000 BCE.)

5So that Highland MacKinnon ancestry I was so sure about?  Well, it turns out that a deep DNA analysis confirms my paternal lineage (the Y-chromosome of my father’s father’s father’s father’s father…all the way back to its beginning) to be that of Haplogroup (I won’t even get into it) I2, subgroup a2.

Haplogroup I began 30,000-40,000 years ago in Eastern Europe, with I1 and I2 diverging about 6,000 years later.  I2a arose about 11,000 years ago in the Balkans and is still today concentrated in Eastern Europe and Russia.  I2a2, that of my Highland Scots paternal DNA, only emerged some 7800 years ago, also in the Balkans, before starting its migration north into Central and Eastern Europe as well as Russia.

And, at some point, as the DNA of a male member of a Celtic or perhaps Germanic tribe who ultimately made his way to Scotland.  And then passed it along to me.

So my Highland Scots DNA is actually Baltic in origin, and is shared by more Serbs and Croats and possibly even Russians than it is by my “fellow” Highlanders.

But if that’s not confusing enough, this only represents one line of grandfathers on my father’s side, going back roughly 8,000 years.  If we consider that there are approximately 400 generations between me and my Neolithic “European” ancestors, then the number of my direct relatives from present day all the way back to the New Stone Age is considerably large [nerdy editor’s note: large enough to need scientific notation: 2.58 x 10120].

But we need not go back that far to make my point: much of an individual’s “ethnic identity” is relatively arbitrary and tells precious little about their deep genetic makeup.

In calculating the rather complex mathematics of our ancestry, scientists have concluded that all modern humans are related to each other in the not too distant past—within a few hundred years in fact.  Steve Olson, writing in The Atlantic in 2002, reported that

  1. Everyone in the world is descended from Nefertiti and Confucius, and
  2. Everyone of European ancestry is descended from Muhammad and Charlemagne.

Grandma?That would be everyone.

Which means that all modern humans alive today are related to each other—and related to each other rather recently, considering that modern humans have been in existence for about 100,000 years.

7

Indeed, everyone reading this post is probably at most my 20th cousin.

But you’re not invited over for Thanksgiving.

And I’m guessing you’re not all Highland Scots.

Shut Down

by Matt McKinnon

About a month and a half ago, I agreed—as part of my job—to write a contribution for the BLS blog, due by October 6th, and to be published shortly thereafter.  I agreed to this based on my understanding of what my job is, what it entails, the compensation I receive as a BLS instructor, and my belief that a community only works when its members participate in just that: a “communio” or sharing, from the Latin “union with.”  I made this agreement in good faith and free from constraint.  And, though some might argue this point, I made it being in sound mind and body.

But the situation has changed.

broken

(The first image would be here if I were not shut down.)

I am not happy with the present way in which the elected officials of the State for whom I work have conducted business regarding the educational system within which I work.  In short, I disapprove of the massive cuts to higher education that the North Carolina State Legislature has made over the past several years.

Never mind that these folks have been duly elected by a legal process and have conducted this business in a manner consistent with the Constitutions of both the State and the Nation.

Never mind that “legal” does not necessarily mean “fair.”

Never mind that there are regular procedures in place to check the manner in which they do this business—that there is constitutional recourse to persuade, recall, impeach, or merely vote them out of office at the next election.

Never mind that what they have done is now “law”—and has become “law” in a legal and constitutional manner.

Never mind all of this because…well, I just do not agree with them or their “law.”

(The second image would be here if I was not shut down.)

(The second image would be here if I were not shut down.)

And while I adhere to the principle that writing a blog entry is part  of my job, and that I have a duty to myself, to my institution, and to my students to faithfully execute the duties of my job, I have another principle that outweighs all of these:

If I do not get what I want, then I shut down.

(The third image would be here if I was not shut down.)

(The third image would be here if I were not shut down.)

At this point, I am not even sure what would make me not shut down.  Or stop shutting down.  Or start back up.

At this point, I am not even sure what I hope to get out of shutting down.  Other than the shut down itself.

But none of that matters.

Because I have shut down.

So, until further notice—until an agreement can be reached that satisfies the righteousness of my indignation at the manner in which duly-elected officials representing the State by whom I am employed have conducted business in a lawful and constitutional and regular manner—until then, there will be no blog contribution.

I will not fulfill this part of my job.  I have deemed it “non-essential.”

There will be no witticisms or anecdotes about me, my classes, my life, or my family.

There will be no funny or interesting or bizarre pictures to punctuate my points.

There will be no weblinks to follow for more information—at least none supplied by me.

There will be none of this.

Because I am shut down.

(The fourth image would be here if I was not shut down.)

(The fourth image would be here if I were not shut down.)

Of course, by shutting down and writing about how I am shutting down, I am still, technically, fulfilling some of my responsibilities and thus doing my job.  Therefore, I will continue to be paid and will continue to accept and spend my paycheck.

After all, shutting down is hard work.

The First Day of School

by Matt McKinnon

Well, it ain’t what it used to be.  The first day of school, I mean.

Image

And I don’t just mean the back-to-school shopping, though that has changed a lot, to be sure.

We did most of ours online this year, since navigating Walmart.com is a LOT more appealing than navigating an actual Walmart.

And since many public schools have gone to uniforms, there’s not really much fun in back-to-school clothing shopping with the kids:

“How about the khaki pants and red polo shirt?”

“No, I won’t be caught dead in those.”

“Okay, then there’s always the red polo shirt and khaki pants.”

McKinnon Boys on First Day of School

Gone are the days, at least for those of us in uniform schools, where back-to-school shopping was a creative endeavor to get the coolest outfits possible, actually enjoying the prospect of new clothes.

Toughskins jeans.  Converse Chuck Taylor hightops (later surpassed by real leather offerings from Addidas and Nike).  Cool football jerseys.  A new jean jacket.

Toughskins

Man, those were the days.

And it didn’t cost $250.00 to fully outfit two kids for the entire school year.  (Or at least until they get stains all over their shirts and wear holes in the knees of their pants.  Do kids not wear patches on pants anymore?)

And picking out your clothes for the first day of school was just as exciting, and became even more important the older you got.  After all, I had to make a nice impression on those 10-year old girls I was not going to talk to.  Or even look at.

But now the shopping carts are virtual and the clothing is all the same: red polo shirts and khaki pants.  Maybe shorts.  If you’re feeling crazy…navy blue.

Of course, school supply shopping is still best done at an actual store, especially since the local Walmart and OfficeMax and Staples all have lists sent to them by the school district and even the local schools.  And then there’s the additional list that the teacher sends out.

Back to School SuppliesThe cumulative effect of all this is that there are three lists for each of our two elementary-age kids that my wife and I have to carry around with a real shopping cart (the one with the wheel that won’t swivel right), juggling from one list to the other, trying to mark off what we have while we search for what we still need, all the while trying unsuccessfully to keep items not on the list out of the basket.  (How we ended up with a “Duck Dynasty” pillow in the cart I will never know.)

Not to mention that our high school junior is too cool even to shop with everybody else, so we had to make a special late-night black-ops trip, just he and I, outfitted in dark clothing and sunglasses, so no one he knows will see him…with his dad…shopping at Walmart of all places.

And not to mention that the entire school supply deal set us back about $150.00.  A hundred and fifty dollars?!  For notebooks and paper and pencils?

Yes.  And pens, and erasers, and binders in every size and color imaginable.  And glue and glue sticks.  And highlighters, and rulers, and note cards, and composition books.  And more binders.  And pencil boxes, no wait, they have to be bags with three holes to fit in the binder.  And lunch boxes.  And Clorox Wipes and Kleenex (are those really our responsibility?  Whatever happened to that green stuff the janitor would just spread around on the floor when some kid threw up?)  And we still can’t find any graph paper.  Does Walmart have something against graph paper?  Are American kids just not expected to plot graphs anymore?  No wonder we’re falling behind the rest of the developed world.  I bet they have graph paper in Sweden.

But I digress.

I’m not talking about any of that.

No, what I mean when I say that the first day of school ain’t what it used to be is that, as someone who taught mainly face-to-face classes for years but who now teaches entirely online, the first day of school just isn’t quite the same.

Now, don’t get me wrong: I am NOT complaining.

Just observing.  (I tell my wife this all the time.)

First Day of Class

There used to be a nervous energy about the first day of class—when that meant standing in front of a theatre-size room of 100 students or so.  There was electricity in seeing the fresh faces of students experiencing their very first day of college, or even in the nonchalant smoothness of seniors who had waited until the very last moment to complete their GEC credit.

There was magic in the anticipation of how hard the course might be, or how boring the professor was, or how anything I was saying would have any bearing on anyone’s intended career.

I used to enjoy coming up with new ways to start the first day: by proclaiming to the class, for example, that the only thing I hated more than the first day of class was…the next day of class.  Or by moving everybody outside to enjoy the weather.  Or even sitting at a desk like everybody else: just sitting, waiting, and watching as the time for class to start came and went, and still no teacher.  And then getting up abruptly, as if annoyed, audibly mumbling something to the effect that if nobody else is going to teach the damn course, then I might as well.

Yes, those were the days.

But those days are gone.

And again, don’t get me wrong: I am not complaining.  Only observing.

I love teaching online, and have come to see what we do in the BLS program as not just a service to the University, but more importantly, as a service to students—some of whom may not be able to take classes or finish their degree any other way.

And my students, overall, tend to be older, more mature, more driven, and actually interested in what is being taught.

And there is certainly energy and magic in the first day, though clicking on a link to make the course available doesn’t quite compare to bounding around a lecture hall like Phil Donahue in his prime.

No; it’s just not quite the same.

Even though this year I tried.

Fresh Shave and a Haircut

I got a haircut.  I took a shower.  Heck, I even shaved, and thought about adding some color to my graying beard before deciding against it.

And then I sat down, clicked on “Make Course Available,” and…

Well, nothing happened.  At least nothing spectacular.

For that, I’ll have to wait for the next 48 days—or however many are in this first session.

But of course, it’s not that bad…

After all, other than strippers, “escorts,” and the occasional politician, who else do you know can go to work not wearing pants?

Comforts of Home

Yes, there’s something to be said for the comforts of home.

“So, You Pastor a Church?”

by Matt McKinnon

I'm no pastor and I don't play one on TV.

I’m no pastor, and I don’t play one on TV.

It’s a question I used to get all the time, mostly from me and my wife’s family members.  Good, God-fearing folks (for the most part) who simply assumed that devoting one’s professional life to the study of religion must mean being a pastor—since “religion” must be synonymous with “church.”  Why else would someone spend upwards of eight years in school (after undergrad?!) studying various religions and even languages few people on earth still use?

And while one of my three degrees in religious studies is from a non-denominational “divinity” school (Yale) and my doctorate from a Roman Catholic university (Marquette), my degrees themselves are academic, preparations for scholarship in the academy and not the pulpit.  But that still hasn’t stopped folks from asking the above question, and has also led to invitations to offer prayer at family gatherings, read scripture at special events, and even give short homilies when the situation arises.

Now don’t get me wrong, there’s nothing wrong with being a pastor, or priest, or imam, or rabbi.  Plenty of good folks are in these lines of work, many of whom I have studied alongside of in pursuing my education.  My wife’s cousin, in fact, is a Baptist preacher—a wonderful man who is much more qualified to pray and preach and—God forbid—counsel folks than me.  So the problem is not my disdain for this profession: the problem is that it is not my profession.

But the real issue here is not what I do but rather the underlying problem that most folks have in understanding exactly what “religious studies” does—and how it is different from “theology” and the practice of religion.

This was never as clear as in the recent Fox News interview of religious studies scholar Reza Aslan about his new book on Jesus, “Zealot: The Life and Times of Jesus of Nazareth.”

.

Lauren Green

Lauren Green

Never mind that Fox religion correspondent Lauren Green gives a horrible interview, spending much more time on what critics have to say about Aslan’s book than on the book itself.  For while this may be bad, even worse is that it becomes painfully clear that she probably has not read the book—and may have not even perused even the first two pages.  But what is most troubling here is that the RELIGION CORRESPONDENT for a major news network is working with the same misunderstandings and ignorance of what exactly religious studies is and what religious studies scholars do as regular folks who are not RELIGION CORRESPONDENTS.

Zealot

Aslan’s Zealot

Her assumption is that the story here, the big scoop, the underlying issue with Aslan’s book about Jesus is that…the author is a Muslim.  And not just a Muslim, but one who used to be a Christian.  Despite Aslan’s continued attempts to point out that he has a PhD in religious studies, has been studying religions for over twenty years, and has written many books dealing with Christianity, Islam, Judaism, and even Hinduism, Ms. Green cannot get past what she—and many of his critics—see as the real issue: he is a Muslim writing a “controversial” book about Jesus—the “founder” of Christianity as she calls him.

Now I put “controversial” in quotations because, as anyone even remotely aware of scholarship on Christianity knows, the most “controversial” of his claims are nothing new: scholars since the 19th century have been coming to many of the same conclusions that Aslan has come to.  And I put “founder” in quotations as well, since these same folks even tangentially aware of New Testament scholarship know that Jesus himself lived and died a Jew, and never “founded” a new religion.

Dr. Reza Aslan

Dr. Reza Aslan

Not being aware of any of this is not really the problem, but rather a symptom of the bigger issue: Ms. Green, like many folks, simply does not understand what the discipline of religious studies is, or what religious studies scholars do.  So why would she be aware of information that is common knowledge for any undergrad who has sat through a survey course on the introduction to religion at a mainstream college or university?

Except that, uh, she is the RELIGION CORRESPONDENT for a major news network, and would thus benefit from knowing not just about the practice of religion, but about the way it is studied as well.

Now, my own mother has been guilty of this (though she’s no RELIGION CORRESPONDENT), one time explaining to me why she would rather have a class on Buddhism, for example, taught by a practicing Buddhist, or on Islam by a practicing Muslim.  And here we have the crux of the problem: for the role of a scholar is not simply to explain what folks believe or what a religion teaches, though that is part of it.  The role of a scholar is also to research and discover if what a religion says about something has any historical veracity or is problematic or even inconsistent.  Our role is to apply critical analysis to our subjects, the same way a scholar of English Literature or Russian History or Quantum Physics would.

Scholars of the Hebrew Scriptures, for example, have argued that there are two competing and contradictory creation stories in Genesis, that the book of Isaiah was composed by at least three authors, that the genealogical narratives in Matthew and Luke disagree, and that Paul only actually composed about half of the letters in the New Testament that bear his name.  And you will find all of these ideas routinely taught in secular state schools like UNCG as well as mainstream seminaries like Princeton and Wake Forest.

It just doesn’t matter what one’s religion is, or even if they have one.  Some of the best and most reliable books on New Testament subjects have been written by Roman Catholics, Protestants, atheists, Jews, Women, and yes, even Muslims.  One’s personal religion simply has no place in scholarship, anymore than being a Christian or Jew or Muslim would affect the way that a biologist studies cells or an astronomer studies space.

Scholarly Books about Jesus

Scholarly Books about Jesus

One’s religion, or lack thereof, may point someone in certain directions and may inform what interests him or her—and may even make what they do a vocation or calling.  It may inform their training and influence their methodologies.  Or it may not.  But it doesn’t make them qualified to study one religion or prevent them from studying another.  One’s training—including those degrees that Dr. Aslan pointed out—is what does that.

As my first religion professor Henry Levinson (a Festive-Naturalist Jew who didn’t hold the traditional concept of God adhered to by his religion) often put it: “It doesn’t take one to know one; it takes one to be one.”

Dr. Henry Levinson

Dr. Henry Levinson

Religious studies scholars are trying to “know” religions and religious people, not “be” them, for that is something tangential at best to our roles as scholars.

So this should be the official motto of all religious studies scholarship, where what one’s religion “is” has no bearing on the quality of the scholarship they do.

Anything less is not scholarship.

It’s simply propaganda.

Adventures at Camp Dad

by Matt McKinnon

Image 1Against the advice of my wife (a licensed educator), I am attempting to keep my three sons—ages four, six, and fifteen—home with me this summer.  (I use the present progressive tense along with the verb “to attempt” because, while the action is ongoing, it may not be ongoing for long, as my wife keeps threatening to close “Camp Dad” for the season.)

This really shouldn’t be a problem, or so my thinking goes.  After all, I work from home and, owing to my wife’s hectic work schedule, do much of the kid-watching during the school year, taking them to and from school and shuttling them to various soccer practices and games throughout central Illinois.

“Why should we spend the money,” I argued, “When I can take care of them here at home for free?”

Why indeed?

Why oh why?!

Image 2We are only three weeks in to the official “Camp Dad: Summer 2013” season and already I have lost two of the kids, though for only a brief period of time.  The first, my four-year-old, I feared had chased after the dog who had escaped into the woods behind our house.  Once I had corralled the mutt (the dog, not my son) with the help of a neighbor, I noticed that the littlest was missing.

As I ran through the neighborhood frantically calling his name, gathering neighbors for an impromptu search party, I heard the car alarm going off and knew immediately where he was: locked in the SUV and trying to open the door with the alarm set.

Image 3

And there he was, safe and sound, and had been—playing a stupid video game, oblivious to the cries of his father—sitting peacefully and calmly while I was screaming my head off, calling his name, and making a spectacle of myself to the neighbors.

And casting real suspicion over this whole “Camp Dad” idea.

As if that wasn’t bad enough, I lost the middle son the next week.  Though again, only for a brief period of time.

Image 4

 We were all going to the local “Safety Town,” inappropriately named it would seem, where kids can ride their bikes and parents can sit in the shade and watch.  Since my two oldest are proficient at bike-riding and my youngest is not, my fifteen-year-old rode ahead along the trail with my six-year-old.  They would go on to the park while my four-year-old and I walked.

The oldest had his phone.  He is marginally reliable.  What could go wrong?

What could go wrong indeed.

They arrived at the park and my oldest son called: all was well.

Image 5My youngest and I arrived by foot some ten minutes later, and I took my place under the shade of a tree, eyes peeled for my six-year-old in his new bike helmet, conspicuous enough with a red plastic Mohawk down the center.

There were a lot of kids, so the first five minutes of not being able to locate him seemed normal.  But five became ten and so I asked my oldest where he was.

Ten became fifteen as my oldest son rode around the park looking for him and I, once again, resorted to the parenting tool I know best and use most often: screaming his name at the top of my lungs.

Image 6

But to no avail.  No relieving sounds of a car alarm to signal where he was.  Nothing.  Just a park full of kids on bikes, but none of them my now-missing six-year-old.

As my eldest continued his futile search of the park on bike, I grabbed by youngest and bolted out the back towards the trail—the Rock Island trail to be exact, a 26-mile rail-to-trail project that runs from Peoria to Only-God-Knows-Where, Illinois.  If he was on that trail, he could be half way to Rock Island for all I knew.

And then my cell phone rang.  There was a lady on the other end.  “We have Lucas” she said calmly and sweetly.

“Oh thank you!” I exclaimed, not even contemplating that she could have meant something more sinister by her words, something out of Liam Neeson’s “Taken” perhaps.  A rather sweet ransom call.

Image 7Luckily, she was no Albanian gangster, but rather a mother out for a walk with her own mother and her seven-year-old son, who was now comforting my lost child.

Evidently, not seeing his older brother, my son decided to ride back to find us, going out the back of the park as we were coming in.  Needless to say, this was not in the itinerary for Camp Dad.

We soon reunited and all was well again.

BUT THIS WAS ONLY WEEK TWO!

I shudder to think what can, or rather will happen next at Camp Dad, where our official motto has become: “Has Anyone Seen My Kids?”

And yet I am steadfast in my resolve, determined that Camp Dad will go on.

My wife, on the other hand, threatens that Camp Dad is about to close, pending legal action.

Environmentalism and the Future

by Matt McKinnon

Let me begin by stating that I consider myself an environmentalist.  I recycle almost religiously.  I compost obsessively.  I keep the thermostat low in winter and high in summer.  I try to limit how much I drive, but as the chauffeur for my three school-age sons, this is quite difficult.  I support environmental causes and organizations when I can, having been a member of the Sierra Club and the Audubon Society.

1I find the arguments of the Climate Change deniers uninformed at best and disingenuous at worst.  Likewise, the idea of certain religious conservatives that it is hubris to believe that humans can have such a large effect on God’s creation strikes me as theologically silly and even dishonest.  And while I understand and even sympathize with the concerns of those folks whose businesses and livelihoods are tied to our current fossil-fuel addiction, I find their arguments that economic interests should override environmental concerns to be lacking in both ethics and basic forethought.

That being said, I have lately begun to ponder not just the ultimate intentions and goals of the environmental movement, but the very future of our planet.

Earth and atmospheric scientists tell us that the earth’s temperature is increasing, most probably as a result of human activity.  And that even if we severely limited that activity (which we are almost certainly not going to do anytime soon), the consequences are going to be dire: rising temperatures will lead to more severe storms, melting polar ice caps, melting permafrost (which in turn will lead to the release of even more carbon dioxide, increasing the warming), rising ocean levels, lowering of the oceans’ ph levels (resulting in the extinction of the coral reefs), devastating floods in some places along with crippling droughts in others.

2And according to a 2007 report by the Intergovernmental Panel on Climate Change, by 2100 (less than 100 years) 25% of all species of plants and land animals may be extinct.

Basically, our not-too-distant future may be an earth that cannot support human life.

Now, in my more misanthropic moments, I have allowed myself to indulge in the idea that this is exactly what the earth needs.  That this in fact should be the goal of any true environmental concern: the extinction of humanity.  For only then does the earth as a planet capable of supporting other life stand a chance.  (After all, the “environment” will survive without life, though it won’t be an especially nice place to visit, much less inhabit, especially for a human.)

3And a good case can be made that humans have been destroying the environment in asymmetrical and irrevocable ways since at least the Neolithic Age when we moved from hunter and gatherer culture to the domestication of plants and animals along with sustained agriculture.  Humans have been damaging the environment ever since.  (Unlike the beaver, as only one example of a “keystone species,” whose effect on the environment in dam building has an overwhelming positive and beneficial impact on countless other species as well as the environment itself.)

4So unless we’re seriously considering a conservation movement that takes us back to the Paleolithic Era instead of simply reducing our current use and misuse of the earth, then we’re really just putting off the inevitable.

But all that being said, whatever the state of our not-too-distant future, the inevitability of the “distant future” is undeniable—for humans, as well as beavers and all plants and animals, and ultimately the earth itself.  For the earth, like all of its living inhabitants, has a finite future.

Around 7.5 billion years or so is a reasonable estimate.  And then it will most probably be absorbed in the sun, which will have swollen into a red giant.

5(Unless, as some scientists predict, the Milky Way collides with the Andromeda galaxy, resulting in cataclysmic effects that cannot be predicted.)

At best, however, this future only includes the possibility of earth supporting life for another billion years or so.  For by then, the increase in the sun’s brightening will have evaporated all of the oceans.

6Of course, long before that, the level of carbon dioxide in the atmosphere (ironically enough) will have diminished well below the quantity needed to support plant life, destroying the food chain and causing the extinction of all animal species as well.

And while that’s not good news, the worse news is that humans will have been removed from the equation long before the last holdouts of carbon-based life-forms eventually capitulate.

(Ok, so some microbes may be able to withstand the dry inhospitable conditions of desert earth, but seriously, who cares about the survival of microbes?)

Now if we’re optimistic about all of this (irony intended), the best-case scenario is for an earth that is able to support life as we know it for at most another half billion more years.  (Though this may be a stretch.)  And while that seems like a really long time, we should consider that the earth has already been inhabited for just over 3 and a half billion years.

So having only a half billion years left is sort of like trying to enjoy the last afternoon of a four-day vacation.

7

Enjoy the rest of your day.

All Hallows Eve…and Errors

by Matt McKinnon

All Hallows Eve, or Hallowe’en for short, is one of the most controversial and misunderstood holidays celebrated in the United States—its controversy owing in large part to its misunderstanding.  More so than the recent “War on Christmas” that may or may not be raging across the country, or the most important of all Christian holidays—Easter—blatantly named after the pagan goddess (Eostre), Halloween tends to separate Americans into those who enjoy it and find it harmless and those who disdain it and find it demonic.  Interestingly enough, both groups tend to base their ideas about Halloween on the same erroneous “facts” about its origins.

A quick perusal of the internet (once you have gotten by the commercialized sites selling costumes and the like) will offer the following generalizations about the holiday, taken for granted by most folks as historical truth.

Common ideas from a secular and/or Neopagan perspective:

  • Halloween developed from the  pan-Celtic feast of Samhain (pronounced “sah-ween”)
  • Samhain was the Celtic equivalent of New Years
  • This was a time when the veil between the living and dead was lifted
  • It becomes Christianized as “All Hallows Day” (All Saints Day)
  • The eve of this holy day remained essentially Pagan
  • Celebrating Halloween is innocent fun

Common ideas from an Evangelical Christian perspective (which would accept the first five of the above):

  • Halloween is Pagan in origin and outlook
  • It became intertwined with the “Catholic” All Saints Day
  • It celebrates evil and/or the Devil
  • It glorifies death and the macabre
  • Celebrating Halloween is blasphemous, idolatrous, and harmful

Even more “respectable” sites like those from History.com and the Library of Congress continue to perpetuate the Pagan-turned-Christian history of Halloween despite scarce evidence to support it, and considerable reason to be suspicious of it.

To be sure, like most legends, this “history” of Halloween contains some kernel of fact, though, again like most things, its true history is much more convoluted and complex.

The problem with Halloween and its Celtic origins is that the Celts were a semi-literate people who left only some inscriptions: all the writings we have about the pre-Christian Celts (the pagans) are the product of Christians, who may or may not have been completely faithful in their description and interpretation.  Indeed, all of the resources for ancient Irish mythology are medieval documents (the earliest being from the 11th century—some 600 years after Christianity had been introduced to Ireland).

It may be the case that Samhain indeed marked the Irish commemoration of the change in seasons “when the summer goes to its rest,” as the Medieval Irish tale “The Tain” records.  (Note, however, that our source here is only from the 12th century, and is specific to Ireland.)  The problem is that the historical evidence is not so neat.

A heavy migration of Irish to the Scottish Highlands and Islands in the early Middle Ages introduced the celebration of Samhain there, but the earliest Welsh (also Celtic) records afford no significance to the same dates.  Nor is there any indication that there was a counterpart to this celebration in Anglo-Saxon England from the same period.

So the best we can say is that, by the 10th century or so, Samhaim was established as an Irish holiday denoting the end of summer and the beginning of winter, but that there is no evidence that November 1 was a major pan-Celtic festival, and that even where it was celebrated (Ireland, Scottish Highlands and Islands), it did not have any religious significance or attributes.

As if the supposed Celtic origins of the holiday are uncertain enough, its “Christianization” by a Roman Church determined to stomp out ties to a pagan past are even more problematic.

It is assumed that because the Western Christian churches now celebrate All Saints Day on November 1st—with the addition of the Roman Catholic All Souls Day on November 2nd—there must have been an attempt by the clergy of the new religion to co-opt and supplant the holy days of the old.  After all, the celebrations of the death of the saints and of all Christians seem to directly correlate with the accumulated medieval suggestions that Samhain celebrated the end and the beginning of all things, and recognized a lifting of the veil between the natural and supernatural worlds.

The problem is that All Saints Day was first established by Pope Boniface IV on 13 May, 609 (or 610) when he consecrated the Pantheon at Rome.  It continued to be celebrated in Rome on 13 May, but was also celebrated at various other times in other parts of the Western Church, according to local usage (the medieval Irish church celebrated All Saints Day on April 20th).

Its Roman celebration was moved to 1 November during the reign of Pope Gregory III (d. 741), though with no suggestion that this was an attempt to co-opt the pagan holiday of Samhain.  In fact, there is evidence that the November date was already being kept by some churches in England and Germany as well as the Frankish kingdom, and that the date itself is most probably of Northern German origin.

Thus the idea that the celebration of All Saints Day on November 1st had anything to do either with Celtic influence or Roman concern to supersede the pagan Samhain has no historical basis: instead, Roman and Celtic Christianity followed the lead of the Germanic tradition, the reason for which is lost to history.

The English historian Ronald Hutton concludes that, while there is no doubt that the beginning of November was the time of a major pagan festival that was celebrated in all of the pastoral areas of the British Isles, there is no evidence that it was connected with the dead, and no proof that it celebrated the new year.

By the end of the Middle Ages, however, Halloween—as a Christian festival of the dead—had developed into a major public holiday of revelry, drink, and frolicking, with food and bonfires, and the practice of “souling” (a precursor to our modern trick-or-treating?) culminating in the most important ritual of all: the ringing of church bells to comfort the souls of people in purgatory.

The antics and festivities that most resemble our modern Halloween celebrations come directly from this medieval Christian holiday: the mummers and guisers (performers in disguise) of the winter festivals also being active at this time, and the practice of “souling” where children would go around soliciting “soul cakes” representing souls being freed from purgatory.

The tricks and pranks and carrying of vegetables (originally turnips) carved with scary faces (our jack o’ lanterns) are not attested to until the nineteenth century, so their direct link with earlier pagan practices is sketchy at best.

While the Celtic origins of Samhain may have had some influence on the celebration of Halloween as it begin to take shape during the Middle Ages, the Catholic Christian culture of the Middle Ages had a much more profound effect, where the ancient notion of the spiritual quality of the dates October 31st/November 1st became specifically associated with death—and later with the macabre of more recent times.

Thus modern Halloween is more directly a product of the Christian Middle Ages than it is of Celtic Paganism.  To the extent that some deny its rootedness in Christianity or deride its essence as pagan is more an indication of how these groups feel about medieval Catholic Christianity than Celtic Paganism (about which we know so very little).

And to the extent that we fail to realize just how Christian many of these practices and festivities were, we fail to see how much the Reformation and later movement of pietism and rationalism have been successful in redefining exactly what “Christian” is.

As such, Halloween is no less Christian and no more Pagan than either Christmas or Easter.

Happy trick-or-treating!