Tag Archives: Matt McKinnon

Why All Babies Deserve to Die: Science and Theology in the Abortion Debate

by Matt McKinnon

The debate rages on…

The debate rages on…

Just a few of the headlines on the abortion debate from the last few weeks:

I would say that the Abortion issue has once again taken center stage in the culture wars, but it never really left. Unlike homosexual marriage, which seems to be making steady progress towards resolution by a majority of Americans that the freedom to marry of consenting adults is basic civil right, the abortion debate continues to divide a populace who is torn between adjudicating the priority of the basic rights of both mother and “potential” child.

I say “potential” child because herein is where the real debate lies: exactly when does a fertilized human egg, a zygote, become a “person,” endowed with certain human if not specifically civil rights?

Is it a person yet?

Is it a person yet?

Dougherty’s main point in his article on liberal denial focuses on the “fact” of the beginnings of human life. He claims that liberals tend to make one of two types of arguments where science and human life are concerned: either they take the unresolved legal issue regarding the idea of personhood and transfer it back to the “facts” of biology, concluding that we cannot really know what human life is or when it begins, or they acknowledge the biological fact of the beginning of human life but claim that this has no bearing on how we should think about the legality of abortion.

Both sorts of arguments, he claims, are obscurantist, and fail to actually take into account the full weight of science on the issue.

But the problem, I contend, isn’t one of science: it’s one of theology—or philosophy for those less religiously inclined.

The problem is not the question of “what” human life is or “when” it begins. Dougherty points out:

After the fusion of sperm and egg, the resulting zygote has unique human DNA from which we can deduce the identity of its biological parents. It begins the process of cell division, and it has a metabolic action that will not end until it dies, whether that is in a few days because it never implants on the uterine wall, or years later in a gruesome fishing accident, or a century later in a hospital room filled with beloved grandchildren.

Two-cell zygote.

Two-cell zygote. Is this a person?

So basically, human life begins at conception because at that point science can locate a grouping of cells from which it can deduce all sorts of things from its DNA, and this grouping of cells, if everything goes nicely, will result in the birth, life, and ultimate death of a human being.

He even gets close to the heart of the problem when, in arguing against an article by Ryan Cooper, he claims that many people are not fine with the idea that an abortion represents the end of a life, nor are they comfortable with having a category of human life that is not granted the status of “humanity”—and thus not afforded basic human rights.

The problem with all of these discussions is that they dance around the real issue here—the issue not of “human life” and its definition and beginning, but rather the philosophical and often theological question of the human “person.”

If we look closely at Dougherty’s remarks above, we note two distinct examples of why the generation of human life is a “fact”: (1) we can locate DNA that tells us all sorts of things about the parents (and other ancestors) of the fetus and (2) this fetus, if everything works properly, will develop into a human being, or rather, I would argue, a human “person.”

For there’s the distinction that makes the difference.

After all, analyze any one of my many bodily fluids and a capable technician would be able to locate the exact same information that Mr. Dougherty points out is right there from the first moments of a zygote’s existence. But no one claims that any of these bodily fluids or the cells my body regularly casts off are likewise deserving of being labeled “human life,” though the sperm in my semen and the cells in my saliva are just as much “alive” as any zygote (believe me, I’ve looked).

No, the distinction and the difference is in the second example: The development of this zygote into a human person. My sperm, without an egg and the right environment, will never develop into a human being. The cells in my saliva have no chance at all—even with an egg and the right conditions.

Nope, not people.

Nope, not people.

So the real force of Doughtery’s argument lies in the “potential” of the zygote to develop into what he and anti-abortion folks would claim is already there in the “reality” of a human person.

The debate thus centers on the question of human personhood, what we call theological or philosophical anthropology. For one side, this personhood is the result of a development and is achieved sometime during the embryonic stage (like “viability”) or even upon birth. For others, it is there at conception. For some in both camps it would include a “soul.” For others it would not.

So the reason that the abortion debate is sui generis or “of its own kind” is because here the issue is not the rights of a minority versus the rights of a majority, as it is in the debate about homosexual marriage, or even the rights of the mother versus the rights of the child. Rather the real debate is about when “human life” is also a human “person” (note this is also informs the debate of whether or not to end the life of someone in a vegetative state).

Is this a person?

Fetus at four weeks. Is this a person?

To this end, Mr. Dougherty is correct: We can and do know what human life is and when it begins. And he is correct that many are uncomfortable with the idea that abortion means the death of a human life. But he fails to recognize that the reason this is the case is that while those on one side regard this “life” as a human person, others do not. Potentially, perhaps, but not a “person” yet. And certainly not one whose “right to life” (if there even is such a thing: nature says otherwise—but that’s another blog post) trumps the rights of the mother.

So what does all of this have to do with all babies deserving to die? It’s simple: this is what the (necessary?) intrusion of theology into public policy debates entails. Once theological ideas are inserted (and note that I am not arguing that they should or shouldn’t be), how do we adjudicate between their competing claims or limit the extent that they go?

For the two great Protestant Reformers Martin Luther and John Calvin, representing the two dominant trajectories of traditional Protestant Christianity, humans are, by nature, sinful. We are conceived in sin and born into sin, and this “Original Sin” is only removed in Baptism (here the Roman Catholic Church would agree). Furthermore, we are prone to keep sinning due to the concupiscence of our sinful nature (here is where the Roman Church would disagree). The point is that, for Protestants, all people are not only sinful, but are also deserving of the one chief effect of sin: Death.

romans_6-23

“For the wages of sin is death.” — Romans 6:23

 

Calvin was most explicit in Book 2, Chapter 1 of his famous Institutes:

Even babies bring their condemnation with them from their mother’s wombs: they suffer for their own imperfections and no one else’s. Although they have not yet produced the fruits of sin, they have the seed within. Their whole nature is like a seedbed of sin and so must be hateful and repugnant to God.

Since babies, like all of us, are sinful in their very nature, and since they will necessarily continually bear the fruits of those sins (anyone who’s ever tried to calm a screaming infant can attest to this), and since the wages of those sins is death, then it’s not a far-fetched theological conclusion that all babies deserve to die. And remember: “they suffer for their own imperfections.”

But they don’t just deserve to die—they deserve to go to hell as well (but that’s also another blog post). And this, not from the fringes of some degenerate religious thinker, but from the theology of one of Protestant Christianity’s most influential thinkers.

A sinner in the eyes of God (or at least Calvin).

A sinner in the eyes of God (according to John Calvin, anyway).

Of course, it should be noted that Calvin does not imply that we should kill babies, or even that their death at human hands would be morally justifiable: thought he does argue (and here all Christian theology would agree) that their death at the hand of God is not just morally justifiable, it is also deserved. It should also be noted that the Roman Catholic theology behind the idea that children cannot sin until they reach the age of reason is predicated on the notion that this is only the case once their Original Sin has been removed in Baptism (So Jewish, Muslim, and Hindu kids would be sinful, unlike their Christian counterparts).

Again, this is not to argue that philosophical and theological principles should not be employed in the abortion debate, or in any debate over public policy. Only that (1) this is what is occurring when pro-choice and anti-abortion folks debate abortion and (2) it is fraught with complexities and difficulties that few on either side seem to recognize.

And contrary to  Mr.Dougherty, this is beyond the realm of science, which at best tells us only about states of nature.

But the only way we have a “prayer” of real sustained dialogue—as opposed to debates that ignore our competing fundamental positions—is to take seriously the philosophical and theological issues that frame the question (even if my own example is less than serious).

But I’m not holding my breath. I would most certainly die if I did.

Earth Day Is a Sham

by Matt McKinnon

1

I am very fond of the earth. I live here, and have now for almost five decades. It’s the only home I have ever known, and I plan on retiring here and someday giving back to the earth by, well, decomposing and becoming dirt.

Ashes to ashes and all that.

I also love to garden. I love the feel of dirt between my fingers: the rich, dark stardust that collected after the Big Bang and has nourished the origin and descent of our species, of all species, since the beginning of life.

In fact, my favorite part of gardening is not the planting, which is a close second. Or the harvesting, though I enjoy the fruits of my garden immeasurably. No, my favorite part is composting: Meticulously collecting all the bits and scraps from the kitchen as well as the garden to supply generous amounts of “greens” for nitrogen, shredding junk mail (and when I taught face-to-face, unclaimed papers) to add the proper amount of “browns” for carbon, assembling them all in my composter, and religiously turning and stirring to get the desired result of rich, black, humus.

2

The good stuff.

(The sweet smell of a properly-proportioned compost pile is actually quite intoxicating.)

So my favorite part of gardening is not just sticking my hands in the earth, but making it.

I have always loved the earth, literally, for as long as I can remember. One of my first memories is getting home from church on Easter Sunday, brightly arrayed in my new pastel-colored Easter suit, and making a mad dash for the dirt, new plastic bulldozer in hand to play in my beloved earth.

I must have been maybe five years old.

And all through my childhood the place my friends and I played most regularly was a lower, barren part of my neighbor’s backyard that we endearingly called “down in the dirt.” As in: “I’ll be back later mom; we’re going down in the dirt.”

And when my wife was a teacher, I would happily assist her in making the annual “Earth Day Cake,” complete with crushed Oreos and gummy worms. Not too dissimilar from the mud pies I used to make in my own backyard.

So it is with much pain and anguish that I proclaim Earth Day to be a sham. A fraud. A ruse. Perpetrated by both well-meaning environmentalists (like myself) and corporate interests with ulterior motives.

3

The problem, of course, is not the idea or intent: Celebrating the earth that sustains all that we are, as well as raising awareness of exactly what we humans are doing to our planet.

No, the problem is that Earth Day, far from being a rousing success, has actually been an abject failure.

Though this, of course, depends on how you look at it.

From a PR perspective (is there any other where public policy is concerned), Earth Day has been wildly successful. First proposed in 1969 by peace activist John McConnell to honor the earth as well as peace, and celebrated annually on April 22nd, Earth Day has grown from its initial celebration mostly in schools and colleges across the United States to become the largest secular holiday in the world, celebrated by some one billion people in over 192 countries.

But from a practical perspective, the movement has not had the desired effect of meaningfully arresting the manner in which we are still destroying the earth. Even more so than in 1970. Heck, it hasn’t even managed to convince most Americans that we are experiencing an ecological crisis.

Though perhaps it makes us feel better about it, at least one day a year.

And therein is the problem. Couched in terminology of honoring the earth, and even cleaning it up a bit, Earth Day domesticates what is arguably the greatest catastrophe to ever befall humanity: the impending collapse of an environment that is hospitable to human survival.

There have, of course, been other extinction events before—five in fact, with the largest being the “Great Dying” (or Permian-Triassic extinction event for all those biogeeks out there), some 252 million years ago, which resulted in the extinction of an estimated 90% of all species. The most famous, arguably, is the last major extinction, the Cretacious-Paleogene extinction event around 66 million years ago that resulted in the loss of 75% of all species, including everyone’s favorite—all those non-avian dinosaurs. This of course was followed by the rise of mammals (and birds) as the dominant land vertebrates. Which has ultimately led us to the precipice of a sixth extinction event.

4

Many scientists (PBS reports 70% of all biologists) predict that we are now in the beginning of another extinction event, the first (and probably last) ever to be caused by humans. (The same humans, incidentally, who celebrate Earth Day every year.) The result of this current extinction may compete in magnitude with the Great Dying, resulting in the extinction of nearly 90% of all living species. And potentially in a much quicker manner than the previous five extinction events of the past.

Of course, the data is not conclusive and the consensus is not unanimous, as it rarely is in science, or anything else for that matter.

But what is clear is that, regardless of what the population believes about “climate change” or “global warming,” we humans have polluted and destroyed parts of the earth to the extent that they may never recover—at least not in terms of being able to support life as we know it. (And by that I mean human life as well as those things that support human life.)

More so than the recent coal ash spills in our own neighborhood or the release of toxic chemicals in West Virginia, the oceans are perhaps the best example of how much humans have destroyed and are continuing to destroy the earth’s environment.

5

Floating islands of trash in the Pacific Gyre.

So let’s be clear in a manner that climate change or global warming cannot: the oceans are dying at an alarming rate. And by “dying” I don’t mean metaphorically. I mean literally. As in, studies suggest that all of the world’s corals may be extinct by the end of this century due to the acidification of the oceans caused mostly by the carbon dioxide emissions from human activity. And once the oceans die, well, human survival becomes more than a little tenuous.

And yet instead of debating what best to do about the great damage we have already caused to the earth, we are instead debating how to regulate fracking (if at all), whether to institute a “carbon tax,” and whether or not to build a pipeline from the oil sands in Canada to refineries in the United States. Rest assured: such debates are moot. For if we succeed in burning all of the oil available in those sands as well as the natural gas and coal we extract from the ground here in the US, then our fate is sealed. Along with that of anywhere upwards of 90% of the species who inhabit earth along with us.

6Oh, I almost forgot:

Have a Happy Earth Day.

 

¿Habla American?: Why English as an Official Language is Blatantly Un-American

by Matt McKinnon

We the People...

Nosotros, el Pueblo…

I’m no fan of corporations.  In fact, I am often critical of them and the too-big-to-fail capitalism that has come to dominate global economics. But I am willing to congratulate them on the off-chance that they do something good or get something right.

Like the Cheerios commercials featuring a multi-ethnic girl with her family that prompted racist hate-speech from trolls everywhere. Or the recent revelation that multinational corporations are taking climate change seriously, since it poses a real economic threat to them.

Or when Coca-Cola broadcast this advertisement during the Super Bowl:

(Coke doubled-down amidst much criticism to play it again for the Winter Olympics.)

Now, I’m no dummy, and I’m certainly not naïve. I realize that the folks at Coca-Cola are first and foremost interested in their bottom line, and that means selling more Coke. And as we are all aware by now, the United States is undergoing a considerable demographic shift, so much so that white people will no longer be the majority by 2043. And more to the point: white kids will no longer make up a majority of youth in five or six years. Yes, five or six years! Which is why companies like Coca-Cola are so interested in multicultural approaches to advertising.

So yes, I know all this, and yet still find it laudable (1) that Coca-Cola produced the commercial, and (2) that they stood by it despite heavy criticism.

But enough about Coke. My real interest is the criticism that was generated by having non-white U.S. citizens sing a patriotic song in a language other than English. And the next logical step that many critics make: viz., that English should be the official language of the United States.

This impulse is nothing new. Nor is the fear and prejudice behind it.

Benjamin Franklin.

Benjamin Franklin.

Our brilliant and esteemed Founding Father Benjamin Franklin railed against the unwanted influence of what he called “swarthy German” immigrants with surprisingly racist overtones:

“Why should Pennsylvania, founded by the English, become a Colony of Aliens, who will shortly be so numerous as to Germanize us instead of our Anglifying them, and will never adopt our Language or Customs, any more than they can acquire our Complexion. “

(Indeed: Who knew that only the English were truly white?)

Of course, Franklin was wrong then, as those who criticize the Coke ad and call for English as our official language are wrong now. They are wrong for a practical reason based in historical fact: the new German immigrants did not “Germanize” the English, despite the fact that more Americans now claim German ancestry than any other ethnic or national group. No, they learned English because it was practical to do so, though some retained the use of their native tongue well into the 20th century.

Likewise, studies show that recent immigrants are assimilating in similar fashion, just as immigrants have been doing since, well, the original English came over and ironically did not assimilate into existing native cultures.

And this means that they are learning English.

Loosing my Espanish, by H.G. Carrillo (2004).

Loosing my Espanish, by H.G. Carrillo (2004).

A Pew study found that 91% of second-generation children from Hispanic families speak English “very well or pretty well” and that the number rises to 97% of third-generation children. Indeed, other studies show that not only are second and third generations learning English, they are more likely than not to learn only English—and not the language of their parents’ or grandparents’ homeland.

But there is another—deeper and more essential—reason why English is not and should not be the official language of our land. And while this argument could be made from the liberal and progressive “love-for-all-things-multicultural” perspective worthy of this “liberal studies” blog, the stronger argument is actually one more conservative in nature, rooted as it is in the very fabric of our democracy, in what it means to be American.

The argument is simple: making English, or any language, the Official Language of the United States is blatantly Un-American at its core.

In fact, the late conservative writer Joseph Sobran made a similar argument some thirty years or so ago, to the chagrin of some whose conservative principles only went as deep as their nationalism. (This was the same Joe Sobran whom Pat Buchanan called “perhaps the finest columnist of our generation” and Ann Coulter named “the world’s greatest writer” and the “G.K. Chesterton of our time.”)

Joseph Sobran.

Joseph Sobran.

The point is twofold: First, from a conservative perspective, government should be limited and should only be about the business of governing—not social engineering. Mandating that Americans learn and use English is as absurd from a conservative viewpoint as mandating that they learn and use French, or that they eat their vegetables and lay off the Supersized fries and soda. This, argues conservatism, is simply not the purview of government, and it doesn’t matter whether learning English or eating broccoli are good ideas or not (as I think they both are). What matters is that this is not the government’s responsibility to decide or dictate.

And second, a government “of the people, by the people, and for the people” should, as much as is possible, reflect the majority of the people, while safeguarding the rights of the minority. But such a reflection, like the people it reflects, is in a constant state of change.

So in this case, what could be more basic than the right to express oneself in the language of one’s choice? And what could be more democratic than a government committed to accommodating that language—those languages—and to understanding and communicating with its own citizens?

For what right is more basic than the choice of language? Freedom of speech? Freedom of the press? Freedom of religion? All of these are secondary, at least temporally, to the choice of language whereby one speaks, or publishes, or prays aloud to their God.

Indeed, the only act synchronous to that of speaking is the forming of one’s very thoughts. And yet, even here, do we really decide what language we use to form our thoughts? Or does our language shape our thoughts and even ourselves?

If so, what would it mean that for some U.S. citizens, their very thoughts are formed in an unofficial language?

Government should not be in the business of constraining either the free thought or the free expression of its citizens.

"We speak English."

“We speak English.”

Furthermore, the fact of English as our common language is an accident of history. Not only are we the largest English-speaking nation in the world, we are also the second largest Spanish-speaking nation (second only to Mexico). And what is more democratic than a common language that actually resonates with the voice of the people? If Spanish, or French, or Chinese should one day become the preferred language of a majority of U.S. citizens, how undemocratic would it be that their free and common expression would be constrained by the short-sightedness of having made English the Official Language?

To extrapolate from James Madison’s argument against the state establishment of Christianity in his Memorial and Remonstrance: any government that can today make English the official language can tomorrow replace it with Spanish or Arabic.

***

This is what it means to be American: to have the right to choose whatever language one wishes to express oneself, be it for business, or entertainment, or religion, or school—ever mindful of the need to balance this with the necessity of being understood.

As Americans, we lack an ethnic identity. And we lack an established religion. And we lack an official language.

But we are united as a people precisely because we lack these. Since our ethnic identities and religions and languages are plural. As are we.

But in this plurality there is strength.

And from this plurality there is unity.

Or, as our Founding Fathers put it,

Dean Bryant Johnson, "E Pluribus Unum" (2012), detail.

Dean Bryant Johnson, “E Pluribus Unum” (2012), detail.

E pluribus unum.

(And that ain’t English.)

HousingFest 2014: February 22 in Charlotte

by Matt McKinnon

6

Cold.

Colder than this.

It’s been cold this week in Peoria, where I currently reside.  Real cold.  Mind-numbing, face-hurting, frost-biting cold.  The high on Monday was 5 degrees Fahrenheit.  The low on Tuesday was 5 below.  And those were the actual temperatures.  The wind chills with the latest Arctic blast regularly approached 30 below.  The schools have yet to be closed for snow since we moved out here three years ago, but have now been closed three times due to cold temperatures.  Evidently, with wind chills so low, frostbite can occur within ten minutes or so.  And no one wants to be responsible for turning Peoria’s schoolchildren into a bunch of kidsicles.

I bring this up because it seems that these extremely low temperatures, along with Thanksgiving and Christmas, are the rare annual events that cause most folks to think about the homeless.  I mean, I almost froze to death running outside in my pajamas to set the garbage can back up—I can only imagine what folks who lack adequate housing have to put up with in these conditions.

2

Homeless.

Last January, the U.S. Government reported that some 633,782 people in the United States were homeless (62,619 of whom are veterans).  The good news is that this number has decreased since 2007: down 6.8% among individuals, down 3.7% among families, down 13.1% among unsheltered homeless, and down 19.3% among the chronically homeless.

And it is this last group I want to focus on.

According to the U.S. Government, a “chronically homeless” person is defined as “an unaccompanied homeless individual with a disabling condition who has either been continuously homeless for a year or more, or has had at least four episodes of homelessness in the past three years.”  Data suggests that the chronically homeless only make up some 10% of all homeless people, but may account for as much as 50% of the services provided (though how best to interpret these numbers is debated).  What can be said with certainty is that chronic homelessness is a major problem with complex causes that needs to be addressed on a number of fronts.

cropped-logo-1Luckily, there are organizations like the Urban Ministry Center in Charlotte, NC, attempting to do just that—offering food, shelter, treatment, and community to some five to six hundred area homeless people.  According to the National Alliance to End Homelessness, the best way to help those who are chronically homeless is a Housing First approach coupling permanent housing with supportive services (such as case management, mental health and substance abuse services, health care, and employment).

This makes sense: What a chronically homeless person needs most is a home, not just a temporary shelter.  They need the stability and security that permanent supportive housing provides—creating an environment in which other contributing issues (that exacerbate and cause of homelessness in the first place) can be addressed.

4

This is precisely what the Urban Ministry Center’s HousingWorks program does: “(G)ive chronically homeless individuals what they need most—a safe, stable, affordable home—and then provide the wrap-around support to help them remain housed and regain lives of wellness and dignity.”

This philosophy of “Housing First” recognizes the complexity of chronic homelessness and seeks a solution based on the premise that housing is a fundamental right, regardless of a person’s mental health, physical condition, or addiction.  It’s a simple idea: get folks into housing first—provide a safe and secure home—and then work on whatever issues need attention in order to insure that the person remains in housing.

After all, their website asks: “Who among us could tackle sobriety while sleeping under a bridge?”

5

Housing First has become a major movement in the United States in the search for solutions to the problem of chronic homelessness, as both National Public Radio and the Wall Street Journal have reported.  Indeed, the WSJ article reveals that, according to a Chicago-based study, providing permanent housing to the chronically homeless improves (and saves) lives as well as saving taxpayer money.

In Charlotte alone, the cost in temporary shelter, emergency room treatment, as well as hospital and jail stays is over $39,000.00 a year—for each chronically homeless person.  That’s a cost that the community pays. In other words, it comes mostly out of our tax dollars.  Compare that with the $13,983 annual cost that Charlotte’s HousingWorks program spends to provide both permanent housing and case management (including services like medical, mental health, and addiction treatment) for the same homeless individual.  Less money to be sure, but the real savings is in the improvement of people’s quality of life and the success in providing meaningful treatment that saves and changes lives.

It’s just this simple: “HousingWorks saves lives and saves our community money.

7

The Blind Boys of Alabama.

To that end, the Urban Ministry Center is having a benefit concert to help end chronic homelessness in Charlotte—its first annual HousingFest.  On February 22nd, 2014 the Grammy award winning group The Blind Boys of Alabama and Grammy award winning singer/songwriter Jim Lauderdale will be performing at the Neighborhood Theater in Charlotte, NC.  Tickets for this worthwhile event are only $25.00 and can be purchased through the Neighborhood Theater’s website.

8

Jim Lauderdale.

If you’re in the Charlotte area, I urge you to support this important cause and enjoy some outstanding musical entertainment in the process. After all, homelessness is chronic issue, regardless of the time of year and the weather.

Here’s a chance to help do something about it.

Editor’s note: A big, hearty welcome to those of you who are following us since last week’s post went viral (over half a million hits)! Our contributors are the faculty and (occasionally) students and alumni of the BLS Program at UNCG, which is an online multidisciplinary program for nontraditional students. With people from so many different disciplines coming together to write about whatever inspires them, this blog is rather eclectic. I think Matt’s post this week makes it pretty clear that we’re not just a parenting blog, though we do have quite a few parents and they do write about the topic from time to time. We may have a post on co-sleeping for you in a couple of weeks.

Feel free to browse back through our back posts. I’ve been working on the tags (which are inconsistent at best); you can now click on an author’s name in the tags to see all that author’s posts. I just looked at the “parenting” tag and realized it needs a lot of work, so don’t rely on that one just now. Enjoy! -JP

Santa, Jesus, and Race

by Matt McKinnon

So, as I’m sure most of you have heard, Santa Claus and Jesus are both White.  They just are.  And these are historical facts.

1

Megyn Kelly.

Or so proclaimed Fox television personality Megyn Kelly on her show The Kelly File, while discussing an article by Slate contributor Aisha Harris proposing that the beloved American icon be “made over” in a more culturally neutral manner. (Harris’s suggestion is a cartoon gift-bearing penguin complete with red suit and obviously fake beard.)

What ensued was a predictable lampooning by late-night comedians and news commentators, followed by Fox News’ just as predicable recalcitrant dismissal of a “liberal media’s” obsession with race and race-baiting.  For her part, Ms. Kelly refused to apologize for her position, calling the entire episode “tongue-in-cheek,” though she did acknowledge that the issue of Jesus’ ethnicity is “unsettled.”

2

Harris’ Santa (illustration by Mark Stamaty).

There are many issues here that deserve commentary and discussion, not the least of which is Ms. Harris’ initial suggestion that an ethnically-neutral Santa Claus is much better suited for a multi-ethnic culture like that of the contemporary United States than the image of fat white man landing on rooftops and essentially performing what would be interpreted as “breaking and entering” in most legal jurisdictions.  (And at least one comedian has suggested the ramifications for a black counterpart, given the Stand Your Ground laws in places like Florida.)

But I, like most in the entertainment and media industries, am more interested in what Kelly said, and how she said it, than in what precipitated it.

Kelly, of course, was simply pointing out what most Americans probably assume: Santa Claus as a fictionalized character is based on the historical fourth century Christian bishop Saint Nicholas of Myra.  Santa has always been portrayed as a white man because, duh, his likeness is based on a real white man.  The notion that he can be portrayed any other way than that of mainstream American culture and its hegemonic ethnicity (white) is practically unthinkable—at least by white folks in the mainstream.

In fact, when watching the video from the show, it is more Kelly’s insistence of the brute factuality of Santa’s (and Jesus’s) ethnicity that is so problematic—and not necessarily her position.

And on this subject, a few ideas deserve discussion.

First and most obvious is that what the historical Saint Nicholas and Jesus both share is, if not a common ethnicity, then at least a common geography and culture—that of the Hellenistic Near East.  And while Nicholas was most probably a Greek from “Asia Minor,” and Jesus a Palestinian Jew, neither of them would have considered himself “white”—and would probably not be considered so by today’s use of the term.  So if Santa Claus is simply a reinterpretation of the Anatolian bishop Nicholas of Myra, then Ms. Kelly is mistaken: neither he (nor Jesus) is white.

They just aren’t.

A forensic reconstruction of St. Nicholas' face, surrounded by his image in traditional icons.

St. Nicholas’ face, in a forensic reconstruction and traditional icons.

But, without getting into the specifics, our Santa Claus’s development most probably owes more to Pagan characters, practices, and legends than he does to the Christian bishop of Myra.  And so, arguably, on this point, Kelly is correct: Santa Claus, inasmuch as he evolves from Germanic mythology, is “white” or Northern European (though the same cannot be said for Jesus).

3

A Medieval European “Wild Man.”

Of course, the real issue in all of this is the assumption that Santa is white—must be white, can only be white—whether rooted in history or mythology.  And that is the assumption of hegemony: Where whiteness is the presumed sole and legitimate norm.  Where the way things are is the way they have been and should be.

And pointing this out to someone who is oblivious to the ubiquity of whiteness in our culture can be awkward, unsettling, and even shocking.

Hence the insistence that Santa, like Jesus, is white—without any thought or consideration that such a proclamation may be controversial or that it could possibly be historically and culturally conditioned—tells us more about the speaker, her audience, and our culture than it does about either Santa or Jesus.

But this is not to belittle Ms. Kelly or chasten her about her firmly-held beliefs and ethnic identity.  For the real “White Man’s (sic) Burden” is not to rule over and “civilize” the non-White world, but rather to recognize and confront examples of “White Privilege” in our pluralistic and multi-ethnic culture.  And while there are much more important aspects of white privilege in present day America (like being arrested, prosecuted, convicted, and sentenced to jail far less often than non-whites for similar crimes), the ubiquity of whiteness as the aesthetical norm is not to be dismissed.

But this is easier said than done, since, while the ubiquity of whiteness is quite obvious to most non-white folks, it tends to be invisible to most whites.

And the same thing that happened to Ms. Kelly happened to me, though I was seven years old and in the second grade at the time.

I had the same reaction to the “Black Santa” that my African-American teacher Mrs. Watson had put up on her classroom wall: “Santa Claus is not black.  He’s white,” I thought to myself, aghast at the suggestion.  Luckily, I didn’t have a television show to proclaim this, and didn’t actually articulate it to any of my classmates either.

But the memory has stayed with me—as one where, for the first time, I was confronted with the hegemony of my own culture: me, a little white boy bused from my mostly white neighborhood to school in the projects: a minority among minorities, but whose whiteness was still the norm.

Or should be, as I mistakenly assumed then.

4

Still from the Good Times episode “Black Jesus” (1974).

And around the same time (1974) and in much the same way, I was confronted by the “Black Jesus” episode on Good Times, where JJ had used Ned the Wino as the model for his painting of the Christ, since—being passed out in the gutter—he was the only one who could hold the pose long enough for JJ to paint.

Much later, of course, I was confronted by a long history of Jesus being portrayed in the varying ethnicity of the portrayers—across the centuries, from Asia to Africa to Europe and the Americas.

An Ethiopian Jesus and a Chinese Jesus

Jesus in Ethiopian and Chinese depictions.

And then by the African-American Liberation Theologian James Cone’s insistence that “Jesus is a black man,” in that, according to Christian theology, God comes to his people in the guise of the most repressed and outcast among us.  And in the United States, the reasoning goes, who has been more marginalized than the African slave?

But, arguably, I may not have been as sympathetic and understanding of art history or liberation theology had I not first been confronted in the privileged place of my whiteness in American culture by folks like Mrs. Watson, and the producers, writers, and cast of Good Times.

So what is troubling in Ms. Kelly’s remarks is not her assumption that both Santa and Jesus are white—but her insistence that they are, an insistence that suggests that she has probably never been confronted by the hegemony of her whiteness or the problems that such an unyielding hegemony can produce, at least in a multi-ethnic culture like ours where the power to portray is the power to influence.

It is the catering to the very understandable perspective of a certain portion of the American public that times are changing, have changed, and that “their America,” for better or worse, is gone and not coming back.  And while such a perspective is frightening and should be met with sympathy and sensitivity in order to soothe and diffuse it, it is usually met with a demagoguery of entrenchment from the one side and outright scorn from the other.

Where are Mrs. Watson and JJ when we need them?

(Of course, I must admit, I don’t particularly care for the Penguin Santa myself, whatever ethnicity it is.)

Who Am I? (On Genealogy and Genetic Ancestry)

by Matt McKinnon

Who am I?

I have long been pestered by this question, seeking the answer not in a litany of likes and dislikes or the self-obsessed perspective that modern Western consumerist culture offers me.  But neither in the personal history of myself—where I’m from, where I’ve been, and so on.  Or even less in my career, my “profession,” what I do to make enough money to live comfortably and raise a family.

1

No, my interest in identity has been more in my genealogy, my distant past, and what we now call “deep genealogy”—the history of my DNA, that mysterious code I have no control over but that dictates much of who I am.

2The more I have sought answers in these two areas, the more I have come to realize that they are decidedly different—that my genealogy (the relatively recent history of my family and ancestors) and my “deep genealogy” (the origins and history of my DNA) offer two quite different portraits—even though the latter, after tens of thousands of years, ultimately leads to the former.

But that’s the key: after tens of thousands of years.

I remember my first dabbling in genealogy when I was in high school: I had always known that my name, McKinnon—or rather MacKinnon—was Scottish in origin.  I had been told by my family that we were mostly “Scots-Irish,” a term which, I came to find out later, is basically an American invention used rarely if ever in either Scotland or Ireland.  It can denote the Ulster Scots whom the English used to colonize Northern Ireland in the 17th century (and are thus not “genetically” Irish at all), or Lowland Scots and those of the Borderlands between Scotland and England.

3But a little research soon proved that the MacKinnon name is Highland, not Lowland or Border, and certainly not “Scots-Irish.”  The Highlands and Islands of Scotland are mostly Gaelic, and hence Celtic in origin, while the Scots of the Lowlands are a mix of Celtic, Roman, German, English, Scandinavian, Irish, and Scottish in varying amounts.  And since our most recent Scottish ancestor was a MacKinnon who left the Isle of Skye sometime in the late 18th or early 19th century, my Highland ancestry was confirmed.

So I spent the rest of my high school days donning plaid scarves, Shetland wool sweaters, and Harris Tweed caps and playing records of bagpipe music at home that frightened the cat and annoyed my parents and siblings to no end.

But deep down, I knew that this was not answer enough.  Indeed, ethnic identity continued to elude me and offer more questions than answers.

And it still does, even after countless hours spent researching family history and genealogy, and hundreds of dollars spent on research and DNA analysis.  Perhaps my developing awareness of the fragmentary and somewhat arbitrary nature of what we call “history” has made my search one of exponential questions instead of hard and fast answers.

For what we call “Celtic” is in fact a linguistic designation, like (and related to) “Germanic” or “Balto-Slavic.”  These are first and foremost language identifiers and not “genetic” ones.

So MacKinnon, being a Highland name, at least designates my ethnic identity as Celtic, right?

Perhaps.  At least to some extent.  But what does that really mean?

After all, these groups—Celtic, Germanic, Balto-Slavic, Italic—are only Indo-European linguistic identifiers with origins in a shared Proto-Indo-European population of tribes who inhabited Europe most probably during the late Neolithic Age (circa 4000 BCE).  Only then did these peoples begin their various migrations north and west as they differentiated into the more well-known (if often mistakenly applied) names like the Celts, Germans, Slavs, Romans, etc…

4The point being that, any location of one’s ancestry as “Scottish,” or “Highland,” or “Gaelic,” or “Celtic,” or, for that matter “Germanic” or “Balto-Slavic” is rather arbitrary in that it assigns prominence to one moment in a wave of modern human migration that began in Africa some 70,000 years ago and arrived on the Pontic-Caspian steppe in what is today Eastern Europe about 30,000 years later.  From there, these various groups migrated into all directions, as wave after wave of tribes populated Europe, developing different cultures and languages, though all sharing the same not-too-distant Indo-European past.

(It is interesting to note as well that these folks only started to look “European,” i.e., “white” around 11,000 BCE.)

5So that Highland MacKinnon ancestry I was so sure about?  Well, it turns out that a deep DNA analysis confirms my paternal lineage (the Y-chromosome of my father’s father’s father’s father’s father…all the way back to its beginning) to be that of Haplogroup (I won’t even get into it) I2, subgroup a2.

Haplogroup I began 30,000-40,000 years ago in Eastern Europe, with I1 and I2 diverging about 6,000 years later.  I2a arose about 11,000 years ago in the Balkans and is still today concentrated in Eastern Europe and Russia.  I2a2, that of my Highland Scots paternal DNA, only emerged some 7800 years ago, also in the Balkans, before starting its migration north into Central and Eastern Europe as well as Russia.

And, at some point, as the DNA of a male member of a Celtic or perhaps Germanic tribe who ultimately made his way to Scotland.  And then passed it along to me.

So my Highland Scots DNA is actually Baltic in origin, and is shared by more Serbs and Croats and possibly even Russians than it is by my “fellow” Highlanders.

But if that’s not confusing enough, this only represents one line of grandfathers on my father’s side, going back roughly 8,000 years.  If we consider that there are approximately 400 generations between me and my Neolithic “European” ancestors, then the number of my direct relatives from present day all the way back to the New Stone Age is considerably large [nerdy editor's note: large enough to need scientific notation: 2.58 x 10120].

But we need not go back that far to make my point: much of an individual’s “ethnic identity” is relatively arbitrary and tells precious little about their deep genetic makeup.

In calculating the rather complex mathematics of our ancestry, scientists have concluded that all modern humans are related to each other in the not too distant past—within a few hundred years in fact.  Steve Olson, writing in The Atlantic in 2002, reported that

  1. Everyone in the world is descended from Nefertiti and Confucius, and
  2. Everyone of European ancestry is descended from Muhammad and Charlemagne.

Grandma?That would be everyone.

Which means that all modern humans alive today are related to each other—and related to each other rather recently, considering that modern humans have been in existence for about 100,000 years.

7

Indeed, everyone reading this post is probably at most my 20th cousin.

But you’re not invited over for Thanksgiving.

And I’m guessing you’re not all Highland Scots.

Shut Down

by Matt McKinnon

About a month and a half ago, I agreed—as part of my job—to write a contribution for the BLS blog, due by October 6th, and to be published shortly thereafter.  I agreed to this based on my understanding of what my job is, what it entails, the compensation I receive as a BLS instructor, and my belief that a community only works when its members participate in just that: a “communio” or sharing, from the Latin “union with.”  I made this agreement in good faith and free from constraint.  And, though some might argue this point, I made it being in sound mind and body.

But the situation has changed.

broken

(The first image would be here if I were not shut down.)

I am not happy with the present way in which the elected officials of the State for whom I work have conducted business regarding the educational system within which I work.  In short, I disapprove of the massive cuts to higher education that the North Carolina State Legislature has made over the past several years.

Never mind that these folks have been duly elected by a legal process and have conducted this business in a manner consistent with the Constitutions of both the State and the Nation.

Never mind that “legal” does not necessarily mean “fair.”

Never mind that there are regular procedures in place to check the manner in which they do this business—that there is constitutional recourse to persuade, recall, impeach, or merely vote them out of office at the next election.

Never mind that what they have done is now “law”—and has become “law” in a legal and constitutional manner.

Never mind all of this because…well, I just do not agree with them or their “law.”

(The second image would be here if I was not shut down.)

(The second image would be here if I were not shut down.)

And while I adhere to the principle that writing a blog entry is part  of my job, and that I have a duty to myself, to my institution, and to my students to faithfully execute the duties of my job, I have another principle that outweighs all of these:

If I do not get what I want, then I shut down.

(The third image would be here if I was not shut down.)

(The third image would be here if I were not shut down.)

At this point, I am not even sure what would make me not shut down.  Or stop shutting down.  Or start back up.

At this point, I am not even sure what I hope to get out of shutting down.  Other than the shut down itself.

But none of that matters.

Because I have shut down.

So, until further notice—until an agreement can be reached that satisfies the righteousness of my indignation at the manner in which duly-elected officials representing the State by whom I am employed have conducted business in a lawful and constitutional and regular manner—until then, there will be no blog contribution.

I will not fulfill this part of my job.  I have deemed it “non-essential.”

There will be no witticisms or anecdotes about me, my classes, my life, or my family.

There will be no funny or interesting or bizarre pictures to punctuate my points.

There will be no weblinks to follow for more information—at least none supplied by me.

There will be none of this.

Because I am shut down.

(The fourth image would be here if I was not shut down.)

(The fourth image would be here if I were not shut down.)

Of course, by shutting down and writing about how I am shutting down, I am still, technically, fulfilling some of my responsibilities and thus doing my job.  Therefore, I will continue to be paid and will continue to accept and spend my paycheck.

After all, shutting down is hard work.

The First Day of School

by Matt McKinnon

Well, it ain’t what it used to be.  The first day of school, I mean.

Image

And I don’t just mean the back-to-school shopping, though that has changed a lot, to be sure.

We did most of ours online this year, since navigating Walmart.com is a LOT more appealing than navigating an actual Walmart.

And since many public schools have gone to uniforms, there’s not really much fun in back-to-school clothing shopping with the kids:

“How about the khaki pants and red polo shirt?”

“No, I won’t be caught dead in those.”

“Okay, then there’s always the red polo shirt and khaki pants.”

McKinnon Boys on First Day of School

Gone are the days, at least for those of us in uniform schools, where back-to-school shopping was a creative endeavor to get the coolest outfits possible, actually enjoying the prospect of new clothes.

Toughskins jeans.  Converse Chuck Taylor hightops (later surpassed by real leather offerings from Addidas and Nike).  Cool football jerseys.  A new jean jacket.

Toughskins

Man, those were the days.

And it didn’t cost $250.00 to fully outfit two kids for the entire school year.  (Or at least until they get stains all over their shirts and wear holes in the knees of their pants.  Do kids not wear patches on pants anymore?)

And picking out your clothes for the first day of school was just as exciting, and became even more important the older you got.  After all, I had to make a nice impression on those 10-year old girls I was not going to talk to.  Or even look at.

But now the shopping carts are virtual and the clothing is all the same: red polo shirts and khaki pants.  Maybe shorts.  If you’re feeling crazy…navy blue.

Of course, school supply shopping is still best done at an actual store, especially since the local Walmart and OfficeMax and Staples all have lists sent to them by the school district and even the local schools.  And then there’s the additional list that the teacher sends out.

Back to School SuppliesThe cumulative effect of all this is that there are three lists for each of our two elementary-age kids that my wife and I have to carry around with a real shopping cart (the one with the wheel that won’t swivel right), juggling from one list to the other, trying to mark off what we have while we search for what we still need, all the while trying unsuccessfully to keep items not on the list out of the basket.  (How we ended up with a “Duck Dynasty” pillow in the cart I will never know.)

Not to mention that our high school junior is too cool even to shop with everybody else, so we had to make a special late-night black-ops trip, just he and I, outfitted in dark clothing and sunglasses, so no one he knows will see him…with his dad…shopping at Walmart of all places.

And not to mention that the entire school supply deal set us back about $150.00.  A hundred and fifty dollars?!  For notebooks and paper and pencils?

Yes.  And pens, and erasers, and binders in every size and color imaginable.  And glue and glue sticks.  And highlighters, and rulers, and note cards, and composition books.  And more binders.  And pencil boxes, no wait, they have to be bags with three holes to fit in the binder.  And lunch boxes.  And Clorox Wipes and Kleenex (are those really our responsibility?  Whatever happened to that green stuff the janitor would just spread around on the floor when some kid threw up?)  And we still can’t find any graph paper.  Does Walmart have something against graph paper?  Are American kids just not expected to plot graphs anymore?  No wonder we’re falling behind the rest of the developed world.  I bet they have graph paper in Sweden.

But I digress.

I’m not talking about any of that.

No, what I mean when I say that the first day of school ain’t what it used to be is that, as someone who taught mainly face-to-face classes for years but who now teaches entirely online, the first day of school just isn’t quite the same.

Now, don’t get me wrong: I am NOT complaining.

Just observing.  (I tell my wife this all the time.)

First Day of Class

There used to be a nervous energy about the first day of class—when that meant standing in front of a theatre-size room of 100 students or so.  There was electricity in seeing the fresh faces of students experiencing their very first day of college, or even in the nonchalant smoothness of seniors who had waited until the very last moment to complete their GEC credit.

There was magic in the anticipation of how hard the course might be, or how boring the professor was, or how anything I was saying would have any bearing on anyone’s intended career.

I used to enjoy coming up with new ways to start the first day: by proclaiming to the class, for example, that the only thing I hated more than the first day of class was…the next day of class.  Or by moving everybody outside to enjoy the weather.  Or even sitting at a desk like everybody else: just sitting, waiting, and watching as the time for class to start came and went, and still no teacher.  And then getting up abruptly, as if annoyed, audibly mumbling something to the effect that if nobody else is going to teach the damn course, then I might as well.

Yes, those were the days.

But those days are gone.

And again, don’t get me wrong: I am not complaining.  Only observing.

I love teaching online, and have come to see what we do in the BLS program as not just a service to the University, but more importantly, as a service to students—some of whom may not be able to take classes or finish their degree any other way.

And my students, overall, tend to be older, more mature, more driven, and actually interested in what is being taught.

And there is certainly energy and magic in the first day, though clicking on a link to make the course available doesn’t quite compare to bounding around a lecture hall like Phil Donahue in his prime.

No; it’s just not quite the same.

Even though this year I tried.

Fresh Shave and a Haircut

I got a haircut.  I took a shower.  Heck, I even shaved, and thought about adding some color to my graying beard before deciding against it.

And then I sat down, clicked on “Make Course Available,” and…

Well, nothing happened.  At least nothing spectacular.

For that, I’ll have to wait for the next 48 days—or however many are in this first session.

But of course, it’s not that bad…

After all, other than strippers, “escorts,” and the occasional politician, who else do you know can go to work not wearing pants?

Comforts of Home

Yes, there’s something to be said for the comforts of home.

“So, You Pastor a Church?”

by Matt McKinnon

I'm no pastor and I don't play one on TV.

I’m no pastor, and I don’t play one on TV.

It’s a question I used to get all the time, mostly from me and my wife’s family members.  Good, God-fearing folks (for the most part) who simply assumed that devoting one’s professional life to the study of religion must mean being a pastor—since “religion” must be synonymous with “church.”  Why else would someone spend upwards of eight years in school (after undergrad?!) studying various religions and even languages few people on earth still use?

And while one of my three degrees in religious studies is from a non-denominational “divinity” school (Yale) and my doctorate from a Roman Catholic university (Marquette), my degrees themselves are academic, preparations for scholarship in the academy and not the pulpit.  But that still hasn’t stopped folks from asking the above question, and has also led to invitations to offer prayer at family gatherings, read scripture at special events, and even give short homilies when the situation arises.

Now don’t get me wrong, there’s nothing wrong with being a pastor, or priest, or imam, or rabbi.  Plenty of good folks are in these lines of work, many of whom I have studied alongside of in pursuing my education.  My wife’s cousin, in fact, is a Baptist preacher—a wonderful man who is much more qualified to pray and preach and—God forbid—counsel folks than me.  So the problem is not my disdain for this profession: the problem is that it is not my profession.

But the real issue here is not what I do but rather the underlying problem that most folks have in understanding exactly what “religious studies” does—and how it is different from “theology” and the practice of religion.

This was never as clear as in the recent Fox News interview of religious studies scholar Reza Aslan about his new book on Jesus, “Zealot: The Life and Times of Jesus of Nazareth.”

.

Lauren Green

Lauren Green

Never mind that Fox religion correspondent Lauren Green gives a horrible interview, spending much more time on what critics have to say about Aslan’s book than on the book itself.  For while this may be bad, even worse is that it becomes painfully clear that she probably has not read the book—and may have not even perused even the first two pages.  But what is most troubling here is that the RELIGION CORRESPONDENT for a major news network is working with the same misunderstandings and ignorance of what exactly religious studies is and what religious studies scholars do as regular folks who are not RELIGION CORRESPONDENTS.

Zealot

Aslan’s Zealot

Her assumption is that the story here, the big scoop, the underlying issue with Aslan’s book about Jesus is that…the author is a Muslim.  And not just a Muslim, but one who used to be a Christian.  Despite Aslan’s continued attempts to point out that he has a PhD in religious studies, has been studying religions for over twenty years, and has written many books dealing with Christianity, Islam, Judaism, and even Hinduism, Ms. Green cannot get past what she—and many of his critics—see as the real issue: he is a Muslim writing a “controversial” book about Jesus—the “founder” of Christianity as she calls him.

Now I put “controversial” in quotations because, as anyone even remotely aware of scholarship on Christianity knows, the most “controversial” of his claims are nothing new: scholars since the 19th century have been coming to many of the same conclusions that Aslan has come to.  And I put “founder” in quotations as well, since these same folks even tangentially aware of New Testament scholarship know that Jesus himself lived and died a Jew, and never “founded” a new religion.

Dr. Reza Aslan

Dr. Reza Aslan

Not being aware of any of this is not really the problem, but rather a symptom of the bigger issue: Ms. Green, like many folks, simply does not understand what the discipline of religious studies is, or what religious studies scholars do.  So why would she be aware of information that is common knowledge for any undergrad who has sat through a survey course on the introduction to religion at a mainstream college or university?

Except that, uh, she is the RELIGION CORRESPONDENT for a major news network, and would thus benefit from knowing not just about the practice of religion, but about the way it is studied as well.

Now, my own mother has been guilty of this (though she’s no RELIGION CORRESPONDENT), one time explaining to me why she would rather have a class on Buddhism, for example, taught by a practicing Buddhist, or on Islam by a practicing Muslim.  And here we have the crux of the problem: for the role of a scholar is not simply to explain what folks believe or what a religion teaches, though that is part of it.  The role of a scholar is also to research and discover if what a religion says about something has any historical veracity or is problematic or even inconsistent.  Our role is to apply critical analysis to our subjects, the same way a scholar of English Literature or Russian History or Quantum Physics would.

Scholars of the Hebrew Scriptures, for example, have argued that there are two competing and contradictory creation stories in Genesis, that the book of Isaiah was composed by at least three authors, that the genealogical narratives in Matthew and Luke disagree, and that Paul only actually composed about half of the letters in the New Testament that bear his name.  And you will find all of these ideas routinely taught in secular state schools like UNCG as well as mainstream seminaries like Princeton and Wake Forest.

It just doesn’t matter what one’s religion is, or even if they have one.  Some of the best and most reliable books on New Testament subjects have been written by Roman Catholics, Protestants, atheists, Jews, Women, and yes, even Muslims.  One’s personal religion simply has no place in scholarship, anymore than being a Christian or Jew or Muslim would affect the way that a biologist studies cells or an astronomer studies space.

Scholarly Books about Jesus

Scholarly Books about Jesus

One’s religion, or lack thereof, may point someone in certain directions and may inform what interests him or her—and may even make what they do a vocation or calling.  It may inform their training and influence their methodologies.  Or it may not.  But it doesn’t make them qualified to study one religion or prevent them from studying another.  One’s training—including those degrees that Dr. Aslan pointed out—is what does that.

As my first religion professor Henry Levinson (a Festive-Naturalist Jew who didn’t hold the traditional concept of God adhered to by his religion) often put it: “It doesn’t take one to know one; it takes one to be one.”

Dr. Henry Levinson

Dr. Henry Levinson

Religious studies scholars are trying to “know” religions and religious people, not “be” them, for that is something tangential at best to our roles as scholars.

So this should be the official motto of all religious studies scholarship, where what one’s religion “is” has no bearing on the quality of the scholarship they do.

Anything less is not scholarship.

It’s simply propaganda.

Adventures at Camp Dad

by Matt McKinnon

Image 1Against the advice of my wife (a licensed educator), I am attempting to keep my three sons—ages four, six, and fifteen—home with me this summer.  (I use the present progressive tense along with the verb “to attempt” because, while the action is ongoing, it may not be ongoing for long, as my wife keeps threatening to close “Camp Dad” for the season.)

This really shouldn’t be a problem, or so my thinking goes.  After all, I work from home and, owing to my wife’s hectic work schedule, do much of the kid-watching during the school year, taking them to and from school and shuttling them to various soccer practices and games throughout central Illinois.

“Why should we spend the money,” I argued, “When I can take care of them here at home for free?”

Why indeed?

Why oh why?!

Image 2We are only three weeks in to the official “Camp Dad: Summer 2013” season and already I have lost two of the kids, though for only a brief period of time.  The first, my four-year-old, I feared had chased after the dog who had escaped into the woods behind our house.  Once I had corralled the mutt (the dog, not my son) with the help of a neighbor, I noticed that the littlest was missing.

As I ran through the neighborhood frantically calling his name, gathering neighbors for an impromptu search party, I heard the car alarm going off and knew immediately where he was: locked in the SUV and trying to open the door with the alarm set.

Image 3

And there he was, safe and sound, and had been—playing a stupid video game, oblivious to the cries of his father—sitting peacefully and calmly while I was screaming my head off, calling his name, and making a spectacle of myself to the neighbors.

And casting real suspicion over this whole “Camp Dad” idea.

As if that wasn’t bad enough, I lost the middle son the next week.  Though again, only for a brief period of time.

Image 4

 We were all going to the local “Safety Town,” inappropriately named it would seem, where kids can ride their bikes and parents can sit in the shade and watch.  Since my two oldest are proficient at bike-riding and my youngest is not, my fifteen-year-old rode ahead along the trail with my six-year-old.  They would go on to the park while my four-year-old and I walked.

The oldest had his phone.  He is marginally reliable.  What could go wrong?

What could go wrong indeed.

They arrived at the park and my oldest son called: all was well.

Image 5My youngest and I arrived by foot some ten minutes later, and I took my place under the shade of a tree, eyes peeled for my six-year-old in his new bike helmet, conspicuous enough with a red plastic Mohawk down the center.

There were a lot of kids, so the first five minutes of not being able to locate him seemed normal.  But five became ten and so I asked my oldest where he was.

Ten became fifteen as my oldest son rode around the park looking for him and I, once again, resorted to the parenting tool I know best and use most often: screaming his name at the top of my lungs.

Image 6

But to no avail.  No relieving sounds of a car alarm to signal where he was.  Nothing.  Just a park full of kids on bikes, but none of them my now-missing six-year-old.

As my eldest continued his futile search of the park on bike, I grabbed by youngest and bolted out the back towards the trail—the Rock Island trail to be exact, a 26-mile rail-to-trail project that runs from Peoria to Only-God-Knows-Where, Illinois.  If he was on that trail, he could be half way to Rock Island for all I knew.

And then my cell phone rang.  There was a lady on the other end.  “We have Lucas” she said calmly and sweetly.

“Oh thank you!” I exclaimed, not even contemplating that she could have meant something more sinister by her words, something out of Liam Neeson’s “Taken” perhaps.  A rather sweet ransom call.

Image 7Luckily, she was no Albanian gangster, but rather a mother out for a walk with her own mother and her seven-year-old son, who was now comforting my lost child.

Evidently, not seeing his older brother, my son decided to ride back to find us, going out the back of the park as we were coming in.  Needless to say, this was not in the itinerary for Camp Dad.

We soon reunited and all was well again.

BUT THIS WAS ONLY WEEK TWO!

I shudder to think what can, or rather will happen next at Camp Dad, where our official motto has become: “Has Anyone Seen My Kids?”

And yet I am steadfast in my resolve, determined that Camp Dad will go on.

My wife, on the other hand, threatens that Camp Dad is about to close, pending legal action.