Tag Archives: critical thinking

HB2: Legislating Bigotry

by Jay Parr

Last Wednesday, March 23, the North Carolina General Assembly convened in its Second Extra Session of the 2016 legislative yearan “emergency” session, with the request for that session and the proclamation that it would be held both filed by the clerk only one day before. The session convened at 10:00 AM, and a new bill was introduced in the state house of representatives. It was debated and amended and passed in the span of five hours, the final vote taking place at 3:04 PM. From there it was passed on to the state senate, where it passed its final vote a little over three hours later, at 6:29 PM. Forty-five minutes after that, at 7:14 PM, Governor Pat McCrory tweeted that he had signed it into law.


That law takes effect today, April 1, 2016. April Fools’ Day. There’s probably some joke about putting such misguided legislation into effect on this, of all days, but you can rest assured that this post is not an April Fools’ Day prank.


I’m the first to admit that I understand very little of what Governor McCrory or the NC General Administration has done in recent years, so it was no surprise to me to learn that they had done something else I found totally baffling. I was, however, a little surprised that they had convened an emergency session to do something I found totally so baffling about something that was so far from an emergency. McCrory’s next tweet, two minutes later, purported to provide something of a justification.


The “Ordinance” to which McCrory refers here is a nondiscrimination ordinance that was set to go into effect in Charlotte today, which would have added “marital status, familial status, sexual orientation, gender identity, [and] gender expression” to the list of protected statuses in such areas as housing and employment, and would have implicitly allowed transgender people to use the restroom facilities best corresponding to their gender identity. That is, it removed the old verbiage more or less requiring this transgendered woman to apply her lipstick in the bathroom with the urinals behind her.


As an aside, I found it interesting that both of McCrory’s tweets used precious characters to invoke the word “bipartisan.” That emphasis prompted me to go look. What I found was far from anything I would describe as bipartisan. The representatives calling for the special session were all Republican, with every Republican representative except one (Chuck McGrady of Henderson) joining the call. No Democrat called for it, nor did NC’s one unaffiliated representative. The thirty-six sponsors of the bill, including the four primary sponsors, were all Republican. In the House vote, every Republican representative got in line with an aye vote. Most of the Democrats and that one unaffiliated representative voted nay. When the bill came to a vote in the senate, the entire Democratic side of the aisle walked out in protest. That bears repeating: Every single Democratic state senator walked out of the senate vote in protest. There were, however, eleven Democratic representatives back in the house, mostly from relatively conservative rural districts, who for some reason or another voted aye. I guess those eleven votes are where McCrory gets his claim that it was “bipartisan.”


While we’re unpacking those tweets, let’s take a look at McCrory’s phrase about the Charlotte ordinance, “allowing men to use women’s bathroom/locker room.” If you read the ordinance deemed so objectionable as to warrant an emergency session of the state legislature, the only relevant language (on p.4, under Section 3) is as follows:

“It shall be unlawful to deny any person the full and equal enjoyment of the goods, services, facilities, privileges, advantages, and accommodations of a place of public accommodation because of race, color, religion, sex, marital status, familial status, sexual orientation, gender identity, gender expression, or national origin.” (PDF)

That language does replace some language specifically excluding “[r]estrooms, shower rooms, bathhouses and similar facilities which are in their nature distinctly private” (the struck-through language on the PDF), but it’s a bit of a stretch to portray it as opening the door for me, as a cisgendered male, to pull on a dress and go lurking about in the ladies’ room.

But that’s the bogeyman that was invoked. This guy. Lurking in the bathroom. Waiting for your wife and daughter.


For the record, that guy’s at a movie with his young niece, who wanted to wear her Cinderella dress but was worried about being teased, so he dressed up in a Cinderella dress along with her. That guy has more cojones than the entire NC General Assembly combined. But I digress.

McCrory’s tweet only works if you define a transgendered woman as a “man.” The only way to define a transgendered woman as a man is to completely ignore the complexity of sex, assigned sex, gender, gender identity, and gender expression. That is, to define a trans woman as a man, you have to insist that one’s gender expression is always dictated entirelyand solelyby the contents of his or her first diaper. You have to insist that sex=gender, always, and without exception, and you basically have to insist that your [sex=gender] equation is always binary, male or female, and deny the existence of intersex people. It’s a slippery slope, even if you dictate your definitions entirely by biology. I give you Pidgeon Pagonis, one of the hundreds of thousands of Americans born neither entirely male nor entirely female, but basically a little bit of both.


Of course, we as a culture have a history of being threatened by exceptions to binary gender. We revile people who do not conform to the gender norms of their assigned sex, and we take it so far as to view a stay-at-home dad as a worthless freeloader and the career-oriented mom who supports him as a heartless, distant, and probably unfit mother. And that’s a couple that is entirely heteronormative. A woman born male, or as she is more commonly described, a “man who wants to be a woman,” just gives Americans the willies. We are so attached to the notion of binary gender that when a baby is born intersex, our first cultural and medical impulse is to subject that baby to “corrective” surgery, to “fix” those nonconforming genitals, and we continue to do so despite the fact that those surgeries are literally a form of genital mutilation, and despite overwhelming evidence that it is both medically and psychologically damaging to the child, and to the adult that child will become. With a background like that, it’s no wonder that certain segments of our population panic at the notion of “penises in women’s rooms.”


But let’s talk for a moment about the utter paucity of evidence indicating that any transgendered person anywhere in the United States has engaged in sexual misconduct in a public bathroom, let alone sexual harassment or predatory misconduct toward a cisgendered victim. You have most likely shared a public restroom with a transgendered person on at least one occasion and never knew it. In fact, despite there being some seven thousand transgendered people for every US senator in the country, you’re more likely to be groped by a senator.


There is, on the other hand, ample evidence of transgender people being harassed, assaulted, and even killed for using public restrooms. Reliable statistics are hard to find, because many law enforcement agencies have only recently begun tracking gender nonconformity as an impetus for hate crimes, but the vast majority of transgender people report having been harassed and bullied, often in bathrooms, usually beginning as early as elementary school. Many have feared for their lives. Many have been physically assaulted. Too many have been killed. To quote an article that appeared in the scholarly journal Aggression and Violent Behavior a few years back:

“[S]ources indicate that violence against transgender people starts early in life, that transgender people are at risk for multiple types and incidences of violence, and that this threat lasts throughout their lives. In addition, transgender people seem to have particularly high risk for sexual violence.” (14.3, pp 170-179)


According to FBI hate-crime statistics for 2014 (which was only the second year gender identity was tracked), “the number of violent crimes motivated by the victim’s gender identity tripled from the year before” (ThinkProgress). Now, we can attribute that jump to a system that’s just starting to track those statistics, but if we go over to the US Department of Justice’s Office for Victims of Crime, we find this little gem about how safe any trans woman really is anywhere:

“50 percent of people who died in violent hate crimes against lesbian, gay, bisexual, transgender, and queer (LGBTQ) people were transgender women[…]. Sexual assault and/or genital mutilation before or after their murders was a frequent occurrence.” (ovc.gov)

Let’s break that quote down a little: Transgender women, who account for maybe five percent of the LGBTQ population, account for half of those killed in hate crimes. Oh, and they’re likely to get raped and/or mutilated in the process. No wonder Madeline Goss doesn’t want to go in the men’s room.


Meanwhile, over in the ladies’ room, the women who are supposed to be “protected” by this law are now legally required to share it with this guy.


Sheffield’s tweet went viral, and while he admits that “It’s super funny to think about some bearded hillbilly in a stall next to the governor’s wife while she clutches her pearls,” the reality of the situation is actually a lot darker, and a whole lot more dangerous for the trans person.

“I can follow the law and go into the women’s room in a state that’s a Stand Your Ground state with a very liberal open carry law, and if I do that, are women gonna stop and ask me if I’m trans? Or are they just going to shoot me because they think I really am a predator because all they see is some bearded guy walking into the women’s room?” (Mic)

Now, this is a guy who can pass comfortably as a cisgendered man, so in reality, he can most likely continue to use the men’s room (in a closed stall, of course) and no one will be the wiser. But what about all the transgendered folks who are early in transition and don’t pass comfortably as either binary gender? What about the genderqueer folk who aren’t comfortable on either side of the gender binary, or the intersex people who don’t biologically fit into either side of the gender binary? Heck, what about the men who just plain have a really feminine physicality? Or the women who just have a really masculine one? Is it justice to force these people into an artificially imposed binary? Is it justice to force them into the room where they are exponentially more likely to be harassed, bullied, assaulted, and even murdered? All so we heteronormative cisgendered folk can avoid maybe being a little uncomfortable? I mean seriously, which bathroom would you have this person use?


Let us not forget that transgender people are already a threatened demographic. Reliable statistics are hard to nail down, because the population has been largely ignored by law-enforcement agencies and social-science researchers alike, so the numbers that are available are usually self-reported and from relatively small sample sizes, so they tend to have wide margins of error. But what they do tell us without a doubt is that most transgender people experience harassment and bullying, usually beginning at a young age, and often coming from figures of authority. They tell us that somewhere around half of transgendered people are rejected by their own families. They tell us transgendered people are orders of magnitude more likely to be homeless, or to be denied basic services, or to have significant mental health issues such as major clinical depressiongee, I wonder whyand that somewhere between one third and one half of all transgender people have attempted to commit suicide at some point in the past.


If you need something a little more personal than statistics, and if you feel like watching a brilliant movie that’s admittedly a little hard to watch, pop over and check out Boys Don’t Cry (1999), starring Hilary Swank in what is arguably her best acting turn ever, as 21-year-old trans man Brandon Teena, who was raped and murdered in 1993 when his cover was blown (Wikipedia). Here’s a trailer that links right to the full-length film.

The McCrory administration and the General Assembly don’t seem to have sought out any of these statistics, or to have considered the impact their actions would have on an already-marginalized and endangered population, before springing into action. Instead, they seem to have done just as they did with Amendment One a few years ago. In yet another decidedly anti-intellectual action, they seem to have acted on ignorance, out of irrational fear of an unsubstantiated bogeyman, to protect a privileged class from having to potentially step outside of their comfort zone a little, and in the process, throwing an already underprivileged classof folks people who are already marginalized by society and by the legal systemunder the bus.


As for McCrory’s rhetoric of the Charlotte ordinance “putting our women and children at risk,” that sounds to me like a thinly-veiled version of Hermann Goering’s “[T]ell them they are being attacked” tactic. After reading some other analyses of HB2, I’m also not entirely certain to what extent the whole mishegas was about some of the other powers that were quietly wrested from the municipalities and consolidated at the state level, such as the authority to determine the terms for public-bidding contracts or to set local a minimum wage. We don’t have space to explore those details here.


Looking at this issue from a more global perspective, I have to point out the fact that all-gender toilet and bathing practices have been common throughout much of the world and throughout much of history, and at levels of social organization ranging from nomadic bands to advanced state-level societies. In much of the world men and women and children, young and old alike, have bathed and do bathe in common, communal spaces, and have and do use common, communal toilet facilities. In some cases those are little more than latrines. In some cases they are advanced bathroom facilities that are designed from the ground up to be shared by members of either (or any) gender. They are almost universally a safe space, policed by the guidelines of community etiquette and often by an additional subset of bath- or toilet-specific etiquette, and they are almost never a space marked by heightened sexual energy, harassment, or bullying.


If we could adopt attitudes more like that in the US, it would certainly take some of the angst out of this who-uses-which-bathroom issue. Unfortunately, I don’t see that happening anytime soon. We’re too steeped in our puritanical taboos about any sort of bodily functions, and our insistence on equating any level of nudity with sex, and our amazingly strong cultural taboos about sexuality and sexual expression outside of a very narrow set of parameters driven mostly by, interestingly enough, the marketing industry.

Maybe if we could get the marketing industry to normalize nonbinary gender, then we wouldn’t have laws that force someone (who just needs to pee) into a situation where (s)he is quite so likely to encounter violence just for existing. Maybe we could create a culture where someone doesn’t have to carry these cards around in his pockets to try to defuse the situation that is sure to arise.


The silver lining to all this just may be that it seems to have opened up a new conversation about trans issues. Maybe, just as Amendment One did, it will help raise enough awareness to tip the balance of public opinion. That’s the best possible outcome I can think of. But until this situation is solved and trans folk can safely use the bathroom that corresponds to their gender identity, the bathroom in my office is open to anyone who needs it. It’s the least I can do.


The Devout Agnostic

by Jay Parr

Sunrise as seen from orbit. Taken by Chris Hadfield aboard the International Space Station.

I am a devout agnostic. No, that is not an oxymoron.

After considerable searching, study, and introspectionand, having been raised in the Protestant Christian tradition, no small amount of internal conflictI have come to rest in the belief that any entity we might reasonably call God would be so alien to our limited human perceptions as to be utterly, and irreconcilably, beyond human comprehension.

Gah. So convoluted. Even after something like a dozen revisions.

Let me try to strip that down. To wit: Humankind cannot understand God. We cannot remotely define God. We wouldn’t know God if it/he/she/they slapped us square in the face. In the end, we cannot say with any certainty that anything we might reasonably call God actually exists. Nor can we say with any certainty that something we might reasonably call God does not exist.

Splash text: I don't know, and you don't either.

To horribly misquote some theologian (or philosopher?) I seem to remember encountering somewhere along the way, humankind can no more understand God than a grasshopper can understand number theory.

I mean, we can’t even wrap our puny little heads around the immensity of the known physical realm (or Creation, if you prefer) without creating incredibly simplistic, and only vaguely representative models.

Let’s look at some of the things we do know. With only a handful of notable exceptions the entirety of human history has happened on, or very near to, the fragile skin of a tiny drop of semi-molten slag just under 8,000 miles across. That’s just under 25,000 miles around, or a little more than two weeks’ driving at 70 mph, if you went non-stop without stopping for meals or potty breaks.

Freight train in the American west, looking dwarfed by the landscape, with mountains visible in the far-off distance.

Even that tiny drop of slag can feel pretty vast to our little human perceptions, as anyone can tell you who has been on a highway in the American West and looked out at that little N-scale model train over there and realized that, no, it’s actually a full-sized freight train, with engines sixteen feet tall and seventy feet long and as heavy as five loaded-down tractor-trailers. And even though you can plainly see the entire length of that little train, it’s actually over a mile long, and creeping along at seventy-five miles per hour. Oh, and that mountain range just over there in the background? Yeah, it’s three hours away.

If we can’t comprehend the majesty of our own landscape, on this thin skin on this tiny droplet of molten slag we call home, how can we imagine the distance even to our own moon?

To-scale image of Earth and the Moon, with the Moon represented by a single pixel.

If you look at this image, in which the moon is depicted as a single pixel, it is 110 pixels to the earth (which itself is only three pixels wide, partially occupying nine pixels). At this scale it would be about eighty-five times the width of that image before you got to the Sun. If you’re bored, click on the image and it will take you to what the author only-half-jokingly calls “a tediously accurate scale model of the solar system,” where you can scroll through endless screens of nothing as you make your way from the Sun to Pluto.

Beyond the Moon, we’re best off talking about distances in terms of the speed of lightas in, how long it takes a ray of light to travel there, cruising along at about 186,000 miles per second, or 670 million miles per hour.

On the scale of our drop of moltener, Earthlight travels pretty fast. A beam of light can travel around to the opposite side of the Earth in about a fifteenth of a second. That’s why we can call that toll-free customer-service number and suddenly find ourselves talking to some poor soul who’s working through the night somewhere in Indonesiawhich, for the record, is about as close as you can get to the exact opposite point on the planet without hiring a more expensive employee down in Perth.


That capacity for real-time communication just starts to break down when you get to the Moon. At that distance a beam of light, or a radio transmission, takes a little more than a second (about 1.28 seconds, to be more accurate). So the net result is about a two-and-a-half-second lag round-trip. Enough to be noticeable, but it has rarely been a problem, asin all of human historyonly two dozen people have ever been that far away from the Earth (all of them white American men, by the way), and no one has been any further. By the way, that image of the Earthrise up there? That was taken with a very long lens, and then I cropped the image even more for this post, so it looks a lot closer than it really is.

Beyond the Moon, the distances get noticeable even at the speed of light, as the Sun is about four hundred times further away than the Moon. Going back up to that scale model in which the Earth is three pixels wide, if the Earth and Moon are about an inch and a half apart on your typical computer screen, the Sun would be about the size of a softball and fifty feet away (so for a handy visual, the Sun is a softball at the front of a semi trailer and the Earth is a grain of sand back by the doors). Traveling at 186,000 miles per second, light from the Sun makes the 93-million-mile trip to Earth in about eight minutes and twenty seconds.


Even with all that empty space, our three pixels against the fifty feet to the Sun, we’re still right next door. The same sunlight that reaches us in eight minutes takes four hours and ten minutes to reach Neptune, the outermost planet of our solar system since poor Pluto got demoted. If you’re still looking at that scale model, where we’re three pixels wide and the sun is a softball fifty feet away, that puts Neptune about a quarter of a mile away and the size of a small bead. And that’s still within our home solar system. Well within our solar system if you include all the smaller dwarf planets, asteroids, and rubble of the Kuiper Belt (including Pluto, which we now call a dwarf planet).

To get to our next stellar neighbor at this scale, we start out at Ocean Isle Beach, find the grain of sand that is Earth (and the grain of very fine sand an inch and a half away that is the Moon), drop that softball fifty feet away to represent the Sun, lay out a few more grains of sand and a few little beads between the Atlantic Ocean and the first dune to represent the rest of the major bodies in our solar system, and then we drive all the way across the United States, the entire length of I-40 and beyond, jogging down the I-15 (“the” because we’re on the west coast now) to pick up the I-10 through Los Angeles and over to the Pacific Ocean at Santa Monica, where we walk out to the end of the Santa Monica Pier and set down a golf ball to represent Proxima Centauri. And that’s just the star that’s right next door.

See what I’m getting at?

What’s even more mind-bending than the vast distances and vast emptiness of outer space, is that our universe is every bit as vast at the opposite end of the size spectrum. The screen you’re reading this on, the hand you’re scrolling with—even something as dense as a solid ingot of gold bullion—is something like 99.999999999% empty space (and that’s a conservative estimate). Take a glance at this comparison of our solar system against a gold atom, if both the Sun and the gold nucleus had a radius of one foot. You’ll see that the outermost electron in the gold atom would be more than twice the distance of Pluto.


And even though that nucleus looks kind of like a mulberry in this illustration, we now know that those protons and neutrons are, once again, something on the order of being their own solar systems compared to the quarks that constitute them. There’s enough wiggle room in there that at the density of a neutron star, our entire planet would be condensed to the size of a child’s marble. And for all we know, those quarks are made up of still tinier particles. We’re not even sure if they’re actually anything we would call solid matter or if they’re just some kind of highly-organized energy waves. In experiments, they kind of act like both.

This is not mysticism, folks. This is just physics.

The crux of all this is that, with our limited perception and our limited ability to comprehend vast scales, the universe is both orders of magnitude larger and orders of magnitude smaller than we can even begin to wrap our minds around. We live our lives at a very fixed scale, unable to even think about that which is much larger or much smaller than miles, feet, or fractions of an inch (say, within six or seven zeroes).

Those same limitations of scale apply in a very literal sense when we start talking about our perception of such things as the electromagnetic spectrum and the acoustic spectrum. Here’s an old chart of the electromagnetic spectrum from back in the mid-’40s. You can click on the image to expand it in a new tab.


If you look at about the two-thirds point on that spectrum you can see the narrow band that is visible light. We can see wavelengths from about 750 nanometers (400 terahertz) at the red end, to 380 nm (800 THz) at the blue end. In other words, the longest wavelength we can see is right at twice the length, or half the frequency, of the shortest wavelength we can see. If our hearing were so limited, we would only be able to hear one octave. Literally. One single octave.

We can feel some of the longer wavelengths as radiant heat, and some of the shorter wavelengths (or their aftereffects) as sunburn, but even all that is only three or four orders of magnitudetwo or three zeroesand if you look at that chart, you’ll see that it’s a logarithmic scale that spans twenty-seven orders of magnitude.

If we could see the longer wavelengths our car engines would glow and our brake rotors would glow and our bodies would glow, and trees and plants would glow blazing white in the sunlight. A little longer and all the radio towers would be bright lights from top to bottom, and the cell phone towers would have bright bars like fluorescent tubes at the tops of them, and there would be laser-bright satellites in the sky, and our cell phones would flicker and glow, and our computers, and our remotes, and our wireless ear buds, and all the ubiquitous little radios that are in almost everything anymore. It would look like some kind of surreal Christmas.


If we could see shorter wavelengths our clothing would be transparent, and our bodies would be translucent, and the night sky would look totally different. Shorter still and we could see bright quasi-stellar objects straight through the Earth. It would all be very disorienting.

Of course, the ability to perceive such a range of wavelengths would require different organs, once you got beyond the near-ultraviolet that some insects can see and the near-infrared that some snakes can see. And in the end, one might argue that our limited perception of the electromagnetic spectrum is just exactly what we’ve needed to survive this far.

I was going to do the same thing with the vastness of acoustic spectrum against the limitations of human hearing here, but I won’t get into it because acoustics is basically just a subset of fluid dynamics. What we hear as sound is things movingpressure waves against our eardrums, to be precisebut similar theories can be applied from the gravitational interaction of galaxy clusters (on a time scale of eons) to the motion of molecules bumping into one another (on the order of microseconds), and you start getting into math that looks like this…


…and I’m an English major with a graduate degree in creative writing. That image could just as easily be a hoax, and I would be none the wiser. So let’s just leave it at this: There’s a whole lot we can’t hear, either.

We also know for a fact that time is not quite as linear as we would like to think. Einstein first theorized that space and time were related, and that movement through space would affect movement through time (though gravity also plays in there, just to complicate matters). We do just begin to see it on a practical level with our orbiting spacecraft. It’s not very bigthe International Space Station will observe a differential of about one second over its decades-long lifespanbut our navigational satellites do have to adjust for it so your GPS doesn’t drive you to the wrong Starbucks.

Physicists theorize that time does much stranger things on the scale of the universe, and in some of the bizarre conditions that can be found. Time almost breaks down completely in a black hole, for instance. Stephen Hawking has posited (and other theoretical astrophysicists agree) that even if the expanding universe were to reverse course and start contracting, which has not been ruled out as a possibility, it would still be an expanding universe because at that point time would have also reversed itself. Or something like that; this is probably a hugely oversimplified layman’s reading of it. But still, to jump over to popular culture, specifically a television series floating somewhere between science fiction and fantasy, the Tenth Doctor probably said it best:


So far we’ve been talking about physical facts. When we get into how our brains process those facts, things become even more uncertain. We do know that of the information transmitted to our brains via the optic and auditory nerves, the vast majority of it is summarily thrown out without getting any cognitive attention at all. What our brains do process is, from the very beginning, distorted by filters and prejudices that we usually don’t even notice. It’s called conceptually-driven processing, and it has been a fundamental concept in both cognitive psychology and consumer-marketing research for decades (why yes, you should be afraid). Our perceptual set can heavily influence how we interpret what we see—and even what information we throw away to support our assumptions. I’m reminded of that old selective-attention test from a few years back:

There are other fun videos by the same folks on The Invisible Gorilla, but this is a pretty in-your-face example of how we can tune out things that our prejudices have deemed irrelevant, even if it’s a costume gorilla beating its chest right in the middle of the scene. As it turns out, we can only process a limited amount of sensory information in a given time (a small percentage of what’s coming in), so the very first thing our brains do is throw out most of it, before filling in the gaps with our own assumptions about how things should be.

As full of holes as our perception is, our memory process is even worse. We know that memory goes through several phases, from the most ephemeral, sensory memory, which is on the order of fractions of a second, to active memory, on the order of tens of seconds, to various iterations of long-term memory. At each stage, only a tiny portion of the information is selected and passed on to the next. And once something makes it through all those rounds of selection to make it into long-term memory, there is evidence in cognitive neuroscience that in order to retrieve those memories, we have to destroy them first. That’s right; the act of recalling a long-term memory back into active memory physically destroys it. That means that when you think about that dim memory from way back in your childhood (I’m lying on the living-room rug leafing through a volume of our off-brand encyclopedia while my mother works in the kitchen), you’re actually remembering the last time you remembered it. Because the last time you remembered it, you obliterated that memory in the process, and had to remember it all over again.

I’ve heard it said that if scientists ran the criminal-justice system, eyewitness testimony would be inadmissible in court. Given the things we know about perception and memory (especially in traumatic situations), that might not be such a bad idea.



So far I have avoided the topic of religion itself. I’m about to change course, and I know that this is where I might write something that offends someone. So I want to start out with the disclaimer that what I’m writing here is only my opiniononly my experienceand I recognize that everyone’s religious journey is individual, unique, and deeply personal. I’m not here to convert anyone, and I’m not here to pooh-pooh anyone’s religious convictions. Neither am I here to be converted. I respect your right to believe what you believe and to practice your religion as you see fitprovided you respect my right to do the same. Having stated that

Most of the world’s older religions started out as oral traditions. Long before being written down they had been handed down in storytelling, generation after generation after generation, mutating along the way, until what ends up inscribed in the sacred texts might be completely unrecognizable to the scribes’ great-great-grandparents. Written traditions are somewhat more stable, but until the advent of typography, every copy was still transcribed by hand, and subject to the interpretations, misinterpretations, and agendas of the scribes doing the copying.

Acts of translation are even worse. Translation is, by its very nature, an act of deciding what to privilege and what to sacrifice in the source text. I have experienced that process first-hand in my attempts to translate 14th-century English into 21st-century English. Same language, only 600 years later.


Every word is a decision: Do I try to preserve a particular nuance at the expense of the poetic meter of the phrase? Do I use two hundred words to convey the meaning that is packed into these twenty words? How do I explain this cultural reference that is meaningless to us, but would have been as familiar to the intended audience as we woulds find a Seinfeld reference? Can I go back to my translation ten years after the fact and change that word that seemed perfect at the time but that has since proven a nagging source of misinterpretation? Especially in the translation of sacred texts, where people will hang upon the interpretation of a single word, forgetting entirely that it’s just some translator’s best approximation. Wars have been fought over such things.

The Muslim world might have the best idea here, encouraging its faithful to learn and study their scriptures in Arabic rather than rely on hundreds of conflicting translations in different languages. Added bonus: You get a common language everyone can use.


But the thing is, even without the vagaries of translation, human language isat besta horribly imprecise tool. One person starts out with an idea in mind. That person approximates that idea as closely as they can manage, using the clumsy symbols that make up any given languageusually composing on the flyand transmits that language to its intended recipient through some method, be it speech or writing or gestural sign language. The recipient listens to that sequence of sounds, or looks at that sequence of marks or gestures, and interprets them back into a series of symbolic ideas, assembling those ideas back together with the help of sundry contextual clues to approximatehopefully—something resembling what the speaker had in mind.

It’s all fantastically imprecisewristwatch repair with a sledgehammerand when you add in the limitations of the listener’s perceptual set it’s obvious how a rhinoceros becomes a unicorn. I say “tree,” thinking of the huge oak in my neighbor’s back yard, but one reader pictures a spruce, another a dogwood, another a magnolia. My daughter points to the rosemary tree in our dining room, decorated with tinsel for the holidays. The mathematician who works in logic all day imagines data nodes arranged in a branching series of nonrecursive decisions. The genealogist sees a family history.

Humans are also infamously prone to hyperbole. Just ask your second cousin about that bass he had halfway in the boat last summer before it wriggled off the hook. They’re called fish stories for a reason. As an armchair scholar of medieval English literature, I can tell you that a lot of texts presented as history, with a straight face, bear reading with a healthy dose of skepticism. According to the 12th-century History of the Kings of Britain, that nation was founded when some guy named Brutus, who gets his authority by being the grandson of Aeneas (yeah, the one from Greek mythology), sailed up the Thames, defeated the handful of giants who were the sole inhabitants of the whole island, named the island after himself (i.e., Britain), and established the capital city he called New Troy, which would later be renamed London. Sounds legit.


In the beginning of Sir Gawain and the Green Knight, Gawain beheads the huge green man who has challenged him to a one-blow-for-one-blow duel, right there in front of the whole Arthurian court, but the man picks up his head, laughs at Gawain, hops back on his horse, and rides off. Granted, Gawain is presented as allegory rather than fact, but Beowulf is presented as fact, and he battles a monster underwater for hours, then kills a dragon when he’s in his seventies.

Heck, go back to ancient Greek literature and the humans and the gods routinely get into each other’s business, helping each other out, meddling in each other’s affairs, deceiving and coercing each other into to do things, getting caught up in petty jealousies, and launching wars out of spite or for personal gain. Sound familiar?

As for creation stories, there are almost as many of those as there are human civilizations. We have an entire three-credit course focused on creation stories, and even that only has space to address a small sampling of them.


Likewise, there are almost as many major religious texts as there are major civilizations. The Abrahamic traditions have their Bible and their Torah and their Qur’an and Hadith, and their various apocryphal texts, all of which are deemed sacrosanct and infallible by at least a portion of their adherents. The Buddhists have their Sutras. The Hindus have their Vedas, Upanishads, and Bhagavad Gita. The Shinto have their Kojiki. The Taoists have their Tao Te Ching. Dozens of other major world religions have their own texts, read and regarded as sacred by millions. The countless folk religions around the world have their countless oral traditions, some of which have been recorded and some of which have not.

Likewise, there are any number of religions that have arisen out of personality cults, sometimes following spiritual leaders of good faith, sometimes following con artists and charlatans. Sometimes those cults implode early. Sometimes they endure. Sometimes they become major world religions.


At certain levels of civilization, it is useful to have explanations for the unexplainable, symbolic interpretations of the natural world, narratives of origin and identityeven absolute codes of conduct. Religious traditions provide their adherents with comfort, moral guidance, a sense of belonging, and the foundations of strong communities.

However, religion has also been abused throughout much of recorded history, to justify keeping the wealthy and powerful in positions of wealth and power, to justify keeping major segments of society in positions of abject oppression, to justify vast wars, profitable to the most powerful and the least at risk, at the expense of the lives and livelihoods of countless less-powerful innocents.

A lot of good has been done in the name of religion. So has a lot of evil. And before we start talking about Islamist violence, let us remember that millions have been slaughtered in the name of Christianity. Almost every religion has caused bloodshed in its history, and every major religion has caused major bloodshed at some point in its history. Even the Buddhists. And there’s almost always some element of we’re-right-and-you’re-wrong very close to the center of that bloodshed.


But what if we’re all wrong?

If we can’t begin to comprehend the vastness of the universe or the emptiness of what we consider solid, if we can only sense a tiny portion of what is going on around us (and through us), and if we don’t even know for sure what we have actually seen with our own eyes or heard with our own ears, how can we even pretend to have any handle on an intelligence that might have designed all this? How can we even pretend to comprehend an intelligence that might even be all of this? I mean seriously, is there any way for us to empirically rule out the possibility that our entire known universe is part of some greater intelligence too vast for us to begin to comprehend? That in effect we are, and our entire reality is, a minuscule part of God itself?

In short, the more convinced you are that you understand the true nature of anything we might reasonably call God, the more convinced I am that you are probably mistaken.


I’m reminded of the bumper sticker I’ve seen: “If you’re living like there’s no God, you’d better be right!” (usually with too many exclamation points). And the debate I had with a street evangelist in which he tried to convince me that it was safer to believe in Jesus if there is no Christian God, than to be a non-believer if he does exist. Nothing like the threat of hell to bring ’em to Jesus. But to me, that kind of thinking is somewhere between a con job and extortion. You’re either asking me to believe you because you’re telling me bad things will happen to me if I don’t believe you, which is circular logic, or you’re threatening me. Either way, I’m not buying. I don’t believe my immortal soul will be either rewarded or punished in the afterlife, because when it comes right down to it, even if something we might reasonably call God does exist, I still don’t think we will experience anything we would recognize as an afterlife. Or that we possess anything we would recognize as an immortal soul.

To answer the incredulous question of a shocked high-school classmate, yes, I do believe that when we die, we more or less just wink out of existence. And no, I’m not particularly worried about that. I don’t think any of us is aware of it when it happens.

But if there’s no recognizable afterlife, no Heaven or Hell, no divine judgment, what’s to keep us from abandoning all morality and doing as we pleasekilling, raping, looting, destroying property and lives with impunity, without fear of divine retribution? Well, if there is no afterlife, if, upon our deaths, we cease to exist as an individual, a consciousness, an immortal soul, or anything we would recognize as an entitywhich, as I have established here, I believe is likely the casethen it logically follows that this life, this flicker of a few years between the development of  consciousness in the womb and the disintegration of that consciousness at death, well, to put it bluntly, this is all we get. This life, and then we’re gone. There is no better life beyond. You can call it nihilism, but I think it’s quite the opposite.

Because if this one life here on Earth is all we get, ever, that means each life is unique, and finite, and precious, and irreplaceable, and in a very real sense, sacred. Belief in an idealized afterlife can be usedtwisted, ratherto justify the killing of innocents. Kill ’em all and let God sort ’em out. The implication being that if the slaughtered were in fact good people, they’re now in a better place. But if there is no afterlife, no divine judgment, no eternal reward or punishment, then the slaughtered innocent are nothing more than that: Slaughtered. Wiped out. Obliterated. Robbed of their one chance at this beautiful, awesome, awful, and by turns astounding and terrifying experience we call life.

Likewise, if this one life is all we get and someone is deliberately maimedwhether physically or emotionally, with human atrocities inflicted upon them or those they love—they don’t get some blissful afterlife to compensate for it. They spend the rest of their existence missing that hand, or having been raped, or knowing that their parents or siblings or children were killed because they happened to have been born in a certain place, or raised with a certain set of religious traditions, or have a certain color of skin or speak a certain language.

In other words, if this one life is all we get? We had damned well better use it wisely. Because we only get this one chance to sow as much beauty, as much joy, as much nurturing, and peace, and friendliness, and harmony as possible. We only get this one chance to embrace the new ideas and the new experiences. We only get this one chance to welcome the stranger, and to see the world through their eyes, if only for a moment. We only get this one chance to feed that hungry person, or to give our old coat to that person who is cold, or to offer compassion and solace and aid to that person who has seen their home, family, livelihood, and community destroyed by some impersonal natural disaster or some human evil such as war.


If I’m living like there’s no (recognizable) God, I’d better be doing all I can manage to make this world a more beautiful place, a happier place, a more peaceful place, a better place. For everyone.

As for a God who would see someone living like that, or at least giving it their best shot, and then condemn them to eternal damnation because they failed to do something like accept Jesus Christ as their personal lord and savior? I’m sorry, but I cannot believe in a God like that. I might go so far as to say I flat-out refuse to believe in a God like that. I won’t go so far as to say that no God exists, because as I have said, I believe that we literally have no way of knowing, but I’m pretty sure any God that does exist isn’t that small-minded.


So anyway, happy holidays.

This is an examination of my own considered beliefs, and nothing more. I won’t try to convert you. I will thank you to extend me the same courtesy. You believe what you believe and I believe what I believe, and in all likelihood there is some point at which each of us believes the other is wrong. And that’s okay. If after reading this you find yourself compelled to pray for my salvation, I won’t be offended.

If you celebrate Christmas, I wish you a merry Christmas. If you celebrate the Solstice, I wish you a blessed Solstice. If you celebrate Hanukkah, I wish you (belatedly) a happy Hanukkah. If you celebrate Milad un Nabi, I wish you Eid Mubarak. If some sense of tradition and no small amount of marketing has led you to celebrate the celebratory season beyond any sense of religious conviction, you seem to be in good company. If you celebrate some parody of a holiday such as Giftmas, I wish you the love of family and friends, and some cool stuff to unwrap. If you celebrate Festivus, I wish you a productive airing of grievances. If you’re Dudeist, I abide. If you’re Pastafarian, I wish you noodly appendage and all that. If you don’t celebrate anything? We’re cool.

And if you’re still offended because I don’t happen to believe exactly the same thing you believe? Seriously? You need to get over it.


Science in a Postmodern Age

by Matt McKinnon


I am not a scientist.

Just like many prominent, mostly Republican, politicians responding to the issue of climate change—trying their best to refrain from losing votes from their conservative constituencies while not coming across as being completely out of touch with the modern world—I am not a scientist.

Of course, if you ask most people who are in fact scientists, then somewhere around 87% of them agree that climate change is real and that it is mostly due to human activity (or at least if you ask those scientists who are members of the American Association for the Advancement of Science, as reported by the Pew Research Center).


Then again, if you ask average Americans (the ones who are not scientists), then only about 50% think that human activity is the largest cause of climate change.

That’s quite a disparity (37 points), especially since getting 87% of scientists to agree on anything is not all that easy and arguably represents what we could call a scientific consensus.

This, of course, provides much fodder for comedians like Bill Maher and Jon Stewart as well as many liberals and progressives, who have come to see the problem of science and a skeptical public as a characteristic of contemporary American conservatism.


And this characterization is buttressed by the even more overwhelming discrepancy between the public and scientists on the question of evolution. A 2009 study by Pew found that only 54% of the public believe in evolution (22% of whom believe that it was guided by a supreme being) versus 95% of scientists (where only 8% believe it to be guided by a supernatural power). And that more recent 2014 Pew study bumped the public percentage up to 65% and the scientific consensus up to 98%.

That’s a gap of 33 points, a bit less than the 37 points on the issue of climate change. Sure there’s something to be said for the idea that contemporary conservatism is at odds with science on some fundamental issues.

But not so fast.

For while there is a large discrepancy between scientists and the American public on these core conservative questions, there is also a large and seemingly growing discrepancy between the public and science on issues that cross political lines, or that could even be considered liberal issues.


Take the recent controversy about immunizations.

Just as with climate change and evolution, a large majority of scientists not only think that they are safe and effective, but also think that certain immunizations should be mandatory for participation in the wider society. That same 2014 Pew study found that 86% of scientists think immunizations should be mandatory, compared to 68% of the public.

And the very liberal left is often just as vocal as the conservative right on this issue, with folks like Jenny McCarthy who has claimed that her son’s autism was the result of immunizations despite clear scientific evidence that has debunked any link. At least one study by Yale law professor Dan Kahan shows that those who fear childhood immunizations are pretty much split between liberals and conservatives.


Still, with an 18-point gap between scientists and the public on this issue, that leaves a lot of progressives seemingly in the same position as those conservatives denying the role of human activity in climate change.

Just as interesting, however, is the discrepancy between scientists and the public on building more nuclear power plants—a gap that is greater (20 points) though scientific opinion is less certain. Pew found that 45% of the public favors more nuclear power compared to 65% of scientists.

But what is even more intriguing is that all of these gaps between scientific consensus and public opinion are far less than the discrepancy that exists on the issue of biomedical science, from the use of pesticides to animal testing and the most controversial: genetically modified organisms (GMOs).


That same Pew study found that a whopping 88% of scientists believe that it is safe to eat genetically modified foods, a larger consensus than agree on human activity and climate change, compared to public opinion, which languishes very far back at 37% (a disparity of 51%!).

And 68% of those scientists agree that it is safe to eat foods grown with pesticides, compared to 28% of the public (a gap of 40 points).

But you won’t find many liberal politicians wading publicly into this issue, championing the views of science over a skeptical public. Nor will you find much sympathy from those comedians either.


It seems that when the proverbial shoe is on the other foot, then it is either not problematic that so many plain old folks diverge from scientific opinion, or there is in fact good reason for their skepticism.

Which brings me to my point about science in a postmodern age. For while it is true that there are good reasons to be skeptical of the science on the use of pesticides and GMOs, as well as some of these other issues, the problem is: who decides when to be skeptical and how skeptical we should be?


That is the problem of postmodernism, which strives for a leveling of discourse and has more than a bit of anti-clerical skepticism about it. For if postmodernism teaches us anything it’s that the certitude of reason in the modern age is anything but certain. And while this makes for fun philosophical frolicking by folks like Heidegger, Foucoult, and Habbermas, it is problematic for science, which relies completely on the intuition that reason and observation are the only certain means of discovery we have.

But in a postmodern age, nothing is certain, and nothing is beyond reproach—not the government, or business, or think tanks, or even institutions of higher learning. Not scientific studies or scientists or even science itself. Indeed, not even reason for that matter.


The moorings of the modern era in reason have become unmoored to some extent in our postmodern culture. And this, more than anything else, explains the large gaps on many issues between scientific opinion and that of the public.

And in the interest of full disclosure: I believe human activity is causing climate change and that immunizations are safe and should be required but I am very skeptical of the use of pesticides and eating GMOs.

But what do I know? I’m not a scientist.

Jumping the Shark

By Marc Williams

The Fonz

In 1977, the television sit-com Happy Days began its fifth season with an audacious episode that was different in tone from its first four years of episodes, it took the viewers by surprise. Happy Days’ appeal had always been its nostalgic attitude toward the 1950’s and the likable, down-to-earth characters around whom each episode focused. The motorcycle-riding, leather jacket-wearing heartthrob, Arthur “the Fonz” Fonzarelli–played by Henry Winkler–was the epitome of cool and remains an icon of coolness today.

Unexpectedly, in the fifth season’s premiere episode, the Fonz decided to prove his bravery by jumping over a shark while water skiing. It was a baffling moment in television history. The first four years of the show had nothing to do with water skiing and the Fonz had never been the kind of character who needed to “prove himself” to anyone. More superficially, viewers weren’t accustomed to seeing the Fonz in swimming trunks. It was an odd episode after which the series could never be the same–a point of no return. This moment gave birth to the phrase “jumping the shark,” a term coined by John Hein to describe the moment when a television show betrays its origins–perhaps suggesting that the writers have run out of ideas. Often, shows are thought to have jumped the shark when a key character leaves the show, or if an important new character is introduced. Hein started a website dedicated to the phenomenon, where readers can debate the moment in which their favorite television shows jumped the shark. Hein sold the website in 2006 but the site is still active.

Here’s a clip (via YouTube) of Fonzie’s famous shark jump:

The website argues that virtually any creative endeavor can jump the shark: musical groups, movies, advertising and political campaigns, and so on. But can an educational television program jump the shark? Some have argued that Discovery Channel’s Shark Week has done so.

For years, Shark Week has provided viewers fascinating documentaries about recent shark research and has captured some truly eye-popping footage. For example, the images captured in a 2001 Shark Week episode entitled “Air Jaws” captured some of the most stunning nature film I’ve ever seen–images of enormous great white sharks leaping completely out of the water, attacking seals and seal decoys. Like nearly everything one expects to see on The Discovery Channel, Shark Week is usually both entertaining and educational.

Clip from Air Jaws

Click to view a clip from “Air Jaws”

When watching that spectacular 2001 episode, I wondered to myself–”how will Discovery Channel ever top this?” What shark footage could possibly compete with these amazing images? How will they possibly attract viewers next year? Not surprisingly, Discovery Channel dedicated many of its subsequent Shark Week shows over the past twelve years to more footage of jumping great whites–and not much else. Perhaps the producers acknowledged that indeed, they simply couldn’t surpass the spectacle of “Air Jaws.” Until the 2013 installment of Shark Week, that is.

“Megalodon” was the centerpiece of Shark Week 2013, a documentary about the prehistoric shark that paleontologists believe grew to lengths of 60 feet or more. I’ve always been fascinated by megalodon; my older brother was a shark enthusiast when we were young and I vividly recall him showing me a photo of fossilized megalodon jaws he found in a book–I couldn’t believe that such an enormous creature ever lived. I was awed by the thought of it. Naturally, when I read that Discovery Channel was featuring Megalodon in its 2013 Shark Week series, I set my DVR.

The episode begins with some amateur video footage from a fishing party aboard a boat off the coast of Cape Town, South Africa. The amateur footage ends with some fearsome crashes and the viewer then learns that the footage was recovered from the boat’s wreckage–and that none of the young passengers survived. When my wife and I watched the episode, we both thought the footage looked a little too polished to be amateur footage. My wife said she didn’t remember hearing anything on the news of a horrible boating accident and I didn’t remember such a story either.

Viewers were then introduced to a self-proclaimed expert in mysterious oceanic events: a dubious specialty, held by a man who was perhaps a bit too comfortable in front of the documentarian’s camera.

As the program continues, viewers learn that megalodon may not be extinct after all! And of course, in true Shark Week fashion, there was some stunning footage that offered tantalizing glances of what might be a live megalodon in the ocean. The ocean is a huge place, we’re reminded, and new species are discovered every year. The coelacanth, for instance, was thought to be extinct for over 60 million years until a live specimen was discovered in 1938. Even very large animals like giant squid and megamouth sharks have only been recently captured on film, so the evidence supporting a modern-day megalodon simply can’t be dismissed.

Clip from Megalodon

Click to view a clip from “Megalodon”

The program was extremely entertaining and was easily the most exciting Shark Week show I’ve seen since “Air Jaws.” And not surprisingly, “Megalodon” received the highest ratings in the history of Shark Week. Unfortunately, as you may have guessed, it was all a hoax. Like Animal Planet’s 2012 documentary on mermaids, all of the nature footage and expert testimonies were fabrications. My wife and I hadn’t heard about the vanished boating party on the news because, of course, there never was a boating party. There was virtually nothing true about Discovery Channel’s “Megalodon.” But many viewers were fooled, and subsequently criticized the network for misleading and humiliating the audience.

What do you think? By airing a work of fiction–and presenting it as truth–did Shark Week jump the shark? Have the producers run out of ideas? Have they abandoned Shark Week’s reputation? Or were Shark Week viewers naive all along for seeking education through commercial television?

“So, You Pastor a Church?”

by Matt McKinnon

I'm no pastor and I don't play one on TV.

I’m no pastor, and I don’t play one on TV.

It’s a question I used to get all the time, mostly from me and my wife’s family members.  Good, God-fearing folks (for the most part) who simply assumed that devoting one’s professional life to the study of religion must mean being a pastor—since “religion” must be synonymous with “church.”  Why else would someone spend upwards of eight years in school (after undergrad?!) studying various religions and even languages few people on earth still use?

And while one of my three degrees in religious studies is from a non-denominational “divinity” school (Yale) and my doctorate from a Roman Catholic university (Marquette), my degrees themselves are academic, preparations for scholarship in the academy and not the pulpit.  But that still hasn’t stopped folks from asking the above question, and has also led to invitations to offer prayer at family gatherings, read scripture at special events, and even give short homilies when the situation arises.

Now don’t get me wrong, there’s nothing wrong with being a pastor, or priest, or imam, or rabbi.  Plenty of good folks are in these lines of work, many of whom I have studied alongside of in pursuing my education.  My wife’s cousin, in fact, is a Baptist preacher—a wonderful man who is much more qualified to pray and preach and—God forbid—counsel folks than me.  So the problem is not my disdain for this profession: the problem is that it is not my profession.

But the real issue here is not what I do but rather the underlying problem that most folks have in understanding exactly what “religious studies” does—and how it is different from “theology” and the practice of religion.

This was never as clear as in the recent Fox News interview of religious studies scholar Reza Aslan about his new book on Jesus, “Zealot: The Life and Times of Jesus of Nazareth.”


Lauren Green

Lauren Green

Never mind that Fox religion correspondent Lauren Green gives a horrible interview, spending much more time on what critics have to say about Aslan’s book than on the book itself.  For while this may be bad, even worse is that it becomes painfully clear that she probably has not read the book—and may have not even perused even the first two pages.  But what is most troubling here is that the RELIGION CORRESPONDENT for a major news network is working with the same misunderstandings and ignorance of what exactly religious studies is and what religious studies scholars do as regular folks who are not RELIGION CORRESPONDENTS.


Aslan’s Zealot

Her assumption is that the story here, the big scoop, the underlying issue with Aslan’s book about Jesus is that…the author is a Muslim.  And not just a Muslim, but one who used to be a Christian.  Despite Aslan’s continued attempts to point out that he has a PhD in religious studies, has been studying religions for over twenty years, and has written many books dealing with Christianity, Islam, Judaism, and even Hinduism, Ms. Green cannot get past what she—and many of his critics—see as the real issue: he is a Muslim writing a “controversial” book about Jesus—the “founder” of Christianity as she calls him.

Now I put “controversial” in quotations because, as anyone even remotely aware of scholarship on Christianity knows, the most “controversial” of his claims are nothing new: scholars since the 19th century have been coming to many of the same conclusions that Aslan has come to.  And I put “founder” in quotations as well, since these same folks even tangentially aware of New Testament scholarship know that Jesus himself lived and died a Jew, and never “founded” a new religion.

Dr. Reza Aslan

Dr. Reza Aslan

Not being aware of any of this is not really the problem, but rather a symptom of the bigger issue: Ms. Green, like many folks, simply does not understand what the discipline of religious studies is, or what religious studies scholars do.  So why would she be aware of information that is common knowledge for any undergrad who has sat through a survey course on the introduction to religion at a mainstream college or university?

Except that, uh, she is the RELIGION CORRESPONDENT for a major news network, and would thus benefit from knowing not just about the practice of religion, but about the way it is studied as well.

Now, my own mother has been guilty of this (though she’s no RELIGION CORRESPONDENT), one time explaining to me why she would rather have a class on Buddhism, for example, taught by a practicing Buddhist, or on Islam by a practicing Muslim.  And here we have the crux of the problem: for the role of a scholar is not simply to explain what folks believe or what a religion teaches, though that is part of it.  The role of a scholar is also to research and discover if what a religion says about something has any historical veracity or is problematic or even inconsistent.  Our role is to apply critical analysis to our subjects, the same way a scholar of English Literature or Russian History or Quantum Physics would.

Scholars of the Hebrew Scriptures, for example, have argued that there are two competing and contradictory creation stories in Genesis, that the book of Isaiah was composed by at least three authors, that the genealogical narratives in Matthew and Luke disagree, and that Paul only actually composed about half of the letters in the New Testament that bear his name.  And you will find all of these ideas routinely taught in secular state schools like UNCG as well as mainstream seminaries like Princeton and Wake Forest.

It just doesn’t matter what one’s religion is, or even if they have one.  Some of the best and most reliable books on New Testament subjects have been written by Roman Catholics, Protestants, atheists, Jews, Women, and yes, even Muslims.  One’s personal religion simply has no place in scholarship, anymore than being a Christian or Jew or Muslim would affect the way that a biologist studies cells or an astronomer studies space.

Scholarly Books about Jesus

Scholarly Books about Jesus

One’s religion, or lack thereof, may point someone in certain directions and may inform what interests him or her—and may even make what they do a vocation or calling.  It may inform their training and influence their methodologies.  Or it may not.  But it doesn’t make them qualified to study one religion or prevent them from studying another.  One’s training—including those degrees that Dr. Aslan pointed out—is what does that.

As my first religion professor Henry Levinson (a Festive-Naturalist Jew who didn’t hold the traditional concept of God adhered to by his religion) often put it: “It doesn’t take one to know one; it takes one to be one.”

Dr. Henry Levinson

Dr. Henry Levinson

Religious studies scholars are trying to “know” religions and religious people, not “be” them, for that is something tangential at best to our roles as scholars.

So this should be the official motto of all religious studies scholarship, where what one’s religion “is” has no bearing on the quality of the scholarship they do.

Anything less is not scholarship.

It’s simply propaganda.

What Should We Learn in College? (Part II)

by Wade Maki

In my last post I discussed comments made by our Governor on what sorts of things we should, and shouldn’t, be learning in college. This is a conversation going on across higher education. Of course we should learn everything in college, but this goal is not practical as our time and funds are limited. We are left then to prioritize what things to require of our students, what things will be electives, and what things not to offer at all.

One area we do this prioritization in is “general education” (GE), which is the largest issue in determining what we learn in college. Some institutions have a very broad model for GE that covers classic literature, history, philosophy, and the “things an educated person should know.” Exactly what appears on this list will vary by institution with some being more focused on the arts, some on the humanities, and others on social sciences. The point being that the institution decides a very small core for GE.

The drawback to a conscribed model for GE is that it doesn’t allow for as much student choice. The desire for more choice led to another very common GE system often referred to as “the cafeteria model” whereby many courses are offered as satisfying GE requirements and each student picks preferences for a category. This system is good for student choice of what to learn, but it isn’t good if you want a connected “core” of courses.

In recent years there has been a move to have a “common core” in which all universities within a state would have the same GE requirements. This makes transfers easier since all schools have the same core. However, it also tends to limit the amount of choice by reducing the options to only those courses offered at every school. In addition, it eliminates the local character of an institution’s GE (by making them all the same), which also reduces improvements from having competing systems (when everyone does it their own way, good ideas tend to be replicated). If we don’t try different GE systems on campuses then innovation slows.


No matter which direction we move GE, we still have to address the central question of “what should we learn?” For example, should students learn a foreign language? Of course they should in an ideal world, but consider that foreign language requirements are two years.  We must compare the opportunity costs of that four course requirement (what else could we have learned from four other courses in say economics, psychology, science, or communications?). This is just one example of how complicated GE decisions can be. Every course we require is a limitation on choice and makes it less likely that other (non-required) subjects will be learned.

As many states look at a “common core” model there is an additional consideration which is often overlooked.  Suppose we move to a common core of general education in which most students learn the same sorts of things.  Now imagine your business or work environment where most of your coworkers learned the same types of things but other areas of knowledge were not learned by any of them. Is this preferable to an organization where its already employed educated members learned very little in common but have more diverse educational backgrounds? I suspect an organization with more diverse education employees will be more adaptable than one where there are a few things everyone knows and a lot of things no one knows.


This is my worry about the way we are looking to answer the question of what we should learn in college. In the search for an efficient, easy to transfer, common core we may end up:

  1. Having graduates with more similar educations and the same gaps in their educations.
  2. Losing the unique educational cultures of our institutions.
  3. Missing out on the long term advantage of experimentation across our institutions by imposing one model for everyone.

Not having a common core doesn’t solve the all of the problems, but promoting experiments through diverse and unique educational requirements is worth keeping. There is another problem with GE that I can’t resolve, which is how most of us in college answer the question this way: “Everyone should learn what I did or what I’m teaching.” But that is a problem to be addressed in another posting. So, what should we learn in college?

Environmentalism and the Future

by Matt McKinnon

Let me begin by stating that I consider myself an environmentalist.  I recycle almost religiously.  I compost obsessively.  I keep the thermostat low in winter and high in summer.  I try to limit how much I drive, but as the chauffeur for my three school-age sons, this is quite difficult.  I support environmental causes and organizations when I can, having been a member of the Sierra Club and the Audubon Society.

1I find the arguments of the Climate Change deniers uninformed at best and disingenuous at worst.  Likewise, the idea of certain religious conservatives that it is hubris to believe that humans can have such a large effect on God’s creation strikes me as theologically silly and even dishonest.  And while I understand and even sympathize with the concerns of those folks whose businesses and livelihoods are tied to our current fossil-fuel addiction, I find their arguments that economic interests should override environmental concerns to be lacking in both ethics and basic forethought.

That being said, I have lately begun to ponder not just the ultimate intentions and goals of the environmental movement, but the very future of our planet.

Earth and atmospheric scientists tell us that the earth’s temperature is increasing, most probably as a result of human activity.  And that even if we severely limited that activity (which we are almost certainly not going to do anytime soon), the consequences are going to be dire: rising temperatures will lead to more severe storms, melting polar ice caps, melting permafrost (which in turn will lead to the release of even more carbon dioxide, increasing the warming), rising ocean levels, lowering of the oceans’ ph levels (resulting in the extinction of the coral reefs), devastating floods in some places along with crippling droughts in others.

2And according to a 2007 report by the Intergovernmental Panel on Climate Change, by 2100 (less than 100 years) 25% of all species of plants and land animals may be extinct.

Basically, our not-too-distant future may be an earth that cannot support human life.

Now, in my more misanthropic moments, I have allowed myself to indulge in the idea that this is exactly what the earth needs.  That this in fact should be the goal of any true environmental concern: the extinction of humanity.  For only then does the earth as a planet capable of supporting other life stand a chance.  (After all, the “environment” will survive without life, though it won’t be an especially nice place to visit, much less inhabit, especially for a human.)

3And a good case can be made that humans have been destroying the environment in asymmetrical and irrevocable ways since at least the Neolithic Age when we moved from hunter and gatherer culture to the domestication of plants and animals along with sustained agriculture.  Humans have been damaging the environment ever since.  (Unlike the beaver, as only one example of a “keystone species,” whose effect on the environment in dam building has an overwhelming positive and beneficial impact on countless other species as well as the environment itself.)

4So unless we’re seriously considering a conservation movement that takes us back to the Paleolithic Era instead of simply reducing our current use and misuse of the earth, then we’re really just putting off the inevitable.

But all that being said, whatever the state of our not-too-distant future, the inevitability of the “distant future” is undeniable—for humans, as well as beavers and all plants and animals, and ultimately the earth itself.  For the earth, like all of its living inhabitants, has a finite future.

Around 7.5 billion years or so is a reasonable estimate.  And then it will most probably be absorbed in the sun, which will have swollen into a red giant.

5(Unless, as some scientists predict, the Milky Way collides with the Andromeda galaxy, resulting in cataclysmic effects that cannot be predicted.)

At best, however, this future only includes the possibility of earth supporting life for another billion years or so.  For by then, the increase in the sun’s brightening will have evaporated all of the oceans.

6Of course, long before that, the level of carbon dioxide in the atmosphere (ironically enough) will have diminished well below the quantity needed to support plant life, destroying the food chain and causing the extinction of all animal species as well.

And while that’s not good news, the worse news is that humans will have been removed from the equation long before the last holdouts of carbon-based life-forms eventually capitulate.

(Ok, so some microbes may be able to withstand the dry inhospitable conditions of desert earth, but seriously, who cares about the survival of microbes?)

Now if we’re optimistic about all of this (irony intended), the best-case scenario is for an earth that is able to support life as we know it for at most another half billion more years.  (Though this may be a stretch.)  And while that seems like a really long time, we should consider that the earth has already been inhabited for just over 3 and a half billion years.

So having only a half billion years left is sort of like trying to enjoy the last afternoon of a four-day vacation.


Enjoy the rest of your day.