Tag Archives: critical thinking

Jumping the Shark

By Marc Williams

The Fonz

In 1977, the television sit-com Happy Days began its fifth season with an audacious episode that was different in tone from its first four years of episodes, it took the viewers by surprise. Happy Days’ appeal had always been its nostalgic attitude toward the 1950’s and the likable, down-to-earth characters around whom each episode focused. The motorcycle-riding, leather jacket-wearing heartthrob, Arthur “the Fonz” Fonzarelli–played by Henry Winkler–was the epitome of cool and remains an icon of coolness today.

Unexpectedly, in the fifth season’s premiere episode, the Fonz decided to prove his bravery by jumping over a shark while water skiing. It was a baffling moment in television history. The first four years of the show had nothing to do with water skiing and the Fonz had never been the kind of character who needed to “prove himself” to anyone. More superficially, viewers weren’t accustomed to seeing the Fonz in swimming trunks. It was an odd episode after which the series could never be the same–a point of no return. This moment gave birth to the phrase “jumping the shark,” a term coined by John Hein to describe the moment when a television show betrays its origins–perhaps suggesting that the writers have run out of ideas. Often, shows are thought to have jumped the shark when a key character leaves the show, or if an important new character is introduced. Hein started a website dedicated to the phenomenon, where readers can debate the moment in which their favorite television shows jumped the shark. Hein sold the website in 2006 but the site is still active.

Here’s a clip (via YouTube) of Fonzie’s famous shark jump:

The website argues that virtually any creative endeavor can jump the shark: musical groups, movies, advertising and political campaigns, and so on. But can an educational television program jump the shark? Some have argued that Discovery Channel’s Shark Week has done so.

For years, Shark Week has provided viewers fascinating documentaries about recent shark research and has captured some truly eye-popping footage. For example, the images captured in a 2001 Shark Week episode entitled “Air Jaws” captured some of the most stunning nature film I’ve ever seen–images of enormous great white sharks leaping completely out of the water, attacking seals and seal decoys. Like nearly everything one expects to see on The Discovery Channel, Shark Week is usually both entertaining and educational.

Clip from Air Jaws

Click to view a clip from “Air Jaws”

When watching that spectacular 2001 episode, I wondered to myself–”how will Discovery Channel ever top this?” What shark footage could possibly compete with these amazing images? How will they possibly attract viewers next year? Not surprisingly, Discovery Channel dedicated many of its subsequent Shark Week shows over the past twelve years to more footage of jumping great whites–and not much else. Perhaps the producers acknowledged that indeed, they simply couldn’t surpass the spectacle of “Air Jaws.” Until the 2013 installment of Shark Week, that is.

“Megalodon” was the centerpiece of Shark Week 2013, a documentary about the prehistoric shark that paleontologists believe grew to lengths of 60 feet or more. I’ve always been fascinated by megalodon; my older brother was a shark enthusiast when we were young and I vividly recall him showing me a photo of fossilized megalodon jaws he found in a book–I couldn’t believe that such an enormous creature ever lived. I was awed by the thought of it. Naturally, when I read that Discovery Channel was featuring Megalodon in its 2013 Shark Week series, I set my DVR.

The episode begins with some amateur video footage from a fishing party aboard a boat off the coast of Cape Town, South Africa. The amateur footage ends with some fearsome crashes and the viewer then learns that the footage was recovered from the boat’s wreckage–and that none of the young passengers survived. When my wife and I watched the episode, we both thought the footage looked a little too polished to be amateur footage. My wife said she didn’t remember hearing anything on the news of a horrible boating accident and I didn’t remember such a story either.

Viewers were then introduced to a self-proclaimed expert in mysterious oceanic events: a dubious specialty, held by a man who was perhaps a bit too comfortable in front of the documentarian’s camera.

As the program continues, viewers learn that megalodon may not be extinct after all! And of course, in true Shark Week fashion, there was some stunning footage that offered tantalizing glances of what might be a live megalodon in the ocean. The ocean is a huge place, we’re reminded, and new species are discovered every year. The coelacanth, for instance, was thought to be extinct for over 60 million years until a live specimen was discovered in 1938. Even very large animals like giant squid and megamouth sharks have only been recently captured on film, so the evidence supporting a modern-day megalodon simply can’t be dismissed.

Clip from Megalodon

Click to view a clip from “Megalodon”

The program was extremely entertaining and was easily the most exciting Shark Week show I’ve seen since “Air Jaws.” And not surprisingly, “Megalodon” received the highest ratings in the history of Shark Week. Unfortunately, as you may have guessed, it was all a hoax. Like Animal Planet’s 2012 documentary on mermaids, all of the nature footage and expert testimonies were fabrications. My wife and I hadn’t heard about the vanished boating party on the news because, of course, there never was a boating party. There was virtually nothing true about Discovery Channel’s “Megalodon.” But many viewers were fooled, and subsequently criticized the network for misleading and humiliating the audience.

What do you think? By airing a work of fiction–and presenting it as truth–did Shark Week jump the shark? Have the producers run out of ideas? Have they abandoned Shark Week’s reputation? Or were Shark Week viewers naive all along for seeking education through commercial television?

“So, You Pastor a Church?”

by Matt McKinnon

I'm no pastor and I don't play one on TV.

I’m no pastor, and I don’t play one on TV.

It’s a question I used to get all the time, mostly from me and my wife’s family members.  Good, God-fearing folks (for the most part) who simply assumed that devoting one’s professional life to the study of religion must mean being a pastor—since “religion” must be synonymous with “church.”  Why else would someone spend upwards of eight years in school (after undergrad?!) studying various religions and even languages few people on earth still use?

And while one of my three degrees in religious studies is from a non-denominational “divinity” school (Yale) and my doctorate from a Roman Catholic university (Marquette), my degrees themselves are academic, preparations for scholarship in the academy and not the pulpit.  But that still hasn’t stopped folks from asking the above question, and has also led to invitations to offer prayer at family gatherings, read scripture at special events, and even give short homilies when the situation arises.

Now don’t get me wrong, there’s nothing wrong with being a pastor, or priest, or imam, or rabbi.  Plenty of good folks are in these lines of work, many of whom I have studied alongside of in pursuing my education.  My wife’s cousin, in fact, is a Baptist preacher—a wonderful man who is much more qualified to pray and preach and—God forbid—counsel folks than me.  So the problem is not my disdain for this profession: the problem is that it is not my profession.

But the real issue here is not what I do but rather the underlying problem that most folks have in understanding exactly what “religious studies” does—and how it is different from “theology” and the practice of religion.

This was never as clear as in the recent Fox News interview of religious studies scholar Reza Aslan about his new book on Jesus, “Zealot: The Life and Times of Jesus of Nazareth.”

.

Lauren Green

Lauren Green

Never mind that Fox religion correspondent Lauren Green gives a horrible interview, spending much more time on what critics have to say about Aslan’s book than on the book itself.  For while this may be bad, even worse is that it becomes painfully clear that she probably has not read the book—and may have not even perused even the first two pages.  But what is most troubling here is that the RELIGION CORRESPONDENT for a major news network is working with the same misunderstandings and ignorance of what exactly religious studies is and what religious studies scholars do as regular folks who are not RELIGION CORRESPONDENTS.

Zealot

Aslan’s Zealot

Her assumption is that the story here, the big scoop, the underlying issue with Aslan’s book about Jesus is that…the author is a Muslim.  And not just a Muslim, but one who used to be a Christian.  Despite Aslan’s continued attempts to point out that he has a PhD in religious studies, has been studying religions for over twenty years, and has written many books dealing with Christianity, Islam, Judaism, and even Hinduism, Ms. Green cannot get past what she—and many of his critics—see as the real issue: he is a Muslim writing a “controversial” book about Jesus—the “founder” of Christianity as she calls him.

Now I put “controversial” in quotations because, as anyone even remotely aware of scholarship on Christianity knows, the most “controversial” of his claims are nothing new: scholars since the 19th century have been coming to many of the same conclusions that Aslan has come to.  And I put “founder” in quotations as well, since these same folks even tangentially aware of New Testament scholarship know that Jesus himself lived and died a Jew, and never “founded” a new religion.

Dr. Reza Aslan

Dr. Reza Aslan

Not being aware of any of this is not really the problem, but rather a symptom of the bigger issue: Ms. Green, like many folks, simply does not understand what the discipline of religious studies is, or what religious studies scholars do.  So why would she be aware of information that is common knowledge for any undergrad who has sat through a survey course on the introduction to religion at a mainstream college or university?

Except that, uh, she is the RELIGION CORRESPONDENT for a major news network, and would thus benefit from knowing not just about the practice of religion, but about the way it is studied as well.

Now, my own mother has been guilty of this (though she’s no RELIGION CORRESPONDENT), one time explaining to me why she would rather have a class on Buddhism, for example, taught by a practicing Buddhist, or on Islam by a practicing Muslim.  And here we have the crux of the problem: for the role of a scholar is not simply to explain what folks believe or what a religion teaches, though that is part of it.  The role of a scholar is also to research and discover if what a religion says about something has any historical veracity or is problematic or even inconsistent.  Our role is to apply critical analysis to our subjects, the same way a scholar of English Literature or Russian History or Quantum Physics would.

Scholars of the Hebrew Scriptures, for example, have argued that there are two competing and contradictory creation stories in Genesis, that the book of Isaiah was composed by at least three authors, that the genealogical narratives in Matthew and Luke disagree, and that Paul only actually composed about half of the letters in the New Testament that bear his name.  And you will find all of these ideas routinely taught in secular state schools like UNCG as well as mainstream seminaries like Princeton and Wake Forest.

It just doesn’t matter what one’s religion is, or even if they have one.  Some of the best and most reliable books on New Testament subjects have been written by Roman Catholics, Protestants, atheists, Jews, Women, and yes, even Muslims.  One’s personal religion simply has no place in scholarship, anymore than being a Christian or Jew or Muslim would affect the way that a biologist studies cells or an astronomer studies space.

Scholarly Books about Jesus

Scholarly Books about Jesus

One’s religion, or lack thereof, may point someone in certain directions and may inform what interests him or her—and may even make what they do a vocation or calling.  It may inform their training and influence their methodologies.  Or it may not.  But it doesn’t make them qualified to study one religion or prevent them from studying another.  One’s training—including those degrees that Dr. Aslan pointed out—is what does that.

As my first religion professor Henry Levinson (a Festive-Naturalist Jew who didn’t hold the traditional concept of God adhered to by his religion) often put it: “It doesn’t take one to know one; it takes one to be one.”

Dr. Henry Levinson

Dr. Henry Levinson

Religious studies scholars are trying to “know” religions and religious people, not “be” them, for that is something tangential at best to our roles as scholars.

So this should be the official motto of all religious studies scholarship, where what one’s religion “is” has no bearing on the quality of the scholarship they do.

Anything less is not scholarship.

It’s simply propaganda.

What Should We Learn in College? (Part II)

by Wade Maki

In my last post I discussed comments made by our Governor on what sorts of things we should, and shouldn’t, be learning in college. This is a conversation going on across higher education. Of course we should learn everything in college, but this goal is not practical as our time and funds are limited. We are left then to prioritize what things to require of our students, what things will be electives, and what things not to offer at all.

One area we do this prioritization in is “general education” (GE), which is the largest issue in determining what we learn in college. Some institutions have a very broad model for GE that covers classic literature, history, philosophy, and the “things an educated person should know.” Exactly what appears on this list will vary by institution with some being more focused on the arts, some on the humanities, and others on social sciences. The point being that the institution decides a very small core for GE.

The drawback to a conscribed model for GE is that it doesn’t allow for as much student choice. The desire for more choice led to another very common GE system often referred to as “the cafeteria model” whereby many courses are offered as satisfying GE requirements and each student picks preferences for a category. This system is good for student choice of what to learn, but it isn’t good if you want a connected “core” of courses.

In recent years there has been a move to have a “common core” in which all universities within a state would have the same GE requirements. This makes transfers easier since all schools have the same core. However, it also tends to limit the amount of choice by reducing the options to only those courses offered at every school. In addition, it eliminates the local character of an institution’s GE (by making them all the same), which also reduces improvements from having competing systems (when everyone does it their own way, good ideas tend to be replicated). If we don’t try different GE systems on campuses then innovation slows.

Image

No matter which direction we move GE, we still have to address the central question of “what should we learn?” For example, should students learn a foreign language? Of course they should in an ideal world, but consider that foreign language requirements are two years.  We must compare the opportunity costs of that four course requirement (what else could we have learned from four other courses in say economics, psychology, science, or communications?). This is just one example of how complicated GE decisions can be. Every course we require is a limitation on choice and makes it less likely that other (non-required) subjects will be learned.

As many states look at a “common core” model there is an additional consideration which is often overlooked.  Suppose we move to a common core of general education in which most students learn the same sorts of things.  Now imagine your business or work environment where most of your coworkers learned the same types of things but other areas of knowledge were not learned by any of them. Is this preferable to an organization where its already employed educated members learned very little in common but have more diverse educational backgrounds? I suspect an organization with more diverse education employees will be more adaptable than one where there are a few things everyone knows and a lot of things no one knows.

Image

This is my worry about the way we are looking to answer the question of what we should learn in college. In the search for an efficient, easy to transfer, common core we may end up:

  1. Having graduates with more similar educations and the same gaps in their educations.
  2. Losing the unique educational cultures of our institutions.
  3. Missing out on the long term advantage of experimentation across our institutions by imposing one model for everyone.

Not having a common core doesn’t solve the all of the problems, but promoting experiments through diverse and unique educational requirements is worth keeping. There is another problem with GE that I can’t resolve, which is how most of us in college answer the question this way: “Everyone should learn what I did or what I’m teaching.” But that is a problem to be addressed in another posting. So, what should we learn in college?

Environmentalism and the Future

by Matt McKinnon

Let me begin by stating that I consider myself an environmentalist.  I recycle almost religiously.  I compost obsessively.  I keep the thermostat low in winter and high in summer.  I try to limit how much I drive, but as the chauffeur for my three school-age sons, this is quite difficult.  I support environmental causes and organizations when I can, having been a member of the Sierra Club and the Audubon Society.

1I find the arguments of the Climate Change deniers uninformed at best and disingenuous at worst.  Likewise, the idea of certain religious conservatives that it is hubris to believe that humans can have such a large effect on God’s creation strikes me as theologically silly and even dishonest.  And while I understand and even sympathize with the concerns of those folks whose businesses and livelihoods are tied to our current fossil-fuel addiction, I find their arguments that economic interests should override environmental concerns to be lacking in both ethics and basic forethought.

That being said, I have lately begun to ponder not just the ultimate intentions and goals of the environmental movement, but the very future of our planet.

Earth and atmospheric scientists tell us that the earth’s temperature is increasing, most probably as a result of human activity.  And that even if we severely limited that activity (which we are almost certainly not going to do anytime soon), the consequences are going to be dire: rising temperatures will lead to more severe storms, melting polar ice caps, melting permafrost (which in turn will lead to the release of even more carbon dioxide, increasing the warming), rising ocean levels, lowering of the oceans’ ph levels (resulting in the extinction of the coral reefs), devastating floods in some places along with crippling droughts in others.

2And according to a 2007 report by the Intergovernmental Panel on Climate Change, by 2100 (less than 100 years) 25% of all species of plants and land animals may be extinct.

Basically, our not-too-distant future may be an earth that cannot support human life.

Now, in my more misanthropic moments, I have allowed myself to indulge in the idea that this is exactly what the earth needs.  That this in fact should be the goal of any true environmental concern: the extinction of humanity.  For only then does the earth as a planet capable of supporting other life stand a chance.  (After all, the “environment” will survive without life, though it won’t be an especially nice place to visit, much less inhabit, especially for a human.)

3And a good case can be made that humans have been destroying the environment in asymmetrical and irrevocable ways since at least the Neolithic Age when we moved from hunter and gatherer culture to the domestication of plants and animals along with sustained agriculture.  Humans have been damaging the environment ever since.  (Unlike the beaver, as only one example of a “keystone species,” whose effect on the environment in dam building has an overwhelming positive and beneficial impact on countless other species as well as the environment itself.)

4So unless we’re seriously considering a conservation movement that takes us back to the Paleolithic Era instead of simply reducing our current use and misuse of the earth, then we’re really just putting off the inevitable.

But all that being said, whatever the state of our not-too-distant future, the inevitability of the “distant future” is undeniable—for humans, as well as beavers and all plants and animals, and ultimately the earth itself.  For the earth, like all of its living inhabitants, has a finite future.

Around 7.5 billion years or so is a reasonable estimate.  And then it will most probably be absorbed in the sun, which will have swollen into a red giant.

5(Unless, as some scientists predict, the Milky Way collides with the Andromeda galaxy, resulting in cataclysmic effects that cannot be predicted.)

At best, however, this future only includes the possibility of earth supporting life for another billion years or so.  For by then, the increase in the sun’s brightening will have evaporated all of the oceans.

6Of course, long before that, the level of carbon dioxide in the atmosphere (ironically enough) will have diminished well below the quantity needed to support plant life, destroying the food chain and causing the extinction of all animal species as well.

And while that’s not good news, the worse news is that humans will have been removed from the equation long before the last holdouts of carbon-based life-forms eventually capitulate.

(Ok, so some microbes may be able to withstand the dry inhospitable conditions of desert earth, but seriously, who cares about the survival of microbes?)

Now if we’re optimistic about all of this (irony intended), the best-case scenario is for an earth that is able to support life as we know it for at most another half billion more years.  (Though this may be a stretch.)  And while that seems like a really long time, we should consider that the earth has already been inhabited for just over 3 and a half billion years.

So having only a half billion years left is sort of like trying to enjoy the last afternoon of a four-day vacation.

7

Enjoy the rest of your day.

What Should we Learn in College? (Part I)

by Wade Maki

Recently Governor McCrory made some comments on William Bennett’s radio show about higher education. These comments got a lot of people’s attention and not necessarily the good kind. Before reading any comments on what someone else has said it is best to check out the original source. To that end, I suggest listening to the entire segment of the Governor on the show (which you can download as an MP3 here).

Governor Pat McCrory

Governor Pat McCrory

Several comments were made regarding higher education including the importance an education has in getting a job, the shortage of certain kinds of training (welding), and the surplus of workers in other kinds of education (including gender studies, philosophy, and Swahili). While there are a lot of things worth responding to in the radio segment, I will address only one issue: Why disciplinary training in philosophy is valuable. Philosophy is, after all, my field and it is wise to restrict one’s public claims to what one knows.

What does philosophy teach us? Common answers include increased critical thinking, argumentation skills, and clarity of communication. In practice this includes a bundle of skills such as: seeing the logical implications of proposed ideas or courses of action; the ability to identify the relevant issue under discussion and separate out the “red herrings”, unsupported arguments, or fallacious reasoning; being able to break down complex ideas, issues, or communications and explain them in a logically organized fashion, etc. I could go on, but these are a sampling of the real skills learned from an education in philosophy.

What the governor and Dr. Bennett (who holds a Ph.D. in Philosophy) said gives the impression that a philosophy education doesn’t help students get jobs. This has been a takeaway message in the media. Since, others have made the case that a job isn’t the goal of an education, I leave it to the reader to examine that argument. There are two points about the discussion that should be noted. First, Dr. Bennett was suggesting that we have too many Ph.D.’s in philosophy, which is a separate claim than philosophy lacks educational value. It may be true that we have an oversupply of Ph.D.’s in many disciplines (and a shortage in others). The causes of this are many and include the free choice of students as to what to study, the impetus for universities to create graduate programs to enhance their reputations, and the ability to reduce teaching costs by putting graduate students in the classroom. Again, I leave it to others to examine these causes. Nothing Dr. Bennett said indicated that undergraduates shouldn’t learn philosophy.

Dr. William "Bill" Bennett

Dr. William “Bill” Bennett

This leads me to the second point—Dr. Bennett is himself an example of the value philosophy adds to education. What do you do with a philosophy education? Dr. Bennett parlayed his philosophical training, in addition to legal training (a common set of skills), to become Secretary of Education, a political commentator, an author, and a talk radio host. His logical argumentation skills, knowledge of Aristotle and virtue ethics are seen throughout his work. The very skills described above as benefits of a philosophical education are the skills his career represents.

There are very good reasons to include philosophy as part of our higher education curricula. Unfortunately, philosophy becomes an easy target in public discourse disparaging what we learn in this discipline for at least two reasons. First, most people don’t have an understanding of what philosophy is and how it develops numerous valuable skills. Second, philosophy teaches transferable skills that enhance many careers without having a single career associated solely with it (besides teaching). In other words, the value of studying nursing may be to become a nurse in a way that studying philosophy isn’t to become a philosopher. The value of philosophy is found in the skills it develops which can be applied to all sorts of jobs. I suspect Dr. Bennett would agree and I hope Governor McCrory will as well.

Pride and Prejudice

by Ann Millett-Gallant

From Wednesday, Sept 26 – Sunday, Sept 30, Durham hosted the 28th semi-annual Pride Weekend.  This festival, which began in 1981 and is the largest LGBT event in North Carolina, included a number of colorful performances, including music, dance, karaoke, DJs, and comedy (especially a headliner by Joan Rivers), parties and get-togethers, lunches and dinners, meetings over coffee, walk and runs, church services, vendors, and a lavish and lively parade.  According to their website, the mission of these events is:

  • to promote unity and visibility among lesbians, gay men, bisexual and transgendered people
  • to promote a positive image through programs and public activities that foster an awareness of our past struggles
  • to be recognized as an important and talented sector of our diverse state.
  • to support and encourage HIV/AIDS education, breast cancer awareness and basic health education

Although I am in complete support of these missions and always love a good party, I have only attended the parade twice with a friend of mine who is a lesbian.  I was thrilled when my new friend, Jay O’Berski, invited me to be a part of the float hosted this year by his Durham-based theater company, The Little Green Pig.  We all wore t-shirts in support of Pussy Riot, a Russian, Feminist Punk collective who stage activist Guerilla performances all over Moscow and who were recently incarnated (for more information, see this interview).

This is a photo of me in my Pussy Riot t-shirt in the café of the Durham Whole Foods before the parade.  Unfortunately, pouring rain prevented me from marching, or “scooting” in the parade, so I modeled my shirt where other marchers were gathered.  Although the parade was inaccessible to me this year, the spirit of the event inspired me.

The Pussy Riot acts relate to Unit 6 of my course BLS 348: Representing Women, “Performance as Resistance,” and most specifically, the activist work of the Guerilla Girls.

The Guerilla Girls are a performance team whose work includes live actions as well as posters and printed projects to critique the masculine biases of art history. The assigned reading for this class, the Introduction and Conclusion to The Guerrilla Girls’ Bedside Companion to the History of Western Art, presents a selection of their written projects, many of which engage irony, satire, and witty sense of humor. The Guerilla Girls call for change and invite others to partake in their protests.

In 1989, the Guerilla Girls challenged the Metropolitan Museum on their lack of representation of female artists. Almost 85% of the Mets’ nudes were female, compared with the only 5% of their collection of work by female artists.  This ad above appeared on New York City buses.

Representing Women also includes an assigned reading on homosexual artists:  Harmony Hammond, “Lesbian Artists,” in Amelia Jones, ed. The Feminism and Visual Culture Reader, 2nd edition (London, New York: Routledge, 2010), p. 128-129.

After the parade and conducting research for this blog, I became aware that one lesson might not be enough.  The Bachelor of Arts in Liberal Studies program emphasizes diversity and the breadth and wealth of differing human experiences.

Jay Parr raised similar points in his blog post of 9/27/11.  In “The Significance of a Simple Ring,” he discussed his discomfort at seeing a non-married, homosexual man wearing a ring.  Parr analyzed his negative reaction, given his full support of and numerous friendships with the LGBT community.   In the specific context of UNCG, Parr stated: “The irony is that the training seminar I was attending was so that I could become a certified Safe Zone ally, so that I could advertise to the university that, hey, if you’re an LGBTQ member of our community and you need someone to talk with about that, I’m here for you.”

Parr then focused on the significance of the ring as a symbol of one’s commitment to their spouse, as well as of the legal and social status of marriage.  He advocated that all couples should have the right to the ring and all the significance and rights surrounding it.

Parr’s post predated passage of the marriage amendment to the state constitution in May 2012, which solidified the ban of same sex marriage in North Carolina “Defense of Marriage.”  I felt disappointed and defeated by this law, but maybe, at least, it will motivate those who are against such legislation to speak out.  Not long after this act, President Obama “came out” with his support of same sex marriage, bringing the discussion to nation attention.

Opponents of same sex marriage say it’s an affront to traditional marriage.  Yet, my husband and I, although we are heterosexual, do not have a traditional marriage: we lived together for 3 years before becoming engaged, I proposed to him, and we have no plans, nor desire to have children.  Further, I was born without fingers, so I literally can’t wear a ring.  Nonetheless, we were allowed to get married, and the minister I found online was, I’m pretty sure, a lesbian.  She was ordained, but would not have legally been able to marry a loving partner herself.  In my opinion, bans on same sex marriage are an affront to Civil Rights.  Interracial marriage was legalized in all states not until 1967, and 45 years later we are debating similar issues.  I hope that events like the Pride Parade and public support of same sex marriage will lead toward positive change.

I feel hopeful this Fall, as new television shows such as The New Normal and Couples have strong and openly homosexual characters, adding to the presence of happy, same sex couples on television, in examples such as Modern Family (winner of the most 2012 Emmy awards), Glee, The Ellen DeGeneres Show, and Grey’s Anatomy, as well as popular shows that ended in the past few years, like Ugly Betty and Brothers and Sisters.  While I hesitate to wish reality would mirror television in general, this is evidence that perhaps American culture is beginning to have more exposure to and familiarity with so-called “Alternative” lifestyles.

__________

Editor’s note: Ann Millett-Gallant will be giving a book talk about her book, The Disabled Body in Contemporary Art, on Tuesday, November 13, at 3:00 PM, in the Multicultural Resource Center, on the ground floor the Elliott University Center.

Nimrod: What’s In a Name?

by Matt McKinnon

My teenage son is a nimrod. Or so I thought.

And if you have teenagers, and were born in the second half of the twentieth century, you have probably thought at one time or another that your son (or daughter) was a nimrod too, and would not require any specific evidence to explain why.

Of course this is the case only if you are of a certain age: namely, a Baby Boomer or Gen Xer (like myself).

For if you are any older, and if you are rather literate, then you would be perplexed as to why I would think that my son was a nimrod, and why was I not capitalizing Nimrod as it should be.  Since it is, after all, a proper noun.

It is?

Yes, it is.  Or rather it was.  Let me explain.

It turns out, the word “nimrod” (or more properly “Nimrod”) has a fascinating history in which it goes about a substantial reinterpretation.  (Any nimrod can find this out by searching the web, though there is precious little explanation there.)   This, by itself, isn’t surprising, as many words that make their way through the ages transform as well.  But the transformation of “Nimrod” to “nimrod” is particularly interesting in what it tells us about ourselves and our culture.

Nimrod, you see, was a character from the Hebrew Scriptures, or as Christians know it, the Old Testament:

“Cush became the father of Nimrod; he was the first on earth to become a mighty warrior. He was a mighty hunter before the LORD; therefore it is said, ʻLike Nimrod a mighty hunter before the LORD.ʼ”  (Genesis 10:8-9 NRSV)

This is the manner in which older biblically literate folks will understand the term: “as a mighty hunter.”

But there’s more here, for these folks also might understand the term as referencing a tyrannical ruler.

Why?  Well, the etymology of the word links it to the Hebrew “to rebel,” not for anything that Nimrod actually does in the Old Testament, but because, as many scholars attest, it is probably a distortion of the name for the Mesopotamian war-god Ninurta.  And the later chroniclers of Israelite religion didn’t have much sympathy for the polytheism of their Mesopotamian neighbors—especially when it so obviously informed their own religious mythology.

So the word, when it very early on enters the biblical narrative, already shows signs of transformation and tension as referencing both a mighty hunter as well as someone rebellious against the Israelite god.

In fact, Jewish and Christian tradition name Nimrod as the leader of the folks who built the Tower of Babel, though this is not found anywhere in the scriptures.  This, then, is how Nimrod is now portrayed in more conservative circles, despite the lack of biblical attestation:

And as the word is already attested to in Middle English, by the 16th century it is clearly being used in both manners in the English language: as a tyrant and as a great warrior or hunter.

Now I can assure you, neither of these describes my teenage son.  So what gives?

Well, “Nimrod” shows up in a 1932 Broadway play (which only had 11 showings) about two lovesick youngsters:

“He’s in love with her. That makes about the tenth. The same old Nimrod. Won’t let her alone for a second.”

Here, however, the emphasis is still on the term’s former meaning as a hunter, though its use in the play to describe a somewhat frivolous and hapless fellow who moves from one true love to the next points us in the right direction.

And in a 1934 film You’re Telling Me, W.C. Fields’ character, a bit of a buffoon himself (and a drunkard), takes a few swings of a limp golf club and hands it back to his dim-witted caddy, saying in a way only W.C. Fields could:

“Little too much whip in that club, nimrod.”

So here we have the first recorded instance of the word’s transformation from a great hunter or tyrant to a stupid person or jerk.

But that’s not the end of the story.  After all, how many of us have seen You’re Telling Me?  (I haven’t, at least, not until I did the research.)

So the last, and arguably most important piece to the puzzle is not the origination of the word or its transformation, but rather the dissemination of it.

And that, as I’m sure many of you are aware, is none other than T.V. Guide’s greatest cartoon character of all time: Bugs Bunny, who first debuted in the 1940’s, not that long after You’re Telling Me premiered.

In this context, the one most folks born after World War II are familiar with, Bugs Bunny refers to the inept hunter Elmer Fudd as a “little nimrod.”  And the rest, as they say, is history.

For what emerges from Bugs’ usage is not the traditional reference to Fudd as a hunter (though this is the obvious, albeit ironic, intention), but rather Fudd’s more enduring (and endearing?) quality of ineptitude and buffoonery.

And anyone who has (or knows) a teenager can certainly attest to the applicability of this use of the term in describing him or her.

But the important thing is what this says about literacy and our contemporary culture.

For whereas my parents’ generation and earlier were more likely than not to receive their cultural education from Classical stories, the great literature of Europe, and the Bible, those of us born in the latter half of the 20th century and later, are much more likely to receive our cultural education from popular culture.

I have seen this firsthand when teaching, for example, Nietzsche’s “Thus Spake Zarathustra,” which offers a critique of Judaism and Christianity by parodying scripture.  The trouble is, when students don’t know the referent, they can’t fully understand or appreciate the allusion.  And this is as true of Shakespeare and Milton as it is of Nietzsche…or Bugs Bunny for that matter.

And the ramifications of this are far greater than my choosing the proper term to criticize my teenage son.

(Though ya gotta admit, “nimrod” sounds pretty apropos.)