Tag Archives: literature

What Should We Learn in College? (Part II)

by Wade Maki

In my last post I discussed comments made by our Governor on what sorts of things we should, and shouldn’t, be learning in college. This is a conversation going on across higher education. Of course we should learn everything in college, but this goal is not practical as our time and funds are limited. We are left then to prioritize what things to require of our students, what things will be electives, and what things not to offer at all.

One area we do this prioritization in is “general education” (GE), which is the largest issue in determining what we learn in college. Some institutions have a very broad model for GE that covers classic literature, history, philosophy, and the “things an educated person should know.” Exactly what appears on this list will vary by institution with some being more focused on the arts, some on the humanities, and others on social sciences. The point being that the institution decides a very small core for GE.

The drawback to a conscribed model for GE is that it doesn’t allow for as much student choice. The desire for more choice led to another very common GE system often referred to as “the cafeteria model” whereby many courses are offered as satisfying GE requirements and each student picks preferences for a category. This system is good for student choice of what to learn, but it isn’t good if you want a connected “core” of courses.

In recent years there has been a move to have a “common core” in which all universities within a state would have the same GE requirements. This makes transfers easier since all schools have the same core. However, it also tends to limit the amount of choice by reducing the options to only those courses offered at every school. In addition, it eliminates the local character of an institution’s GE (by making them all the same), which also reduces improvements from having competing systems (when everyone does it their own way, good ideas tend to be replicated). If we don’t try different GE systems on campuses then innovation slows.

Image

No matter which direction we move GE, we still have to address the central question of “what should we learn?” For example, should students learn a foreign language? Of course they should in an ideal world, but consider that foreign language requirements are two years.  We must compare the opportunity costs of that four course requirement (what else could we have learned from four other courses in say economics, psychology, science, or communications?). This is just one example of how complicated GE decisions can be. Every course we require is a limitation on choice and makes it less likely that other (non-required) subjects will be learned.

As many states look at a “common core” model there is an additional consideration which is often overlooked.  Suppose we move to a common core of general education in which most students learn the same sorts of things.  Now imagine your business or work environment where most of your coworkers learned the same types of things but other areas of knowledge were not learned by any of them. Is this preferable to an organization where its already employed educated members learned very little in common but have more diverse educational backgrounds? I suspect an organization with more diverse education employees will be more adaptable than one where there are a few things everyone knows and a lot of things no one knows.

Image

This is my worry about the way we are looking to answer the question of what we should learn in college. In the search for an efficient, easy to transfer, common core we may end up:

  1. Having graduates with more similar educations and the same gaps in their educations.
  2. Losing the unique educational cultures of our institutions.
  3. Missing out on the long term advantage of experimentation across our institutions by imposing one model for everyone.

Not having a common core doesn’t solve the all of the problems, but promoting experiments through diverse and unique educational requirements is worth keeping. There is another problem with GE that I can’t resolve, which is how most of us in college answer the question this way: “Everyone should learn what I did or what I’m teaching.” But that is a problem to be addressed in another posting. So, what should we learn in college?

All Hallows Eve…and Errors

by Matt McKinnon

All Hallows Eve, or Hallowe’en for short, is one of the most controversial and misunderstood holidays celebrated in the United States—its controversy owing in large part to its misunderstanding.  More so than the recent “War on Christmas” that may or may not be raging across the country, or the most important of all Christian holidays—Easter—blatantly named after the pagan goddess (Eostre), Halloween tends to separate Americans into those who enjoy it and find it harmless and those who disdain it and find it demonic.  Interestingly enough, both groups tend to base their ideas about Halloween on the same erroneous “facts” about its origins.

A quick perusal of the internet (once you have gotten by the commercialized sites selling costumes and the like) will offer the following generalizations about the holiday, taken for granted by most folks as historical truth.

Common ideas from a secular and/or Neopagan perspective:

  • Halloween developed from the  pan-Celtic feast of Samhain (pronounced “sah-ween”)
  • Samhain was the Celtic equivalent of New Years
  • This was a time when the veil between the living and dead was lifted
  • It becomes Christianized as “All Hallows Day” (All Saints Day)
  • The eve of this holy day remained essentially Pagan
  • Celebrating Halloween is innocent fun

Common ideas from an Evangelical Christian perspective (which would accept the first five of the above):

  • Halloween is Pagan in origin and outlook
  • It became intertwined with the “Catholic” All Saints Day
  • It celebrates evil and/or the Devil
  • It glorifies death and the macabre
  • Celebrating Halloween is blasphemous, idolatrous, and harmful

Even more “respectable” sites like those from History.com and the Library of Congress continue to perpetuate the Pagan-turned-Christian history of Halloween despite scarce evidence to support it, and considerable reason to be suspicious of it.

To be sure, like most legends, this “history” of Halloween contains some kernel of fact, though, again like most things, its true history is much more convoluted and complex.

The problem with Halloween and its Celtic origins is that the Celts were a semi-literate people who left only some inscriptions: all the writings we have about the pre-Christian Celts (the pagans) are the product of Christians, who may or may not have been completely faithful in their description and interpretation.  Indeed, all of the resources for ancient Irish mythology are medieval documents (the earliest being from the 11th century—some 600 years after Christianity had been introduced to Ireland).

It may be the case that Samhain indeed marked the Irish commemoration of the change in seasons “when the summer goes to its rest,” as the Medieval Irish tale “The Tain” records.  (Note, however, that our source here is only from the 12th century, and is specific to Ireland.)  The problem is that the historical evidence is not so neat.

A heavy migration of Irish to the Scottish Highlands and Islands in the early Middle Ages introduced the celebration of Samhain there, but the earliest Welsh (also Celtic) records afford no significance to the same dates.  Nor is there any indication that there was a counterpart to this celebration in Anglo-Saxon England from the same period.

So the best we can say is that, by the 10th century or so, Samhaim was established as an Irish holiday denoting the end of summer and the beginning of winter, but that there is no evidence that November 1 was a major pan-Celtic festival, and that even where it was celebrated (Ireland, Scottish Highlands and Islands), it did not have any religious significance or attributes.

As if the supposed Celtic origins of the holiday are uncertain enough, its “Christianization” by a Roman Church determined to stomp out ties to a pagan past are even more problematic.

It is assumed that because the Western Christian churches now celebrate All Saints Day on November 1st—with the addition of the Roman Catholic All Souls Day on November 2nd—there must have been an attempt by the clergy of the new religion to co-opt and supplant the holy days of the old.  After all, the celebrations of the death of the saints and of all Christians seem to directly correlate with the accumulated medieval suggestions that Samhain celebrated the end and the beginning of all things, and recognized a lifting of the veil between the natural and supernatural worlds.

The problem is that All Saints Day was first established by Pope Boniface IV on 13 May, 609 (or 610) when he consecrated the Pantheon at Rome.  It continued to be celebrated in Rome on 13 May, but was also celebrated at various other times in other parts of the Western Church, according to local usage (the medieval Irish church celebrated All Saints Day on April 20th).

Its Roman celebration was moved to 1 November during the reign of Pope Gregory III (d. 741), though with no suggestion that this was an attempt to co-opt the pagan holiday of Samhain.  In fact, there is evidence that the November date was already being kept by some churches in England and Germany as well as the Frankish kingdom, and that the date itself is most probably of Northern German origin.

Thus the idea that the celebration of All Saints Day on November 1st had anything to do either with Celtic influence or Roman concern to supersede the pagan Samhain has no historical basis: instead, Roman and Celtic Christianity followed the lead of the Germanic tradition, the reason for which is lost to history.

The English historian Ronald Hutton concludes that, while there is no doubt that the beginning of November was the time of a major pagan festival that was celebrated in all of the pastoral areas of the British Isles, there is no evidence that it was connected with the dead, and no proof that it celebrated the new year.

By the end of the Middle Ages, however, Halloween—as a Christian festival of the dead—had developed into a major public holiday of revelry, drink, and frolicking, with food and bonfires, and the practice of “souling” (a precursor to our modern trick-or-treating?) culminating in the most important ritual of all: the ringing of church bells to comfort the souls of people in purgatory.

The antics and festivities that most resemble our modern Halloween celebrations come directly from this medieval Christian holiday: the mummers and guisers (performers in disguise) of the winter festivals also being active at this time, and the practice of “souling” where children would go around soliciting “soul cakes” representing souls being freed from purgatory.

The tricks and pranks and carrying of vegetables (originally turnips) carved with scary faces (our jack o’ lanterns) are not attested to until the nineteenth century, so their direct link with earlier pagan practices is sketchy at best.

While the Celtic origins of Samhain may have had some influence on the celebration of Halloween as it begin to take shape during the Middle Ages, the Catholic Christian culture of the Middle Ages had a much more profound effect, where the ancient notion of the spiritual quality of the dates October 31st/November 1st became specifically associated with death—and later with the macabre of more recent times.

Thus modern Halloween is more directly a product of the Christian Middle Ages than it is of Celtic Paganism.  To the extent that some deny its rootedness in Christianity or deride its essence as pagan is more an indication of how these groups feel about medieval Catholic Christianity than Celtic Paganism (about which we know so very little).

And to the extent that we fail to realize just how Christian many of these practices and festivities were, we fail to see how much the Reformation and later movement of pietism and rationalism have been successful in redefining exactly what “Christian” is.

As such, Halloween is no less Christian and no more Pagan than either Christmas or Easter.

Happy trick-or-treating!

Spiders and Toads

By Marc Williams

Laurence Olivier as Richard III.

“Now is the winter of our discontent
Made glorious summer by this sun of York.”
~Richard III (Act 1, scene 1).

King Richard III is among Shakespeare’s greatest villains. Based on the real-life Richard of Glouster, Shakespeare’s title character murders his way to the throne, bragging about his deeds and ambitions to the audience in some of Shakespeare’s most delightful soliloquies. Shakespeare’s Richard is famously depicted as a hunchback, and uses his physical deformity as justification for his evil ambitions:

Why, I, in this weak piping time of peace,
Have no delight to pass away the time,
Unless to spy my shadow in the sun
And descant on mine own deformity:
And therefore, since I cannot prove a lover,
To entertain these fair well-spoken days,
I am determined to prove a villain
And hate the idle pleasures of these days.

For stage actors, Richard III is a tremendously challenging role. On one hand, he is pure evil—but he must also be charming and likeable. If you aren’t familiar with the play, its second scene features Richard successfully wooing Lady Anne as she grieves over her husband’s corpse! And Richard is her husband’s killer! Shakespeare’s Richard is both evil and smooth.

Simon Russell Beale as Richard III.

Actors must also deal with the issue of Richard’s physical disability. For instance, Richard is described as a “poisonous bunch-back’d toad,” an image that inspired Simon Russell Beale’s 1992 performance at the Royal Shakespeare Company, while Antony Sher’s iconic 1984 interpretation was inspired by the phrase “bottled spider,” an insult hurled at Richard in Act I.

Anthony Sher’s “bottled spider” interpretation of  Richard III.

While much of the historical record disputes Shakespeare’s portrayal of Richard as a maniacal mass-murderer, relatively little is known about Richard’s disability. According to the play, Richard is a hunchback with a shriveled arm. However, there is little evidence to support these claims.

This uncertainty may soon change. Archaeologists in Leicester, England have uncovered the remnants of a chapel that was demolished in the 16th century. That chapel, according to historic accounts of Richard’s death at the Battle of Bosworth, was Richard’s burial site. Not only have researchers found the church, but they have also located the choir area, where Richard’s body was allegedly interred. And indeed, last week, the archaeologists uncovered bones in the choir area:

If the archeologists have indeed found the remains of Richard III, the famous king was definitely not a hunchback. It appears he suffered from scoliosis—a lateral curve or twist of the spine—but not from kyphosis, which is a different kind of spinal curvature that leads to a pronounced forward-leaning posture. As Dr. Richard Taylor explains in the video, the excavated remains suggest this person would have appeared to have one shoulder slightly higher than the other as a result of scoliosis.

Interestingly, Ian McKellen’s performance as Richard III, captured in Richard Loncraine’s 1996 film, seems to capture the kind of physical condition described by Dr. Taylor, with one shoulder slightly higher than the other. At the 6:45 mark in this video, one can see how McKellen dealt with Richard’s condition.

So it appears Shakespeare not only distorted historical details in Richard III, he also apparently distorted the title character’s shape. Of Shakespeare’s Richard, McKellen wrote:

Shakespeare’s stage version of Richard has erased the history of the real king, who was, by comparison, a model of probity. Canny Shakespeare may well have conformed to the propaganda of the Tudor Dynasty, Queen Elizabeth I’s grandfather having slain Richard III at the Battle of Bosworth. Shakespeare was not writing nor rewriting history. He was building on his success as thee young playwright of the Henry VI trilogy, some of whose monstrously self-willed men and women recur in Richard III.

It seems likely that Shakespeare wanted Richard to seem as evil as possible in order to flatter Queen Elizabeth I, depicting her grandfather as England’s conquering hero. But why distort Richard’s physical disability as well?

In describing Richard’s body shape, it is difficult to ascertain what Shakespeare’s motives might have been and perhaps even more difficult to assess his attitudes toward physical difference in general. For example, in my “Big Plays, Big Ideas” class in the BLS program, we discuss the issue of race in Othello, even though we don’t know much about what Shakespeare thought about race. Many scholars have investigated the subject of physical difference in Shakespeare, of course: there are papers on Richard’s spine, naturally, but also Othello’s seizures, Lavinia’s marginalization in Titus Andronicus after her hands and feet are severed, the depiction of blindness in King Lear, and even Hermia’s height in A Midsummer Night’s Dream. And just as one must ask, “is Othello about race,” we might also ask, “is Richard III about shape?” I doubt many would argue that physical difference is the primary focus of Shakespeare’s Richard III, but it will be interesting to observe how the apparent discovery of Richard’s body will affect future performances of the play. Will actors continue to twist their bodies into “bottled spiders,” or will they focus on the historical Richard’s scoliosis—and perhaps ask why such vicious language is used to describe such a minor difference?

Resurgence of the American Right

By Claude Tate

In my BLS class, “Self, Society, and Salvation” we devote a unit to problems of society.  In that unit we look at three different approaches to organizing society.  Each approach is designed to promote what it believes to be the best society.  The focus of aristocratic theory is to create an orderly society. We next look at liberalism, which seeks to create a society as free as possible. Our final lesson is on socialism, which strives to create a society that is fair and just. In each discussion we examine how those three approaches are manifested in America.  In our lessons on liberalism and socialism in particular, I try to emphasize how we have struggled from the start over how to create a society that is not only as free as possible, but also fair and just. I also try to emphasize how we see that struggle played out in the news every day.  This post concerns that struggle; a struggle that will only grow more intense in the coming months as the 2012 elections near.  And that intensity will be due in large part to the resurgence of the American Right.

In 2008 our economy suffered, for want of a better term, a melt-down that impacted every area of the economy. The specifics of what caused the “Great Recession” will be argued over for years to come much like the specifics of what brought on the Great Depression. But it is safe to say that the cause of the 2008 collapse was ideologically driven.  The drive for lower taxes and less regulation of business, which began to gain traction in the ‘70s and steadily gained ground through the ‘80s and ‘90s, dominated policy under the administration of George W. Bush.  Taxes were reduced. Regulations that could be eliminated were, areas where new regulations were needed were ignored, and people were put in charge of our regulatory agencies that had spent much of their careers fighting the very agencies they now lead. Our yearly budget surplus was quickly turned into a yearly deficit as revenues coming into the government failed to keep up with our spending. And businesses from Main Street to Wall Street were allowed to conduct business as they saw fit.  The gap between the rich and the middle class which had begun decades earlier widened at an increasing rate.  And both the government and individual citizens were going deeper into debt. But the stock market was soaring. All was well.  And if we had any doubts, President Bush and the Treasury Secretary Hank Paulson assured us the economy’s fundamentals were strong.  In retrospect their statements sounded much like the statements the leaders of business were making in September and October of 1929, weeks before the crash that signaled the beginning of the Great Depression.

But just as with 1929, statements of reassurance could not stop what happened to the economy in the fall of 2008. Capitalism depends on a sound banking system, and our major banks were failing. Both Democrats and Republicans agreed the government had to help them survive.  The economy as a whole was beginning to go into a tailspin; a tailspin that was feared would end in another Depression if the banks failed.  Thus TARP was passed under President Bush. Taxpayer money flowed to the major banks to save them from a catastrophe of their own making.  And it should be noted that our actions were not unique. Other Western nations took similar actions.  The American public did not rise up in protest. In fact, Democrats dominated the election of 2008 as they not only won the Presidency, but also increased their majority in the House, and clearly took control of the Senate. Prior to the election Democrats had relied upon two independents with whom they caucused to control the Senate by a 51 to 49 margin.  Virtually everyone agreed, the American Right was dead.  But the banking situation had not stabilized when President Obama took office in January of 2009, so he continued TARP.  And within weeks the Right started to show signs of life.  The defibrillator was the policies enacted to help the nation recover.  TARP led the way. We began to hear of banks that were being given taxpayer money to survive handing our large bonuses to many top executives. Their justification was that they needed to hold onto their talent.  This was the same talent that had almost destroyed them. The American public began to grow restless. A movement was born, the Tea Party, which embraced the very ideology that had caused the collapse.  Anger that one would think should have directed at Wall Street and to the business practices that caused the collapse instead was being redirected at the government. A stimulus bill, The American Recovery and Reinvestment Act of 2009 was passed, but due to Republican opposition, the final bill was too small to have a substantial impact on recovery.  It mitigated the impact of the economic downturn and saved many jobs, but it was not enough to pull us out. The Dodd–Frank Wall Street Reform and Consumer Protection Act, designed to tighten regulations on banks and cover loopholes that had allowed for dangerous speculation passed, but with opposition and resentment. And to save the automobile industry, the government loaned money to General Motors and Chrysler.  What in the past would have been hailed as necessary now was condemned as government interference in the private sector. Both are paying their loans back and GM is once again number one in sales. Thousands of jobs were saved and both are hiring back workers. (But still, Mitt Romney maintains we should have let them fail.)  And finally, The Patient Protection and Affordable Care Act, an act very similar to a Republican proposal from the 1990’s, was vehemently opposed by Republicans and derided by the Tea Party and others as a government takeover of healthcare.  Misinformation was rampant. The government was not going to kill granny. (I loved a sign I saw a woman holding up at one of the “town meetings” which said, “Government, keep your hands off of my Medicare”.)

The anger against the government accelerated in the fall of 2010 leading to a Republican victory in the mid-term elections with Republicans not only taking over the House of Representatives, but many governorships and state legislatures. And many of those newly elected Republican officials vowed to reinstitute the policies that led to the collapse.  And that move to the right in the Republican Party has continued. Today we are even hearing of Social Security as we know it being dismantled, Medicare fundamentally altered, funding for our public schools and  universities being cut, and our universities called places where the “liberal elite” indoctrinate our young. (I refer to our indoctrination methods as “guerilla teaching”, but evidently the right has seen through us.) And that is the tip of the iceberg. How did this happen? How did an economic collapse that one would think should have opened a window of opportunity for the left instead lead to a resurgence of the right?

On January 9th my wife and I were coming home from the mountains, and as we normally do we were listening to NPR (otherwise known a propaganda tool for the liberal elite that should not be supported by taxpayers).  The guest on “The Diane Rehm Show” that day was Thomas Frank.  Mr. Frank, writer and former opinion columnist for “The Wall Street Journal”, had just published a new book, “Pity the Billionaire”, in which he explored the birth of the Tea Party phenomenon. I purchased the book at soon as we got home and read it immediately.  His is not the final word on the Tea Party and the rise of the Right, but the book does provide insight and in my view, is well worth reading.

From the bestselling author of What’s the Matter with Kansas?, a wonderfully insightful and sardonic look at why the worst economy since the 1930s has brought about the revival of conservatism

–From book description on Amazon.com

Economic catastrophe usually brings social protest and demands for change—or at least it’s supposed to. But when Thomas Frank set out in 2009 to look for expressions of American discontent, all he could find were loud demands that the economic system be made even harsher on the recession’s victims and that society’s traditional winners receive even grander prizes. The American Right, which had seemed moribund after the election of 2008, was strangely reinvigorated by the arrival of hard times. The Tea Party movement demanded not that we question the failed system but that we reaffirm our commitment to it. Republicans in Congress embarked on a bold strategy of total opposition to the liberal state. And TV phenom Glenn Beck demonstrated the commercial potential of heroic paranoia and the purest libertarian economics.
In Pity the Billionaire, Frank, the great chronicler of American paradox, examines the peculiar mechanism by which dire economic circumstances have delivered wildly unexpected political results. Using firsthand reporting, a deep knowledge of the American Right, and a wicked sense of humor, he gives us the first full diagnosis of the cultural malady that has transformed collapse into profit, reconceived the Founding Fathers as heroes from an Ayn Rand novel, and enlisted the powerless in a fan club for the prosperous. The understanding Frank reaches is at once startling, original, and profound.

Click here to listen to the interview of Mr. Frank my wife and I listened to from The Diane Rehm Show

New Year’s Resolutions

By Carrie Levesque

Is it too early to start thinking about New Year’s Resolutions?  It’s not something I usually get to until the holiday insanity is over and the next wave of media bombardment starts pushing gym memberships and Nutrisystem packages.   Though I filter out most of the blah blah blah, I do get to thinking about how to best use this gift of a new year.

While procrastinating online a few days ago, I came across an article, “Five lessons learned from living in Paris.”   I was struck by its epigraph, a quote from Hemingway’s letters, where he writes, “Paris is so very beautiful that it satisfies something in you that is always hungry in America.”   Hemingway’s quote and this article on Jennifer L Scott’s new book,”Lessons from Madame Chic: The Top 20 Things I Learned While Living in Paris” are both indirectly about Americans’ hunger for beauty and grace in a commerce-driven culture that sometimes doesn’t seem to have a lot of use for either.

Two of Scott’s ‘lessons’ resonate with me as I reevaluate my life and habits at year’s end.  First was her observation that “Parisians often turned mundane aspects of everyday life into something special.”  She recalls how her host father made an event of savoring a bit of his favorite cheese every night at the end of dinner; it was just routine enjoyment of a cheese course, but he was so passionate about it that it became a special ritual whose memory Scott savored after she returned to American life.  Meals tend to be something we rush through in the US, a pit-stop refueling on our way to getting done more of whatever it is we are always rushing to get done.  It seems if we’re looking for anywhere in our lives to slow down and satisfy a spiritual hunger along with our physical hunger, mealtime is a worthy candidate.

She also comments on the European tendency to take more pride and care in one’s dress while at the same time owning fewer clothes and accessories than Americans tend to, which simplifies the process of making oneself presentable each day.  This calls to mind web projects like Project 333, one of many movements today exploring the benefits of ‘voluntary simplicity.’   Owning fewer things doesn’t only save you money, it also frees you from having to care for and about heaps of stuff, so that you have more time and energy to indulge in things like Scott’s first point: really enjoying and being present for life’s smaller, even routine, pleasures.

OK, so none of this is anything new.  Oceans of ink have already been spilled on these topics (sorry, Ms. Scott).  And yet we still make New Year’s resolutions, and we still hunger and struggle, in all sorts of ways, to be better (whatever each of us decides that means), to improve the world around us.  Which is why at New Year’s I find myself looking to accounts I’ve come across of people closer to home who decided to devote their year to some sort of radical reevaluation of the way we live (like a New Year’s Resolution on steroids), and the lessons they learned from it.

Two of my favorites in the ‘less is more’ genre are Judith Levine’s “Not Buying It: My Year without Shopping” (fairly self-explanatory) and Colin Beavan’s “No Impact Man” (which became a movie in 2009- he and his family sought to live (in NYC, for a whole year) in a way that made no environmental impact).    For giving more consideration to the sacred place of food in our lives, I think of Barbara Kingsolver’s wonderful “Animal, Vegetable, Miracle,” her account of eating only in-season food produced by or close-by her Virginia mountain farm for one year.  Two other works I might get to this year are A.J. Jacobs’ “The Year of Living Biblically”(in which Jacobs, an agnostic, tries to take literal direction from the Good Book and considers the place of the sacred in our lives) and Sara Bongiorni’s “A Year Without ‘Made in China’: One Family’s True Life Adventure in the Global Economy“ (in which she walks the walk of the ‘Buy American’ talk).

While I don’t see myself devoting my year to anything requiring quite this level of discipline (baby steps!), it is inspiring to live vicariously through these authors and reflect on what they gained from their projects.  They and their families forged meaningful new traditions and found a new sense of community that arises when one disconnects a bit from the mainstream economy (potlucks instead of takeout, preparing a fresh meal as a family instead of popping a frozen meal in the microwave, sharing communal resources instead of everyone needing to own and maintain their own whoziwhatsits).   If you’ve got the money, it’s easy to escape to an exotic foreign locale to feed your need for beauty.  But it’s possible if we take them more to heart, these authors’ efforts can challenge us to find a beauty in our own communities that can leave us all a little less hungry in America.

My Experience in the BLS Program at UNCG

By Julia Burns, BLS Class of 2012

I woke up this morning and went into the bathroom to do my daily ritual as usual. The only problem that I have is looking in the mirror at a very scary soon to be 52 year old! Thinking, I realized today is 11th of December and it is 4 more days until I will officially be graduated. I did it! I worked hard to get my Bachelor of Arts degree while I worked making money in a reputable job. It took me 2 years in person and a year online to complete in 3 ½ years what normally have taken in 4 to 5 years. How did I do it?  The Bachelor of Arts in Liberal Studies program at the University of North Carolina at Greensboro.

When I first enrolled, I thought this was going to be a breeze – a piece of cake. I couldn’t have been more wrong. I never worked so hard in my life. A traditional student can walk to class, take notes, study, test, and interact readily with other students; an online student does not have that luxury. An online article by Terence Loose points out the following seven myths about  online learning:

  • Online courses are easier than in-class courses.
  • You have to be tech-savvy to take an online class.
  • You don’t receive personal attention in online education.
  • You can “hide” in an online course and never participate.
  • You don’t learn as much when you pursue an online degree.
  • Respected schools don’t offer online degrees.
  • Networking opportunities aren’t available through online education.

I compared these seven myths to my experience with online classes. I am technologically illiterate. I received a lot of personal attention in online education. I couldn’t hide in an online course and not participate if I expected to receive a grade and keep my financial aid. I learned more from studying on line than I did from attending in person. The University of North Carolina at Greensboro is a well-respected, fully accredited, state university. I made some wonderful contacts online–not just on the North Carolina campus but from all across the country as well as all over the world.  Through online classes, I have learned the art of self discipline, how to prioritize better, how to write for specific disciplines, developed a stronger interest in all types of literature, and a gained great appreciation for all types of anthropology.

Many classes featured heated debates, such as the mock trials in “Great Trials in American History.” This was done live online and all students had to participate. It was a difficult night because in some parts of the country there were terrible thunderstorms and a lot of tornado activity going on. The thrill of the storms and the debate combined was really exciting!

What do I intend to do with this online degree in Bachelor of Arts? I would like to be a lawyer or a teacher. But in the meanwhile, I have chosen neither. I am currently refreshing my algebra skills to take the GRE and get my Master Arts in Liberal Studies. The law has always fascinated me, teaching would be a great challenge, but to become better educated is where I am headed. Who knows–maybe I will get my PhD?

Confederates in the Attic

By Carrie Levesque

It’s an experience not uncommon to people who love books: every so often, you stumble across one that changes your life.

Years ago, I was catsitting for a neighbor, a Duke administrator with an extensive personal library of contemporary world literature.   It was two weeks before I was to leave for Prague to begin research on one dissertation topic, when I found myself throwing myself headlong into an entirely new topic after being blown away by a book I had randomly selected from his shelves.  I didn’t know then as I poured over Dubravka Ugresic’s The Culture of Lies that I had found not only a new dissertation topic, but also a timely new teaching field that would inspire me for many years to come.  I’m very thankful today that this book and I crossed paths (and especially that my advisor was willing to go along with that abrupt change of plans).  I don’t know that “Czech Women’s Symbolist Literature” would have found the same broad appeal as “Women, War and Terror” as I joined the BLS program a few years later.

So this summer, it happened again.  A recommended title I jotted down from a newspaper article on Southern culture turned up a few weeks later at my county’s Friends of the Library sale and bam- I had a live one.

I devoured the first 300 or so pages of Tony Horwitz’s wildly thought-provoking and entertaining Confederates in the Attic and then dawdled through the last 90 because I could not bear for it to end.  In his quest to understand “how the Lost Cause still resonates in the memory and rituals of the South” (book jacket), journalist Horwitz interviews some often colorful, always impassioned figures, white and black, from urban eccentrics to quiet, small-town folks, struggling to preserve or just make sense of what it means to be Southern roughly 130 years (at the time of writing) after the War Between the States.   He probes the relationship between the Civil War and the South through its symbols (the rebel flag controversy), its branding and marketing (lunch with Scarlett O’Hara, anyone?), and most fascinating, its reenactors, the most hardcore of which spend many damp, cold nights in the fields of Virginia and Pennsylvania, huddled, hungry and stinky, trying to get close to the experience of poorly-equipped soldier-ancestors on the march into battle.

Prior to reading this book, I confess I had an attitude toward the Civil War common to many of us born Yankee: I knew just what I needed to know: the North whupped the South and put an end to a certain deplorable ‘peculiar institution.’  It was, to me, in my shamefully glossed-over understanding, not much more than a tragic means to a well-overdue end.   But, of course, it was that and infinitely more than that.  It was the most horrific 4 years this country has ever seen, with over 620,000 dead and unspeakable suffering on the front lines and on the home front.  In some of the fiercest battles, a soldier died nearly every second.   It was a conflict started in part by tensions between two culturally and economically very different parts of this country, and both its liberating and shattering effects continue to shape this enduring tension 150 years later.

Since I finished Confederates, the pile of Civil-War-related books at my bedside grows almost daily.  With over 70,000 books published on the Civil War as of 2002 (Library of Congress), I’ve got my work cut out for me in pursuing this new personal and academic obsession.

My daughter at the burial site of Stonewall Jackson's arm.

Though my husband and I aren’t yet outfitting ourselves for a march to Gettysburg, we have begun dragging our daughters and any other willing family members to battlefields and other related sites (though not yet on quite as intensive a schedule as the “Civil Wargasm” tours of Horwitz’s star reenactor, Robert Lee Hodge).  A pilgrimage to Lexington and Appomattox is planned for spring.

And of course I’m intrigued about the possibility of bringing this interest into an updated version of Women, War and Terror.  There is a considerable body of exciting research on the role of women in the Civil War.  Mary Chestnut’s famous diary is cited in every source I come across.  I’m about halfway through historian Carol Berkin’s Civil War Wives, which has been a riveting read.  And I’m dying to know more about Jennie Hodgers, aka Albert Cashiers, one of 400 documented cases of women who disguised themselves as men in order to fight as soldiers in the Civil War (http://www.civilwar.org/education/history/biographies/jennie-hodgers.html).

So what book has changed your life?  Or another question I’d love a response to: what is your relationship to the Civil War?  If you’ve got family stories to tell, you’ve got an eager ear right here!

The Threepenny Opera

By Marc Williams

The Threepenny Opera at UNCG

UNCG Theatre began performances of The Threepenny Opera Wednesday evening.  While it is best known for the song “Mack the Knife,” The Threepenny Opera is a tremendously important piece of 20th century dramatic literature and is certainly among my favorite plays.

While the play does indeed feature lots of music, it isn’t an opera in the traditional sense–it isn’t “sung through,” so most of the text is spoken.  This balance of spoken dialogue and song may seem akin to musical theatre–but The Threepenny Opera doesn’t really fall into that category either.  In most works of musical theatre, singing emerges from intense dramatic situations; a character may need a song to express an idea that words alone cannot capture.  In The Threepenny Opera, songs sometimes relate to the dramatic situation but just as often, the songs are only tangentially related to what is happening between the characters.  And in some cases, the song is a complete interruption of the action.

The creators of this piece, Bertolt Brecht and Kurt Weill, were artist-activists and viewed every piece of theatre as a political statement.  To them, a work of art either supports the status quo or challenges the status quo.  To encourage their audience to think objectively and politically about the situations in their plays, they worked to prevent the audience from developing an emotional connection with the play’s characters.  This attempt became known as the “alienation effect,” and resulted in a style of performance that keeps the audience at an arm’s length from the play’s action. To achieve the alienation effect, characters might directly address the audience, reminding them they are only watching a play.  Design elements might be merely suggested rather than rendered in a realistic, lifelike manner.  Performers might deliver lines sarcastically or without realistic expressiveness, ensuring that audience members won’t empathize with the character.  And in the case of The Threepenny Opera, like many other Brecht plays, songs are used to interrupt or comment upon the action.

In my Big Plays, Big Ideas class in the BLS program, we read another Brecht play, The Life of Galileo, and discuss the ways in which Brecht alienates the audience to ensure they observe the sociopolitical circumstances that inform the characters’ behavior.  In studying other works of dramatic literature from a variety of historical periods, we find that the alienation effect was not invented by Brecht and Weill at all.

Projections and suggested scenery demonstrate the alienation effect in The Glass Menagerie.

Theatre artists in Ancient Greece employed such alienating effects in their work, and many artists in the decades since Brecht have incorporated alienating effects into their work as well.  Even Tennessee Williams’ The Glass Menagerie, often considered a work of 20th century realism, is in fact inspired by Brecht and contains a variety of alienating devices.

Have you encountered the alienation effect in the theatre, or on television or in a movie?

Bobby Darin sings “Mack the Knife.”

Alan Cumming and Cyndi Lauper starred in a 2006 revival of The Threepenny Opera.  This performance is from the Tony Awards broadcast.

A Thousand Faces

By Marc Williams

I teach a number of courses involving dramatic literature, including Big Plays, Big Ideas in the BLS program at UNCG.  In most of these classes, I discuss dramatic structure—the way that incidents are arranged into a plot.  Whenever I teach dramatic structure, I always turn to Sophocles’ Oedipus Rex to serve as an example.  Aristotle believed this play to be tragedy “in its ideal state,” partially because the incidents are arranged in a clear cause-and-effect manner.  One incident logically follows the next and although there are some surprises, none of the events are random, accidental, or tangential.

The story of Oedipus is an ancient myth.  20th century mythology scholar Joseph Campbell wrote about Oedipus frequently, including in his seminal book, The Hero with a Thousand Faces.  In this book, Campbell outlines the “monomyth,” a dramatic structure that many, if not most, stories seem to adhere in one manner or another.  The monomyth consists of several stages of the hero’s journey: a call to adventure, a refusal of that call, followed by aid from a supernatural entity, crossing a threshold into unfamiliar territory, entering/escaping the belly of the whale, traveling a road of trials, and so on, all the way through the hero’s return.  Oedipus’ journey follows Cambell’s pattern almost perfectly. The pattern applies not only to Ancient Greek myths but to stories from virtually every culture across the globe.

Campbell describes the stages of the hero’s journey at length in The Hero with a Thousand Faces and also diagrams the hero’s journey thus:

I was instantly reminded of Joseph Campbell and his diagram today when I came across this:

1.  A character is in a zone of comfort
2.  But they want something
3.  They enter an unfamiliar situation
4.  Adapt to it
5.  Get what they wanted
6.  Pay a heavy price for it
7.  Then return to their familiar situation
8.  Having changed

This diagram was developed by Dan Harmon, creator of the NBC sitcom Community, and according to this article on Wired.com, is apparently the inspiration for every episode of the show:

Dan Harmon

[Harmon] began doodling the circles in the late ’90s, while stuck on a screenplay. He wanted to codify the storytelling process— to find the hidden structure powering the movies and TV shows, even songs, he’d been absorbing since he was a kid. “I was thinking, there must be some symmetry to this,” he says of how stories are told. “Some simplicity.” So he watched a lot of Die Hard, boiled down a lot of Joseph Campbell, and came up with the circle, an algorithm that distills a narrative into eight steps:

Harmon calls his circles embryos— they contain all the elements needed for a satisfying story— and he uses them to map out nearly every turn on Community, from throwaway gags to entire seasons. If a plot doesn’t follow these steps, the embryo is invalid, and he starts over. To this day, Harmon still studies each film and TV show he watches, searching for his algorithm underneath, checking to see if the theory is airtight. “I can’t not see that circle,” he says. “It’s tattooed on my brain.”

The eight-step Harmon embryo model is simpler than Cambell’s monomyth, which contains seventeen structural units. Harmon’s embryo model, because it is simpler than Campell’s, is probably also more universal.  And indeed, Harmon uses this embryo as a litmus test to determine if an episode of Community is structurally sound. It is, after all, a tried and true formula for great storytelling.  So where else can this structure be seen?

The Wizard of Oz and Star Wars come to mind.   Have you encountered a monomyth on television or in a movie theatre recently?  Or a story that follows Harmon’s embryo model?

Are You Ready for Some Football?

By Marc Williams

I’ll begin by confessing that I am among America’s truly die-hard football fans.  I follow football throughout the year, even though the season only lasts about four months.  Serious fans like me are thrilled this morning: the NFL’s 130+ day lockout appears to be ending today following months of intense negotiations.

During the past few months, analysts have criticized both owners and players in news articles and fans have sounded off on sports talk radio.  Given America’s economic struggles, how could these sides complain about having to share $9 billion in revenues?  While the owners initiated the lockout, most of the criticism I heard seemed to be directed at the “overpaid” NFL players.

NFL Commissioner Roger Goodell, left, and NFL Players Association Director DeMaurice Smith, right.

Listening to talk radio over the past 130 days, I often heard fans suggesting, “the players should be more grateful–I’ll go play football for a fraction of what those guys make.”  Every time I listened to the radio, I heard someone cite his $30K salary, how hard he works, and how happy he’d be to play football for the minimum NFL rookie salary, which was $325,000 in 2010.  What many fans fail to realize is that these players earn a minimum of $325,000 because they have specific skills and physical attributes that are exceedingly rare and have found a way to capitalize on those traits.

Consider one of my favorite players, wide receiver Calvin Johnson, as an example.  In the 2008 Beijing Olympics, Jamaican sprinter Usain Bolt covered the first 40m of his record-breaking 100m run at a speed of 8.6m/second.  When Calvin Johnson was entering the 2007 NFL draft, he was timed at a similar distance at about 8.4m/second, only slightly behind Bolt’s pace. These numbers are especially notable since Bolt, the 100-meter world record-holder, weighs only 198 pounds while Johnson weighs almost 240 pounds. Further, Johnson is 6’5” and has a 42” vertical leap (four inches better than NBA star Kobe Bryant’s). Johnson, like most NFL athletes, possesses not only exceptional football skills but also a rare combination of size and athleticism.

While these physical attributes are indeed rare, many argue that twenty year-olds have no business earning so much money for simply playing a game.  After all, many of us go to school, earn degrees, work from the bottom-up in our chosen fields, taking years to earn promotions and raises, and never approach the $325,000 minimum NFL rookie salary.  Is this evidence that something is out of whack?

Shakespeare

Perhaps what is truly out of whack is the notion that education, job skills, and a lifetime of service should entitle one to fame or a generous salary.  Consider Shakespeare as an example.  Unlike most of the successful poets and playwrights of the Elizabethan era, Shakespeare was not college educated.  His success generated enormous jealousy from writers who had “paid their dues” through university education.  To this day, there are scholars dedicated to attributing Shakespeare’s work to other individuals from Francis Bacon to Queen Elizabeth herself, arguing that an uneducated son of a glove maker couldn’t possibly be the author of Hamlet and King Lear.

 In my BLS course on Shakespeare, we discuss this “authorship question,” a premise I dismiss as snobbery.  Perhaps Shakespeare was simply brilliant; an individual with exceptional skills who was creative enough to find a way to apply and capitalize on those skills.  Mozart composed his first opera when he was twelve years old—there is no accounting for that kind of genius.  Isn’t the same true of professional athletes? When we express jealousy about the financial success of professional athletes, are we jealous of their unique gifts or are we jealous because we haven’t figured out what to do with our own talents?

Usain Bolt breaks the world record in the 100 meters, Beijing 2008:

Calvin Johnson in action: