Tag Archives: popular culture

And the Oscar Goes To…

By Marc Williams

This Sunday, the Hollywood glitterati will turn out for its annual jubilee, the Academy Awards. While I’ve never been a fan of award shows (see my post from last summer regarding the Tony Awards), I certainly view an Oscar as the highest recognition in the entertainment industry. While lots of quality work is unrecognized by the Academy each year, I still regard an Oscar nomination as some validation of quality work.

In the decade or so after I finished high school,  I took that validation quite seriously. I made a point of seeing all of the Oscar-nominated films before the awards ceremony. I would definitely see all the Best Picture nominees but I tried to see the documentaries and foreign films too.Indeed I saw many terrific films I wouldn’t have otherwise seen. In 1998, everybody saw Titanic (the Best Picture winner, among many other wins) but I hadn’t seen L.A. Confidential until after it received nine Oscar nominations. In 2004, it was a Best Director Oscar nomination for Fernando Meirelles that prompted me to view City of God–which I now count among my favorite films of all time.

When I first started trying to see all the Oscar-nominated films, my motivation was largely snobbish. I felt I earned a certain cultural cache from seeing all of the “great” films of the year, especially the obscure films my friends hadn’t heard of. Admittedly, there were many times I forced myself to sit through movies in which I wasn’t remotely interested. In earning my status as a highly cultured individual, I figured I had to pay the price of boredom. I suffered through The Red Violin, The Gangs of New York, and many other Oscar nominated films, hating every minute of them.

Naturally, circumstances change. I can’t fit self-imposed boredom into my schedule anymore. Nowadays I find it exceedingly difficult to go to the movies at all. My wife and I try to watch films at home but that can be challenging with a two-year old asleep down the hall. Not surprisingly, we’ve fallen behind on all the movies we want to see–we’ve learned from experience that Netflix only allows users to put 500 movies in the DVD queue. I doubt we’ll ever catch up. The result of our changing circumstances is a need to prioritize our film viewing and spend time only with stories we find truly fascinating–the Netflix queue is getting pared down to the essentials and we make very careful choices when we are able to make a rare trip to the movie theatre.

Of the eight films nominated for Best Picture this year, I’ve seen half. In recent years, I’ve seen far fewer.

In 2009, I had only seen two of the eight nominated films at the time of the ceremony. This year, I’ve seen Hugo, The Artist, The Descendants, and Midnight in Paris. I doubt I will ever see Moneyball or Extremely Loud & Incredibly Close because I’m just not interested. And this, for me, is how the Oscars reflect my changing views on achieving “cultured” status. I’m not willing to endure disinterest in exchange for this status.

I’m convinced, however, that I’m not the only person who has consumed boring art for snobbish reasons. In fact, I believe many of us go to the theatre, museums, or obscure films with boredom as an objective: “If I can withstand this boredom for two hours, I’ve paid my cultural debt to society.” I agree the arts are vital to communities, to self-awareness, and communication but if the work isn’t engaging, interesting, or in some way entertaining, how valuable can it be?

I’ve seen this phenomenon at work in some of my BLS courses. My Eye Appeal students, for instance, are required to attend a live performance in their community. My hope is that the assignment will be fun–and for most of my students, this assignment is the highlight of the course. But sometimes, students attend events in which they clearly aren’t interested. Perhaps they are trying to impress me with their sophistication, attending a ballet or opera that they secretly despise, hoping to manufacture some cultural credibility?

Have you ever suffered boredom for the sake of feeling cultured?

That’s Entertainment!?

By Ann Millett-Gallant

I will admit, I like to watch TV. I study and teach about mass media representations, for example in my BLS course, Representing Women, so it is partially a professional interest, but also, I enjoy the entertainment. I have been watching all the new shows this Fall and can say I like “Two Broke Girls” and “Pan Am” the best so far. I have also appreciated the new episodes of “The Closer” and “Grey’s Anatomy.” I am a little confused by “Once Upon a Time.” It is a fairy tale show, but somehow, the plot seems more fitting to a movie.

But perhaps this is the direction television is taking – towards fantasy, or overly dramatic crime dramas. Shows based on mid 20th century are also popular, such as “Pan Am” or “Mad Men.” Basically, viewers desire to be transported to another time or place. Or perhaps fictional television is trying to distinguish itself from “Reality” TV. This morning, I was disgusted to hear about Kim Kardashian’s upcoming divorce, after a huge, media spectacle wedding and 72 days of marriage. I was not surprised and would not have taken such offense, except the story was profiled on NBC’s “Today Show” as important national news. They then featured a panel of legal “experts” to analyze whether it was legitimate or rather a huge media ploy. One of these “experts” was Star Jones, who after her scandalous exit from “The View,” dramatic weight loss, and own short marriage, redeemed her media status by appearing on “The Apprentice.” But I am getting off topic. Apparently, the Kardashian wedding cost $10 million and the couple has accrued up to $20 million since for appearances and publicity projects. And THIS is “reality” TV? The obvious irony is that all the “reality” TV on today is the farthest thing from the reality of the viewers. We are in an economic depression and unemployment is higher than it’s ever been. Reality TV seems less realistic and much more voyeuristic. Viewers watch so they can ridicule the “cast” of “Jersey Shore” (I use the term “cast” cautiously) or revel in the gluttony and triviality of the wealthy Kardashians, Hiltons, or the bevy of Playboy bunnies. On the other hand, many “reality” shows are about competitions, specifically ones in which the struggling artist has a chance at stardom (“The X Factor,” “American Idol,” and even “Project Runway”).

Other competition shows, like the aforementioned “Apprentice” and “Dancing with the Stars” seem like platforms for so-called “stars” to rehabilitate their reputations. It should then seem no coincidence that Dancing with the “Stars” is actually dancing with the cast offs of other “reality” TV shows. As television fiction, reality, and competition overlap, so does “news” and “entertainment.” Let’s be honest, real “reality” is depressing. The other news stories on “Today” were about the war, political debates, the failing economy, and random horror stories like medical mistakes. Maybe the news is just responding to their target audiences, everyday people who feel powerless and economically, and perhaps personally depressed. Maybe we prefer the “Reality” of TV to our own realities.

Recalculating

By Marc Williams

Anyone in a car with a GPS knows the phrase “recalculating.” Once you program your destination and begin your journey, the GPS expects you to follow the path with unwavering trust. The slightest turn from the designated route—a pit stop, a scenic detour, a bite to eat—will cause the GPS to recalculate the route. If I miss a turn and get frustrated, the Garmin’s voice is steady and confident, never losing sight of the path. It’s oddly comforting to know that someone in the car can keep their cool. Interestingly, on my Garmin system, and on all of the GPS devices I’ve encountered in other cars, the voice that calmly says, “recalculating” is always female. Is that merely a coincidence?

CNN.com’s Brandon Griggs recently wrote about the new iPhone 4S feature, Siri,

and its distinctly female voice. Griggs notes that female voices are far more common in talking devices than male voices, and provides some interesting theories on the various reasons why talking computers tend to be female. For instance, Griggs cites Clifford Nass of Stanford University:

iPhone's Siri

“It’s much easier to find a female voice that everyone likes than a male voice that everyone likes,” said Stanford University Professor Clifford Nass, author of “The Man Who Lied to His Laptop: What Machines Teach Us About Human Relationships.” “It’s a well-established phenomenon that the human brain is developed to like female voices.”

HAL 9000 (From Kubrick's 2001: A Space Odyssey)

Another theory suggests that Hollywood uses male computer voices in suspense and thriller movies to create a sound of “menace,” so perhaps we find the idea of female computers to be more less menacing.

Or the historical theory is that WWII aviators relied on females for navigation, since their voices were easily distinguished from the male voices of the other pilots.

In many BLS courses, including my arts courses, gender roles and gender identity are discussed and debated at length. In fact, last week, my “Big Plays, Big Ideas” class was discussing gender attitudes on war in our examination of the Greek comedy Lysistrata.

So what does the trend of female computers say about gender attitudes on technology today?

Siri in action:

Beyond Apple

By Marc Williams

Following Steve Jobs’ passing on October 5, countless articles, blogs, and remembrances have paid tribute to Jobs’ contributions to the technology industry. I could certainly join that chorus, given that I do so much of my online teaching for the BLS program from my iMac, iPad, and iPhone. I’m a dedicated and unabashed Mac user and I thank Steve Jobs for helping make life a little simpler through these brilliant gadgets.

Somewhat overlooked in the tributes to Jobs are his contributions as the owner/CEO of Pixar. Jobs purchased Pixar from Lucasfilm in 1986, which at the time was developing 3D animation hardware for commercial use. Pixar’s hardware development and sales business was not terribly successful and the company began selling off divisions and laying off employees.

During this period of struggle, Jobs was willing to consider a proposal from one of his employees, John Lasseter. Lasseter, an animator who had been fired from Disney and subsequently hired by Lucasfilm, had been experimenting with short films and advertisements, all completely animated by computer. Lasseter pitched Jobs on the idea of creating a computer-animated feature film. Such an endeavor would be tremendously risky for a company like Pixar, which had not been organized as a film studio. Not only did Jobs sign off on the idea, but he was able to secure a three-picture deal with Walt Disney Feature Animation. Jobs shifted the company’s primary focus to making movies.

The feature film Lasseter pitched to Jobs became Toy Story, and he went on to direct A Bug’s Life, Toy Story 2, Cars, and Cars 2. Lasseter also served as executive producer for virtually every other Pixar film (Finding Nemo, The Incredibles, WALL-E, Up, Toy Story 3) and he is now the Chief Creative Officer for Walt Disney and Pixar Animation Studios.

There’s no denying Lasseter’s genius but Pixar as we know it today would not exist without Steve Jobs. Jobs took a tremendous leap of faith, entrusting the company’s future to the brilliance of his collaborators.

A leader doesn’t necessarily have to be the person with the best idea; sometimes the leader simply must recognize the best idea in the room—and get out of the way. Harry Truman said, “it is amazing how much you can accomplish in life if you don’t mind who gets the credit.” The story of Pixar demonstrates that Steve Jobs fully understood this simple truth.

John Lasseter

“Steve Jobs was an extraordinary visionary, our very dear friend and the guiding light of the Pixar family. He saw the potential of what Pixar could be before the rest of us, and beyond what anyone ever imagined. Steve took a chance on us and believed in our crazy dream of making computer animated films; the one thing he always said was to simply ‘make it great.’ He is why Pixar turned out the way we did and his strength, integrity and love of life has made us all better people. He will forever be a part of Pixar’s DNA. Our hearts go out to his wife Laurene and their children during this incredibly difficult time.”

– John Lasseter, Chief Creative Officer & Ed Catmull, President, Walt Disney and Pixar Animation Studios

Adults Say the Damnedest Things

By Ann Millett-Gallant

Ann at UNCG

I am scholar who studies and teaches about representations of the body in visual culture.  I am the designer and instructor for 3 BLS courses: Art of Life, Photography: Context and Illusions, and Representing Women.  These classes include examples from fine art, film, television, advertisements, medical images, and so on, and in my own writing, I focus specifically on these representations of the visibly disabled body.  I also consider everyday social interaction as a form of visual culture.  As a woman with very noticeable physical disabilities, I know a lot about the stare.  And often for me, daily life can become a theater for performance.  At the Kroger near my house, I have been stared at, mainly by children; asked how long I have been disabled or what “happened” to me; told various stories about other disabled people; and told I am admired, because if the person speaking were “like” me, they wouldn’t want to leave the house.  Now, it’s always good to be called amazing, but it’s not so great when that term is based on low expectations for what I can and should so.   I always have to consider whether to use the moment as a teaching tool or whether to blow it off and not waste my time.  I like to respond to children, for example, by telling them I was “born this way,” or that I don’t have legs because I don’t need them.  I didn’t know how to respond when the cashier at CVS asked me how long I had been doing things for myself, as if implying the question “how long have you been out of the hospital?”  I wish I had had a better come back, but I just declared, “Since birth!”  Sometimes these moments amuse me, but sometimes they make me angry and sad.  Not so much sad about myself, but sad that disability has such a bad rap.

The other day, after I left the doctor’s office, I scooted across the parking lot to Rose’s discount department store.  I was happy to find 2 pairs of cute, inexpensive sunglasses, since recently it seemed that all my other cheap sunglasses had been breaking.  I thought I should call my husband to come to pick me up, but then I realized I had forgotten my cell phone.  I asked the woman at the checkout line if there was a phone nearby I could use, and I explained my predicament.  She responded that I better get a hold of him soon, because what if I were to get abducted.  I found this to be an odd response, because we were in a public space in a nice neighborhood.  I scowled and said in a joking tone that I wasn’t worried about that, but I just wanted to get home.  She then told me I should worry about that (being abducted), because “they go after all kinds of women.”  I scowled again, in more anger.  Was that supposed to be some sort of twisted compliment, that even though I was disabled, that I was still attractive to abductors?!  And in the first place, the fact that she was talking about me as being abducted meant she was classifying me as helpless, dependent, childlike, or somehow already vulnerable.  I don’t think all that people think before they speak, and certainly don’t realize the assumptions and stereotypes that are communicated through their utterances.  And, I realize their comments have much more to say about their own ignorance and insecurities than mine.

And all these interactions aren’t negative.  In some ways, I feel like a celebrity.  I stand out in a crowd, and I attract attention.  I do exchange many smiles and hellos with people around my usual haunts.  In some restaurants where my husband and I eat often, the wait staff may even remember what I want to order.  Sometimes the comradery is nice, yet at times, I wish I could remain incognito.  I’d love to hear from others who have these experiences.

A Thousand Faces

By Marc Williams

I teach a number of courses involving dramatic literature, including Big Plays, Big Ideas in the BLS program at UNCG.  In most of these classes, I discuss dramatic structure—the way that incidents are arranged into a plot.  Whenever I teach dramatic structure, I always turn to Sophocles’ Oedipus Rex to serve as an example.  Aristotle believed this play to be tragedy “in its ideal state,” partially because the incidents are arranged in a clear cause-and-effect manner.  One incident logically follows the next and although there are some surprises, none of the events are random, accidental, or tangential.

The story of Oedipus is an ancient myth.  20th century mythology scholar Joseph Campbell wrote about Oedipus frequently, including in his seminal book, The Hero with a Thousand Faces.  In this book, Campbell outlines the “monomyth,” a dramatic structure that many, if not most, stories seem to adhere in one manner or another.  The monomyth consists of several stages of the hero’s journey: a call to adventure, a refusal of that call, followed by aid from a supernatural entity, crossing a threshold into unfamiliar territory, entering/escaping the belly of the whale, traveling a road of trials, and so on, all the way through the hero’s return.  Oedipus’ journey follows Cambell’s pattern almost perfectly. The pattern applies not only to Ancient Greek myths but to stories from virtually every culture across the globe.

Campbell describes the stages of the hero’s journey at length in The Hero with a Thousand Faces and also diagrams the hero’s journey thus:

I was instantly reminded of Joseph Campbell and his diagram today when I came across this:

1.  A character is in a zone of comfort
2.  But they want something
3.  They enter an unfamiliar situation
4.  Adapt to it
5.  Get what they wanted
6.  Pay a heavy price for it
7.  Then return to their familiar situation
8.  Having changed

This diagram was developed by Dan Harmon, creator of the NBC sitcom Community, and according to this article on Wired.com, is apparently the inspiration for every episode of the show:

Dan Harmon

[Harmon] began doodling the circles in the late ’90s, while stuck on a screenplay. He wanted to codify the storytelling process— to find the hidden structure powering the movies and TV shows, even songs, he’d been absorbing since he was a kid. “I was thinking, there must be some symmetry to this,” he says of how stories are told. “Some simplicity.” So he watched a lot of Die Hard, boiled down a lot of Joseph Campbell, and came up with the circle, an algorithm that distills a narrative into eight steps:

Harmon calls his circles embryos— they contain all the elements needed for a satisfying story— and he uses them to map out nearly every turn on Community, from throwaway gags to entire seasons. If a plot doesn’t follow these steps, the embryo is invalid, and he starts over. To this day, Harmon still studies each film and TV show he watches, searching for his algorithm underneath, checking to see if the theory is airtight. “I can’t not see that circle,” he says. “It’s tattooed on my brain.”

The eight-step Harmon embryo model is simpler than Cambell’s monomyth, which contains seventeen structural units. Harmon’s embryo model, because it is simpler than Campell’s, is probably also more universal.  And indeed, Harmon uses this embryo as a litmus test to determine if an episode of Community is structurally sound. It is, after all, a tried and true formula for great storytelling.  So where else can this structure be seen?

The Wizard of Oz and Star Wars come to mind.   Have you encountered a monomyth on television or in a movie theatre recently?  Or a story that follows Harmon’s embryo model?

To Tweet or Not to Tweet

By Marc Williams

In the first week of my Shakespeare Off the Page class in the BLS program, we discuss the development of the English language during Shakespeare’s era.  English, which was developing and expanding as colloquial language, was considered “lowbrow” by many 16th century traditional academics.  Shakespeare’s works, written in English of course, did much to legitimize the English language in spite of resistance from these traditionalists.    Shakespeare also coined many new words and phrases, contributing to a rapid expansion of English vocabulary that occurred during his lifetime.  One of my students gave the example of “eyeball,” a word that hadn’t been written in English until Shakespeare included it in A Midsummer Night’s Dream.  Every semester, I ask my students to consider the coinage of new words today.  Students often cite technological advances as the source for new words:  after all, who knew what a “blog” was twenty years ago?

This week’s discussion reminded me of a story that surfaced last year about the New York Times.  Philip Corbett, the standards editor at the New York Times, issued a memo to staff writers that the word “tweet” should not appear in the newspaper if describing a posting on Twitter.  Corbett advised the word “tweet” should appear only if referring to the sound a bird makes.  It was widely reported that Corbett “banned” the word but “strongly discouraged” is probably a more accurate assessment of Corbett’s memo.  Regardless of the phrasing, Corbett was widely criticized for the move–Here is Corbett’s response to the criticism he received.

So here’s a major publisher–among the world’s most important authorities on trends and issues with American English–deliberately resisting the coinage and use of a new word.  I did a quick online search and found two dictionaries disagreeing with each other.  Here’s Webster, which does not provide a Twitter-based definition for “tweet.”

Noun            1. tweet – week chirping sound as of a small bird [sic]

Verb            1. tweet –make a weak, chirping sound; “the small bird was tweeting in the tree” Synonyms: twirp

Verb            2. tweet – squeeze tightly between the fingers; “He pinched her behind”; “She squeezed the bottle” Synonyms: nip, pinch, twinge, twitch, squeeze

Here’s Dictionary.com, complete with a reference to Twitter:

tweet – noun

1. a weak chirping sound, as of a young or small bird.

2. Digital Technology . a very short message posted on the Twitter Web site: the message may include text, keywords, mentions of specific users, links to Web sites, and links to images or videos on a Web site.

Corbett explains that if “tweet” ever becomes as common as “e-mail,” it will warrant reconsideration as a legitimate word.  But don’t we look to the New York Times, dictionaries, and other publications to confirm a word’s legitimacy and proper use?  If they won’t use the word in print, can it ever be legitimized?  Do publishers have an obligation to embrace and define new words?  Do they think of themselves as defenders of the English language?  What can be gained from refusing “tweet” and other new words admission into our vocabulary?