Tag Archives: Hollywood

It’s So Bad it’s Good.

By Claude Tate

I devoted one of my blog entries last fall to a movie, Raise the Red Lantern,  which I felt was not only an excellent movie in and of itself, but a movie of educational value in that it provided a window onto traditional Confucian society in early 20th century China.  In fact, I liked it so much I’ve used it a number of times in classes and recommended it on numerous occasions to my BLS students.

Gong Li in Raise the Red Lantern

Gong Li in Raise the Red Lantern

This blog is also devoted to a movie, but a very different kind of movie. This movie is far removed from an excellent movie. It’s a bad movie. A very bad movie. The general consensus is that it is the worst movie ever made.  In fact, it is so bad that even some critics see it as good.  For example, Phil Hall on his film review site, Rotten Tomatoes, said the exceptionally poor quality of the movie made him laugh so much, he could not put it at the top of his ‘worst of’ list.  Another source, Videohound’s Complete Guide to Cult Flicks and Trash Pics, states that, “In fact, the film has become so famous for its own badness that it’s now beyond criticism.”

Also, this movie does not have the clear educational value that “Lantern” has. But if one defines educational value somewhat loosely—strike that, very loosely—it is not without some value.  In fact, I used it on several occasions in a face to face class I used to teach entitled “US History Since 1945″.  Since we had time in that class to go beyond the high points that one is limited to in the broad surveys, I tried to include things that would allow students to re-imagine what everyday life was like. Toward that end, when I covered the 1950s, among other things, I brought in clips of some of the old classic TV shows as well as some movies.  The atomic bomb and the possibility of nuclear annihilation became a part of our lives during the ’50s, so there were a number of movies made that were built around that theme. Some were good, while others were not so good. We also really became very much aware of space during this ’50s, so a number of movies were made devoted to that theme. Again, some were good, while others were bad. While I mainly used clips, when there was time, I would try to work in an entire movie using one or both of those themes. I tried a couple of the really good ones, but film-making has changed quite a bit since then, so students didn’t seem to appreciate the quality of what they were seeing. So I decided to look for stinkers.  Luckily, I not only found a stinker, the stench from this ‘masterpiece’ could encircle the earth several times over. Students generally really liked the movie, so I thought I would suggest it here.

Plan_9_poster

Plan 9 From Outer Space original poster

It is (music please) Plan 9 From Outer Space (made in 1956, released in 1959). It is a horror movie that incorporated an invasion from outer space theme as aliens planned to conquer the earth by raising the dead against us. It was conceived, produced, written, and directed by the infamous Edward D. Wood, Jr.  After years of criticism, in 1980 Michael Medved and Harry Medved named it the “worst movie ever made” and awarded it their Golden Turkey Award.  That same year Ed Wood (who died in 1978) was also posthumously awarded the Golden Turkey Award as the worst director ever.  I don’t know whether any of the actors in the movie received any ‘worst’ awards for their performances, but a number of them should have. It also played at a Worst Films Festival in New Orleans, figured prominently in an episode of Seinfeld, and was the centerpiece of the movie, Ed Wood, (1994) which was directed and produced by Tim Burton and starred Johnny Depp.

Johnny Depp as Ed Wood

Johnny Depp as Ed Wood

For those of you who haven’t seen the movie, I don’t want to spoil the beauty of it by saying too much about the number of ‘gems’ it contains. But a few hints will not hurt.  Bela Lugosi was going to make a comeback with this movie, but died after only a few test shots. Ed, ‘ingeniously’ included those few shots on a number of occasions in the movie. For other shots, his wife’s chiropractor, who looked nothing like Lugosi, was the stand in. That is why the DVD released through Image Entertainment, states “Almost Starring Bela Lugosi” on the cover.  The special effects are also a thing of beauty as the flying saucers are campy even for the ’50s.  And one just has to love how night turns to day and back to night in back to back scenes. But if you wish to know more, a plot summary can be found on “The Movie Club Annuals…” website.

I own the DVD. I just had to have a physical copy. But it is in the public domain, and can be accessed on YouTube here.   You should also be able to download it from other sources.

Ed Wood

Ed Wood

By the way, Ed Wood made movies before and after this one.  I have not seen them so I cannot attest to their quality, but given his talent for movie making he showed in Plan 9 and the titles, I would assume they are bad also.  But movies evidently weren’t his only passion.  He wrote a large number of books, which I have not read nor intend to, but from the sampling of titles I’ve seen, he seems to have been just as good at writing books as making movies. Ed also led an interesting personal life which you get some hint of in the movie, Ed Wood.  If you want to find out more about Ed or his other “artistic” endeavors, you’re on your own.  I’m only recommending Plan 9 From Outer Space.

If you plan to watch Plan 9, I would suggest you watch Ed Wood first.  I think it will help you understand and appreciate both Ed and Plan 9 since it focuses on Ed’s early career and the making of this ‘masterpiece’.  By the way, Ed Wood is a very good movie. It won two Academy Award, one for Best Supporting Actor (Martin Landau), and one for Best Makeup (Rick Baker, Ve Neill, and Yonlanda Tousseing).  Unfortunately, this movie is not in the public domain so you will need to rent it.

Vampira (Maila Nurmi) in Plan 9

Vampira (Maila Nurmi) in Plan 9 From Outer Space

I would also suggest you watch it with friends. It’s always more fun to see an awful flick with friends as they may see things to make fun of you may miss. So I guess it has value beyond just its dubious educational value. It’s a great excuse for friends to get together.

Spiders and Toads

By Marc Williams

Laurence Olivier as Richard III.

“Now is the winter of our discontent
Made glorious summer by this sun of York.”
~Richard III (Act 1, scene 1).

King Richard III is among Shakespeare’s greatest villains. Based on the real-life Richard of Glouster, Shakespeare’s title character murders his way to the throne, bragging about his deeds and ambitions to the audience in some of Shakespeare’s most delightful soliloquies. Shakespeare’s Richard is famously depicted as a hunchback, and uses his physical deformity as justification for his evil ambitions:

Why, I, in this weak piping time of peace,
Have no delight to pass away the time,
Unless to spy my shadow in the sun
And descant on mine own deformity:
And therefore, since I cannot prove a lover,
To entertain these fair well-spoken days,
I am determined to prove a villain
And hate the idle pleasures of these days.

For stage actors, Richard III is a tremendously challenging role. On one hand, he is pure evil—but he must also be charming and likeable. If you aren’t familiar with the play, its second scene features Richard successfully wooing Lady Anne as she grieves over her husband’s corpse! And Richard is her husband’s killer! Shakespeare’s Richard is both evil and smooth.

Simon Russell Beale as Richard III.

Actors must also deal with the issue of Richard’s physical disability. For instance, Richard is described as a “poisonous bunch-back’d toad,” an image that inspired Simon Russell Beale’s 1992 performance at the Royal Shakespeare Company, while Antony Sher’s iconic 1984 interpretation was inspired by the phrase “bottled spider,” an insult hurled at Richard in Act I.

Anthony Sher’s “bottled spider” interpretation of  Richard III.

While much of the historical record disputes Shakespeare’s portrayal of Richard as a maniacal mass-murderer, relatively little is known about Richard’s disability. According to the play, Richard is a hunchback with a shriveled arm. However, there is little evidence to support these claims.

This uncertainty may soon change. Archaeologists in Leicester, England have uncovered the remnants of a chapel that was demolished in the 16th century. That chapel, according to historic accounts of Richard’s death at the Battle of Bosworth, was Richard’s burial site. Not only have researchers found the church, but they have also located the choir area, where Richard’s body was allegedly interred. And indeed, last week, the archaeologists uncovered bones in the choir area:

If the archeologists have indeed found the remains of Richard III, the famous king was definitely not a hunchback. It appears he suffered from scoliosis—a lateral curve or twist of the spine—but not from kyphosis, which is a different kind of spinal curvature that leads to a pronounced forward-leaning posture. As Dr. Richard Taylor explains in the video, the excavated remains suggest this person would have appeared to have one shoulder slightly higher than the other as a result of scoliosis.

Interestingly, Ian McKellen’s performance as Richard III, captured in Richard Loncraine’s 1996 film, seems to capture the kind of physical condition described by Dr. Taylor, with one shoulder slightly higher than the other. At the 6:45 mark in this video, one can see how McKellen dealt with Richard’s condition.

So it appears Shakespeare not only distorted historical details in Richard III, he also apparently distorted the title character’s shape. Of Shakespeare’s Richard, McKellen wrote:

Shakespeare’s stage version of Richard has erased the history of the real king, who was, by comparison, a model of probity. Canny Shakespeare may well have conformed to the propaganda of the Tudor Dynasty, Queen Elizabeth I’s grandfather having slain Richard III at the Battle of Bosworth. Shakespeare was not writing nor rewriting history. He was building on his success as thee young playwright of the Henry VI trilogy, some of whose monstrously self-willed men and women recur in Richard III.

It seems likely that Shakespeare wanted Richard to seem as evil as possible in order to flatter Queen Elizabeth I, depicting her grandfather as England’s conquering hero. But why distort Richard’s physical disability as well?

In describing Richard’s body shape, it is difficult to ascertain what Shakespeare’s motives might have been and perhaps even more difficult to assess his attitudes toward physical difference in general. For example, in my “Big Plays, Big Ideas” class in the BLS program, we discuss the issue of race in Othello, even though we don’t know much about what Shakespeare thought about race. Many scholars have investigated the subject of physical difference in Shakespeare, of course: there are papers on Richard’s spine, naturally, but also Othello’s seizures, Lavinia’s marginalization in Titus Andronicus after her hands and feet are severed, the depiction of blindness in King Lear, and even Hermia’s height in A Midsummer Night’s Dream. And just as one must ask, “is Othello about race,” we might also ask, “is Richard III about shape?” I doubt many would argue that physical difference is the primary focus of Shakespeare’s Richard III, but it will be interesting to observe how the apparent discovery of Richard’s body will affect future performances of the play. Will actors continue to twist their bodies into “bottled spiders,” or will they focus on the historical Richard’s scoliosis—and perhaps ask why such vicious language is used to describe such a minor difference?

Nimrod: What’s In a Name?

by Matt McKinnon

My teenage son is a nimrod. Or so I thought.

And if you have teenagers, and were born in the second half of the twentieth century, you have probably thought at one time or another that your son (or daughter) was a nimrod too, and would not require any specific evidence to explain why.

Of course this is the case only if you are of a certain age: namely, a Baby Boomer or Gen Xer (like myself).

For if you are any older, and if you are rather literate, then you would be perplexed as to why I would think that my son was a nimrod, and why was I not capitalizing Nimrod as it should be.  Since it is, after all, a proper noun.

It is?

Yes, it is.  Or rather it was.  Let me explain.

It turns out, the word “nimrod” (or more properly “Nimrod”) has a fascinating history in which it goes about a substantial reinterpretation.  (Any nimrod can find this out by searching the web, though there is precious little explanation there.)   This, by itself, isn’t surprising, as many words that make their way through the ages transform as well.  But the transformation of “Nimrod” to “nimrod” is particularly interesting in what it tells us about ourselves and our culture.

Nimrod, you see, was a character from the Hebrew Scriptures, or as Christians know it, the Old Testament:

“Cush became the father of Nimrod; he was the first on earth to become a mighty warrior. He was a mighty hunter before the LORD; therefore it is said, ʻLike Nimrod a mighty hunter before the LORD.ʼ”  (Genesis 10:8-9 NRSV)

This is the manner in which older biblically literate folks will understand the term: “as a mighty hunter.”

But there’s more here, for these folks also might understand the term as referencing a tyrannical ruler.

Why?  Well, the etymology of the word links it to the Hebrew “to rebel,” not for anything that Nimrod actually does in the Old Testament, but because, as many scholars attest, it is probably a distortion of the name for the Mesopotamian war-god Ninurta.  And the later chroniclers of Israelite religion didn’t have much sympathy for the polytheism of their Mesopotamian neighbors—especially when it so obviously informed their own religious mythology.

So the word, when it very early on enters the biblical narrative, already shows signs of transformation and tension as referencing both a mighty hunter as well as someone rebellious against the Israelite god.

In fact, Jewish and Christian tradition name Nimrod as the leader of the folks who built the Tower of Babel, though this is not found anywhere in the scriptures.  This, then, is how Nimrod is now portrayed in more conservative circles, despite the lack of biblical attestation:

And as the word is already attested to in Middle English, by the 16th century it is clearly being used in both manners in the English language: as a tyrant and as a great warrior or hunter.

Now I can assure you, neither of these describes my teenage son.  So what gives?

Well, “Nimrod” shows up in a 1932 Broadway play (which only had 11 showings) about two lovesick youngsters:

“He’s in love with her. That makes about the tenth. The same old Nimrod. Won’t let her alone for a second.”

Here, however, the emphasis is still on the term’s former meaning as a hunter, though its use in the play to describe a somewhat frivolous and hapless fellow who moves from one true love to the next points us in the right direction.

And in a 1934 film You’re Telling Me, W.C. Fields’ character, a bit of a buffoon himself (and a drunkard), takes a few swings of a limp golf club and hands it back to his dim-witted caddy, saying in a way only W.C. Fields could:

“Little too much whip in that club, nimrod.”

So here we have the first recorded instance of the word’s transformation from a great hunter or tyrant to a stupid person or jerk.

But that’s not the end of the story.  After all, how many of us have seen You’re Telling Me?  (I haven’t, at least, not until I did the research.)

So the last, and arguably most important piece to the puzzle is not the origination of the word or its transformation, but rather the dissemination of it.

And that, as I’m sure many of you are aware, is none other than T.V. Guide’s greatest cartoon character of all time: Bugs Bunny, who first debuted in the 1940’s, not that long after You’re Telling Me premiered.

In this context, the one most folks born after World War II are familiar with, Bugs Bunny refers to the inept hunter Elmer Fudd as a “little nimrod.”  And the rest, as they say, is history.

For what emerges from Bugs’ usage is not the traditional reference to Fudd as a hunter (though this is the obvious, albeit ironic, intention), but rather Fudd’s more enduring (and endearing?) quality of ineptitude and buffoonery.

And anyone who has (or knows) a teenager can certainly attest to the applicability of this use of the term in describing him or her.

But the important thing is what this says about literacy and our contemporary culture.

For whereas my parents’ generation and earlier were more likely than not to receive their cultural education from Classical stories, the great literature of Europe, and the Bible, those of us born in the latter half of the 20th century and later, are much more likely to receive our cultural education from popular culture.

I have seen this firsthand when teaching, for example, Nietzsche’s “Thus Spake Zarathustra,” which offers a critique of Judaism and Christianity by parodying scripture.  The trouble is, when students don’t know the referent, they can’t fully understand or appreciate the allusion.  And this is as true of Shakespeare and Milton as it is of Nietzsche…or Bugs Bunny for that matter.

And the ramifications of this are far greater than my choosing the proper term to criticize my teenage son.

(Though ya gotta admit, “nimrod” sounds pretty apropos.)

That’s Entertainment!?

By Ann Millett-Gallant

I will admit, I like to watch TV. I study and teach about mass media representations, for example in my BLS course, Representing Women, so it is partially a professional interest, but also, I enjoy the entertainment. I have been watching all the new shows this Fall and can say I like “Two Broke Girls” and “Pan Am” the best so far. I have also appreciated the new episodes of “The Closer” and “Grey’s Anatomy.” I am a little confused by “Once Upon a Time.” It is a fairy tale show, but somehow, the plot seems more fitting to a movie.

But perhaps this is the direction television is taking – towards fantasy, or overly dramatic crime dramas. Shows based on mid 20th century are also popular, such as “Pan Am” or “Mad Men.” Basically, viewers desire to be transported to another time or place. Or perhaps fictional television is trying to distinguish itself from “Reality” TV. This morning, I was disgusted to hear about Kim Kardashian’s upcoming divorce, after a huge, media spectacle wedding and 72 days of marriage. I was not surprised and would not have taken such offense, except the story was profiled on NBC’s “Today Show” as important national news. They then featured a panel of legal “experts” to analyze whether it was legitimate or rather a huge media ploy. One of these “experts” was Star Jones, who after her scandalous exit from “The View,” dramatic weight loss, and own short marriage, redeemed her media status by appearing on “The Apprentice.” But I am getting off topic. Apparently, the Kardashian wedding cost $10 million and the couple has accrued up to $20 million since for appearances and publicity projects. And THIS is “reality” TV? The obvious irony is that all the “reality” TV on today is the farthest thing from the reality of the viewers. We are in an economic depression and unemployment is higher than it’s ever been. Reality TV seems less realistic and much more voyeuristic. Viewers watch so they can ridicule the “cast” of “Jersey Shore” (I use the term “cast” cautiously) or revel in the gluttony and triviality of the wealthy Kardashians, Hiltons, or the bevy of Playboy bunnies. On the other hand, many “reality” shows are about competitions, specifically ones in which the struggling artist has a chance at stardom (“The X Factor,” “American Idol,” and even “Project Runway”).

Other competition shows, like the aforementioned “Apprentice” and “Dancing with the Stars” seem like platforms for so-called “stars” to rehabilitate their reputations. It should then seem no coincidence that Dancing with the “Stars” is actually dancing with the cast offs of other “reality” TV shows. As television fiction, reality, and competition overlap, so does “news” and “entertainment.” Let’s be honest, real “reality” is depressing. The other news stories on “Today” were about the war, political debates, the failing economy, and random horror stories like medical mistakes. Maybe the news is just responding to their target audiences, everyday people who feel powerless and economically, and perhaps personally depressed. Maybe we prefer the “Reality” of TV to our own realities.

Recalculating

By Marc Williams

Anyone in a car with a GPS knows the phrase “recalculating.” Once you program your destination and begin your journey, the GPS expects you to follow the path with unwavering trust. The slightest turn from the designated route—a pit stop, a scenic detour, a bite to eat—will cause the GPS to recalculate the route. If I miss a turn and get frustrated, the Garmin’s voice is steady and confident, never losing sight of the path. It’s oddly comforting to know that someone in the car can keep their cool. Interestingly, on my Garmin system, and on all of the GPS devices I’ve encountered in other cars, the voice that calmly says, “recalculating” is always female. Is that merely a coincidence?

CNN.com’s Brandon Griggs recently wrote about the new iPhone 4S feature, Siri,

and its distinctly female voice. Griggs notes that female voices are far more common in talking devices than male voices, and provides some interesting theories on the various reasons why talking computers tend to be female. For instance, Griggs cites Clifford Nass of Stanford University:

iPhone's Siri

“It’s much easier to find a female voice that everyone likes than a male voice that everyone likes,” said Stanford University Professor Clifford Nass, author of “The Man Who Lied to His Laptop: What Machines Teach Us About Human Relationships.” “It’s a well-established phenomenon that the human brain is developed to like female voices.”

HAL 9000 (From Kubrick's 2001: A Space Odyssey)

Another theory suggests that Hollywood uses male computer voices in suspense and thriller movies to create a sound of “menace,” so perhaps we find the idea of female computers to be more less menacing.

Or the historical theory is that WWII aviators relied on females for navigation, since their voices were easily distinguished from the male voices of the other pilots.

In many BLS courses, including my arts courses, gender roles and gender identity are discussed and debated at length. In fact, last week, my “Big Plays, Big Ideas” class was discussing gender attitudes on war in our examination of the Greek comedy Lysistrata.

So what does the trend of female computers say about gender attitudes on technology today?

Siri in action:

Beyond Apple

By Marc Williams

Following Steve Jobs’ passing on October 5, countless articles, blogs, and remembrances have paid tribute to Jobs’ contributions to the technology industry. I could certainly join that chorus, given that I do so much of my online teaching for the BLS program from my iMac, iPad, and iPhone. I’m a dedicated and unabashed Mac user and I thank Steve Jobs for helping make life a little simpler through these brilliant gadgets.

Somewhat overlooked in the tributes to Jobs are his contributions as the owner/CEO of Pixar. Jobs purchased Pixar from Lucasfilm in 1986, which at the time was developing 3D animation hardware for commercial use. Pixar’s hardware development and sales business was not terribly successful and the company began selling off divisions and laying off employees.

During this period of struggle, Jobs was willing to consider a proposal from one of his employees, John Lasseter. Lasseter, an animator who had been fired from Disney and subsequently hired by Lucasfilm, had been experimenting with short films and advertisements, all completely animated by computer. Lasseter pitched Jobs on the idea of creating a computer-animated feature film. Such an endeavor would be tremendously risky for a company like Pixar, which had not been organized as a film studio. Not only did Jobs sign off on the idea, but he was able to secure a three-picture deal with Walt Disney Feature Animation. Jobs shifted the company’s primary focus to making movies.

The feature film Lasseter pitched to Jobs became Toy Story, and he went on to direct A Bug’s Life, Toy Story 2, Cars, and Cars 2. Lasseter also served as executive producer for virtually every other Pixar film (Finding Nemo, The Incredibles, WALL-E, Up, Toy Story 3) and he is now the Chief Creative Officer for Walt Disney and Pixar Animation Studios.

There’s no denying Lasseter’s genius but Pixar as we know it today would not exist without Steve Jobs. Jobs took a tremendous leap of faith, entrusting the company’s future to the brilliance of his collaborators.

A leader doesn’t necessarily have to be the person with the best idea; sometimes the leader simply must recognize the best idea in the room—and get out of the way. Harry Truman said, “it is amazing how much you can accomplish in life if you don’t mind who gets the credit.” The story of Pixar demonstrates that Steve Jobs fully understood this simple truth.

John Lasseter

“Steve Jobs was an extraordinary visionary, our very dear friend and the guiding light of the Pixar family. He saw the potential of what Pixar could be before the rest of us, and beyond what anyone ever imagined. Steve took a chance on us and believed in our crazy dream of making computer animated films; the one thing he always said was to simply ‘make it great.’ He is why Pixar turned out the way we did and his strength, integrity and love of life has made us all better people. He will forever be a part of Pixar’s DNA. Our hearts go out to his wife Laurene and their children during this incredibly difficult time.”

- John Lasseter, Chief Creative Officer & Ed Catmull, President, Walt Disney and Pixar Animation Studios

The Rise and Fall of 3-D

By Marc Williams

I have always loved going to the movies.  Now that my wife and I have a young child at home, we don’t get out as much as we used to, so going to the movies is a particularly special treat. We subscribe to Netflix so we can watch movies at home sometimes but actually going to a movie theatre for the big-screen experience is rare thing these days.

Over the past year or two, most of the movies we’ve seen in the theatre were offered in 3-D. Moviegoers nowadays are often given a choice between traditional 2-D and 3-D–we typically opt for the 2-D experience but we have, of course, occasionally opted for a few 3-D titles.  For example, we saw the final installment of the Harry Potter series, Harry Potter and the Deathly Hallows Part II, in both 2-D and 3-D.

How and why did this trend of 3-D films start?  3-D is not a new idea–filmmakers have been experimenting with the technology for nearly one hundred years and feature films have been offered in 3-D for well over fifty years.  But 3-D has become a craze.  I think it began with Avatar, which was probably the most successful and critically acclaimed 3-D film of all time. Hollywood knows a good money-making machine when they see it, so in the wake of Avatar‘s success, the major studios mobilized and started offering more and more 3-D titles.  Theatre chains have also found a way to cash in on the phenomenon; ticket prices for 3-D films are significantly higher than prices for standard 2-D films.

While I admired Avatar for its technical achievement, I personally have not been able to embrace the 3-D craze. My personal distaste for 3-D primarily stems from the fact that most 3-D films I’ve seen use the 3-D technology as a cheap gimmick, not as a storytelling device. If there is an explosion in a 3-D film, is the story truly enhanced by making the viewers feel as if shrapnel is headed in their direction?  I don’t see much payoff for this use of 3-D, yet this is precisely how most films choose to employ the technology.

Avatar is a different kind of 3-D film for several reasons. First, the lush physical environment is given tremendous depth through the use of 3-D; this is important because the film is about the beauty and fragility of the environment.  In this regard, Avatar does not use 3-D as a cheap gimmick.  In fact, the film was shot in 3-D; it was part of the director’s plan for the film all along.  Most 3-D films today are not shot in 3-D–they are converted to 3-D from a 2-D format.  Disney’s re-release of The Lion King in 3-D is a great example of this; they’re simply capitalizing on the 3-D craze, offering viewers only a slightly different experience from the original 2-D film.  To me, that experience isn’t worth the extra $5.00 the movie theatre wants to charge for a 3-D ticket.

Another issue with 3-D is the amount of light on the screen.  3-D technology depends upon darkening the picture by about 50%.  Film critic Roger Ebert has been particularly critical of 3-D films specifically for this reason–the picture is simply too dark.  I found this to be true in Harry Potter and the Deathly Hallows Part II when I saw it in 3-D.  Images that are intensely white in the 2-D version (bright light, for instance) lost their luster in the 3-D version, seeming more gray than white.  And given that the 2-D film was already very dark and shadowy, some of the picture was rendered incomprehensibly dark in the 3-D version.  Some of the other technical concerns with 3-D are outlined in this letter to Roger Ebert from Walter Murch, arguably the most distinguished film editor in the industry.

Box office receipts are beginning to show that 3-D is becoming less and less palatable for moviegoers.  This could be a rejection of the extra $5.00 being charged by theatre chains, or perhaps a reaction to dim, dizzying images.  I was surprised to learn, for example, that the 3-D version of Harry Potter and the Deathly Hallows Part II generated only one-third of the revenue generated by the 2-D version of the same film.  Because profits are down, the 3-D craze is likely nearing its end.

The point is that a brilliant and promising technology can die if no one is willing to use it properly.  Director James Cameron made Avatar using processes no one had ever used before in a feature film, so he had to be willing to adjust his typical methods in order to maximize the 3-D technology’s potential. The question all of this raises for me is the employment of new technology in the classroom–especially the online classroom.  Given that the BLS degree program at UNCG is offered online, it seems we instructors and course developers run the risk of adopting technology without really using it to its fullest potential.  Or even misusing it.

I have certainly been guilty of using technology in ways that create a obstacles to learning–how can this be avoided?  Or better yet, how can technology be used to give students and faculty an advantage? In what ways can technology truly enhance the educational experience, especially online?