Why I Do My Job: A Letter From a Graduate

by Jay Parr

I was recently cleaning out a pile of old papers in my office—going through each one, because anything with FERPA-protected information must be shredded—when I stumbled across this old email sent by an alumna just after she graduated in August 2011. It reminded me of why I do this job.

Dawn Humphrey (right), serving as a marshal at the May 2011 commencement.

Dawn Humphrey (right), serving as a marshal at the May 2011 commencement.

Dear Jay,

For decades I called myself a high school graduate. Today I call myself a graduate student. What a change the BLS program has made in my life!

Three years ago I made a courageous decision to complete my bachelor’s degree, although I was in what some would consider my “golden years.” I sought your advice and you recommended I complete my Associates degree. I subsequently enrolled at a community college in the fall of 2009 and graduated with an AA degree in August of 2010, earning a 4.0 GPA.

contemporaryshortstory

Last August, just one short year ago, I began my studies as a BLS student at UNCG while working full time. I managed to complete all the BLS requirements within one year, graduating on August 12, 2011, and again attaining a GPA of 4.0. I completed 3 hours more than was necessary in order to qualify for Latin Honors [summa cum laude] and the potential nod of Phi Beta Kappa.

As with most adult students, I was eager to complete the degree, yet I also juggled a career and a household and struggled with finances. Fortunately, the academic community has begun recognizing the needs of the online student, with time and convenience being paramount to address a work-life balance.

While I certainly have no desire to become a poster child, future candidates are inspired when they realize their dreams are so close to becoming a reality, thus hearing my story may provide the motivation to pursue their goal. I also had the pleasure of serving as a University Marshal, indicative of the BLS students who are becoming involved in more traditional campus activities and honors.

mysterymayhemmurder

While my time in the BLS program was swift, my educational experience was excellent, graced by exemplary professors and a robust curriculum. Hard work and late nights, blended with lively discussion boards and insightful professors, proved rewarding beyond all my expectations.

Just one month shy of my 55th birthday, I have fulfilled my dream thanks to the wonderful BLS program at UNCG and the guidance of their attentive staff. It is my hope that other potential students will see that via the BLS program, the end of the rainbow may be closer than they think.

On a closing note, please accept my sincere thanks for your advice and encouragement through the years. Our early conversations were the catalyst that sparked the inspiration and courage to return to UNCG after a 30 year hiatus.

Many thanks,
Dawn L. Humphrey
Masters of Arts in Liberal Studies Candidate

Dawn Humphrey receiving her Master of Arts from the chancellor one year after this letter.

Dawn Humphrey receiving her Master of Arts from the chancellor one year later.

Ms. Humphrey finished her Master of Arts in the MALS program one year later—faster than any previous MALS student, and with yet another perfect 4.0—and she now serves as a teaching and research assistant for Dr. Stephen Ruzicka, one of the senior faculty in that program (also a committee member and occasional teacher in the BLS Program). She writes that the pay is negligible (she still has another career), but that “it is the delight of interacting with students that calls me back to the MALS table each semester.”

Thank you Dawn!

Freedom of Speech in the Classroom

by Steve O’Boyle

I'm Feeling Lucky!

I’m Feeling Lucky!

At no other time in history have people had access to more information than in the current era. Within seconds we can become pseudo-experts on most any topic, from Satanism to Zen, from the Kama Sutra to Lollapalooza and or even an upbeat biopic of Leonard Cohen (say it like co-en, then it works). This is not news to you (or at least I hope it’s not), but it is an important yet puzzling piece to recent controversies concerning freedom of speech in the classroom.

In the past year, there have been several incidents where university professors have been sanctioned for the words that they used in their classrooms while attempting to explain academic ideas. One incident that made national headlines involved a highly regarded sociologist named Patti Adler, a full Professor at the University of Colorado.

Patti Adler.

Dr. Patti Adler.

In her intro-level Deviance in U.S. Society class, Dr. Adler spiced up her lecture on prostitution with “a skit in which many of Adler’s teaching assistants dress[ed] up as various types of prostitutes. The teaching assistants portrayed prostitutes ranging from sex slaves to escorts, and described their lifestyles and what led them to become prostitutes” (DailyCamera).

Adler is described in the article as having an unorthodox and engaging teaching style. “Students recounted how Adler showed up in class in a bikini to illustrate deviance or dressed as a homeless person to make the same point.” However, the prostitution lecture got—well, some negative attention—and at the time the article went to press, it looked like Dr. Adler was at risk of being forced into early retirement over the controversy. She was in jeopardy of losing her job for trying to teach her students in a way that was engaging, entertaining, and most of all, memorable. That is to say, for trying to do her job.

Prostitution skit in Adler's class.

Prostitution skit in Adler’s Deviance class.

I do realize that some of you may not think this is a big deal, but as someone who teaches sociology at UNCG—a discipline that includes an entire area devoted to social deviance—well, as my old not-very-good mechanic used to say about my POS Jeep, “Man, this is troublematic…”

So if we offend a student in class—not directly of course, but by making them feel uncomfortable while trying to teach them important ideas—we might be severely sanctioned for this? Knowledge that is controversial, and can take a student out of their comfort zone, is off limits?

Do I have your attention yet?

Do I have your attention yet?

Students are now exposed to more controversial envelope-pushing cultural ideas and images than ever before, and at much younger ages (scholars call this phenomenon “the internet”). So I find it a bit perplexing that these kids—who could never understand a teenager’s absolute thrill of finding their parents’ porno mags in the sock drawer, but (or perhaps because) they can now google any sex act and have a “how-to” video before their eyes in seconds (and long before their first real date)—these students are so much more savvy than I ever was at their age, but now I have to watch what I say more than ever in the classroom?!

And to complicate things further, because of the limitless access they have grown up with (and the seconds-long attention span that accompanies it), it takes more effort than ever to keep the attention of these Millenials without grabbing their attention—with ideas and language that wakes them the #@%$ up, and stops them from just sitting there in class half asleep, hoping whoever they’re trying to hook up with will respond to their inane text with a “k”…

"wnt 2 hookup l8r?"

“n class. bored. wnt 2 hookup l8r?”

So what to do? I’m going to follow the advice university counsel Skip Capone gave a few years back, after some legal challenges at other institutions—some of them blatantly political (here’s a link to the slide show, which is clearly dated).

My CYA strategy? Define germane to the class, then when comes the time to talk about the touchy stuff, refer them back to that term. Then show them the link from the controversial stuff (i.e., the fun stuff), directly to how it relates—or is germane—to the academic topic. Finally, address the class with “so do you see the connection here?” When they say “yes,” you’re covered.

Why All Babies Deserve to Die: Science and Theology in the Abortion Debate

by Matt McKinnon

The debate rages on…

The debate rages on…

Just a few of the headlines on the abortion debate from the last few weeks:

I would say that the Abortion issue has once again taken center stage in the culture wars, but it never really left. Unlike homosexual marriage, which seems to be making steady progress towards resolution by a majority of Americans that the freedom to marry of consenting adults is basic civil right, the abortion debate continues to divide a populace who is torn between adjudicating the priority of the basic rights of both mother and “potential” child.

I say “potential” child because herein is where the real debate lies: exactly when does a fertilized human egg, a zygote, become a “person,” endowed with certain human if not specifically civil rights?

Is it a person yet?

Is it a person yet?

Dougherty’s main point in his article on liberal denial focuses on the “fact” of the beginnings of human life. He claims that liberals tend to make one of two types of arguments where science and human life are concerned: either they take the unresolved legal issue regarding the idea of personhood and transfer it back to the “facts” of biology, concluding that we cannot really know what human life is or when it begins, or they acknowledge the biological fact of the beginning of human life but claim that this has no bearing on how we should think about the legality of abortion.

Both sorts of arguments, he claims, are obscurantist, and fail to actually take into account the full weight of science on the issue.

But the problem, I contend, isn’t one of science: it’s one of theology—or philosophy for those less religiously inclined.

The problem is not the question of “what” human life is or “when” it begins. Dougherty points out:

After the fusion of sperm and egg, the resulting zygote has unique human DNA from which we can deduce the identity of its biological parents. It begins the process of cell division, and it has a metabolic action that will not end until it dies, whether that is in a few days because it never implants on the uterine wall, or years later in a gruesome fishing accident, or a century later in a hospital room filled with beloved grandchildren.

Two-cell zygote.

Two-cell zygote. Is this a person?

So basically, human life begins at conception because at that point science can locate a grouping of cells from which it can deduce all sorts of things from its DNA, and this grouping of cells, if everything goes nicely, will result in the birth, life, and ultimate death of a human being.

He even gets close to the heart of the problem when, in arguing against an article by Ryan Cooper, he claims that many people are not fine with the idea that an abortion represents the end of a life, nor are they comfortable with having a category of human life that is not granted the status of “humanity”—and thus not afforded basic human rights.

The problem with all of these discussions is that they dance around the real issue here—the issue not of “human life” and its definition and beginning, but rather the philosophical and often theological question of the human “person.”

If we look closely at Dougherty’s remarks above, we note two distinct examples of why the generation of human life is a “fact”: (1) we can locate DNA that tells us all sorts of things about the parents (and other ancestors) of the fetus and (2) this fetus, if everything works properly, will develop into a human being, or rather, I would argue, a human “person.”

For there’s the distinction that makes the difference.

After all, analyze any one of my many bodily fluids and a capable technician would be able to locate the exact same information that Mr. Dougherty points out is right there from the first moments of a zygote’s existence. But no one claims that any of these bodily fluids or the cells my body regularly casts off are likewise deserving of being labeled “human life,” though the sperm in my semen and the cells in my saliva are just as much “alive” as any zygote (believe me, I’ve looked).

No, the distinction and the difference is in the second example: The development of this zygote into a human person. My sperm, without an egg and the right environment, will never develop into a human being. The cells in my saliva have no chance at all—even with an egg and the right conditions.

Nope, not people.

Nope, not people.

So the real force of Doughtery’s argument lies in the “potential” of the zygote to develop into what he and anti-abortion folks would claim is already there in the “reality” of a human person.

The debate thus centers on the question of human personhood, what we call theological or philosophical anthropology. For one side, this personhood is the result of a development and is achieved sometime during the embryonic stage (like “viability”) or even upon birth. For others, it is there at conception. For some in both camps it would include a “soul.” For others it would not.

So the reason that the abortion debate is sui generis or “of its own kind” is because here the issue is not the rights of a minority versus the rights of a majority, as it is in the debate about homosexual marriage, or even the rights of the mother versus the rights of the child. Rather the real debate is about when “human life” is also a human “person” (note this is also informs the debate of whether or not to end the life of someone in a vegetative state).

Is this a person?

Fetus at four weeks. Is this a person?

To this end, Mr. Dougherty is correct: We can and do know what human life is and when it begins. And he is correct that many are uncomfortable with the idea that abortion means the death of a human life. But he fails to recognize that the reason this is the case is that while those on one side regard this “life” as a human person, others do not. Potentially, perhaps, but not a “person” yet. And certainly not one whose “right to life” (if there even is such a thing: nature says otherwise—but that’s another blog post) trumps the rights of the mother.

So what does all of this have to do with all babies deserving to die? It’s simple: this is what the (necessary?) intrusion of theology into public policy debates entails. Once theological ideas are inserted (and note that I am not arguing that they should or shouldn’t be), how do we adjudicate between their competing claims or limit the extent that they go?

For the two great Protestant Reformers Martin Luther and John Calvin, representing the two dominant trajectories of traditional Protestant Christianity, humans are, by nature, sinful. We are conceived in sin and born into sin, and this “Original Sin” is only removed in Baptism (here the Roman Catholic Church would agree). Furthermore, we are prone to keep sinning due to the concupiscence of our sinful nature (here is where the Roman Church would disagree). The point is that, for Protestants, all people are not only sinful, but are also deserving of the one chief effect of sin: Death.

romans_6-23

“For the wages of sin is death.” — Romans 6:23

 

Calvin was most explicit in Book 2, Chapter 1 of his famous Institutes:

Even babies bring their condemnation with them from their mother’s wombs: they suffer for their own imperfections and no one else’s. Although they have not yet produced the fruits of sin, they have the seed within. Their whole nature is like a seedbed of sin and so must be hateful and repugnant to God.

Since babies, like all of us, are sinful in their very nature, and since they will necessarily continually bear the fruits of those sins (anyone who’s ever tried to calm a screaming infant can attest to this), and since the wages of those sins is death, then it’s not a far-fetched theological conclusion that all babies deserve to die. And remember: “they suffer for their own imperfections.”

But they don’t just deserve to die—they deserve to go to hell as well (but that’s also another blog post). And this, not from the fringes of some degenerate religious thinker, but from the theology of one of Protestant Christianity’s most influential thinkers.

A sinner in the eyes of God (or at least Calvin).

A sinner in the eyes of God (according to John Calvin, anyway).

Of course, it should be noted that Calvin does not imply that we should kill babies, or even that their death at human hands would be morally justifiable: thought he does argue (and here all Christian theology would agree) that their death at the hand of God is not just morally justifiable, it is also deserved. It should also be noted that the Roman Catholic theology behind the idea that children cannot sin until they reach the age of reason is predicated on the notion that this is only the case once their Original Sin has been removed in Baptism (So Jewish, Muslim, and Hindu kids would be sinful, unlike their Christian counterparts).

Again, this is not to argue that philosophical and theological principles should not be employed in the abortion debate, or in any debate over public policy. Only that (1) this is what is occurring when pro-choice and anti-abortion folks debate abortion and (2) it is fraught with complexities and difficulties that few on either side seem to recognize.

And contrary to  Mr.Dougherty, this is beyond the realm of science, which at best tells us only about states of nature.

But the only way we have a “prayer” of real sustained dialogue—as opposed to debates that ignore our competing fundamental positions—is to take seriously the philosophical and theological issues that frame the question (even if my own example is less than serious).

But I’m not holding my breath. I would most certainly die if I did.

Homeownership as Hazing

by Chris Metivier

Among the unique joys of home ownership.

Ah, the joys of homeownership.

I’m writing this as I wait for the air conditioning repair tech to call me back. He’s supposed to be here by now. I called the company office about half an hour ago to make sure he was still coming, and they said he would call when he’s on his way. I know that technicians get held up when they’re on call, but it’s late afternoon and if he can’t make it today, I don’t know when I’ll be able to reschedule. I’m about to start a new position at the university, a 9-to-5 office job, so today might be my only weekday off for a while. Luckily, it’s rainy today, and cool. So it’s comfortable in my house for now. But it’s only going to get hotter, and my air conditioning unit just runs, impotently, like a mouse on one of those wheels, endlessly turning but accomplishing nothing.

I don’t know anything about air conditioning, but I expect I’ll learn. Much the same way that I’ve learned about plumbing, wiring, and landscaping in the last 9 months or so that I’ve been a homeowner—by having it explained to me by a very capable and polite tradesperson as they repair it. Each time something goes wrong, I ask my veteran homeowner friends if there’s a repair person who they can recommend. They always offer suggestions, but with a tone of resignation that indicates they’ve been here too. Disappointed, financially stressed, unprepared, and even sort of victimized.

He’s happy because he knows the air conditioning works at HIS house.

He’s happy because he knows the air conditioning works at his house.

The thing that gets me is, all these same people were so enthusiastic about my decision to buy a home a year ago. They went on and on about how good of an investment it is and how I’ve been “throwing money away on rent all these years” and “it’s the grown-up thing to do”. Now that the damage is done, they smile wistfully when I complain that everything that can’t be detected in a pre-closing inspection has gone wrong since I’ve bought my house. “Yup”, they say, “get used to that”.

Where was their resignation before I saddled myself with a mortgage? Where was their jaded sincerity? My homeowner friends were all middle-class pride and upward mobility when I was in the market for a house, but now they scoff at my naivete when I complain that I no longer have any savings because every extra dollar goes into fixing my house. I feel like an inductee into some vaguely secretive and mildly abusive club. One that spends a lot of time bragging about the benefits of membership to outsiders, but conveniently neglects to mention the disadvantages.

I had friends in college who joined fraternities, and during their sort of probationary “pledging” period, they were made to carry awkward objects around campus, or recite obscure facts about the university at the command of the senior members. I never joined any such organization myself, but I imagine this hazing ritual is intended to inspire loyalty and demonstrate commitment. My experience as a homeowner feels a little bit like that. It’s not that other homeowners are intentionally abusing or embarrassing me. But they knew I would be abused and embarrassed, and yet they encouraged me to join their club.

Like this, except it only hurts your bank account.

Like this, except it only hurts your bank account.

It occurs to me that this behavior applies to a wider range of groups. Parents often encourage non-parents to have children. Religious folks often recommend spirituality to the non-religious. Even fans of a particular tv show will push their friends to watch the show too. It seems like people always think that their choices are the best ones, and that others would be better off if they agreed. This, of course, is no surprise. How could anyone get through life thinking that all their decisions were poor ones and they would have been better off doing otherwise. Perhaps there are people like this, but they would be miserable friends, and so I suspect not many of us are receiving advice from people like that.

I think there is a range of interpretations of this behavior. The most cynical is that misery loves company—that people who regret their decisions are resentful of those who chose more wisely and strive to bring cosmic justice into balance by luring the lucky or clever into ruin. Perhaps, again, there are some people like this, but it seems pretty unlikely that there are very many.

Slightly less cynical is the possibility that people want to feel justified in their decisions, regardless of whether they are actually good ones, and so they put the most positive spin on their choice to buy a house, have children, get married, etc., which has the dual effect of both convincing themselves that their decision was justified in retrospect, and possibly convincing others to join them, further validating the decision.

A "Shellback" initiation (as sailors cross the Equator for the first time).

A “Shellback” initiation (as sailors cross the Equator for the first time).

A sort of value-neutral psychological explanation is that people simply don’t think about the bad parts of their experience when they make recommendations. They don’t have any real agenda when they tell you that their lives have improved since they started watching Game of Thrones or became gluten-free. They really believe, at least in that moment, that their lives have improved, and their recommendation is more of a description of the benefits they have actually enjoyed. They simply are forgetting about the costs. I think psychologists call this confirmation bias. Or maybe I’m thinking of a different thing, but I’m confident there is a name for it.

To be more optimistic, one more possibility is that people really do, on the whole, assess their lives as better in the light of the change they recommend. They judge themselves to be sincerely happier and they want you to be happier too. They reflect on the quality of their lives before and since their decision, they evaluate the impact of the change, and they believe that it has caused a net improvement. Of course, they may be mistaken in their conclusion that there is a causal link between their decision and their happiness (I’m referring to the work of Dan Gilbert here), but their motivations are benevolent.

While I lean toward the cynical in my explanations of human behavior in most cases, I suspect that my enthusiastic homeowner friends were not actually using me to justify their own bad choices or mollify their regrets. Probably they were reflecting on their current lives in a positive light, and perhaps mistaking correlation for causation.

xkcd552correlation

In any case, I don’t know if they really meant to initiate me into the exclusive club of homeownership through ritual hazing. But I have learned, since I started writing this a couple weeks ago, that there is nothing wrong with my air conditioning that is detectable by any tests that a professional technician is likely to perform. So now I just seem delusional. To this technician, like a bourgeois academic, mystified by the workings of the machinery that makes my comfortable life possible. And to you, like neurotic blogger spouting cautionary tales as though they are profound. Both you and he are probably right.

Whatchoo Talkin’ ‘Bout, Ivar Aasen?!

by Carrie Levesque

parlez-vous-quebecois-500I’ve always loved to study languages. I grew up in a bilingual area close to the Quebec border where a French dialect nearly unrecognizable to the French (and sometimes to the Quebecois from whom it derived) is widely spoken. In college I continued to study French, majored in Russian and took a few semesters of Spanish because I had a crush on the professor. As a graduate student in Slavic literatures, I studied Croatian for a summer in ultra-Catholic Zagreb, where the prize for the best language student was a large coin with a fetus on it (it was OK with me that I didn’t win). While I’ve met with many linguistic frustrations over the years (the Russian case system, French verb tenses) little in these experiences prepared me for the hot mess that is learning Norwegian as a foreigner in Norway.

A Norwegian to Norwegian translating dictionary.

A Norwegian-to-Norwegian translating dictionary. … No, really.

Because, you see, in Norway there is no standard spoken language. Norwegian literally has dialects without number, and there is no favored dialect. Your dialect is as good as mine. Mixing dialects: Also kosher. The Norwegian you learn in your Norwegian as a Second Language class is not the same Norwegian spoken on the street, and the Norwegian spoken on your street is different from the Norwegian spoken 50 miles down the road. In any Norwegian family one marries into, your spouse may speak a different dialect from his mother, who may speak a different dialect from her spouse. This is, of course, just a little inconvenient for non-native Norwegian speakers.

Back in the day, Danish was the standard written language of Norway. While the urban elite also spoke the same Danish they wrote, or a Norwegianized form of it, the isolated rural populations spoke dialects that evolved only gradually from Old Norse (Vikingspeak) to something more closely resembling the Dano-Norwegian spoken in the cities.

Ivar Aasen, who died 118 years ago, is sometimes the bane of my existence.

Ivar Aasen, who died 118 years ago, is sometimes the bane of my existence.

Norway was liberated from Danish rule in 1814, and as this was the era of Romantic Nationalism, establishing one’s own national language and culture was a primary concern. Norwegian linguist Knud Knudsen began to standardize a more fully Norwegianized form of written Danish into what is today called bokmål (literally, ‘book language’). Meanwhile, a self-taught country boy named Ivar Aasen (who, in my opinion, could not just leave well enough alone) traveled throughout Norway’s far-flung rural settlements, compiling many different spoken dialects into Norway’s other official written language, nynorsk, or “new Norwegian” (sometimes called ‘spynorsk’ (or ‘Pukewegian’) by bokmål devotees).

nei-til-nynorskAlthough only about 15% of Norwegian schoolchildren opt for nynorsk as their language of primary instruction today, all schoolchildren learn both languages in school since nynorsk is the official written language of many counties, especially in Western Norway. So while Bergen is a bokmål city and all of its official written business is done in bokmål, it is located in a nynorsk county, and so all institutions administered by the county (the hospitals, universities and high schools) issue communications in nynorsk. And this doesn’t cover what is spoken in Bergen, which is its own crazy something else. The shock and despair I felt when I learned all of this has since diminished, but it may be years before it ever fully leaves.

I heart NynorskFrom the start, people in Norway have been deeply, personally invested in whichever regional dialect of Norwegian they speak or write, so much so that to affect an easier-to-understand dialect—for the sake of, say, helping a foreign student of Norwegian understand them—feels so wrong that in such circumstances they prefer to speak English. While in Norway, unlike in the US, there is by and large no stigma attached to speaking a dialect, there is a stigma attached to speaking a dialect that is not your own.

Early on it was decided that since it was its dialects that kept Norwegian distinct from Danish, the equal status of all dialects must be preserved. This is not to say that people don’t make fun of each other’s dialects; there is lots of good-natured discussion around whose dialect is the ugliest. But, to give an example, teachers in school cannot correct their students for speaking a different dialect.

So linguistically egalitarian are the Norwegians that Norway’s major public television networks also produce programming in the language of the Sami, the indigenous population living mostly in the north of Norway. Though there are only 40-50,000 Sami in a total population of 5 million, every day they broadcast children’s shows and the evening news in Samisk (with Norwegian subtitles). That would be like American public television networks providing daily programming in Navajo or Cherokee. Hard to imagine that happening.

Sami in traditional dress for a cultural event.

Sami in traditional dress for a cultural event.

It is interesting to imagine what things would be like in the US if we regarded all dialects equally, as the Norwegians do. While I don’t know a lot about the history of our most prominent dialects’ development, I think a lot of the stigma directed in some regions of the country toward certain dialects comes from the complicated and often ugly history framing, for example, Northerners’ prejudice toward Southern dialects, or whites’ prejudice toward Ebonics. How might it challenge us to do some serious thinking about these parts of our history if we learned to view as equally valid the different language patterns that grew out of this history?

“Dialects are not necessarily positively or negatively valued; their social values are derived strictly from the social position of their community of speakers[,]” as W. Wolfram and N. Schilling-Estes explain in American English: Dialects and Variation. What does the way a nation treats its speakers of certain dialects say about the values of that society? Because even if Ivar Aasen made my life more difficult by preserving and legitimizing the dialects spoken by many of Norway’s most disenfranchised citizens, I see the great value of his larger project and its enduring message. Everyone matters. And the tool we use to express our worth—our language—matters, too.

Why Writing Matters, And Why You Should Care

by Erin Poythress

3:00 A.M. Still up writing that essay.

3:00 A.M. Still up writing that essay.

You are working on your final essay and preparing to turn in 35% of your grade, and the universe hears you thinking out loud, your curses at the screen. It hears your exhaustion, and perhaps, just the slightest temptation to lift a paragraph or idea from a source you’re reading. You know, since you can’t say it any better than its author did… and anyway it’s 3:00 AM. Maybe this isn’t you. I hope it isn’t you. But if you’re human, you’ve probably at least thought about it. We all have.

This New York Times article describes how plagiarism is on the rise on college campuses all over the country. Any student would be most wise to read this. It isn’t very long, and is a thoughtful approach to a topic that is typically unthoughtfully discussed in class: academic integrity and intellectual property.

Many instructors don’t want to have to spend time discussing plagiarism, and I’ll admit I have felt like students should know this by now. I have also felt that I am only preaching to the choir, since someone lazy and irresponsible enough to cheat clearly isn’t going to bother to read or listen. Often the discussions of cheating that occur the first day of class are like bad sex-ed talks from the 1950s—”don’t ever do it; bad things happen if you do”—without ever talking about what “it” is.

But the notions of authorship and intellectual property have changed in the digital age, and you need to know how this will affect you, because they haven’t changed at UNCG or any other college campus.

"Do you think I haven't read that article myself?"

“Do you think I haven’t read that book myself?”

If you use someone else’s words or ideas and do not give them credit, it is plagiarism, which is just a fancy word for stealing. In an age where you can illegally download music, books, movies, and where websites routinely steal passages from each other uncredited, this may seem like an antiquated notion. It’s not. Not only is that how the university’s Academic Integrity policy specifically defines plagiarism, but to cut and paste or in any other way claim another’s thoughts as your own does not prepare you for the kind of synthesis and analysis that intelligent people must do to be a successful and productive part of society. The short-term result of plagiarizing any part of your essay in one of my classes is, of course, failing the class. But that concerns me less than its broader implications. And it should concern any student, too.

When you graduate from college, because you will have more education than many of your peers, you will have opportunities to not only be more financially secure in this world, but to shape this world. I would argue that all of us—whether we have a Ph.D. or a third-grade education—have an obligation to be a force of positive change in our communities, and as you join the ranks of those with the most education, you have the opportunity to be more visible and more convincing, since you’ve spent all those years learning to think logically and argue convincingly. But this means you also have an obligation to do your thinking and arguing ethically. It’s not difficult at all to find examples of unethical people who have preyed upon innocent people and even profited. Bernie Madoff comes to mind, but he is one of the more egregious examples of lapses in ethics that occur on smaller scales every day. His crimes had victims with names and bank accounts. You may think intellectual property heists have no such victims, but they do, as the linked article from The Crimson attests. They not only hurt the people who actually did the hard work of composing their thoughts, but they hurt the people that steal them because they help sustain the lie that the ones who steal can generate meaningful, coherent thought. What do you think the world would look like if our country’s great thinkers resorted to cut and paste instead of doing the difficult work of trying to solve our world’s most pressing problems?

Slave labor?

Slave labor, anyone?

This may seem like a strong reaction to a problem you view as minor, but I ask you, if, from here on out, all we do is copy/paste/recycle/reuse all the thoughts that came before without improving them, challenging them, overturning them, how will we solve problems we have never faced? What will be the fate of human innovation if all our thoughts are merely mashups of someone else’s deliberation?

Perhaps original thought is overrated, but I don’t think so. And the university doesn’t think so. And original thought is exactly what is expected in your essays. That doesn’t mean you can’t learn from other people’s ideas, but you must give them credit for lighting your path. Don’t denigrate your own talents by lifting their words verbatim without quotation marks and a citation—you’re all intelligent enough to discuss a topic without resorting to stealing.

After Jamaica Kincaid’s “Girl”

by Jessie Lane, BLS student

Jessie Lane

Jessie Lane.

The assignment, for Debby Seabrooke’s Contemporary Short Stories class, was to create a personal adaptation of “Girl” by Jamaica Kincaid. “Girl” was first published in the June 26, 1978 edition of the New Yorker, and has since been widely anthologized. It is frequently studied in literature and writing classes.

Girl
(after Jamaica Kincaid)

Jamaica Kincaid, ca. "Girl."

Jamaica Kincaid, c. 1978.

Remember that we all are always alone. This is not bad news. This news will lead you to the truth quickly. You can rest assured that there is nothing else in this world to do other than honor your family and to work. Do not worry about forming connections beyond that, this will only waste your time and cause you to suffer. Being alone is honorable—not lonely and isolationist. It is a sign of strength. The only company you need is that of your children. You will understand when you become a mother. But, I’m not sure that I want to become a mother. All other relationships are luxuries, and luxuries make you lazy. The world will make you think that you need people; they will constantly be at battle with your wits on this one. They want you to believe that relationships are a testament to your worth. Beat ’em! Make sure that you prove your worth though production. You must make progress, always and continually. This will help you to continue living life. Money is the key to success. Not riches, mind you, but the constant flow of steady money. Find that and do not waste your time on other pursuits. Well—you can—you are a free person, but if you do you will know great sorrow and depression. Not only, but mostly. Oh, and don’t let your softness show, it is unbecoming. Girls are really only anything these days if they act like boys. Don’t look like a boy, please…God…look like a pretty, lean girl. But, act like you can kick everyone’s ass because you know how to make it on your own. Be tough, dirty, fierce, and blood hungry on the inside and look physically accommodating on the outside. Being one of the guys without them knowing is key to this fight. But I don’t know much about fighting. Learn everything you can learn about fighting. This is important. You have got to fight in the war to win. Makeup is your war paint; wear it often, and wear it right. But I can’t seem to get used to the feeling of makeup on my face. To be a proper girl, a proper daughter, you must also take care of us parents. There is a special reason that you are the only girl. Let the boys be single-minded. You can do it all. You must, really, if you want to live a life of meaning. You will understand when you have your children. But, don’t have children because you don’t want to be alone or because you think that you have an unnerving urge to give life and love. These are biological tricks that nature plays on all of us, and it’s important to remember that you can always control nature. Have your children because you will need someone to provide for you in old age. You don’t even need a man. But, I’m not even sure what I want. You will see, child. You will see.

———

Jessie Lane is a 28 year old senior in the BLS Humanities concentration and lives in the mountains of Asheville, North Carolina. She started her college education at age 15 after dropping out of high school in Phoenix, Arizona,  running away to Asheville, and enrolling in classes at Asheville-Buncombe Technical Community College. She has lived all over the United States and traveled worldwide, including Mexico, South Africa, France, Netherlands, and Spain. She is a passionate percussionist, and plays with groups in various genres all around the Asheville area. She loves dancing and writing.