Author Archives: The BLS Program at UNCG

BLS Student Featured on UNCG Home Page

by Jay Parr

Nargiza Kiger featured on the UNCG home page. Photo: Brian Kiger

Nargiza Kiger featured on the UNCG home page.

I generally like to keep this blog about things other than the BLS Program, lest we be accused of navel-gazing. This is going to be one of those exceptions.

If you open the UNCG Home Page in the next two weeks, the first thing you’re going to see is our very own BLS student Nargiza Kiger smiling at you from a field in West Africa. Though she’s technically an in-state student (she and her husband live here in the Triad), I know of no other student who brings a more international perspective to the BLS Program. A native of Uzbekistan in central Asia, where relatively few women manage to achieve higher education, Nargiza traveled to neighboring Kyrgyzstan to attend a university. It was there that she met her husband Brian, and after finishing her Associate of Arts at Forsyth Tech, she had to reconcile her desire to continue her own education with Brian’s career in international development. The BLS Program allowed her to do just that, continuing her education at an American university while stationed with him in Nigeria and then in Ghana. She’s on track to graduate in December.

Nargiza greeting an elephant in Ghana.

Nargiza greeting an elephant in Ghana.

Nargiza came to my attention last fall, shortly after she had moved to Ghana (one downside of my mostly-administrative role is that I’m not as in touch with all our students as I was when I was their academic advisor). I think it may have been infrastructure issues—unreliable power and internet connections—that brought her to my attention. Always on the lookout for BLS students who lead interesting lives, I asked her if she would be interested in writing a post for our blog. Given her history, which you can read in her cover story, I expected her to write about her own experiences. Boy, did she ever turn that on its head.

The post she gave me starts out on the frustrations of being an online student in an African city with tentative infrastructure—with the nerve-wracking image of taking an online test with a glitchy internet connection and having the power go out (yet again) in the middle of it. But then, after getting the reader sucked into her frustrating circumstances, she immediately turns around and points out that in Ghana, she is the privileged one. In a country with a per-capita income of roughly $2.00 a day, where education beyond 9th grade costs real money, and where placement into professional programs is rife with corruption, she can afford tuition at an American institution that costs more than most of her neighbors will make in a year. And yet, despite all these challenges—her own and others’—the post she gave me is ultimately the inspirational story of a security guard who is paying for his siblings to go to school, and who aspires to become a nurse so he can help others.

Ibrahim and Nargiza under the mango tree where Ibrahim likes to read.

Nargiza and Ibrahim, the security guard.

I feel like our little online program is all grown up, out there on the front page of the university’s website. And I can’t think of many people to better represent us than Nargiza, wearing her UNCG colors in Tamale, Ghana, and constantly doing the little things she can do to make the world a better place.

Why I Do My Job: A Letter From a Graduate

by Jay Parr

I was recently cleaning out a pile of old papers in my office—going through each one, because anything with FERPA-protected information must be shredded—when I stumbled across this old email sent by an alumna just after she graduated in August 2011. It reminded me of why I do this job.

Dawn Humphrey (right), serving as a marshal at the May 2011 commencement.

Dawn Humphrey (right), serving as a marshal at the May 2011 commencement.

Dear Jay,

For decades I called myself a high school graduate. Today I call myself a graduate student. What a change the BLS program has made in my life!

Three years ago I made a courageous decision to complete my bachelor’s degree, although I was in what some would consider my “golden years.” I sought your advice and you recommended I complete my Associates degree. I subsequently enrolled at a community college in the fall of 2009 and graduated with an AA degree in August of 2010, earning a 4.0 GPA.

contemporaryshortstory

Last August, just one short year ago, I began my studies as a BLS student at UNCG while working full time. I managed to complete all the BLS requirements within one year, graduating on August 12, 2011, and again attaining a GPA of 4.0. I completed 3 hours more than was necessary in order to qualify for Latin Honors [summa cum laude] and the potential nod of Phi Beta Kappa.

As with most adult students, I was eager to complete the degree, yet I also juggled a career and a household and struggled with finances. Fortunately, the academic community has begun recognizing the needs of the online student, with time and convenience being paramount to address a work-life balance.

While I certainly have no desire to become a poster child, future candidates are inspired when they realize their dreams are so close to becoming a reality, thus hearing my story may provide the motivation to pursue their goal. I also had the pleasure of serving as a University Marshal, indicative of the BLS students who are becoming involved in more traditional campus activities and honors.

mysterymayhemmurder

While my time in the BLS program was swift, my educational experience was excellent, graced by exemplary professors and a robust curriculum. Hard work and late nights, blended with lively discussion boards and insightful professors, proved rewarding beyond all my expectations.

Just one month shy of my 55th birthday, I have fulfilled my dream thanks to the wonderful BLS program at UNCG and the guidance of their attentive staff. It is my hope that other potential students will see that via the BLS program, the end of the rainbow may be closer than they think.

On a closing note, please accept my sincere thanks for your advice and encouragement through the years. Our early conversations were the catalyst that sparked the inspiration and courage to return to UNCG after a 30 year hiatus.

Many thanks,
Dawn L. Humphrey
Masters of Arts in Liberal Studies Candidate

Dawn Humphrey receiving her Master of Arts from the chancellor one year after this letter.

Dawn Humphrey receiving her Master of Arts from the chancellor one year later.

Ms. Humphrey finished her Master of Arts in the MALS program one year later—faster than any previous MALS student, and with yet another perfect 4.0—and she now serves as a teaching and research assistant for Dr. Stephen Ruzicka, one of the senior faculty in that program (also a committee member and occasional teacher in the BLS Program). She writes that the pay is negligible (she still has another career), but that “it is the delight of interacting with students that calls me back to the MALS table each semester.”

Thank you Dawn!

Freedom of Speech in the Classroom

by Steve O’Boyle

I'm Feeling Lucky!

I’m Feeling Lucky!

At no other time in history have people had access to more information than in the current era. Within seconds we can become pseudo-experts on most any topic, from Satanism to Zen, from the Kama Sutra to Lollapalooza and or even an upbeat biopic of Leonard Cohen (say it like co-en, then it works). This is not news to you (or at least I hope it’s not), but it is an important yet puzzling piece to recent controversies concerning freedom of speech in the classroom.

In the past year, there have been several incidents where university professors have been sanctioned for the words that they used in their classrooms while attempting to explain academic ideas. One incident that made national headlines involved a highly regarded sociologist named Patti Adler, a full Professor at the University of Colorado.

Patti Adler.

Dr. Patti Adler.

In her intro-level Deviance in U.S. Society class, Dr. Adler spiced up her lecture on prostitution with “a skit in which many of Adler’s teaching assistants dress[ed] up as various types of prostitutes. The teaching assistants portrayed prostitutes ranging from sex slaves to escorts, and described their lifestyles and what led them to become prostitutes” (DailyCamera).

Adler is described in the article as having an unorthodox and engaging teaching style. “Students recounted how Adler showed up in class in a bikini to illustrate deviance or dressed as a homeless person to make the same point.” However, the prostitution lecture got—well, some negative attention—and at the time the article went to press, it looked like Dr. Adler was at risk of being forced into early retirement over the controversy. She was in jeopardy of losing her job for trying to teach her students in a way that was engaging, entertaining, and most of all, memorable. That is to say, for trying to do her job.

Prostitution skit in Adler's class.

Prostitution skit in Adler’s Deviance class.

I do realize that some of you may not think this is a big deal, but as someone who teaches sociology at UNCG—a discipline that includes an entire area devoted to social deviance—well, as my old not-very-good mechanic used to say about my POS Jeep, “Man, this is troublematic…”

So if we offend a student in class—not directly of course, but by making them feel uncomfortable while trying to teach them important ideas—we might be severely sanctioned for this? Knowledge that is controversial, and can take a student out of their comfort zone, is off limits?

Do I have your attention yet?

Do I have your attention yet?

Students are now exposed to more controversial envelope-pushing cultural ideas and images than ever before, and at much younger ages (scholars call this phenomenon “the internet”). So I find it a bit perplexing that these kids—who could never understand a teenager’s absolute thrill of finding their parents’ porno mags in the sock drawer, but (or perhaps because) they can now google any sex act and have a “how-to” video before their eyes in seconds (and long before their first real date)—these students are so much more savvy than I ever was at their age, but now I have to watch what I say more than ever in the classroom?!

And to complicate things further, because of the limitless access they have grown up with (and the seconds-long attention span that accompanies it), it takes more effort than ever to keep the attention of these Millenials without grabbing their attention—with ideas and language that wakes them the #@%$ up, and stops them from just sitting there in class half asleep, hoping whoever they’re trying to hook up with will respond to their inane text with a “k”…

"wnt 2 hookup l8r?"

“n class. bored. wnt 2 hookup l8r?”

So what to do? I’m going to follow the advice university counsel Skip Capone gave a few years back, after some legal challenges at other institutions—some of them blatantly political (here’s a link to the slide show, which is clearly dated).

My CYA strategy? Define germane to the class, then when comes the time to talk about the touchy stuff, refer them back to that term. Then show them the link from the controversial stuff (i.e., the fun stuff), directly to how it relates—or is germane—to the academic topic. Finally, address the class with “so do you see the connection here?” When they say “yes,” you’re covered.

Why All Babies Deserve to Die: Science and Theology in the Abortion Debate

by Matt McKinnon

The debate rages on…

The debate rages on…

Just a few of the headlines on the abortion debate from the last few weeks:

I would say that the Abortion issue has once again taken center stage in the culture wars, but it never really left. Unlike homosexual marriage, which seems to be making steady progress towards resolution by a majority of Americans that the freedom to marry of consenting adults is basic civil right, the abortion debate continues to divide a populace who is torn between adjudicating the priority of the basic rights of both mother and “potential” child.

I say “potential” child because herein is where the real debate lies: exactly when does a fertilized human egg, a zygote, become a “person,” endowed with certain human if not specifically civil rights?

Is it a person yet?

Is it a person yet?

Dougherty’s main point in his article on liberal denial focuses on the “fact” of the beginnings of human life. He claims that liberals tend to make one of two types of arguments where science and human life are concerned: either they take the unresolved legal issue regarding the idea of personhood and transfer it back to the “facts” of biology, concluding that we cannot really know what human life is or when it begins, or they acknowledge the biological fact of the beginning of human life but claim that this has no bearing on how we should think about the legality of abortion.

Both sorts of arguments, he claims, are obscurantist, and fail to actually take into account the full weight of science on the issue.

But the problem, I contend, isn’t one of science: it’s one of theology—or philosophy for those less religiously inclined.

The problem is not the question of “what” human life is or “when” it begins. Dougherty points out:

After the fusion of sperm and egg, the resulting zygote has unique human DNA from which we can deduce the identity of its biological parents. It begins the process of cell division, and it has a metabolic action that will not end until it dies, whether that is in a few days because it never implants on the uterine wall, or years later in a gruesome fishing accident, or a century later in a hospital room filled with beloved grandchildren.

Two-cell zygote.

Two-cell zygote. Is this a person?

So basically, human life begins at conception because at that point science can locate a grouping of cells from which it can deduce all sorts of things from its DNA, and this grouping of cells, if everything goes nicely, will result in the birth, life, and ultimate death of a human being.

He even gets close to the heart of the problem when, in arguing against an article by Ryan Cooper, he claims that many people are not fine with the idea that an abortion represents the end of a life, nor are they comfortable with having a category of human life that is not granted the status of “humanity”—and thus not afforded basic human rights.

The problem with all of these discussions is that they dance around the real issue here—the issue not of “human life” and its definition and beginning, but rather the philosophical and often theological question of the human “person.”

If we look closely at Dougherty’s remarks above, we note two distinct examples of why the generation of human life is a “fact”: (1) we can locate DNA that tells us all sorts of things about the parents (and other ancestors) of the fetus and (2) this fetus, if everything works properly, will develop into a human being, or rather, I would argue, a human “person.”

For there’s the distinction that makes the difference.

After all, analyze any one of my many bodily fluids and a capable technician would be able to locate the exact same information that Mr. Dougherty points out is right there from the first moments of a zygote’s existence. But no one claims that any of these bodily fluids or the cells my body regularly casts off are likewise deserving of being labeled “human life,” though the sperm in my semen and the cells in my saliva are just as much “alive” as any zygote (believe me, I’ve looked).

No, the distinction and the difference is in the second example: The development of this zygote into a human person. My sperm, without an egg and the right environment, will never develop into a human being. The cells in my saliva have no chance at all—even with an egg and the right conditions.

Nope, not people.

Nope, not people.

So the real force of Doughtery’s argument lies in the “potential” of the zygote to develop into what he and anti-abortion folks would claim is already there in the “reality” of a human person.

The debate thus centers on the question of human personhood, what we call theological or philosophical anthropology. For one side, this personhood is the result of a development and is achieved sometime during the embryonic stage (like “viability”) or even upon birth. For others, it is there at conception. For some in both camps it would include a “soul.” For others it would not.

So the reason that the abortion debate is sui generis or “of its own kind” is because here the issue is not the rights of a minority versus the rights of a majority, as it is in the debate about homosexual marriage, or even the rights of the mother versus the rights of the child. Rather the real debate is about when “human life” is also a human “person” (note this is also informs the debate of whether or not to end the life of someone in a vegetative state).

Is this a person?

Fetus at four weeks. Is this a person?

To this end, Mr. Dougherty is correct: We can and do know what human life is and when it begins. And he is correct that many are uncomfortable with the idea that abortion means the death of a human life. But he fails to recognize that the reason this is the case is that while those on one side regard this “life” as a human person, others do not. Potentially, perhaps, but not a “person” yet. And certainly not one whose “right to life” (if there even is such a thing: nature says otherwise—but that’s another blog post) trumps the rights of the mother.

So what does all of this have to do with all babies deserving to die? It’s simple: this is what the (necessary?) intrusion of theology into public policy debates entails. Once theological ideas are inserted (and note that I am not arguing that they should or shouldn’t be), how do we adjudicate between their competing claims or limit the extent that they go?

For the two great Protestant Reformers Martin Luther and John Calvin, representing the two dominant trajectories of traditional Protestant Christianity, humans are, by nature, sinful. We are conceived in sin and born into sin, and this “Original Sin” is only removed in Baptism (here the Roman Catholic Church would agree). Furthermore, we are prone to keep sinning due to the concupiscence of our sinful nature (here is where the Roman Church would disagree). The point is that, for Protestants, all people are not only sinful, but are also deserving of the one chief effect of sin: Death.

romans_6-23

“For the wages of sin is death.” — Romans 6:23

 

Calvin was most explicit in Book 2, Chapter 1 of his famous Institutes:

Even babies bring their condemnation with them from their mother’s wombs: they suffer for their own imperfections and no one else’s. Although they have not yet produced the fruits of sin, they have the seed within. Their whole nature is like a seedbed of sin and so must be hateful and repugnant to God.

Since babies, like all of us, are sinful in their very nature, and since they will necessarily continually bear the fruits of those sins (anyone who’s ever tried to calm a screaming infant can attest to this), and since the wages of those sins is death, then it’s not a far-fetched theological conclusion that all babies deserve to die. And remember: “they suffer for their own imperfections.”

But they don’t just deserve to die—they deserve to go to hell as well (but that’s also another blog post). And this, not from the fringes of some degenerate religious thinker, but from the theology of one of Protestant Christianity’s most influential thinkers.

A sinner in the eyes of God (or at least Calvin).

A sinner in the eyes of God (according to John Calvin, anyway).

Of course, it should be noted that Calvin does not imply that we should kill babies, or even that their death at human hands would be morally justifiable: thought he does argue (and here all Christian theology would agree) that their death at the hand of God is not just morally justifiable, it is also deserved. It should also be noted that the Roman Catholic theology behind the idea that children cannot sin until they reach the age of reason is predicated on the notion that this is only the case once their Original Sin has been removed in Baptism (So Jewish, Muslim, and Hindu kids would be sinful, unlike their Christian counterparts).

Again, this is not to argue that philosophical and theological principles should not be employed in the abortion debate, or in any debate over public policy. Only that (1) this is what is occurring when pro-choice and anti-abortion folks debate abortion and (2) it is fraught with complexities and difficulties that few on either side seem to recognize.

And contrary to  Mr.Dougherty, this is beyond the realm of science, which at best tells us only about states of nature.

But the only way we have a “prayer” of real sustained dialogue—as opposed to debates that ignore our competing fundamental positions—is to take seriously the philosophical and theological issues that frame the question (even if my own example is less than serious).

But I’m not holding my breath. I would most certainly die if I did.

Homeownership as Hazing

by Chris Metivier

Among the unique joys of home ownership.

Ah, the joys of homeownership.

I’m writing this as I wait for the air conditioning repair tech to call me back. He’s supposed to be here by now. I called the company office about half an hour ago to make sure he was still coming, and they said he would call when he’s on his way. I know that technicians get held up when they’re on call, but it’s late afternoon and if he can’t make it today, I don’t know when I’ll be able to reschedule. I’m about to start a new position at the university, a 9-to-5 office job, so today might be my only weekday off for a while. Luckily, it’s rainy today, and cool. So it’s comfortable in my house for now. But it’s only going to get hotter, and my air conditioning unit just runs, impotently, like a mouse on one of those wheels, endlessly turning but accomplishing nothing.

I don’t know anything about air conditioning, but I expect I’ll learn. Much the same way that I’ve learned about plumbing, wiring, and landscaping in the last 9 months or so that I’ve been a homeowner—by having it explained to me by a very capable and polite tradesperson as they repair it. Each time something goes wrong, I ask my veteran homeowner friends if there’s a repair person who they can recommend. They always offer suggestions, but with a tone of resignation that indicates they’ve been here too. Disappointed, financially stressed, unprepared, and even sort of victimized.

He’s happy because he knows the air conditioning works at HIS house.

He’s happy because he knows the air conditioning works at his house.

The thing that gets me is, all these same people were so enthusiastic about my decision to buy a home a year ago. They went on and on about how good of an investment it is and how I’ve been “throwing money away on rent all these years” and “it’s the grown-up thing to do”. Now that the damage is done, they smile wistfully when I complain that everything that can’t be detected in a pre-closing inspection has gone wrong since I’ve bought my house. “Yup”, they say, “get used to that”.

Where was their resignation before I saddled myself with a mortgage? Where was their jaded sincerity? My homeowner friends were all middle-class pride and upward mobility when I was in the market for a house, but now they scoff at my naivete when I complain that I no longer have any savings because every extra dollar goes into fixing my house. I feel like an inductee into some vaguely secretive and mildly abusive club. One that spends a lot of time bragging about the benefits of membership to outsiders, but conveniently neglects to mention the disadvantages.

I had friends in college who joined fraternities, and during their sort of probationary “pledging” period, they were made to carry awkward objects around campus, or recite obscure facts about the university at the command of the senior members. I never joined any such organization myself, but I imagine this hazing ritual is intended to inspire loyalty and demonstrate commitment. My experience as a homeowner feels a little bit like that. It’s not that other homeowners are intentionally abusing or embarrassing me. But they knew I would be abused and embarrassed, and yet they encouraged me to join their club.

Like this, except it only hurts your bank account.

Like this, except it only hurts your bank account.

It occurs to me that this behavior applies to a wider range of groups. Parents often encourage non-parents to have children. Religious folks often recommend spirituality to the non-religious. Even fans of a particular tv show will push their friends to watch the show too. It seems like people always think that their choices are the best ones, and that others would be better off if they agreed. This, of course, is no surprise. How could anyone get through life thinking that all their decisions were poor ones and they would have been better off doing otherwise. Perhaps there are people like this, but they would be miserable friends, and so I suspect not many of us are receiving advice from people like that.

I think there is a range of interpretations of this behavior. The most cynical is that misery loves company—that people who regret their decisions are resentful of those who chose more wisely and strive to bring cosmic justice into balance by luring the lucky or clever into ruin. Perhaps, again, there are some people like this, but it seems pretty unlikely that there are very many.

Slightly less cynical is the possibility that people want to feel justified in their decisions, regardless of whether they are actually good ones, and so they put the most positive spin on their choice to buy a house, have children, get married, etc., which has the dual effect of both convincing themselves that their decision was justified in retrospect, and possibly convincing others to join them, further validating the decision.

A "Shellback" initiation (as sailors cross the Equator for the first time).

A “Shellback” initiation (as sailors cross the Equator for the first time).

A sort of value-neutral psychological explanation is that people simply don’t think about the bad parts of their experience when they make recommendations. They don’t have any real agenda when they tell you that their lives have improved since they started watching Game of Thrones or became gluten-free. They really believe, at least in that moment, that their lives have improved, and their recommendation is more of a description of the benefits they have actually enjoyed. They simply are forgetting about the costs. I think psychologists call this confirmation bias. Or maybe I’m thinking of a different thing, but I’m confident there is a name for it.

To be more optimistic, one more possibility is that people really do, on the whole, assess their lives as better in the light of the change they recommend. They judge themselves to be sincerely happier and they want you to be happier too. They reflect on the quality of their lives before and since their decision, they evaluate the impact of the change, and they believe that it has caused a net improvement. Of course, they may be mistaken in their conclusion that there is a causal link between their decision and their happiness (I’m referring to the work of Dan Gilbert here), but their motivations are benevolent.

While I lean toward the cynical in my explanations of human behavior in most cases, I suspect that my enthusiastic homeowner friends were not actually using me to justify their own bad choices or mollify their regrets. Probably they were reflecting on their current lives in a positive light, and perhaps mistaking correlation for causation.

xkcd552correlation

In any case, I don’t know if they really meant to initiate me into the exclusive club of homeownership through ritual hazing. But I have learned, since I started writing this a couple weeks ago, that there is nothing wrong with my air conditioning that is detectable by any tests that a professional technician is likely to perform. So now I just seem delusional. To this technician, like a bourgeois academic, mystified by the workings of the machinery that makes my comfortable life possible. And to you, like neurotic blogger spouting cautionary tales as though they are profound. Both you and he are probably right.

Whatchoo Talkin’ ‘Bout, Ivar Aasen?!

by Carrie Levesque

parlez-vous-quebecois-500I’ve always loved to study languages. I grew up in a bilingual area close to the Quebec border where a French dialect nearly unrecognizable to the French (and sometimes to the Quebecois from whom it derived) is widely spoken. In college I continued to study French, majored in Russian and took a few semesters of Spanish because I had a crush on the professor. As a graduate student in Slavic literatures, I studied Croatian for a summer in ultra-Catholic Zagreb, where the prize for the best language student was a large coin with a fetus on it (it was OK with me that I didn’t win). While I’ve met with many linguistic frustrations over the years (the Russian case system, French verb tenses) little in these experiences prepared me for the hot mess that is learning Norwegian as a foreigner in Norway.

A Norwegian to Norwegian translating dictionary.

A Norwegian-to-Norwegian translating dictionary. … No, really.

Because, you see, in Norway there is no standard spoken language. Norwegian literally has dialects without number, and there is no favored dialect. Your dialect is as good as mine. Mixing dialects: Also kosher. The Norwegian you learn in your Norwegian as a Second Language class is not the same Norwegian spoken on the street, and the Norwegian spoken on your street is different from the Norwegian spoken 50 miles down the road. In any Norwegian family one marries into, your spouse may speak a different dialect from his mother, who may speak a different dialect from her spouse. This is, of course, just a little inconvenient for non-native Norwegian speakers.

Back in the day, Danish was the standard written language of Norway. While the urban elite also spoke the same Danish they wrote, or a Norwegianized form of it, the isolated rural populations spoke dialects that evolved only gradually from Old Norse (Vikingspeak) to something more closely resembling the Dano-Norwegian spoken in the cities.

Ivar Aasen, who died 118 years ago, is sometimes the bane of my existence.

Ivar Aasen, who died 118 years ago, is sometimes the bane of my existence.

Norway was liberated from Danish rule in 1814, and as this was the era of Romantic Nationalism, establishing one’s own national language and culture was a primary concern. Norwegian linguist Knud Knudsen began to standardize a more fully Norwegianized form of written Danish into what is today called bokmål (literally, ‘book language’). Meanwhile, a self-taught country boy named Ivar Aasen (who, in my opinion, could not just leave well enough alone) traveled throughout Norway’s far-flung rural settlements, compiling many different spoken dialects into Norway’s other official written language, nynorsk, or “new Norwegian” (sometimes called ‘spynorsk’ (or ‘Pukewegian’) by bokmål devotees).

nei-til-nynorskAlthough only about 15% of Norwegian schoolchildren opt for nynorsk as their language of primary instruction today, all schoolchildren learn both languages in school since nynorsk is the official written language of many counties, especially in Western Norway. So while Bergen is a bokmål city and all of its official written business is done in bokmål, it is located in a nynorsk county, and so all institutions administered by the county (the hospitals, universities and high schools) issue communications in nynorsk. And this doesn’t cover what is spoken in Bergen, which is its own crazy something else. The shock and despair I felt when I learned all of this has since diminished, but it may be years before it ever fully leaves.

I heart NynorskFrom the start, people in Norway have been deeply, personally invested in whichever regional dialect of Norwegian they speak or write, so much so that to affect an easier-to-understand dialect—for the sake of, say, helping a foreign student of Norwegian understand them—feels so wrong that in such circumstances they prefer to speak English. While in Norway, unlike in the US, there is by and large no stigma attached to speaking a dialect, there is a stigma attached to speaking a dialect that is not your own.

Early on it was decided that since it was its dialects that kept Norwegian distinct from Danish, the equal status of all dialects must be preserved. This is not to say that people don’t make fun of each other’s dialects; there is lots of good-natured discussion around whose dialect is the ugliest. But, to give an example, teachers in school cannot correct their students for speaking a different dialect.

So linguistically egalitarian are the Norwegians that Norway’s major public television networks also produce programming in the language of the Sami, the indigenous population living mostly in the north of Norway. Though there are only 40-50,000 Sami in a total population of 5 million, every day they broadcast children’s shows and the evening news in Samisk (with Norwegian subtitles). That would be like American public television networks providing daily programming in Navajo or Cherokee. Hard to imagine that happening.

Sami in traditional dress for a cultural event.

Sami in traditional dress for a cultural event.

It is interesting to imagine what things would be like in the US if we regarded all dialects equally, as the Norwegians do. While I don’t know a lot about the history of our most prominent dialects’ development, I think a lot of the stigma directed in some regions of the country toward certain dialects comes from the complicated and often ugly history framing, for example, Northerners’ prejudice toward Southern dialects, or whites’ prejudice toward Ebonics. How might it challenge us to do some serious thinking about these parts of our history if we learned to view as equally valid the different language patterns that grew out of this history?

“Dialects are not necessarily positively or negatively valued; their social values are derived strictly from the social position of their community of speakers[,]” as W. Wolfram and N. Schilling-Estes explain in American English: Dialects and Variation. What does the way a nation treats its speakers of certain dialects say about the values of that society? Because even if Ivar Aasen made my life more difficult by preserving and legitimizing the dialects spoken by many of Norway’s most disenfranchised citizens, I see the great value of his larger project and its enduring message. Everyone matters. And the tool we use to express our worth—our language—matters, too.

Why Writing Matters, And Why You Should Care

by Erin Poythress

3:00 A.M. Still up writing that essay.

3:00 A.M. Still up writing that essay.

You are working on your final essay and preparing to turn in 35% of your grade, and the universe hears you thinking out loud, your curses at the screen. It hears your exhaustion, and perhaps, just the slightest temptation to lift a paragraph or idea from a source you’re reading. You know, since you can’t say it any better than its author did… and anyway it’s 3:00 AM. Maybe this isn’t you. I hope it isn’t you. But if you’re human, you’ve probably at least thought about it. We all have.

This New York Times article describes how plagiarism is on the rise on college campuses all over the country. Any student would be most wise to read this. It isn’t very long, and is a thoughtful approach to a topic that is typically unthoughtfully discussed in class: academic integrity and intellectual property.

Many instructors don’t want to have to spend time discussing plagiarism, and I’ll admit I have felt like students should know this by now. I have also felt that I am only preaching to the choir, since someone lazy and irresponsible enough to cheat clearly isn’t going to bother to read or listen. Often the discussions of cheating that occur the first day of class are like bad sex-ed talks from the 1950s—”don’t ever do it; bad things happen if you do”—without ever talking about what “it” is.

But the notions of authorship and intellectual property have changed in the digital age, and you need to know how this will affect you, because they haven’t changed at UNCG or any other college campus.

"Do you think I haven't read that article myself?"

“Do you think I haven’t read that book myself?”

If you use someone else’s words or ideas and do not give them credit, it is plagiarism, which is just a fancy word for stealing. In an age where you can illegally download music, books, movies, and where websites routinely steal passages from each other uncredited, this may seem like an antiquated notion. It’s not. Not only is that how the university’s Academic Integrity policy specifically defines plagiarism, but to cut and paste or in any other way claim another’s thoughts as your own does not prepare you for the kind of synthesis and analysis that intelligent people must do to be a successful and productive part of society. The short-term result of plagiarizing any part of your essay in one of my classes is, of course, failing the class. But that concerns me less than its broader implications. And it should concern any student, too.

When you graduate from college, because you will have more education than many of your peers, you will have opportunities to not only be more financially secure in this world, but to shape this world. I would argue that all of us—whether we have a Ph.D. or a third-grade education—have an obligation to be a force of positive change in our communities, and as you join the ranks of those with the most education, you have the opportunity to be more visible and more convincing, since you’ve spent all those years learning to think logically and argue convincingly. But this means you also have an obligation to do your thinking and arguing ethically. It’s not difficult at all to find examples of unethical people who have preyed upon innocent people and even profited. Bernie Madoff comes to mind, but he is one of the more egregious examples of lapses in ethics that occur on smaller scales every day. His crimes had victims with names and bank accounts. You may think intellectual property heists have no such victims, but they do, as the linked article from The Crimson attests. They not only hurt the people who actually did the hard work of composing their thoughts, but they hurt the people that steal them because they help sustain the lie that the ones who steal can generate meaningful, coherent thought. What do you think the world would look like if our country’s great thinkers resorted to cut and paste instead of doing the difficult work of trying to solve our world’s most pressing problems?

Slave labor?

Slave labor, anyone?

This may seem like a strong reaction to a problem you view as minor, but I ask you, if, from here on out, all we do is copy/paste/recycle/reuse all the thoughts that came before without improving them, challenging them, overturning them, how will we solve problems we have never faced? What will be the fate of human innovation if all our thoughts are merely mashups of someone else’s deliberation?

Perhaps original thought is overrated, but I don’t think so. And the university doesn’t think so. And original thought is exactly what is expected in your essays. That doesn’t mean you can’t learn from other people’s ideas, but you must give them credit for lighting your path. Don’t denigrate your own talents by lifting their words verbatim without quotation marks and a citation—you’re all intelligent enough to discuss a topic without resorting to stealing.

After Jamaica Kincaid’s “Girl”

by Jessie Lane, BLS student

Jessie Lane

Jessie Lane.

The assignment, for Debby Seabrooke’s Contemporary Short Stories class, was to create a personal adaptation of “Girl” by Jamaica Kincaid. “Girl” was first published in the June 26, 1978 edition of the New Yorker, and has since been widely anthologized. It is frequently studied in literature and writing classes.

Girl
(after Jamaica Kincaid)

Jamaica Kincaid, ca. "Girl."

Jamaica Kincaid, c. 1978.

Remember that we all are always alone. This is not bad news. This news will lead you to the truth quickly. You can rest assured that there is nothing else in this world to do other than honor your family and to work. Do not worry about forming connections beyond that, this will only waste your time and cause you to suffer. Being alone is honorable—not lonely and isolationist. It is a sign of strength. The only company you need is that of your children. You will understand when you become a mother. But, I’m not sure that I want to become a mother. All other relationships are luxuries, and luxuries make you lazy. The world will make you think that you need people; they will constantly be at battle with your wits on this one. They want you to believe that relationships are a testament to your worth. Beat ’em! Make sure that you prove your worth though production. You must make progress, always and continually. This will help you to continue living life. Money is the key to success. Not riches, mind you, but the constant flow of steady money. Find that and do not waste your time on other pursuits. Well—you can—you are a free person, but if you do you will know great sorrow and depression. Not only, but mostly. Oh, and don’t let your softness show, it is unbecoming. Girls are really only anything these days if they act like boys. Don’t look like a boy, please…God…look like a pretty, lean girl. But, act like you can kick everyone’s ass because you know how to make it on your own. Be tough, dirty, fierce, and blood hungry on the inside and look physically accommodating on the outside. Being one of the guys without them knowing is key to this fight. But I don’t know much about fighting. Learn everything you can learn about fighting. This is important. You have got to fight in the war to win. Makeup is your war paint; wear it often, and wear it right. But I can’t seem to get used to the feeling of makeup on my face. To be a proper girl, a proper daughter, you must also take care of us parents. There is a special reason that you are the only girl. Let the boys be single-minded. You can do it all. You must, really, if you want to live a life of meaning. You will understand when you have your children. But, don’t have children because you don’t want to be alone or because you think that you have an unnerving urge to give life and love. These are biological tricks that nature plays on all of us, and it’s important to remember that you can always control nature. Have your children because you will need someone to provide for you in old age. You don’t even need a man. But, I’m not even sure what I want. You will see, child. You will see.

———

Jessie Lane is a 28 year old senior in the BLS Humanities concentration and lives in the mountains of Asheville, North Carolina. She started her college education at age 15 after dropping out of high school in Phoenix, Arizona,  running away to Asheville, and enrolling in classes at Asheville-Buncombe Technical Community College. She has lived all over the United States and traveled worldwide, including Mexico, South Africa, France, Netherlands, and Spain. She is a passionate percussionist, and plays with groups in various genres all around the Asheville area. She loves dancing and writing.

Brand Loyalty and Personal Identity

by Chris Metivier

This is not Christopher's haircut.

This is not Chris’ head.

I’m a traitor, a turncoat, a fence-jumper, and possibly having an identity crisis. Some of my friends feel betrayed, others feel like they’ve gained an ally, still others feel like they don’t know me anymore. I have mixed feelings of guilt, pride, defensiveness, and confusion. I don’t know if many of the people in my life will ever look at me the same way again. Certainly they won’t if they are looking at me in a recent selfie, since one of the advantages of my new life is that the front-facing camera on my new Android phone is much better than the one on my old iPhone.

I am a brand-betrayer. I’ve been “that Apple guy” for most of the last decade, an Apple evangelist even. I was an early adopter of the iPhone, and an apologist for it ever since. But no more. Now I’m an “Android phone guy”. Sure, I still use Apple computers (as well as a Windows computers, because they both have their advantages), but everyone knows that when you get a text message and pull out your handset that is simultaneously your connection to the rest of the world and a distraction from it, it’s either going to be an iPhone, or something else. People can tell a lot about you from your phone. They know how you operate, what’s important to you, what kind of person you are. They can tell whether you are a sophisticated, modern aficionado of contemporary industrial design, or a utilitarian, no-nonsense, all-business power-user.

Battle of the Brands.

Battle of the Brands.

People judge you on what kind of phone you use. I had no idea how much until I made the switch. As a member of team iPhone, I never noticed how much people believed in their iPhones. It seemed normal to me. Obviously iPhone is the superior device. I’m no fool, why would I have ever bought an inferior product? As one of these people, I had never been on the receiving end of the nose-wrinkling, smug disgust toward anything non-iPhone. Since becoming an iPhone outsider, I have been forced to wonder, have I been behaving that way all these years?

trusted-brandOf course I know, academically, that all this talk about judgment and character evaluation is superficial and not to be taken seriously. I even teach my business ethics students this very lesson. I use “consumerism” to indicate this kind of social behavior. I know “consumerism” is used lots of different ways in different contexts, but this is how I use it in that course. I’m covering that unit now, and just last week established the term. It works like this. Advertising comes in two flavors: transactional and branding. Transactional ads are ones that give you some information about some product or service. Branding ads don’t give you any information, but instead aim to change the way you feel about a brand. The danger of branding is that it asks you to identify with the values that a brand (ostensibly) represents. When lots of people internalize these brand values, they begin to understand themselves, their personal identity, through the brands that they buy. Their self-identity depends on their consumption. Hence, consumerism.

Here is an example that is both particularly easy to analyze and particularly relevant to my case.

PC -vs- Mac.

P.C. vs. Mac.

I’m sure you’ve seen one of the ads from this campaign, perhaps even a parody of it. It was, in terms of recognition, very successful. I often use it as an example in business ethics class. In it we see two characters who figuratively represent not just products, but brands. It’s important to notice that the two characters don’t represent specific products. The two options are Mac and PC. Not Mac and Windows, Mac and Dell, or something else. Apple has set up a dichotomy here between Mac and everything else. So you, the consumer, have exactly two options. Do you want to be like the hip, young, creative, relaxed, attractive Mac guy, or the stiff, nerdy, uptight, boring PC guy? The ad implies that there are “Mac people” and “PC people” and that they have personalities that can be identified by the products they use (or more importantly, buy).

apple-tat-girl-edWhen our culture becomes one (and it has) where people make purchasing decisions based not on the qualities of the products but on the whether or not they believe those products reflect the identity that they want to project, then it becomes a consumerist culture.

Let’s face it, all computers (and all smartphones) do pretty much the same stuff. When we decide which one to buy, it’s rarely based on the properties of the device itself. They just aren’t very different. Sure, you might be used to doing things one way or another, and it would be inconvenient to have to learn new methods for getting things done (another strategy companies use to create an artificial barrier to switching once you’re in), but certainly you don’t think so little of yourself that you believe you couldn’t learn. When you decide to buy a Mac instead of a PC, Apple hopes that it’s at least in part because you think of yourself as the kind of person who uses a Mac. That’s where brand loyalty comes from.

Coke or Pepsi?

Coke or Pepsi?

Sure you might come to believe that the product you chose is objectively superior, or that it will provide tangible benefits to you over the alternative. But those are ad hoc justifications for a decision you made based on how you felt about the product and what kind of person you want to be. Cognitive psychologists call it “confirmation bias” when we cherry-pick evidence to support the position that we’ve already adopted. As consumers, we don’t want to feel foolish for having bought an inferior product (I’m going to stick with the example of computers or smartphones), so we insist that our choice was the best in the face of criticism from those who happen to have bought a competing product for the same feelingsy reasons. And because none of us want to admit (or are maybe not even aware of) our underlying motivations, we get into esoteric fights about megapixels and gigahertz, or when these measurements are not on our side, we can use more abstract metrics like “user experience” or “software ecosystems.”

…which is a croc.

…which is a croc.

I’ve taught this lesson for years, and it never occurred to me until now that I was as guilty as anyone else. So I decided to put my money where my mouth is and turn in my iPhone, to abandon the comfortable hegemony of Apple’s walled garden for the untamed, shifting Otherness of Google’s Android platform. I admit, I had a period of homesickness when I discovered that some of the conveniences I had enjoyed would take some work to reestablish.

"I've made a huge mistake."

“I’ve made a huge mistake.”

But on the whole, as I should have expected, my life has continued largely unchanged. I don’t regret my decision. And as I write this, it occurs to me that this shouldn’t even be a big deal. It wouldn’t be if consumerism wasn’t such a strong force, both internally and externally, in my life.

So I guess that’s the lesson, which I’m still learning. Consumerism is ubiquitous, insidious, and powerful. I’ve resisted the desire to detail the arguments I’ve heard in just the last few days for why my decision to change camps was either foolish or inspired, and to analyze all the ways in which these arguments are misguided, selfishly motivated, or just mean. I know my decision was neither a blunder nor an epiphany. It was an experiment. And it’s value in self-reflection alone was worth the price.

Brand-logo alphabet. How many do you recognize?

Brand-logo alphabet. How many do you recognize?

Earth Day Is a Sham

by Matt McKinnon

1

I am very fond of the earth. I live here, and have now for almost five decades. It’s the only home I have ever known, and I plan on retiring here and someday giving back to the earth by, well, decomposing and becoming dirt.

Ashes to ashes and all that.

I also love to garden. I love the feel of dirt between my fingers: the rich, dark stardust that collected after the Big Bang and has nourished the origin and descent of our species, of all species, since the beginning of life.

In fact, my favorite part of gardening is not the planting, which is a close second. Or the harvesting, though I enjoy the fruits of my garden immeasurably. No, my favorite part is composting: Meticulously collecting all the bits and scraps from the kitchen as well as the garden to supply generous amounts of “greens” for nitrogen, shredding junk mail (and when I taught face-to-face, unclaimed papers) to add the proper amount of “browns” for carbon, assembling them all in my composter, and religiously turning and stirring to get the desired result of rich, black, humus.

2

The good stuff.

(The sweet smell of a properly-proportioned compost pile is actually quite intoxicating.)

So my favorite part of gardening is not just sticking my hands in the earth, but making it.

I have always loved the earth, literally, for as long as I can remember. One of my first memories is getting home from church on Easter Sunday, brightly arrayed in my new pastel-colored Easter suit, and making a mad dash for the dirt, new plastic bulldozer in hand to play in my beloved earth.

I must have been maybe five years old.

And all through my childhood the place my friends and I played most regularly was a lower, barren part of my neighbor’s backyard that we endearingly called “down in the dirt.” As in: “I’ll be back later mom; we’re going down in the dirt.”

And when my wife was a teacher, I would happily assist her in making the annual “Earth Day Cake,” complete with crushed Oreos and gummy worms. Not too dissimilar from the mud pies I used to make in my own backyard.

So it is with much pain and anguish that I proclaim Earth Day to be a sham. A fraud. A ruse. Perpetrated by both well-meaning environmentalists (like myself) and corporate interests with ulterior motives.

3

The problem, of course, is not the idea or intent: Celebrating the earth that sustains all that we are, as well as raising awareness of exactly what we humans are doing to our planet.

No, the problem is that Earth Day, far from being a rousing success, has actually been an abject failure.

Though this, of course, depends on how you look at it.

From a PR perspective (is there any other where public policy is concerned), Earth Day has been wildly successful. First proposed in 1969 by peace activist John McConnell to honor the earth as well as peace, and celebrated annually on April 22nd, Earth Day has grown from its initial celebration mostly in schools and colleges across the United States to become the largest secular holiday in the world, celebrated by some one billion people in over 192 countries.

But from a practical perspective, the movement has not had the desired effect of meaningfully arresting the manner in which we are still destroying the earth. Even more so than in 1970. Heck, it hasn’t even managed to convince most Americans that we are experiencing an ecological crisis.

Though perhaps it makes us feel better about it, at least one day a year.

And therein is the problem. Couched in terminology of honoring the earth, and even cleaning it up a bit, Earth Day domesticates what is arguably the greatest catastrophe to ever befall humanity: the impending collapse of an environment that is hospitable to human survival.

There have, of course, been other extinction events before—five in fact, with the largest being the “Great Dying” (or Permian-Triassic extinction event for all those biogeeks out there), some 252 million years ago, which resulted in the extinction of an estimated 90% of all species. The most famous, arguably, is the last major extinction, the Cretacious-Paleogene extinction event around 66 million years ago that resulted in the loss of 75% of all species, including everyone’s favorite—all those non-avian dinosaurs. This of course was followed by the rise of mammals (and birds) as the dominant land vertebrates. Which has ultimately led us to the precipice of a sixth extinction event.

4

Many scientists (PBS reports 70% of all biologists) predict that we are now in the beginning of another extinction event, the first (and probably last) ever to be caused by humans. (The same humans, incidentally, who celebrate Earth Day every year.) The result of this current extinction may compete in magnitude with the Great Dying, resulting in the extinction of nearly 90% of all living species. And potentially in a much quicker manner than the previous five extinction events of the past.

Of course, the data is not conclusive and the consensus is not unanimous, as it rarely is in science, or anything else for that matter.

But what is clear is that, regardless of what the population believes about “climate change” or “global warming,” we humans have polluted and destroyed parts of the earth to the extent that they may never recover—at least not in terms of being able to support life as we know it. (And by that I mean human life as well as those things that support human life.)

More so than the recent coal ash spills in our own neighborhood or the release of toxic chemicals in West Virginia, the oceans are perhaps the best example of how much humans have destroyed and are continuing to destroy the earth’s environment.

5

Floating islands of trash in the Pacific Gyre.

So let’s be clear in a manner that climate change or global warming cannot: the oceans are dying at an alarming rate. And by “dying” I don’t mean metaphorically. I mean literally. As in, studies suggest that all of the world’s corals may be extinct by the end of this century due to the acidification of the oceans caused mostly by the carbon dioxide emissions from human activity. And once the oceans die, well, human survival becomes more than a little tenuous.

And yet instead of debating what best to do about the great damage we have already caused to the earth, we are instead debating how to regulate fracking (if at all), whether to institute a “carbon tax,” and whether or not to build a pipeline from the oil sands in Canada to refineries in the United States. Rest assured: such debates are moot. For if we succeed in burning all of the oil available in those sands as well as the natural gas and coal we extract from the ground here in the US, then our fate is sealed. Along with that of anywhere upwards of 90% of the species who inhabit earth along with us.

6Oh, I almost forgot:

Have a Happy Earth Day.