Category Archives: Science and Technology

Why All Babies Deserve to Die: Science and Theology in the Abortion Debate

by Matt McKinnon

The debate rages on…

The debate rages on…

Just a few of the headlines on the abortion debate from the last few weeks:

I would say that the Abortion issue has once again taken center stage in the culture wars, but it never really left. Unlike homosexual marriage, which seems to be making steady progress towards resolution by a majority of Americans that the freedom to marry of consenting adults is basic civil right, the abortion debate continues to divide a populace who is torn between adjudicating the priority of the basic rights of both mother and “potential” child.

I say “potential” child because herein is where the real debate lies: exactly when does a fertilized human egg, a zygote, become a “person,” endowed with certain human if not specifically civil rights?

Is it a person yet?

Is it a person yet?

Dougherty’s main point in his article on liberal denial focuses on the “fact” of the beginnings of human life. He claims that liberals tend to make one of two types of arguments where science and human life are concerned: either they take the unresolved legal issue regarding the idea of personhood and transfer it back to the “facts” of biology, concluding that we cannot really know what human life is or when it begins, or they acknowledge the biological fact of the beginning of human life but claim that this has no bearing on how we should think about the legality of abortion.

Both sorts of arguments, he claims, are obscurantist, and fail to actually take into account the full weight of science on the issue.

But the problem, I contend, isn’t one of science: it’s one of theology—or philosophy for those less religiously inclined.

The problem is not the question of “what” human life is or “when” it begins. Dougherty points out:

After the fusion of sperm and egg, the resulting zygote has unique human DNA from which we can deduce the identity of its biological parents. It begins the process of cell division, and it has a metabolic action that will not end until it dies, whether that is in a few days because it never implants on the uterine wall, or years later in a gruesome fishing accident, or a century later in a hospital room filled with beloved grandchildren.

Two-cell zygote.

Two-cell zygote. Is this a person?

So basically, human life begins at conception because at that point science can locate a grouping of cells from which it can deduce all sorts of things from its DNA, and this grouping of cells, if everything goes nicely, will result in the birth, life, and ultimate death of a human being.

He even gets close to the heart of the problem when, in arguing against an article by Ryan Cooper, he claims that many people are not fine with the idea that an abortion represents the end of a life, nor are they comfortable with having a category of human life that is not granted the status of “humanity”—and thus not afforded basic human rights.

The problem with all of these discussions is that they dance around the real issue here—the issue not of “human life” and its definition and beginning, but rather the philosophical and often theological question of the human “person.”

If we look closely at Dougherty’s remarks above, we note two distinct examples of why the generation of human life is a “fact”: (1) we can locate DNA that tells us all sorts of things about the parents (and other ancestors) of the fetus and (2) this fetus, if everything works properly, will develop into a human being, or rather, I would argue, a human “person.”

For there’s the distinction that makes the difference.

After all, analyze any one of my many bodily fluids and a capable technician would be able to locate the exact same information that Mr. Dougherty points out is right there from the first moments of a zygote’s existence. But no one claims that any of these bodily fluids or the cells my body regularly casts off are likewise deserving of being labeled “human life,” though the sperm in my semen and the cells in my saliva are just as much “alive” as any zygote (believe me, I’ve looked).

No, the distinction and the difference is in the second example: The development of this zygote into a human person. My sperm, without an egg and the right environment, will never develop into a human being. The cells in my saliva have no chance at all—even with an egg and the right conditions.

Nope, not people.

Nope, not people.

So the real force of Doughtery’s argument lies in the “potential” of the zygote to develop into what he and anti-abortion folks would claim is already there in the “reality” of a human person.

The debate thus centers on the question of human personhood, what we call theological or philosophical anthropology. For one side, this personhood is the result of a development and is achieved sometime during the embryonic stage (like “viability”) or even upon birth. For others, it is there at conception. For some in both camps it would include a “soul.” For others it would not.

So the reason that the abortion debate is sui generis or “of its own kind” is because here the issue is not the rights of a minority versus the rights of a majority, as it is in the debate about homosexual marriage, or even the rights of the mother versus the rights of the child. Rather the real debate is about when “human life” is also a human “person” (note this is also informs the debate of whether or not to end the life of someone in a vegetative state).

Is this a person?

Fetus at four weeks. Is this a person?

To this end, Mr. Dougherty is correct: We can and do know what human life is and when it begins. And he is correct that many are uncomfortable with the idea that abortion means the death of a human life. But he fails to recognize that the reason this is the case is that while those on one side regard this “life” as a human person, others do not. Potentially, perhaps, but not a “person” yet. And certainly not one whose “right to life” (if there even is such a thing: nature says otherwise—but that’s another blog post) trumps the rights of the mother.

So what does all of this have to do with all babies deserving to die? It’s simple: this is what the (necessary?) intrusion of theology into public policy debates entails. Once theological ideas are inserted (and note that I am not arguing that they should or shouldn’t be), how do we adjudicate between their competing claims or limit the extent that they go?

For the two great Protestant Reformers Martin Luther and John Calvin, representing the two dominant trajectories of traditional Protestant Christianity, humans are, by nature, sinful. We are conceived in sin and born into sin, and this “Original Sin” is only removed in Baptism (here the Roman Catholic Church would agree). Furthermore, we are prone to keep sinning due to the concupiscence of our sinful nature (here is where the Roman Church would disagree). The point is that, for Protestants, all people are not only sinful, but are also deserving of the one chief effect of sin: Death.

romans_6-23

“For the wages of sin is death.” — Romans 6:23

 

Calvin was most explicit in Book 2, Chapter 1 of his famous Institutes:

Even babies bring their condemnation with them from their mother’s wombs: they suffer for their own imperfections and no one else’s. Although they have not yet produced the fruits of sin, they have the seed within. Their whole nature is like a seedbed of sin and so must be hateful and repugnant to God.

Since babies, like all of us, are sinful in their very nature, and since they will necessarily continually bear the fruits of those sins (anyone who’s ever tried to calm a screaming infant can attest to this), and since the wages of those sins is death, then it’s not a far-fetched theological conclusion that all babies deserve to die. And remember: “they suffer for their own imperfections.”

But they don’t just deserve to die—they deserve to go to hell as well (but that’s also another blog post). And this, not from the fringes of some degenerate religious thinker, but from the theology of one of Protestant Christianity’s most influential thinkers.

A sinner in the eyes of God (or at least Calvin).

A sinner in the eyes of God (according to John Calvin, anyway).

Of course, it should be noted that Calvin does not imply that we should kill babies, or even that their death at human hands would be morally justifiable: thought he does argue (and here all Christian theology would agree) that their death at the hand of God is not just morally justifiable, it is also deserved. It should also be noted that the Roman Catholic theology behind the idea that children cannot sin until they reach the age of reason is predicated on the notion that this is only the case once their Original Sin has been removed in Baptism (So Jewish, Muslim, and Hindu kids would be sinful, unlike their Christian counterparts).

Again, this is not to argue that philosophical and theological principles should not be employed in the abortion debate, or in any debate over public policy. Only that (1) this is what is occurring when pro-choice and anti-abortion folks debate abortion and (2) it is fraught with complexities and difficulties that few on either side seem to recognize.

And contrary to  Mr.Dougherty, this is beyond the realm of science, which at best tells us only about states of nature.

But the only way we have a “prayer” of real sustained dialogue—as opposed to debates that ignore our competing fundamental positions—is to take seriously the philosophical and theological issues that frame the question (even if my own example is less than serious).

But I’m not holding my breath. I would most certainly die if I did.

Earth Day Is a Sham

by Matt McKinnon

1

I am very fond of the earth. I live here, and have now for almost five decades. It’s the only home I have ever known, and I plan on retiring here and someday giving back to the earth by, well, decomposing and becoming dirt.

Ashes to ashes and all that.

I also love to garden. I love the feel of dirt between my fingers: the rich, dark stardust that collected after the Big Bang and has nourished the origin and descent of our species, of all species, since the beginning of life.

In fact, my favorite part of gardening is not the planting, which is a close second. Or the harvesting, though I enjoy the fruits of my garden immeasurably. No, my favorite part is composting: Meticulously collecting all the bits and scraps from the kitchen as well as the garden to supply generous amounts of “greens” for nitrogen, shredding junk mail (and when I taught face-to-face, unclaimed papers) to add the proper amount of “browns” for carbon, assembling them all in my composter, and religiously turning and stirring to get the desired result of rich, black, humus.

2

The good stuff.

(The sweet smell of a properly-proportioned compost pile is actually quite intoxicating.)

So my favorite part of gardening is not just sticking my hands in the earth, but making it.

I have always loved the earth, literally, for as long as I can remember. One of my first memories is getting home from church on Easter Sunday, brightly arrayed in my new pastel-colored Easter suit, and making a mad dash for the dirt, new plastic bulldozer in hand to play in my beloved earth.

I must have been maybe five years old.

And all through my childhood the place my friends and I played most regularly was a lower, barren part of my neighbor’s backyard that we endearingly called “down in the dirt.” As in: “I’ll be back later mom; we’re going down in the dirt.”

And when my wife was a teacher, I would happily assist her in making the annual “Earth Day Cake,” complete with crushed Oreos and gummy worms. Not too dissimilar from the mud pies I used to make in my own backyard.

So it is with much pain and anguish that I proclaim Earth Day to be a sham. A fraud. A ruse. Perpetrated by both well-meaning environmentalists (like myself) and corporate interests with ulterior motives.

3

The problem, of course, is not the idea or intent: Celebrating the earth that sustains all that we are, as well as raising awareness of exactly what we humans are doing to our planet.

No, the problem is that Earth Day, far from being a rousing success, has actually been an abject failure.

Though this, of course, depends on how you look at it.

From a PR perspective (is there any other where public policy is concerned), Earth Day has been wildly successful. First proposed in 1969 by peace activist John McConnell to honor the earth as well as peace, and celebrated annually on April 22nd, Earth Day has grown from its initial celebration mostly in schools and colleges across the United States to become the largest secular holiday in the world, celebrated by some one billion people in over 192 countries.

But from a practical perspective, the movement has not had the desired effect of meaningfully arresting the manner in which we are still destroying the earth. Even more so than in 1970. Heck, it hasn’t even managed to convince most Americans that we are experiencing an ecological crisis.

Though perhaps it makes us feel better about it, at least one day a year.

And therein is the problem. Couched in terminology of honoring the earth, and even cleaning it up a bit, Earth Day domesticates what is arguably the greatest catastrophe to ever befall humanity: the impending collapse of an environment that is hospitable to human survival.

There have, of course, been other extinction events before—five in fact, with the largest being the “Great Dying” (or Permian-Triassic extinction event for all those biogeeks out there), some 252 million years ago, which resulted in the extinction of an estimated 90% of all species. The most famous, arguably, is the last major extinction, the Cretacious-Paleogene extinction event around 66 million years ago that resulted in the loss of 75% of all species, including everyone’s favorite—all those non-avian dinosaurs. This of course was followed by the rise of mammals (and birds) as the dominant land vertebrates. Which has ultimately led us to the precipice of a sixth extinction event.

4

Many scientists (PBS reports 70% of all biologists) predict that we are now in the beginning of another extinction event, the first (and probably last) ever to be caused by humans. (The same humans, incidentally, who celebrate Earth Day every year.) The result of this current extinction may compete in magnitude with the Great Dying, resulting in the extinction of nearly 90% of all living species. And potentially in a much quicker manner than the previous five extinction events of the past.

Of course, the data is not conclusive and the consensus is not unanimous, as it rarely is in science, or anything else for that matter.

But what is clear is that, regardless of what the population believes about “climate change” or “global warming,” we humans have polluted and destroyed parts of the earth to the extent that they may never recover—at least not in terms of being able to support life as we know it. (And by that I mean human life as well as those things that support human life.)

More so than the recent coal ash spills in our own neighborhood or the release of toxic chemicals in West Virginia, the oceans are perhaps the best example of how much humans have destroyed and are continuing to destroy the earth’s environment.

5

Floating islands of trash in the Pacific Gyre.

So let’s be clear in a manner that climate change or global warming cannot: the oceans are dying at an alarming rate. And by “dying” I don’t mean metaphorically. I mean literally. As in, studies suggest that all of the world’s corals may be extinct by the end of this century due to the acidification of the oceans caused mostly by the carbon dioxide emissions from human activity. And once the oceans die, well, human survival becomes more than a little tenuous.

And yet instead of debating what best to do about the great damage we have already caused to the earth, we are instead debating how to regulate fracking (if at all), whether to institute a “carbon tax,” and whether or not to build a pipeline from the oil sands in Canada to refineries in the United States. Rest assured: such debates are moot. For if we succeed in burning all of the oil available in those sands as well as the natural gas and coal we extract from the ground here in the US, then our fate is sealed. Along with that of anywhere upwards of 90% of the species who inhabit earth along with us.

6Oh, I almost forgot:

Have a Happy Earth Day.

 

Eastern US Bitten by its own Global Warming Tail

by Joel Gunn

Digging out of the snow in Boston.

Digging out of the snow in Boston.

I have sympathy for the high profile scientists who are struggling with the politics of global warming: Did this or that weather event occur because of global warming? See for example this New York Times post (and the several updates) in which climate scientists address the effects of global warming on the winter Olympics.  However, if you believe in thermodynamics and the interlinking of the atmosphere-ocean circulation and life processes, and that the world is warming, there is only one conclusion you can come to. Everything that is happening is to a greater or lesser degree a product of global warming. There is a legitimate question as to what degree the causes of individual events are apportioned to that trend and what part is attributable to chaotic forces. The tendency is, perhaps wisely, for scientists to hide behind chaotic forces when the politics gets too hot. However, the only real question is how the events and trend fit together, and that is admittedly complicated. But, since we have not lived with this particular change before, observation and model building have to take the lead. You can see the scientists struggling with these questions in the later parts of the blog, and we as citizens of the globe should also participate in thinking this through.

Jet stream and polar vortex.

Jet stream and polar vortex.

A good example is our current discussion of the wandering fragments of polar vortex and the effects they have had on our weather. When I first started seeing weather reports about the polar vortex I said, how can this be? Probably the first really interesting article I read on the behavior of the atmosphere was in the ’70s by Angell and Korshover, people with a government agency who were measuring the atmosphere in many very interesting ways. Their article was about the behavior of the polar vortex. In fact, they were discussing the reason for the cold weather in the mid ’70s. Who recalls that in the winter of ’76-’77 it was so cold the Ohio River froze over? I was sure that the Ice Ages were coming back.

Chicago over frozen Lake Michigan.

Chicago seen over frozen Lake Michigan.

Our present day relationship to the polar vortex has crept out less directly but more interestingly. After the initial reports appeared, the pieces of the global weather puzzle began to leak out. Someone came on television and claimed that it was because the weakening of the circumpolar jet that ordinarily pins in the polar vortex. That helped some. Then a friend sent me a newsletter about a warm pool of water in the Gulf of Alaska. This was also something I recall reading about in the 70s. From that I was familiar with how that pool turns the jet stream up into Canada at which point it zooms down on eastern United States bringing chilly air with it. I have no problem seeing that an arm of the tropical jet thundering around the warm pool and into Canada in the winter would throw the Arctic jet into disarray and freeing the polar vortex to wreak havoc.

NASA image of the first polar vortex event.

NASA image of this winter’s first polar vortex event.

The real question is why that pool is so persistent? Usually it goes away in the fall and the jet stream and the airmass that goes with it assume a cozier route across the US. A possibility is the current solar maximum, weak though it be, but what is the mechanism to warm that pool? The Gulf of Alaska is a long way north for it to be warmed much directly by the sun this time of the year.

Then came another startling observation my friend noticed. Scientists were finding that strong westward passage of air along the equator was sinking atmospheric heat into the Pacific along the equator +/-10 degrees latitude. This seems to be a new phenomenon and worries me. Does it mean that the atmosphere is so hot from global warming that processes are active to cool it by sinking the excess heat we caused into the ocean? Can salmon and other Pacific fish survive such a thing?

Extreme drought in the West (those are mature trees above the waterline).

Extreme drought in the West (those are mature trees above the waterline).

Since then I have been thinking that this could be the warming mechanism for the Gulf of Alaska pool. If the Pacific is being warmed by winds that are sinking atmospheric heat into both atmosphere and ocean, then the Japanese Current could be drawing that warm water around the Pacific with the resulting warm pool in the Gulf of Alaska. The next bit of news I will be watching for is for someone who studies Pacific temperatures to confirm that the latter link is in truth the case.

If all of that is true, then there is little mystery about why the winter is cold in eastern US. It also might suggest that the condition will persist until the thermodynamic balances between atmosphere and ocean, equator and pole, are redressed. How long could that take: ten years, a 1000 years? Perhaps a simulation might be able to work the 10/1000 years question out providing a bit of window on the future.

Seaside Heights, NJ after Sandy.

Seaside Heights, NJ after Sandy.

I have always thought that it is one of the great ironies of the current age that people in eastern US, perhaps the greatest perpetrators of global warming in the world, are being cooled by global warming. This is something else that has been going on for a long time, maybe since the 80s. It interesting that the cooling process has turned mean this winter and bitten us with stinging cold. This is not the type of lesson in global warming we are used to getting from the weather.

Who Am I? (On Genealogy and Genetic Ancestry)

by Matt McKinnon

Who am I?

I have long been pestered by this question, seeking the answer not in a litany of likes and dislikes or the self-obsessed perspective that modern Western consumerist culture offers me.  But neither in the personal history of myself—where I’m from, where I’ve been, and so on.  Or even less in my career, my “profession,” what I do to make enough money to live comfortably and raise a family.

1

No, my interest in identity has been more in my genealogy, my distant past, and what we now call “deep genealogy”—the history of my DNA, that mysterious code I have no control over but that dictates much of who I am.

2The more I have sought answers in these two areas, the more I have come to realize that they are decidedly different—that my genealogy (the relatively recent history of my family and ancestors) and my “deep genealogy” (the origins and history of my DNA) offer two quite different portraits—even though the latter, after tens of thousands of years, ultimately leads to the former.

But that’s the key: after tens of thousands of years.

I remember my first dabbling in genealogy when I was in high school: I had always known that my name, McKinnon—or rather MacKinnon—was Scottish in origin.  I had been told by my family that we were mostly “Scots-Irish,” a term which, I came to find out later, is basically an American invention used rarely if ever in either Scotland or Ireland.  It can denote the Ulster Scots whom the English used to colonize Northern Ireland in the 17th century (and are thus not “genetically” Irish at all), or Lowland Scots and those of the Borderlands between Scotland and England.

3But a little research soon proved that the MacKinnon name is Highland, not Lowland or Border, and certainly not “Scots-Irish.”  The Highlands and Islands of Scotland are mostly Gaelic, and hence Celtic in origin, while the Scots of the Lowlands are a mix of Celtic, Roman, German, English, Scandinavian, Irish, and Scottish in varying amounts.  And since our most recent Scottish ancestor was a MacKinnon who left the Isle of Skye sometime in the late 18th or early 19th century, my Highland ancestry was confirmed.

So I spent the rest of my high school days donning plaid scarves, Shetland wool sweaters, and Harris Tweed caps and playing records of bagpipe music at home that frightened the cat and annoyed my parents and siblings to no end.

But deep down, I knew that this was not answer enough.  Indeed, ethnic identity continued to elude me and offer more questions than answers.

And it still does, even after countless hours spent researching family history and genealogy, and hundreds of dollars spent on research and DNA analysis.  Perhaps my developing awareness of the fragmentary and somewhat arbitrary nature of what we call “history” has made my search one of exponential questions instead of hard and fast answers.

For what we call “Celtic” is in fact a linguistic designation, like (and related to) “Germanic” or “Balto-Slavic.”  These are first and foremost language identifiers and not “genetic” ones.

So MacKinnon, being a Highland name, at least designates my ethnic identity as Celtic, right?

Perhaps.  At least to some extent.  But what does that really mean?

After all, these groups—Celtic, Germanic, Balto-Slavic, Italic—are only Indo-European linguistic identifiers with origins in a shared Proto-Indo-European population of tribes who inhabited Europe most probably during the late Neolithic Age (circa 4000 BCE).  Only then did these peoples begin their various migrations north and west as they differentiated into the more well-known (if often mistakenly applied) names like the Celts, Germans, Slavs, Romans, etc…

4The point being that, any location of one’s ancestry as “Scottish,” or “Highland,” or “Gaelic,” or “Celtic,” or, for that matter “Germanic” or “Balto-Slavic” is rather arbitrary in that it assigns prominence to one moment in a wave of modern human migration that began in Africa some 70,000 years ago and arrived on the Pontic-Caspian steppe in what is today Eastern Europe about 30,000 years later.  From there, these various groups migrated into all directions, as wave after wave of tribes populated Europe, developing different cultures and languages, though all sharing the same not-too-distant Indo-European past.

(It is interesting to note as well that these folks only started to look “European,” i.e., “white” around 11,000 BCE.)

5So that Highland MacKinnon ancestry I was so sure about?  Well, it turns out that a deep DNA analysis confirms my paternal lineage (the Y-chromosome of my father’s father’s father’s father’s father…all the way back to its beginning) to be that of Haplogroup (I won’t even get into it) I2, subgroup a2.

Haplogroup I began 30,000-40,000 years ago in Eastern Europe, with I1 and I2 diverging about 6,000 years later.  I2a arose about 11,000 years ago in the Balkans and is still today concentrated in Eastern Europe and Russia.  I2a2, that of my Highland Scots paternal DNA, only emerged some 7800 years ago, also in the Balkans, before starting its migration north into Central and Eastern Europe as well as Russia.

And, at some point, as the DNA of a male member of a Celtic or perhaps Germanic tribe who ultimately made his way to Scotland.  And then passed it along to me.

So my Highland Scots DNA is actually Baltic in origin, and is shared by more Serbs and Croats and possibly even Russians than it is by my “fellow” Highlanders.

But if that’s not confusing enough, this only represents one line of grandfathers on my father’s side, going back roughly 8,000 years.  If we consider that there are approximately 400 generations between me and my Neolithic “European” ancestors, then the number of my direct relatives from present day all the way back to the New Stone Age is considerably large [nerdy editor's note: large enough to need scientific notation: 2.58 x 10120].

But we need not go back that far to make my point: much of an individual’s “ethnic identity” is relatively arbitrary and tells precious little about their deep genetic makeup.

In calculating the rather complex mathematics of our ancestry, scientists have concluded that all modern humans are related to each other in the not too distant past—within a few hundred years in fact.  Steve Olson, writing in The Atlantic in 2002, reported that

  1. Everyone in the world is descended from Nefertiti and Confucius, and
  2. Everyone of European ancestry is descended from Muhammad and Charlemagne.

Grandma?That would be everyone.

Which means that all modern humans alive today are related to each other—and related to each other rather recently, considering that modern humans have been in existence for about 100,000 years.

7

Indeed, everyone reading this post is probably at most my 20th cousin.

But you’re not invited over for Thanksgiving.

And I’m guessing you’re not all Highland Scots.

How Self-Driving Cars Could Change Our Lives in Unexpected Ways

by Chris Metivier

Who's there?!

Who’s there?!

A couple of weeks ago an unusual thing happened to me. One of my neighbors knocked on my door. I haven’t lived in my current house long, and he just wanted to tell me that his lawn mowing had blown some grass clippings in my driveway and to assure me that he intended to clean it up. I certainly wasn’t worried about grass clippings in my driveway, but it was a polite way for my neighbor to introduce himself, and I was glad that he did. He’s a nice guy.

Afterward, my housemate remarked that it was strange for someone to just knock on the door. Typically when that happens it’s some sort of solicitor, and I’d just as soon pretend I’m not home as answer it. He said that he’s actually annoyed when someone knocks on the door unexpectedly. If it had been a friend, he said, we’d have known they were coming.

We considered how we came to have this attitude of preferring to avoid unannounced visitors. It occurred to us that this is the same way we feel about receiving calls from unrecognized phone numbers. I don’t know about you, but I usually ignore those calls, because again, it’s typically a solicitor. The unexpectedness of a visit or phone call implies that it’s likely to be unwanted. That small bit of information is what we’re acting on when we decide to ignore a knock on the door or a ringing phone.

Just Dropping By

Just dropping by. Remember when that used to happen?

He mentioned that when he was young, it was common for family friends to drop by unannounced. It was a normal social practice. But it doesn’t happen any more. We concluded that the ubiquity of cell phones is at the root of this change. We know now when someone is on their way over because they—knowing that I always have my phone with me and they theirs—will call (or more typically text) to let me know they’re coming. I usually don’t even need to answer my door. If I’m expecting someone, I just make sure the door is unlocked so they can just walk in.

This change in attitude toward visiting someone’s home is sort of an unexpected side-effect of cell phone technology. I wouldn’t have guessed that the portability of telephones would lead to a shift in attitude toward something that isn’t obviously related to telephones. It made me think: what other technologies might have unexpected effects on our attitudes toward common social practices?

Google Self-Driving Car

Google Self-Driving Car.

Some days later I was driving on the highway. There was some traffic and I got to thinking, if only everyone would drive exactly the same speed, there would be no traffic. Then I thought, those Google self-driving cars could do that. In fact, those Google self-driving cars could probably do it much more safely and efficiently, with smaller gaps between cars even at high speeds. I suspect they could merge in and out of traffic with flawless precision. There would be no traffic bottlenecks at major junctions. Maybe I’m expressing more confidence than the technology merits, or maybe anyone who doesn’t is a technophobe. (There. I said it.)

Then I got to thinking about how different traveling by road would be if every car on the road was self-driving. Not only would you probably get where you’re going faster, because of the elimination of most traffic problems (like some idiot doing 55 in the fast lane), you’d also know exactly and reliably when you would arrive at your destination. No longer would “traffic” be an excuse for showing up late. And you couldn’t fudge it. If you left behind schedule, there would be no making up the time by driving extra fast. The self-driving cars would always move at the same speed. Robot cars have no sense of urgency.

Nissan Autonomous Drive

Nissan Autonomous Drive Vehicle.

Furthermore, since there would be no driving extra fast, there would be no breaking the speed limit. Maybe the notion of a speed limit would become nonsensical, since the self-driving cars wouldn’t travel at some range of speeds with an upper limit, they would travel (except while merging) at exactly one speed. Probably other kinds of traffic laws would become obsolete as well.

Think about how common it is to exceed the speed limit, and think about how you feel when you suddenly notice a police cruiser on the road behind you, or on the side of the highway. Maybe you don’t do this, but I immediately feel tense, nervous, guilty. Maybe your stomach lurches a little. Maybe you reflexively take your foot off the gas. I do all those things, even if I’m driving below the posted speed limit. I feel like a criminal, who will suffer or avoid punishment, only on a whim of some guy in a uniform. (Who does he think he is?! Ugh! Cops!)

Well ... rats.

Well … rats.

Now think about how you react to seeing a police officer while you’re both on foot. You don’t feel guilty. You don’t feel nervous. You probably feel safe. You might even smile or nod. You don’t have to avert your eyes for fear he’ll memorize your shifty face so as to apprehend you later. When you’re on foot, police are there to protect you, not to persecute you. The only trouble is that this doesn’t happen all that much. The bulk of a typical (non-criminal) person’s experience with police is on the road, where we’re suspicious of them, the threat of punishment implicit in their mere presence.

It's a beautiful evening to be on a foot beat at the park!

It’s a beautiful evening, isn’t it?

I suspect this is a common attitude toward police. When you’re on foot, they’re your allies. When you’re behind the wheel, they are symbols of authority who are only out to enforce laws against you. My prediction is that if self-driving cars were common, people’s attitude toward law enforcement would change dramatically. Most people’s negative experience with the police are over speeding tickets. Since self-driving cars would make the very concept obsolete, the negative experiences (including anxiety) about contact with police would be eliminated (except maybe for parking tickets).

Sure, there are other reasons people might have negative attitudes toward law enforcement. But I’m willing to bet that most folks are like me. If I didn’t think that all cops were out to give me speeding tickets (because of some insidious quota program that the state will obviously deny, but I know better), I’d be likely to have a much more positive attitude toward them.

I know this isn’t really a big deal. It’s not going to change our lives in any really significant way (more than self-driving cars would already). I know attitudes toward police aren’t a serious societal problem that many people give much thought to. And I know that self-driving cars are technologically possible now, but economically a long way off. This is just an exercise in thinking about the ways technology might change our lives in ways we don’t expect at first. It’s an exercise about excitement for the ways in which our lives might be different, and better, due to minor, indirect effects of new technologies. Sure, I’m being optimistic, and sure, you can be be a future-dreading techno-pessimist if you want. But the future will arrive in spite of pessimists, and there will be myriad benefits for those of us who embrace it.

Who's Driving this thing?!

Wait … who’s Driving this thing?!

Jumping the Shark

By Marc Williams

The Fonz

In 1977, the television sit-com Happy Days began its fifth season with an audacious episode that was different in tone from its first four years of episodes, it took the viewers by surprise. Happy Days’ appeal had always been its nostalgic attitude toward the 1950’s and the likable, down-to-earth characters around whom each episode focused. The motorcycle-riding, leather jacket-wearing heartthrob, Arthur “the Fonz” Fonzarelli–played by Henry Winkler–was the epitome of cool and remains an icon of coolness today.

Unexpectedly, in the fifth season’s premiere episode, the Fonz decided to prove his bravery by jumping over a shark while water skiing. It was a baffling moment in television history. The first four years of the show had nothing to do with water skiing and the Fonz had never been the kind of character who needed to “prove himself” to anyone. More superficially, viewers weren’t accustomed to seeing the Fonz in swimming trunks. It was an odd episode after which the series could never be the same–a point of no return. This moment gave birth to the phrase “jumping the shark,” a term coined by John Hein to describe the moment when a television show betrays its origins–perhaps suggesting that the writers have run out of ideas. Often, shows are thought to have jumped the shark when a key character leaves the show, or if an important new character is introduced. Hein started a website dedicated to the phenomenon, where readers can debate the moment in which their favorite television shows jumped the shark. Hein sold the website in 2006 but the site is still active.

Here’s a clip (via YouTube) of Fonzie’s famous shark jump:

The website argues that virtually any creative endeavor can jump the shark: musical groups, movies, advertising and political campaigns, and so on. But can an educational television program jump the shark? Some have argued that Discovery Channel’s Shark Week has done so.

For years, Shark Week has provided viewers fascinating documentaries about recent shark research and has captured some truly eye-popping footage. For example, the images captured in a 2001 Shark Week episode entitled “Air Jaws” captured some of the most stunning nature film I’ve ever seen–images of enormous great white sharks leaping completely out of the water, attacking seals and seal decoys. Like nearly everything one expects to see on The Discovery Channel, Shark Week is usually both entertaining and educational.

Clip from Air Jaws

Click to view a clip from “Air Jaws”

When watching that spectacular 2001 episode, I wondered to myself–”how will Discovery Channel ever top this?” What shark footage could possibly compete with these amazing images? How will they possibly attract viewers next year? Not surprisingly, Discovery Channel dedicated many of its subsequent Shark Week shows over the past twelve years to more footage of jumping great whites–and not much else. Perhaps the producers acknowledged that indeed, they simply couldn’t surpass the spectacle of “Air Jaws.” Until the 2013 installment of Shark Week, that is.

“Megalodon” was the centerpiece of Shark Week 2013, a documentary about the prehistoric shark that paleontologists believe grew to lengths of 60 feet or more. I’ve always been fascinated by megalodon; my older brother was a shark enthusiast when we were young and I vividly recall him showing me a photo of fossilized megalodon jaws he found in a book–I couldn’t believe that such an enormous creature ever lived. I was awed by the thought of it. Naturally, when I read that Discovery Channel was featuring Megalodon in its 2013 Shark Week series, I set my DVR.

The episode begins with some amateur video footage from a fishing party aboard a boat off the coast of Cape Town, South Africa. The amateur footage ends with some fearsome crashes and the viewer then learns that the footage was recovered from the boat’s wreckage–and that none of the young passengers survived. When my wife and I watched the episode, we both thought the footage looked a little too polished to be amateur footage. My wife said she didn’t remember hearing anything on the news of a horrible boating accident and I didn’t remember such a story either.

Viewers were then introduced to a self-proclaimed expert in mysterious oceanic events: a dubious specialty, held by a man who was perhaps a bit too comfortable in front of the documentarian’s camera.

As the program continues, viewers learn that megalodon may not be extinct after all! And of course, in true Shark Week fashion, there was some stunning footage that offered tantalizing glances of what might be a live megalodon in the ocean. The ocean is a huge place, we’re reminded, and new species are discovered every year. The coelacanth, for instance, was thought to be extinct for over 60 million years until a live specimen was discovered in 1938. Even very large animals like giant squid and megamouth sharks have only been recently captured on film, so the evidence supporting a modern-day megalodon simply can’t be dismissed.

Clip from Megalodon

Click to view a clip from “Megalodon”

The program was extremely entertaining and was easily the most exciting Shark Week show I’ve seen since “Air Jaws.” And not surprisingly, “Megalodon” received the highest ratings in the history of Shark Week. Unfortunately, as you may have guessed, it was all a hoax. Like Animal Planet’s 2012 documentary on mermaids, all of the nature footage and expert testimonies were fabrications. My wife and I hadn’t heard about the vanished boating party on the news because, of course, there never was a boating party. There was virtually nothing true about Discovery Channel’s “Megalodon.” But many viewers were fooled, and subsequently criticized the network for misleading and humiliating the audience.

What do you think? By airing a work of fiction–and presenting it as truth–did Shark Week jump the shark? Have the producers run out of ideas? Have they abandoned Shark Week’s reputation? Or were Shark Week viewers naive all along for seeking education through commercial television?

News From the Final Frontier

by Claude Tate

This is not something I normally put out for public consumption, but maybe the time has come.  I’m a space nerd, and have been since I first became aware of rockets.

Neil Armstrong on the Moon

Neil Armstrong on the Moon

I can’t remember when I was first introduced to rockets, but I do remember seeing the launches of the first manned flights in the auditorium of my elementary school.  I remember our teacher taking us to the auditorium where a single TV was placed on a stand. It was a small country school with few resources, so while I don’t remember whether it was the only TV in the school or not, it may well have been. At least it was the only one I was aware of. Our class and a number of others would sit there staring at the rocket sitting there on the pad on that small, grainy, black and white television way down there on the stage. The early Mercury flights always had delays, so often it would take some time before the big moment happened. But it always happened.  The rocket would come alive and lift majestically for the heavens. It only lasted for ten seconds or so, but what a magnificent ten seconds. I was hooked. A fire was lit that still burns today.

Last year, politics dominated our news. John King is probably doing something to torture that touch screen election map he stood in front of every day, day in and day out, week in and week out, month in and month out. I hope he got extra pay for that.  I was listening to NPR one day and they had a story about a 5 year old being shown a picture of President Obama. When they asked him if they knew who that person was, he said “I’m Barack Obama and I sponsored this message”.

But while everyone seemed to be focused on every word that was uttered in the political arena, there were some significant things happening on the final frontier; some of which received attention, some did not. While the following does not comprise a comprehensive list of everything that happened, these four events stood out for me.  One was a milestone, two signified the passing of an era, and the other was a WOW! event for NASA.

The Voyager Interstellar Mission

First, a milestone was reached as the two Voyager spacecraft began leaving the solar system.

Voyager Spacecraft (Both Voyagers were identical)

Voyager Spacecraft (Both Voyagers were identical)

Their mission can be broken down into two parts. The first part was to increase our knowledge of the solar system. Voyager 1, launched in September of 1977, did flybys of Jupiter and Saturn. Voyager 2 was launched in August of 1977, and in addition to flybys of Jupiter and Saturn, also flew by Uranus and Neptune. The second part, the Voyager Interstellar Mission or VIM began after Voyager 2 passed by Neptune.  The VIM consists of three phases, the termination shock phase, the heliosheath exploration, and the interstellar exploration phases.  Both are now in the heliosheath exploration phase. We do not know how thick this environment is, so we cannot determine exactly how long they will be in this phase, but it will probably be several years.  After that, it will be interstellar space.  They are still operating like champs and have enough power to last until around 2020. After that they will drift.  And providing neither are hit by anything, Voyager 1 will come within 1.6 light years of a star called AC+79 3888, and Voyager 2 will pass within 4.3 light years from Sirius. And then, who knows.

The Retirement of the Space Shuttle Fleet

A Typical Space Shuttle Launch

A Typical Space Shuttle Launch

The next two events signaled the passing of an era.  First was the retirement of the Space Shuttles and their final trips to their respective exhibition sites.

The Space Shuttle flew 135 missions and was the face of the American space program for 30 years, from 1981 to 2011. The accomplishments of Columbia, Challenger, Discovery, Atlantis, and Endeavor are far too numerous to list here. But two things that really stand out to me were the contributions it made to the construction of the International Space Station, which would probably have been impossible without the Shuttle, and the placing into orbit of the Hubble Space Telescope.

Enterprise Being Flown Over New York To Her New Home

Enterprise Being Flown Over New York To Her New Home

Enterprise was the first orbiter built, and while it never flew in space, was essential to refining the technology and design for the other Shuttles. It has been moved from the Smithsonian’s National Air and Space Museum to the Intrepid Sea, Air &Space Museum in New York City.

Endeavor at home in Los Angeles (shuttles are big)

Endeavor at home in Los Angeles (shuttles are big)

In October Endeavour was moved to the California Science Center in Los Angeles.  Shuttle Atlantis has been moved to the Kennedy Space Center Visitor’s Complex in Florida and will be placed on display in 2013. Discovery replaces Enterprise at the Smithsonian. It was heart-warming to see so many people turn out to see the shuttles make their final voyages to their respective retirement destinations.

The Death of Neil Armstrong

Neil Armstrong

Neil Armstrong

The other event that signified the passing of an era was the passing at of the first man on the moon, Neil Armstrong, at the age of 82 from complications following bypass surgery.

Armstrong, a Navy pilot during the Korean War, served as a civilian test pilot until being selected as part of the second ‘class’ of astronauts in September of 1962. He was one of two civilian astronauts (the preference was for military test pilots) and the first American civilian to go into space when he commanded Gemini 8 in March of 1966. He was selected as commander of the Apollo 11 crew in December of 1968. The other members of the crew were Michael Collins and Buzz Aldrin.

Apollo 11 left on its voyage on July 16, 1969.  His coolness under pressure was legendary; a trait that would serve him well in those last few seconds over the moon.  In the final seconds, as the lunar module, Eagle, descended to the surface of the moon, the landing computers became overloaded.  When Armstrong saw they were headed for an unsafe landing site, he took over and manually flew the Eagle to a safe touchdown some distance away. The folks at NASA were worried, but they should not have been. As it turned out, while estimates of the amount of fuel left has varied over the years, the number most often cited is that they had under 20 seconds of fuel left when they landed.

The Apollo 11 crew:Neil Armstrong, Michael Collins, and Buzz Aldrin.

The Apollo 11 crew:
Neil Armstrong, Michael Collins, and Buzz Aldrin.

The landing took place on July 20, 1969.  I was among those millions around the world who were glued to the television and listened to the NASA audio and animations as the Eagle approached the moon; heard Armstrong describe the descent to Mission Control in Houston; and felt that feeling of pride as he said, “Houston, Tranquility Base here. The Eagle has landed.”  I was still there when after several hours of rest and preparation, Armstrong opened the hatch of the Eagle, attached a TV camera to its leg and descended the latter to the moon’s surface.  And I watched and listened on live TV as his first boot touched the moon and he uttered those famous words, “That’s one small step for (a) man, one giant leap for mankind.” In 1960 our rockets were blowing up on the pad and just nine years later, we were walking on the moon. Words simply cannot fully capture what America accomplished in July of 1969.  The closest I can come is to say it was beyond extraordinary.

After the moon landing, Armstrong could have cashed in and made untold millions of dollars.  Instead, he chose to return to Ohio and lead a quite life. He taught aerospace engineering at the University of Cincinnati from 1971 to 1979, worked on his farm, served on several commissions including the investigation of the Challenger explosion, accepted membership on several boards, and took a few jobs as a spokesman for companies he believed in. But until the end, he insisted he was no hero.  He was only doing his job and was one of many who were responsible for the moon landing. NBC had an excellent story of his death and also those first steps on another world.

His family released a statement after his death responding to the many who had asked what they could do to honor Neil.  It stated that in addition to honoring his service, accomplishment and modesty, when you look at the moon, think of Neil Armstrong and give him a wink.

The Curiosity Mission

The other event may not equal the moon landing, but I would still classify it as a WOW! event, the landing of Curiosity on Mars.

Curiosity, which was launched from Cape Canaveral on November 26, 2011, made a powered soft landing on Mars August 6, 2012.  The landing used a technique never before attempted and was nothing short of amazing. The following description is taken from the Mars Science Laboratory/Curiosity Fact Sheet.

Engineers designed the spacecraft to steer itself during descent through Mars’ atmosphere with a series of S-curve maneuvers similar to those used by astronauts piloting NASA space shuttles. During the three minutes before touchdown, the spacecraft slowed its descent with a parachute, then used retrorockets mounted around the rim of an upper stage. In the final seconds, the upper stage acted as a sky crane, lowering the upright rover on a tether to the surface.

And it all worked perfectly.  We have sent a number of unmanned missions to Mars; missions that have yielded a great deal of information about Mars.  But Curiosity is by far the most sophisticated unmanned probe we have ever launched, and has the potential for advancing our knowledge of Mars exponentially. For more information on this amazing mission, go to NASA’s homepage for the Mars Science Laboratory Mission.  A good place to start when you reach the page is the Fact Sheet I referenced above.  It is under Mission Resources located on the right side, and it provides an excellent overview of the mission.  The scope of Curiosity’s abilities is nothing short of amazing.

Self Portrait by Curiosity on Mars

Self Portrait by Curiosity on Mars

We will not be sending men back to the moon anytime soon. And I was disappointed when our new moon program was cancelled.  And presently we have no vehicle to send astronauts to the International Space Station. But we are moving forward into that final frontier. Private U.S. companies are developing the vehicles that will soon be sending Americans back into in a few years.  SpaceX has already developed a rocket and capsule that has begun making supply runs to the space station, and will soon have the capability to send men into space. NASA, in addition to continuing to send men to the space station and someday to an asteroid and Mars, will be undertaking missions that will increase our knowledge of the earth and unlock the secrets of the solar system.  And hopefully in the near future, the James Webb Space Telescope will be ready for launch. The James Webb Space Telescope may not sound as exciting as some of the other missions, but it should extend our vision to the edge of the universe.  And of course, Hubble continues to make discoveries that prove the universe is far more magical and wonderful than we ever imagined.

Our future in the final frontier is bright. And for this old space nerd, it’s going to be exciting.

Environmentalism and the Future

by Matt McKinnon

Let me begin by stating that I consider myself an environmentalist.  I recycle almost religiously.  I compost obsessively.  I keep the thermostat low in winter and high in summer.  I try to limit how much I drive, but as the chauffeur for my three school-age sons, this is quite difficult.  I support environmental causes and organizations when I can, having been a member of the Sierra Club and the Audubon Society.

1I find the arguments of the Climate Change deniers uninformed at best and disingenuous at worst.  Likewise, the idea of certain religious conservatives that it is hubris to believe that humans can have such a large effect on God’s creation strikes me as theologically silly and even dishonest.  And while I understand and even sympathize with the concerns of those folks whose businesses and livelihoods are tied to our current fossil-fuel addiction, I find their arguments that economic interests should override environmental concerns to be lacking in both ethics and basic forethought.

That being said, I have lately begun to ponder not just the ultimate intentions and goals of the environmental movement, but the very future of our planet.

Earth and atmospheric scientists tell us that the earth’s temperature is increasing, most probably as a result of human activity.  And that even if we severely limited that activity (which we are almost certainly not going to do anytime soon), the consequences are going to be dire: rising temperatures will lead to more severe storms, melting polar ice caps, melting permafrost (which in turn will lead to the release of even more carbon dioxide, increasing the warming), rising ocean levels, lowering of the oceans’ ph levels (resulting in the extinction of the coral reefs), devastating floods in some places along with crippling droughts in others.

2And according to a 2007 report by the Intergovernmental Panel on Climate Change, by 2100 (less than 100 years) 25% of all species of plants and land animals may be extinct.

Basically, our not-too-distant future may be an earth that cannot support human life.

Now, in my more misanthropic moments, I have allowed myself to indulge in the idea that this is exactly what the earth needs.  That this in fact should be the goal of any true environmental concern: the extinction of humanity.  For only then does the earth as a planet capable of supporting other life stand a chance.  (After all, the “environment” will survive without life, though it won’t be an especially nice place to visit, much less inhabit, especially for a human.)

3And a good case can be made that humans have been destroying the environment in asymmetrical and irrevocable ways since at least the Neolithic Age when we moved from hunter and gatherer culture to the domestication of plants and animals along with sustained agriculture.  Humans have been damaging the environment ever since.  (Unlike the beaver, as only one example of a “keystone species,” whose effect on the environment in dam building has an overwhelming positive and beneficial impact on countless other species as well as the environment itself.)

4So unless we’re seriously considering a conservation movement that takes us back to the Paleolithic Era instead of simply reducing our current use and misuse of the earth, then we’re really just putting off the inevitable.

But all that being said, whatever the state of our not-too-distant future, the inevitability of the “distant future” is undeniable—for humans, as well as beavers and all plants and animals, and ultimately the earth itself.  For the earth, like all of its living inhabitants, has a finite future.

Around 7.5 billion years or so is a reasonable estimate.  And then it will most probably be absorbed in the sun, which will have swollen into a red giant.

5(Unless, as some scientists predict, the Milky Way collides with the Andromeda galaxy, resulting in cataclysmic effects that cannot be predicted.)

At best, however, this future only includes the possibility of earth supporting life for another billion years or so.  For by then, the increase in the sun’s brightening will have evaporated all of the oceans.

6Of course, long before that, the level of carbon dioxide in the atmosphere (ironically enough) will have diminished well below the quantity needed to support plant life, destroying the food chain and causing the extinction of all animal species as well.

And while that’s not good news, the worse news is that humans will have been removed from the equation long before the last holdouts of carbon-based life-forms eventually capitulate.

(Ok, so some microbes may be able to withstand the dry inhospitable conditions of desert earth, but seriously, who cares about the survival of microbes?)

Now if we’re optimistic about all of this (irony intended), the best-case scenario is for an earth that is able to support life as we know it for at most another half billion more years.  (Though this may be a stretch.)  And while that seems like a really long time, we should consider that the earth has already been inhabited for just over 3 and a half billion years.

So having only a half billion years left is sort of like trying to enjoy the last afternoon of a four-day vacation.

7

Enjoy the rest of your day.

Deleting the “Human” Factor

by Wade Maki

History is often divided into ages based upon a particular trend. The age of reason, age of invention, enlightenment, information and industrialization are but a few examples. Some ages are known for conflicts, others for prosperity. As we are 12 years into the 21st century, I’m noticing a trend that may make this the century we delete the human factor from decisions.

Image

“I’m still here! Turn the lights back on!”

This summer my department was moved into new offices. Of course they are not new so much as new to us with some recent updates and fresh paint. One of the first things we noticed was how the traditional light switches were replaced with motion light sensors to automatically turn the lights on when we enter and off when we are not around. One might be tempted to see this as motivated by convenience, but one would be wrong. The idea here is to save energy (ergo money) by removing the human factor from the equation. Humans tend to leave lights on and so the automatic sensor is there to handle things without having to rely on flawed human judgment. Even though the motion sensor must use some additional power, it has been determined that the sensor will be more efficient than people. As with anything new the bugs haven’t been worked out such as the daily tendency for the sensor to turn off my light when I read, work on my computer, or just sit mostly still for awhile. Thus, I must pause in the dark and wave my arms in the air to get the sensor to turn my light back on. True to the trend of deleting the human factor there is no way for me to override the sensor.

Image

We’ve all been here.

This summer I made several airline flights to conferences and events. At every airport where I used the restroom I found a similar trend of removing the human factor from the equation. Want soap? Put your hand here and the sensor will decide how much to give you. Want water? Hold your hands here and a pre-determined amount of water will flow. Want a paper towel? Wave your hands and a predetermined (always too small amount) of paper will be dispensed. The goal, as with my light sensor, is to remove my decision from the equation in the name of efficient use of energy, water, paper and soap. Why this became of interest was that in one of the airports most of the sensors had apparently stopped working leaving only a couple of sinks operational. If you’ve ever seen a busy airport restroom this was quite a sight to watch as dozens of people were dumbfounded (they waved their hands in vain but no soap or water came). You see, as with my light switch, the ability for an actual human being to turn on the water or pump the soap had been removed rendering the sinks non-functional.

The trend then is for small groups of humans (committees I’m guessing) to decide that in the name of efficiency, safety, or some other good purpose, systems should be designed to remove human decision-making entirely. Lest you think that it stops at switches and faucets please know that Google is close to perfecting the self-driving car. As anyone who drives around others knows, the machines can’t possibly do it worse… can they? I suspect this is only the beginning of a century long trend of deleting the human factor.

Image

Self-driving Google car

If this trend continues what will life in 2099 look like? My life will be managed by devices that wake me up, remind me where to go and what to do, perhaps even when to shower (and for how long). My car will drive me where I need to go. Manual driving will be too risky to allow, but drinking and riding while on your phone is perfectly okay. My refrigerator will know what’s inside and order anything I need from the store, which may deliver it (or have my car add a grocery stop to my commute). In addition to managing my life, my devices will track and report my activities to ensure public safety (this is already occurring and will continue to expand). I could go on, but it is enough to note how some of these things we can see coming by 2099 and others are almost here already.

Of this trend most will ask the wrong questions. Most will ask “how does this make life more efficient and convenient?” Some will ask “what are the costs to us by this loss of control?” In both cases there will be important points on each side to be weighed. However, perhaps the most important question we should ask is “how does removing humans from decisions change us?” Who will we become when we make fewer decisions and cede more control to machines (and to those who program them)? By comparison we know how using cell phones has resulted in a generation that no longer remembers numbers and has forgotten many social amenities. What will life be like 2099? I can give you a rough sketch of this future. Who will people be in 2099? That, may be a much more disconcerting question.

Choose Your Own Adventure

By Carrie Levesque

Recently in the Russian Novel of Conscience course we have been discussing Yevgeny Zamyatin’s 1921 dystopian novel We, about a highly mechanized and regimented totalitarian society (the One State) hundreds of years in the future where citizens have achieved the ultimate happiness: unfreedom.  Taking as our starting point Marx’s claim that machines, invented to help man, have become the symbols of his servitude, we debate the extent to which machines and technology have enslaved or liberated men in today’s world.

As a class, we’ve compiled a pretty good list of technology’s benefits (efficiency, convenience, online degree programs!) and costs (myriad media addictions, a privileging of online relationships at the expense of face-to-face ones).  At midterm, several students have written excellent papers on how we have created a sort of One State within the United States through certain government policies and technologies which reduce rather than foster our individuality and humanity.

Many of these discussions have stirred up nostalgia for simpler times, when it seems people had different values and a different relationship to one another.  They’ve made me think about a book I read recently on a more extreme response to this question, the Back to the land movement (which is a great deal more complex than just ‘living simply,’ but I’m limited to 800 words…).

I grew up in a remote area of northern Maine that has always attracted Back-to-the-landers.  What possesses these diehards who apparently find the southern Maine homesteads of the followers of Scott and Helen Nearing not austere or isolated enough, that they would haul their few remaining possessions to the place where the logging roads end and call it home, I can’t say for sure.

But I’ll admit to having a touch of that idealism myself- to unplug, to live off the land, to disconnect from our nonstop media and rampant consumerism of all the latest technology (though you’d have to be insane to choose the wilds of northern Maine, a place with two seasons: Brutal Winter and Rainy Black Fly Infestation).  Though I now prefer a more comfortable climate, I understand the appeal of living in a beautiful, natural setting, devoting most of one’s time to work in the outdoors without a care for whatever new technology or entertainment the rest of the world is enthralled with.

Coleman Family

And yet, through a closer examination of life ‘off the grid,’ I’ve also come to a greater appreciation of many benefits of our modern life. I recently read Melissa Coleman’s memoir This Life Is In Your Hands about growing up the daughter of famous homesteader and Nearing mentee Eliot Coleman. She chronicles the great strain that the demands of homesteading put on her family, resulting in her father’s ill health, her baby sister’s tragic death and her parents’ divorce.  (On a lighter note, she also reveals some of the purist Nearings’ well-kept secrets: Helen’s love of ice cream, mail order fruit and other delicacies.  Even the folks who wrote the (sometimes a tad righteous) book on living local and off the grid indulged a little on occasion).  Though there were certainly aspects of their lives on the homestead that were richly satisfying, some readers may come away wondering if their chosen cure for the ills of modern life wasn’t in some ways as harmful as the disease, physically as well as spiritually.

Another interesting look at the real-life struggles of those who lived in those idealized ‘simpler times’ is the PBS reality series Frontier House.   In 2001, three families (selected from among some 5,000 applicants!) lived off the land for six months on the simulated frontier of 1880s Montana.  The success of their venture was assessed by historians based on whether each family had put by enough food and fuel over the summer and fall to survive a Montana winter.  Though they labored admirably, through all sorts of drama, if memory serves it was decided all would have perished.  The simpler times were never as simple as they seem.   (Frontier House is available in UNCG’s Instructional Film Collection, but sadly, not on Netflix).

There are no easy answers to the question of man’s relationship to technology.  Most people I know lament their dependency on smart phones, social media and a food supply so highly engineered that many of us have no idea what we’re really eating half the time (pink slime, anyone?).   Yet we have so much to be grateful for.  We live in a time of amazing medical advances.  Whatever may plague or disappoint us in our lives, we have the freedom and resources at our fingertips to research alternatives and connect with like-minded people to find a solution.  For all our similarities, thankfully these United States are not the One State.  Our ultimate happiness is not to be found in our unfreedom, but in our freedom to negotiate these complex choices and relationships, to choose our own adventure.