Tag Archives: science

The Devout Agnostic

by Jay Parr

Sunrise as seen from orbit. Taken by Chris Hadfield aboard the International Space Station.

I am a devout agnostic. No, that is not an oxymoron.

After considerable searching, study, and introspectionand, having been raised in the Protestant Christian tradition, no small amount of internal conflictI have come to rest in the belief that any entity we might reasonably call God would be so alien to our limited human perceptions as to be utterly, and irreconcilably, beyond human comprehension.

Gah. So convoluted. Even after something like a dozen revisions.

Let me try to strip that down. To wit: Humankind cannot understand God. We cannot remotely define God. We wouldn’t know God if it/he/she/they slapped us square in the face. In the end, we cannot say with any certainty that anything we might reasonably call God actually exists. Nor can we say with any certainty that something we might reasonably call God does not exist.

Splash text: I don't know, and you don't either.

To horribly misquote some theologian (or philosopher?) I seem to remember encountering somewhere along the way, humankind can no more understand God than a grasshopper can understand number theory.

I mean, we can’t even wrap our puny little heads around the immensity of the known physical realm (or Creation, if you prefer) without creating incredibly simplistic, and only vaguely representative models.

Let’s look at some of the things we do know. With only a handful of notable exceptions the entirety of human history has happened on, or very near to, the fragile skin of a tiny drop of semi-molten slag just under 8,000 miles across. That’s just under 25,000 miles around, or a little more than two weeks’ driving at 70 mph, if you went non-stop without stopping for meals or potty breaks.

Freight train in the American west, looking dwarfed by the landscape, with mountains visible in the far-off distance.

Even that tiny drop of slag can feel pretty vast to our little human perceptions, as anyone can tell you who has been on a highway in the American West and looked out at that little N-scale model train over there and realized that, no, it’s actually a full-sized freight train, with engines sixteen feet tall and seventy feet long and as heavy as five loaded-down tractor-trailers. And even though you can plainly see the entire length of that little train, it’s actually over a mile long, and creeping along at seventy-five miles per hour. Oh, and that mountain range just over there in the background? Yeah, it’s three hours away.

If we can’t comprehend the majesty of our own landscape, on this thin skin on this tiny droplet of molten slag we call home, how can we imagine the distance even to our own moon?

To-scale image of Earth and the Moon, with the Moon represented by a single pixel.

If you look at this image, in which the moon is depicted as a single pixel, it is 110 pixels to the earth (which itself is only three pixels wide, partially occupying nine pixels). At this scale it would be about eighty-five times the width of that image before you got to the Sun. If you’re bored, click on the image and it will take you to what the author only-half-jokingly calls “a tediously accurate scale model of the solar system,” where you can scroll through endless screens of nothing as you make your way from the Sun to Pluto.

Beyond the Moon, we’re best off talking about distances in terms of the speed of lightas in, how long it takes a ray of light to travel there, cruising along at about 186,000 miles per second, or 670 million miles per hour.

On the scale of our drop of moltener, Earthlight travels pretty fast. A beam of light can travel around to the opposite side of the Earth in about a fifteenth of a second. That’s why we can call that toll-free customer-service number and suddenly find ourselves talking to some poor soul who’s working through the night somewhere in Indonesiawhich, for the record, is about as close as you can get to the exact opposite point on the planet without hiring a more expensive employee down in Perth.

Earthrise_apollo8_19681224_NASA_500crop

That capacity for real-time communication just starts to break down when you get to the Moon. At that distance a beam of light, or a radio transmission, takes a little more than a second (about 1.28 seconds, to be more accurate). So the net result is about a two-and-a-half-second lag round-trip. Enough to be noticeable, but it has rarely been a problem, asin all of human historyonly two dozen people have ever been that far away from the Earth (all of them white American men, by the way), and no one has been any further. By the way, that image of the Earthrise up there? That was taken with a very long lens, and then I cropped the image even more for this post, so it looks a lot closer than it really is.

Beyond the Moon, the distances get noticeable even at the speed of light, as the Sun is about four hundred times further away than the Moon. Going back up to that scale model in which the Earth is three pixels wide, if the Earth and Moon are about an inch and a half apart on your typical computer screen, the Sun would be about the size of a softball and fifty feet away (so for a handy visual, the Sun is a softball at the front of a semi trailer and the Earth is a grain of sand back by the doors). Traveling at 186,000 miles per second, light from the Sun makes the 93-million-mile trip to Earth in about eight minutes and twenty seconds.

iss-sun-over-earth

Even with all that empty space, our three pixels against the fifty feet to the Sun, we’re still right next door. The same sunlight that reaches us in eight minutes takes four hours and ten minutes to reach Neptune, the outermost planet of our solar system since poor Pluto got demoted. If you’re still looking at that scale model, where we’re three pixels wide and the sun is a softball fifty feet away, that puts Neptune about a quarter of a mile away and the size of a small bead. And that’s still within our home solar system. Well within our solar system if you include all the smaller dwarf planets, asteroids, and rubble of the Kuiper Belt (including Pluto, which we now call a dwarf planet).

To get to our next stellar neighbor at this scale, we start out at Ocean Isle Beach, find the grain of sand that is Earth (and the grain of very fine sand an inch and a half away that is the Moon), drop that softball fifty feet away to represent the Sun, lay out a few more grains of sand and a few little beads between the Atlantic Ocean and the first dune to represent the rest of the major bodies in our solar system, and then we drive all the way across the United States, the entire length of I-40 and beyond, jogging down the I-15 (“the” because we’re on the west coast now) to pick up the I-10 through Los Angeles and over to the Pacific Ocean at Santa Monica, where we walk out to the end of the Santa Monica Pier and set down a golf ball to represent Proxima Centauri. And that’s just the star that’s right next door.

See what I’m getting at?

What’s even more mind-bending than the vast distances and vast emptiness of outer space, is that our universe is every bit as vast at the opposite end of the size spectrum. The screen you’re reading this on, the hand you’re scrolling with—even something as dense as a solid ingot of gold bullion—is something like 99.999999999% empty space (and that’s a conservative estimate). Take a glance at this comparison of our solar system against a gold atom, if both the Sun and the gold nucleus had a radius of one foot. You’ll see that the outermost electron in the gold atom would be more than twice the distance of Pluto.

atom-vs-solar-system

And even though that nucleus looks kind of like a mulberry in this illustration, we now know that those protons and neutrons are, once again, something on the order of being their own solar systems compared to the quarks that constitute them. There’s enough wiggle room in there that at the density of a neutron star, our entire planet would be condensed to the size of a child’s marble. And for all we know, those quarks are made up of still tinier particles. We’re not even sure if they’re actually anything we would call solid matter or if they’re just some kind of highly-organized energy waves. In experiments, they kind of act like both.

This is not mysticism, folks. This is just physics.

The crux of all this is that, with our limited perception and our limited ability to comprehend vast scales, the universe is both orders of magnitude larger and orders of magnitude smaller than we can even begin to wrap our minds around. We live our lives at a very fixed scale, unable to even think about that which is much larger or much smaller than miles, feet, or fractions of an inch (say, within six or seven zeroes).

Those same limitations of scale apply in a very literal sense when we start talking about our perception of such things as the electromagnetic spectrum and the acoustic spectrum. Here’s an old chart of the electromagnetic spectrum from back in the mid-’40s. You can click on the image to expand it in a new tab.

1944electromagnetic_spectrum-5000

If you look at about the two-thirds point on that spectrum you can see the narrow band that is visible light. We can see wavelengths from about 750 nanometers (400 terahertz) at the red end, to 380 nm (800 THz) at the blue end. In other words, the longest wavelength we can see is right at twice the length, or half the frequency, of the shortest wavelength we can see. If our hearing were so limited, we would only be able to hear one octave. Literally. One single octave.

We can feel some of the longer wavelengths as radiant heat, and some of the shorter wavelengths (or their aftereffects) as sunburn, but even all that is only three or four orders of magnitudetwo or three zeroesand if you look at that chart, you’ll see that it’s a logarithmic scale that spans twenty-seven orders of magnitude.

If we could see the longer wavelengths our car engines would glow and our brake rotors would glow and our bodies would glow, and trees and plants would glow blazing white in the sunlight. A little longer and all the radio towers would be bright lights from top to bottom, and the cell phone towers would have bright bars like fluorescent tubes at the tops of them, and there would be laser-bright satellites in the sky, and our cell phones would flicker and glow, and our computers, and our remotes, and our wireless ear buds, and all the ubiquitous little radios that are in almost everything anymore. It would look like some kind of surreal Christmas.

visible-vs-infrared

If we could see shorter wavelengths our clothing would be transparent, and our bodies would be translucent, and the night sky would look totally different. Shorter still and we could see bright quasi-stellar objects straight through the Earth. It would all be very disorienting.

Of course, the ability to perceive such a range of wavelengths would require different organs, once you got beyond the near-ultraviolet that some insects can see and the near-infrared that some snakes can see. And in the end, one might argue that our limited perception of the electromagnetic spectrum is just exactly what we’ve needed to survive this far.

I was going to do the same thing with the vastness of acoustic spectrum against the limitations of human hearing here, but I won’t get into it because acoustics is basically just a subset of fluid dynamics. What we hear as sound is things movingpressure waves against our eardrums, to be precisebut similar theories can be applied from the gravitational interaction of galaxy clusters (on a time scale of eons) to the motion of molecules bumping into one another (on the order of microseconds), and you start getting into math that looks like this…

acoustic-theory

…and I’m an English major with a graduate degree in creative writing. That image could just as easily be a hoax, and I would be none the wiser. So let’s just leave it at this: There’s a whole lot we can’t hear, either.

We also know for a fact that time is not quite as linear as we would like to think. Einstein first theorized that space and time were related, and that movement through space would affect movement through time (though gravity also plays in there, just to complicate matters). We do just begin to see it on a practical level with our orbiting spacecraft. It’s not very bigthe International Space Station will observe a differential of about one second over its decades-long lifespanbut our navigational satellites do have to adjust for it so your GPS doesn’t drive you to the wrong Starbucks.

Physicists theorize that time does much stranger things on the scale of the universe, and in some of the bizarre conditions that can be found. Time almost breaks down completely in a black hole, for instance. Stephen Hawking has posited (and other theoretical astrophysicists agree) that even if the expanding universe were to reverse course and start contracting, which has not been ruled out as a possibility, it would still be an expanding universe because at that point time would have also reversed itself. Or something like that; this is probably a hugely oversimplified layman’s reading of it. But still, to jump over to popular culture, specifically a television series floating somewhere between science fiction and fantasy, the Tenth Doctor probably said it best:

wibbly_wobbly_timey_wimey_stuff_jnapier99_edit

So far we’ve been talking about physical facts. When we get into how our brains process those facts, things become even more uncertain. We do know that of the information transmitted to our brains via the optic and auditory nerves, the vast majority of it is summarily thrown out without getting any cognitive attention at all. What our brains do process is, from the very beginning, distorted by filters and prejudices that we usually don’t even notice. It’s called conceptually-driven processing, and it has been a fundamental concept in both cognitive psychology and consumer-marketing research for decades (why yes, you should be afraid). Our perceptual set can heavily influence how we interpret what we see—and even what information we throw away to support our assumptions. I’m reminded of that old selective-attention test from a few years back:

There are other fun videos by the same folks on The Invisible Gorilla, but this is a pretty in-your-face example of how we can tune out things that our prejudices have deemed irrelevant, even if it’s a costume gorilla beating its chest right in the middle of the scene. As it turns out, we can only process a limited amount of sensory information in a given time (a small percentage of what’s coming in), so the very first thing our brains do is throw out most of it, before filling in the gaps with our own assumptions about how things should be.

As full of holes as our perception is, our memory process is even worse. We know that memory goes through several phases, from the most ephemeral, sensory memory, which is on the order of fractions of a second, to active memory, on the order of tens of seconds, to various iterations of long-term memory. At each stage, only a tiny portion of the information is selected and passed on to the next. And once something makes it through all those rounds of selection to make it into long-term memory, there is evidence in cognitive neuroscience that in order to retrieve those memories, we have to destroy them first. That’s right; the act of recalling a long-term memory back into active memory physically destroys it. That means that when you think about that dim memory from way back in your childhood (I’m lying on the living-room rug leafing through a volume of our off-brand encyclopedia while my mother works in the kitchen), you’re actually remembering the last time you remembered it. Because the last time you remembered it, you obliterated that memory in the process, and had to remember it all over again.

I’ve heard it said that if scientists ran the criminal-justice system, eyewitness testimony would be inadmissible in court. Given the things we know about perception and memory (especially in traumatic situations), that might not be such a bad idea.

court

Okay.

So far I have avoided the topic of religion itself. I’m about to change course, and I know that this is where I might write something that offends someone. So I want to start out with the disclaimer that what I’m writing here is only my opiniononly my experienceand I recognize that everyone’s religious journey is individual, unique, and deeply personal. I’m not here to convert anyone, and I’m not here to pooh-pooh anyone’s religious convictions. Neither am I here to be converted. I respect your right to believe what you believe and to practice your religion as you see fitprovided you respect my right to do the same. Having stated that

Most of the world’s older religions started out as oral traditions. Long before being written down they had been handed down in storytelling, generation after generation after generation, mutating along the way, until what ends up inscribed in the sacred texts might be completely unrecognizable to the scribes’ great-great-grandparents. Written traditions are somewhat more stable, but until the advent of typography, every copy was still transcribed by hand, and subject to the interpretations, misinterpretations, and agendas of the scribes doing the copying.

Acts of translation are even worse. Translation is, by its very nature, an act of deciding what to privilege and what to sacrifice in the source text. I have experienced that process first-hand in my attempts to translate 14th-century English into 21st-century English. Same language, only 600 years later.

SGGK_facsimile

Every word is a decision: Do I try to preserve a particular nuance at the expense of the poetic meter of the phrase? Do I use two hundred words to convey the meaning that is packed into these twenty words? How do I explain this cultural reference that is meaningless to us, but would have been as familiar to the intended audience as we woulds find a Seinfeld reference? Can I go back to my translation ten years after the fact and change that word that seemed perfect at the time but that has since proven a nagging source of misinterpretation? Especially in the translation of sacred texts, where people will hang upon the interpretation of a single word, forgetting entirely that it’s just some translator’s best approximation. Wars have been fought over such things.

The Muslim world might have the best idea here, encouraging its faithful to learn and study their scriptures in Arabic rather than rely on hundreds of conflicting translations in different languages. Added bonus: You get a common language everyone can use.

quran

But the thing is, even without the vagaries of translation, human language isat besta horribly imprecise tool. One person starts out with an idea in mind. That person approximates that idea as closely as they can manage, using the clumsy symbols that make up any given languageusually composing on the flyand transmits that language to its intended recipient through some method, be it speech or writing or gestural sign language. The recipient listens to that sequence of sounds, or looks at that sequence of marks or gestures, and interprets them back into a series of symbolic ideas, assembling those ideas back together with the help of sundry contextual clues to approximatehopefully—something resembling what the speaker had in mind.

It’s all fantastically imprecisewristwatch repair with a sledgehammerand when you add in the limitations of the listener’s perceptual set it’s obvious how a rhinoceros becomes a unicorn. I say “tree,” thinking of the huge oak in my neighbor’s back yard, but one reader pictures a spruce, another a dogwood, another a magnolia. My daughter points to the rosemary tree in our dining room, decorated with tinsel for the holidays. The mathematician who works in logic all day imagines data nodes arranged in a branching series of nonrecursive decisions. The genealogist sees a family history.

Humans are also infamously prone to hyperbole. Just ask your second cousin about that bass he had halfway in the boat last summer before it wriggled off the hook. They’re called fish stories for a reason. As an armchair scholar of medieval English literature, I can tell you that a lot of texts presented as history, with a straight face, bear reading with a healthy dose of skepticism. According to the 12th-century History of the Kings of Britain, that nation was founded when some guy named Brutus, who gets his authority by being the grandson of Aeneas (yeah, the one from Greek mythology), sailed up the Thames, defeated the handful of giants who were the sole inhabitants of the whole island, named the island after himself (i.e., Britain), and established the capital city he called New Troy, which would later be renamed London. Sounds legit.

sggk-edit

In the beginning of Sir Gawain and the Green Knight, Gawain beheads the huge green man who has challenged him to a one-blow-for-one-blow duel, right there in front of the whole Arthurian court, but the man picks up his head, laughs at Gawain, hops back on his horse, and rides off. Granted, Gawain is presented as allegory rather than fact, but Beowulf is presented as fact, and he battles a monster underwater for hours, then kills a dragon when he’s in his seventies.

Heck, go back to ancient Greek literature and the humans and the gods routinely get into each other’s business, helping each other out, meddling in each other’s affairs, deceiving and coercing each other into to do things, getting caught up in petty jealousies, and launching wars out of spite or for personal gain. Sound familiar?

As for creation stories, there are almost as many of those as there are human civilizations. We have an entire three-credit course focused on creation stories, and even that only has space to address a small sampling of them.

BLS300-visions

Likewise, there are almost as many major religious texts as there are major civilizations. The Abrahamic traditions have their Bible and their Torah and their Qur’an and Hadith, and their various apocryphal texts, all of which are deemed sacrosanct and infallible by at least a portion of their adherents. The Buddhists have their Sutras. The Hindus have their Vedas, Upanishads, and Bhagavad Gita. The Shinto have their Kojiki. The Taoists have their Tao Te Ching. Dozens of other major world religions have their own texts, read and regarded as sacred by millions. The countless folk religions around the world have their countless oral traditions, some of which have been recorded and some of which have not.

Likewise, there are any number of religions that have arisen out of personality cults, sometimes following spiritual leaders of good faith, sometimes following con artists and charlatans. Sometimes those cults implode early. Sometimes they endure. Sometimes they become major world religions.

jim-jones

At certain levels of civilization, it is useful to have explanations for the unexplainable, symbolic interpretations of the natural world, narratives of origin and identityeven absolute codes of conduct. Religious traditions provide their adherents with comfort, moral guidance, a sense of belonging, and the foundations of strong communities.

However, religion has also been abused throughout much of recorded history, to justify keeping the wealthy and powerful in positions of wealth and power, to justify keeping major segments of society in positions of abject oppression, to justify vast wars, profitable to the most powerful and the least at risk, at the expense of the lives and livelihoods of countless less-powerful innocents.

A lot of good has been done in the name of religion. So has a lot of evil. And before we start talking about Islamist violence, let us remember that millions have been slaughtered in the name of Christianity. Almost every religion has caused bloodshed in its history, and every major religion has caused major bloodshed at some point in its history. Even the Buddhists. And there’s almost always some element of we’re-right-and-you’re-wrong very close to the center of that bloodshed.

spanish-inquisition

But what if we’re all wrong?

If we can’t begin to comprehend the vastness of the universe or the emptiness of what we consider solid, if we can only sense a tiny portion of what is going on around us (and through us), and if we don’t even know for sure what we have actually seen with our own eyes or heard with our own ears, how can we even pretend to have any handle on an intelligence that might have designed all this? How can we even pretend to comprehend an intelligence that might even be all of this? I mean seriously, is there any way for us to empirically rule out the possibility that our entire known universe is part of some greater intelligence too vast for us to begin to comprehend? That in effect we are, and our entire reality is, a minuscule part of God itself?

In short, the more convinced you are that you understand the true nature of anything we might reasonably call God, the more convinced I am that you are probably mistaken.

understand-everything-crop

I’m reminded of the bumper sticker I’ve seen: “If you’re living like there’s no God, you’d better be right!” (usually with too many exclamation points). And the debate I had with a street evangelist in which he tried to convince me that it was safer to believe in Jesus if there is no Christian God, than to be a non-believer if he does exist. Nothing like the threat of hell to bring ’em to Jesus. But to me, that kind of thinking is somewhere between a con job and extortion. You’re either asking me to believe you because you’re telling me bad things will happen to me if I don’t believe you, which is circular logic, or you’re threatening me. Either way, I’m not buying. I don’t believe my immortal soul will be either rewarded or punished in the afterlife, because when it comes right down to it, even if something we might reasonably call God does exist, I still don’t think we will experience anything we would recognize as an afterlife. Or that we possess anything we would recognize as an immortal soul.

To answer the incredulous question of a shocked high-school classmate, yes, I do believe that when we die, we more or less just wink out of existence. And no, I’m not particularly worried about that. I don’t think any of us is aware of it when it happens.

But if there’s no recognizable afterlife, no Heaven or Hell, no divine judgment, what’s to keep us from abandoning all morality and doing as we pleasekilling, raping, looting, destroying property and lives with impunity, without fear of divine retribution? Well, if there is no afterlife, if, upon our deaths, we cease to exist as an individual, a consciousness, an immortal soul, or anything we would recognize as an entitywhich, as I have established here, I believe is likely the casethen it logically follows that this life, this flicker of a few years between the development of  consciousness in the womb and the disintegration of that consciousness at death, well, to put it bluntly, this is all we get. This life, and then we’re gone. There is no better life beyond. You can call it nihilism, but I think it’s quite the opposite.

Because if this one life here on Earth is all we get, ever, that means each life is unique, and finite, and precious, and irreplaceable, and in a very real sense, sacred. Belief in an idealized afterlife can be usedtwisted, ratherto justify the killing of innocents. Kill ’em all and let God sort ’em out. The implication being that if the slaughtered were in fact good people, they’re now in a better place. But if there is no afterlife, no divine judgment, no eternal reward or punishment, then the slaughtered innocent are nothing more than that: Slaughtered. Wiped out. Obliterated. Robbed of their one chance at this beautiful, awesome, awful, and by turns astounding and terrifying experience we call life.

Likewise, if this one life is all we get and someone is deliberately maimedwhether physically or emotionally, with human atrocities inflicted upon them or those they love—they don’t get some blissful afterlife to compensate for it. They spend the rest of their existence missing that hand, or having been raped, or knowing that their parents or siblings or children were killed because they happened to have been born in a certain place, or raised with a certain set of religious traditions, or have a certain color of skin or speak a certain language.

In other words, if this one life is all we get? We had damned well better use it wisely. Because we only get this one chance to sow as much beauty, as much joy, as much nurturing, and peace, and friendliness, and harmony as possible. We only get this one chance to embrace the new ideas and the new experiences. We only get this one chance to welcome the stranger, and to see the world through their eyes, if only for a moment. We only get this one chance to feed that hungry person, or to give our old coat to that person who is cold, or to offer compassion and solace and aid to that person who has seen their home, family, livelihood, and community destroyed by some impersonal natural disaster or some human evil such as war.

syrian_refugees

If I’m living like there’s no (recognizable) God, I’d better be doing all I can manage to make this world a more beautiful place, a happier place, a more peaceful place, a better place. For everyone.

As for a God who would see someone living like that, or at least giving it their best shot, and then condemn them to eternal damnation because they failed to do something like accept Jesus Christ as their personal lord and savior? I’m sorry, but I cannot believe in a God like that. I might go so far as to say I flat-out refuse to believe in a God like that. I won’t go so far as to say that no God exists, because as I have said, I believe that we literally have no way of knowing, but I’m pretty sure any God that does exist isn’t that small-minded.

einstein

So anyway, happy holidays.

This is an examination of my own considered beliefs, and nothing more. I won’t try to convert you. I will thank you to extend me the same courtesy. You believe what you believe and I believe what I believe, and in all likelihood there is some point at which each of us believes the other is wrong. And that’s okay. If after reading this you find yourself compelled to pray for my salvation, I won’t be offended.

If you celebrate Christmas, I wish you a merry Christmas. If you celebrate the Solstice, I wish you a blessed Solstice. If you celebrate Hanukkah, I wish you (belatedly) a happy Hanukkah. If you celebrate Milad un Nabi, I wish you Eid Mubarak. If some sense of tradition and no small amount of marketing has led you to celebrate the celebratory season beyond any sense of religious conviction, you seem to be in good company. If you celebrate some parody of a holiday such as Giftmas, I wish you the love of family and friends, and some cool stuff to unwrap. If you celebrate Festivus, I wish you a productive airing of grievances. If you’re Dudeist, I abide. If you’re Pastafarian, I wish you noodly appendage and all that. If you don’t celebrate anything? We’re cool.

And if you’re still offended because I don’t happen to believe exactly the same thing you believe? Seriously? You need to get over it.

xmashup

Science in a Postmodern Age

by Matt McKinnon

scientist

I am not a scientist.

Just like many prominent, mostly Republican, politicians responding to the issue of climate change—trying their best to refrain from losing votes from their conservative constituencies while not coming across as being completely out of touch with the modern world—I am not a scientist.

Of course, if you ask most people who are in fact scientists, then somewhere around 87% of them agree that climate change is real and that it is mostly due to human activity (or at least if you ask those scientists who are members of the American Association for the Advancement of Science, as reported by the Pew Research Center).

climate-change-smokestacks

Then again, if you ask average Americans (the ones who are not scientists), then only about 50% think that human activity is the largest cause of climate change.

That’s quite a disparity (37 points), especially since getting 87% of scientists to agree on anything is not all that easy and arguably represents what we could call a scientific consensus.

This, of course, provides much fodder for comedians like Bill Maher and Jon Stewart as well as many liberals and progressives, who have come to see the problem of science and a skeptical public as a characteristic of contemporary American conservatism.

nye-hamm

And this characterization is buttressed by the even more overwhelming discrepancy between the public and scientists on the question of evolution. A 2009 study by Pew found that only 54% of the public believe in evolution (22% of whom believe that it was guided by a supreme being) versus 95% of scientists (where only 8% believe it to be guided by a supernatural power). And that more recent 2014 Pew study bumped the public percentage up to 65% and the scientific consensus up to 98%.

That’s a gap of 33 points, a bit less than the 37 points on the issue of climate change. Sure there’s something to be said for the idea that contemporary conservatism is at odds with science on some fundamental issues.

But not so fast.

For while there is a large discrepancy between scientists and the American public on these core conservative questions, there is also a large and seemingly growing discrepancy between the public and science on issues that cross political lines, or that could even be considered liberal issues.

42-21991798

Take the recent controversy about immunizations.

Just as with climate change and evolution, a large majority of scientists not only think that they are safe and effective, but also think that certain immunizations should be mandatory for participation in the wider society. That same 2014 Pew study found that 86% of scientists think immunizations should be mandatory, compared to 68% of the public.

And the very liberal left is often just as vocal as the conservative right on this issue, with folks like Jenny McCarthy who has claimed that her son’s autism was the result of immunizations despite clear scientific evidence that has debunked any link. At least one study by Yale law professor Dan Kahan shows that those who fear childhood immunizations are pretty much split between liberals and conservatives.

jenny_mccarthy

Still, with an 18-point gap between scientists and the public on this issue, that leaves a lot of progressives seemingly in the same position as those conservatives denying the role of human activity in climate change.

Just as interesting, however, is the discrepancy between scientists and the public on building more nuclear power plants—a gap that is greater (20 points) though scientific opinion is less certain. Pew found that 45% of the public favors more nuclear power compared to 65% of scientists.

But what is even more intriguing is that all of these gaps between scientific consensus and public opinion are far less than the discrepancy that exists on the issue of biomedical science, from the use of pesticides to animal testing and the most controversial: genetically modified organisms (GMOs).

fruit-veg

That same Pew study found that a whopping 88% of scientists believe that it is safe to eat genetically modified foods, a larger consensus than agree on human activity and climate change, compared to public opinion, which languishes very far back at 37% (a disparity of 51%!).

And 68% of those scientists agree that it is safe to eat foods grown with pesticides, compared to 28% of the public (a gap of 40 points).

But you won’t find many liberal politicians wading publicly into this issue, championing the views of science over a skeptical public. Nor will you find much sympathy from those comedians either.

jon-stewart

It seems that when the proverbial shoe is on the other foot, then it is either not problematic that so many plain old folks diverge from scientific opinion, or there is in fact good reason for their skepticism.

Which brings me to my point about science in a postmodern age. For while it is true that there are good reasons to be skeptical of the science on the use of pesticides and GMOs, as well as some of these other issues, the problem is: who decides when to be skeptical and how skeptical we should be?

foucault

That is the problem of postmodernism, which strives for a leveling of discourse and has more than a bit of anti-clerical skepticism about it. For if postmodernism teaches us anything it’s that the certitude of reason in the modern age is anything but certain. And while this makes for fun philosophical frolicking by folks like Heidegger, Foucoult, and Habbermas, it is problematic for science, which relies completely on the intuition that reason and observation are the only certain means of discovery we have.

But in a postmodern age, nothing is certain, and nothing is beyond reproach—not the government, or business, or think tanks, or even institutions of higher learning. Not scientific studies or scientists or even science itself. Indeed, not even reason for that matter.

rotwang

The moorings of the modern era in reason have become unmoored to some extent in our postmodern culture. And this, more than anything else, explains the large gaps on many issues between scientific opinion and that of the public.

And in the interest of full disclosure: I believe human activity is causing climate change and that immunizations are safe and should be required but I am very skeptical of the use of pesticides and eating GMOs.

But what do I know? I’m not a scientist.

Tim’s Vermeer: The Science of Dutch Art

by Ann Millett-Gallant

tims-vermeer-optics

Tim Jenison with his “Vermeer” and the equipment used to make it.

I love a good documentary film, especially one about art, so I was happy to receive Bob Hansen’s recommendation of Tim’s Vermeer. It is an eighty-minute film about one man’s quest for art featuring Tim Jenison, an inventor, video equipment specialist, and entrepreneur who is fascinated with the paintings of Johannes Vermeer. Known as “the painter of light,” Vermeer was a Dutch painter of the seventeenth century, best known for his portraits, interior genre scenes, and inclusion of detail. For more information on the life and work of Vermeer, see the website Essential Vermeer.

tims_vermeerThe film is narrated by Penn Jillette and directed by Teller, of the famous team of magicians Penn and Teller. Penn and Teller were also a part of the team that produced the film, and themes of documentation and magic pervade it. Other major themes include the nexus of art and technology, photography and illusion in art history, digital technology and technological imaging in art, and seeing through photographic reproduction versus human seeing. These themes relate to two of my BLS courses, BLS 345, Photography: Contexts and Illusions and BLS 346, The Art of Life. Discussions of art history and the media and techniques of artmaking in the film are also relevant to my Art 100 course. Tim’s exposé of Vermeer may be interpreted as challenging the notions of artistic talent and exposing the myth of so-called “genius” painting. Yet, in the process, Tim discovers a newfound awe of Vermeer’s resources and artistic focus.

Tim is most interested in Vermeer’s possible use of early camera technology and reflective devices. Inspired by Vermeer’s Camera (2001), a book by Philip Stedman, professor at University College of London, Jenison crosses continents and narratives of art history in pursuit of the truth behind Vermeer’s oil painting The Music Lesson (1662-1664), and eventually attempts to recreate it. Tim says in the film that he feels a kinship with Vermeer as an inventor and musician. He also explains why he decided to focus on The Music Lesson, stating that it is “so complete and self-contained,” compared with all other Vermeer paintings, and he declares it “a scientific experiment waiting to happen.” Through his research and art project, Tim aims to offer an “alternative narrative of Vermeer.”

Johannes Vermeer, The Music Lesson, 1662-1664.

Johannes Vermeer, The Music Lesson, 1662-1664.

While experimenting with various reflective devices, Tim paints his first portrait from the reflection of a photograph of his father in law. He is modest about his painted product and says emphatically that painting it was a decidedly object experience, rather than a subjective, or personal one. Tim then zooms in on studying image-making and methods of illusion specific to the seventeenth century, which includes optical machines and—most prominently—the camera obscura, an ancestor of early 19th-century photography.

18th-century camera obscura.

18th-century camera obscura.

Tim then compares the painted details of The Music Lesson to optical effects of photography, concluding that Vermeer depicted photographic seeing, rather than human sight. He states that the appearance of “absolute brightness” in the painting is proof that Vermeer painted from photograph, because such light is not visible to the naked eye.

To prove his hypothesis, Tim first visits Delft, Holland, where he learns to speak Dutch, to grind pigments, and to mix oil paint. He also studies the light, furniture, and interior architecture. Finally, he hires artists to make exact replicas of the pottery found in the composition of The Music Lesson. Tim discusses a list of craftsmen and engineers he would need to serve as “experts” in building a life-size model of the scene, saying that he can attempt to complete all the work with a computer.

Johannes Vermeer, Girl with a Pearl Earring, 1665-1667.

Johannes Vermeer, Girl with a Pearl Earring, 1665-1667.

Tim proves that he is a quintessential “Renaissance Man” (although Vermeer was post-Renaissance historically); to be photographed by a self-built camera, Tim constructs a replica of the scene of The Music Lesson in a San Antonio warehouse from wood, concrete, metal, and glass. Tim’s set is complete with furniture, woodwork, stained-glass windows, and musical instruments. Experimentation leads Tim to discover a system of lenses and mirrors (including a shaving mirror) which, joined with visual color-matching tricks, allow him to build a surprisingly accurate, three-dimensional reproduction of The Music Lesson. Tim also shows his musical skills as he plays on the violin that will serve as his model in the composition.

Earlier in the film, as Tim and Philip Stedman each tried their hands at copying portraits, the music and tempo of the shots slowed down, but they build up again as Tim builds the room and begins to paint the image from it. More dynamic camera work and background music set the stage for many scenes of Tim painting, in which he used his daughter and her friends as live models. This pace is held up for 8 minutes. Time is marked by images of the calendar dates in the lower corner of the screen, as if torn from a desktop calendar. Everything slows down significantly after forty days. At about fifty days, Tim makes a discovery; he finds curves in the painting where there should be straight lines. He explains that Vermeer’s so-called mistake in angles of perspective was a result of viewing and painting a photographic image.

Johannes Vermeer, The Art of Painting, 1662-1668

Johannes Vermeer, The Art of Painting, 1662-1668

Following this discovery, time further drags as Tim experiences the physical pain of his actions and seated position, while the viewer watches him paint details such as violin strings, minute decorations on the piano, and the individual threads on a draping, patterned tablecloth. Somewhere in this period, a threat of carbon monoxide poisoning arises in the studio. I must admit, I do not remember why. I may have zoned out. After eighty to ninety days, Tim becomes “repulsed” while painting a royal blue chair with bronze lion heads on the back and correcting his mistakes with a cotton swab. All in all in the film, there are approximately thirteen minutes of footage of Tim painstakingly painting. It feels longer.

In the final scenes, Tim shows Stedman and David Hockney his painting. Hockney, with whom Tim has met previously in the film, is another artist interested in these reflected forms and technologies (see David Hockney’s website here).

Stedman and Hockney discuss Tim’s painting and determine that it is better than Vermeer’s. In the last shot, the humble Tim claims Vermeer was an inspiring inventor and artist.

Tim Jenison, The Music Lesson, 2012.

Tim Jenison, The Music Lesson, 2012.

The film chronicles important discoveries and historical revisions, but I wasn’t sure if the information alone carried the film. I was just so fascinated by Tim Jenison. He stole the show. He was obviously very smart and skillful, yet also witty, eccentric, and obsessive. It takes one hundred and twenty days for him to paint a replica of a famous Vermeer painting, and the whole project, captured on film, took over five years (2008-2013). For this entire time, Tim’s life seems utterly driven by art and photography.

tims-vermeer-music-lesson

Why All Babies Deserve to Die: Science and Theology in the Abortion Debate

by Matt McKinnon

The debate rages on…

The debate rages on…

Just a few of the headlines on the abortion debate from the last few weeks:

I would say that the Abortion issue has once again taken center stage in the culture wars, but it never really left. Unlike homosexual marriage, which seems to be making steady progress towards resolution by a majority of Americans that the freedom to marry of consenting adults is basic civil right, the abortion debate continues to divide a populace who is torn between adjudicating the priority of the basic rights of both mother and “potential” child.

I say “potential” child because herein is where the real debate lies: exactly when does a fertilized human egg, a zygote, become a “person,” endowed with certain human if not specifically civil rights?

Is it a person yet?

Is it a person yet?

Dougherty’s main point in his article on liberal denial focuses on the “fact” of the beginnings of human life. He claims that liberals tend to make one of two types of arguments where science and human life are concerned: either they take the unresolved legal issue regarding the idea of personhood and transfer it back to the “facts” of biology, concluding that we cannot really know what human life is or when it begins, or they acknowledge the biological fact of the beginning of human life but claim that this has no bearing on how we should think about the legality of abortion.

Both sorts of arguments, he claims, are obscurantist, and fail to actually take into account the full weight of science on the issue.

But the problem, I contend, isn’t one of science: it’s one of theology—or philosophy for those less religiously inclined.

The problem is not the question of “what” human life is or “when” it begins. Dougherty points out:

After the fusion of sperm and egg, the resulting zygote has unique human DNA from which we can deduce the identity of its biological parents. It begins the process of cell division, and it has a metabolic action that will not end until it dies, whether that is in a few days because it never implants on the uterine wall, or years later in a gruesome fishing accident, or a century later in a hospital room filled with beloved grandchildren.

Two-cell zygote.

Two-cell zygote. Is this a person?

So basically, human life begins at conception because at that point science can locate a grouping of cells from which it can deduce all sorts of things from its DNA, and this grouping of cells, if everything goes nicely, will result in the birth, life, and ultimate death of a human being.

He even gets close to the heart of the problem when, in arguing against an article by Ryan Cooper, he claims that many people are not fine with the idea that an abortion represents the end of a life, nor are they comfortable with having a category of human life that is not granted the status of “humanity”—and thus not afforded basic human rights.

The problem with all of these discussions is that they dance around the real issue here—the issue not of “human life” and its definition and beginning, but rather the philosophical and often theological question of the human “person.”

If we look closely at Dougherty’s remarks above, we note two distinct examples of why the generation of human life is a “fact”: (1) we can locate DNA that tells us all sorts of things about the parents (and other ancestors) of the fetus and (2) this fetus, if everything works properly, will develop into a human being, or rather, I would argue, a human “person.”

For there’s the distinction that makes the difference.

After all, analyze any one of my many bodily fluids and a capable technician would be able to locate the exact same information that Mr. Dougherty points out is right there from the first moments of a zygote’s existence. But no one claims that any of these bodily fluids or the cells my body regularly casts off are likewise deserving of being labeled “human life,” though the sperm in my semen and the cells in my saliva are just as much “alive” as any zygote (believe me, I’ve looked).

No, the distinction and the difference is in the second example: The development of this zygote into a human person. My sperm, without an egg and the right environment, will never develop into a human being. The cells in my saliva have no chance at all—even with an egg and the right conditions.

Nope, not people.

Nope, not people.

So the real force of Doughtery’s argument lies in the “potential” of the zygote to develop into what he and anti-abortion folks would claim is already there in the “reality” of a human person.

The debate thus centers on the question of human personhood, what we call theological or philosophical anthropology. For one side, this personhood is the result of a development and is achieved sometime during the embryonic stage (like “viability”) or even upon birth. For others, it is there at conception. For some in both camps it would include a “soul.” For others it would not.

So the reason that the abortion debate is sui generis or “of its own kind” is because here the issue is not the rights of a minority versus the rights of a majority, as it is in the debate about homosexual marriage, or even the rights of the mother versus the rights of the child. Rather the real debate is about when “human life” is also a human “person” (note this is also informs the debate of whether or not to end the life of someone in a vegetative state).

Is this a person?

Fetus at four weeks. Is this a person?

To this end, Mr. Dougherty is correct: We can and do know what human life is and when it begins. And he is correct that many are uncomfortable with the idea that abortion means the death of a human life. But he fails to recognize that the reason this is the case is that while those on one side regard this “life” as a human person, others do not. Potentially, perhaps, but not a “person” yet. And certainly not one whose “right to life” (if there even is such a thing: nature says otherwise—but that’s another blog post) trumps the rights of the mother.

So what does all of this have to do with all babies deserving to die? It’s simple: this is what the (necessary?) intrusion of theology into public policy debates entails. Once theological ideas are inserted (and note that I am not arguing that they should or shouldn’t be), how do we adjudicate between their competing claims or limit the extent that they go?

For the two great Protestant Reformers Martin Luther and John Calvin, representing the two dominant trajectories of traditional Protestant Christianity, humans are, by nature, sinful. We are conceived in sin and born into sin, and this “Original Sin” is only removed in Baptism (here the Roman Catholic Church would agree). Furthermore, we are prone to keep sinning due to the concupiscence of our sinful nature (here is where the Roman Church would disagree). The point is that, for Protestants, all people are not only sinful, but are also deserving of the one chief effect of sin: Death.

romans_6-23

“For the wages of sin is death.” — Romans 6:23

 

Calvin was most explicit in Book 2, Chapter 1 of his famous Institutes:

Even babies bring their condemnation with them from their mother’s wombs: they suffer for their own imperfections and no one else’s. Although they have not yet produced the fruits of sin, they have the seed within. Their whole nature is like a seedbed of sin and so must be hateful and repugnant to God.

Since babies, like all of us, are sinful in their very nature, and since they will necessarily continually bear the fruits of those sins (anyone who’s ever tried to calm a screaming infant can attest to this), and since the wages of those sins is death, then it’s not a far-fetched theological conclusion that all babies deserve to die. And remember: “they suffer for their own imperfections.”

But they don’t just deserve to die—they deserve to go to hell as well (but that’s also another blog post). And this, not from the fringes of some degenerate religious thinker, but from the theology of one of Protestant Christianity’s most influential thinkers.

A sinner in the eyes of God (or at least Calvin).

A sinner in the eyes of God (according to John Calvin, anyway).

Of course, it should be noted that Calvin does not imply that we should kill babies, or even that their death at human hands would be morally justifiable: thought he does argue (and here all Christian theology would agree) that their death at the hand of God is not just morally justifiable, it is also deserved. It should also be noted that the Roman Catholic theology behind the idea that children cannot sin until they reach the age of reason is predicated on the notion that this is only the case once their Original Sin has been removed in Baptism (So Jewish, Muslim, and Hindu kids would be sinful, unlike their Christian counterparts).

Again, this is not to argue that philosophical and theological principles should not be employed in the abortion debate, or in any debate over public policy. Only that (1) this is what is occurring when pro-choice and anti-abortion folks debate abortion and (2) it is fraught with complexities and difficulties that few on either side seem to recognize.

And contrary to  Mr.Dougherty, this is beyond the realm of science, which at best tells us only about states of nature.

But the only way we have a “prayer” of real sustained dialogue—as opposed to debates that ignore our competing fundamental positions—is to take seriously the philosophical and theological issues that frame the question (even if my own example is less than serious).

But I’m not holding my breath. I would most certainly die if I did.

Environmentalism and the Future

by Matt McKinnon

Let me begin by stating that I consider myself an environmentalist.  I recycle almost religiously.  I compost obsessively.  I keep the thermostat low in winter and high in summer.  I try to limit how much I drive, but as the chauffeur for my three school-age sons, this is quite difficult.  I support environmental causes and organizations when I can, having been a member of the Sierra Club and the Audubon Society.

1I find the arguments of the Climate Change deniers uninformed at best and disingenuous at worst.  Likewise, the idea of certain religious conservatives that it is hubris to believe that humans can have such a large effect on God’s creation strikes me as theologically silly and even dishonest.  And while I understand and even sympathize with the concerns of those folks whose businesses and livelihoods are tied to our current fossil-fuel addiction, I find their arguments that economic interests should override environmental concerns to be lacking in both ethics and basic forethought.

That being said, I have lately begun to ponder not just the ultimate intentions and goals of the environmental movement, but the very future of our planet.

Earth and atmospheric scientists tell us that the earth’s temperature is increasing, most probably as a result of human activity.  And that even if we severely limited that activity (which we are almost certainly not going to do anytime soon), the consequences are going to be dire: rising temperatures will lead to more severe storms, melting polar ice caps, melting permafrost (which in turn will lead to the release of even more carbon dioxide, increasing the warming), rising ocean levels, lowering of the oceans’ ph levels (resulting in the extinction of the coral reefs), devastating floods in some places along with crippling droughts in others.

2And according to a 2007 report by the Intergovernmental Panel on Climate Change, by 2100 (less than 100 years) 25% of all species of plants and land animals may be extinct.

Basically, our not-too-distant future may be an earth that cannot support human life.

Now, in my more misanthropic moments, I have allowed myself to indulge in the idea that this is exactly what the earth needs.  That this in fact should be the goal of any true environmental concern: the extinction of humanity.  For only then does the earth as a planet capable of supporting other life stand a chance.  (After all, the “environment” will survive without life, though it won’t be an especially nice place to visit, much less inhabit, especially for a human.)

3And a good case can be made that humans have been destroying the environment in asymmetrical and irrevocable ways since at least the Neolithic Age when we moved from hunter and gatherer culture to the domestication of plants and animals along with sustained agriculture.  Humans have been damaging the environment ever since.  (Unlike the beaver, as only one example of a “keystone species,” whose effect on the environment in dam building has an overwhelming positive and beneficial impact on countless other species as well as the environment itself.)

4So unless we’re seriously considering a conservation movement that takes us back to the Paleolithic Era instead of simply reducing our current use and misuse of the earth, then we’re really just putting off the inevitable.

But all that being said, whatever the state of our not-too-distant future, the inevitability of the “distant future” is undeniable—for humans, as well as beavers and all plants and animals, and ultimately the earth itself.  For the earth, like all of its living inhabitants, has a finite future.

Around 7.5 billion years or so is a reasonable estimate.  And then it will most probably be absorbed in the sun, which will have swollen into a red giant.

5(Unless, as some scientists predict, the Milky Way collides with the Andromeda galaxy, resulting in cataclysmic effects that cannot be predicted.)

At best, however, this future only includes the possibility of earth supporting life for another billion years or so.  For by then, the increase in the sun’s brightening will have evaporated all of the oceans.

6Of course, long before that, the level of carbon dioxide in the atmosphere (ironically enough) will have diminished well below the quantity needed to support plant life, destroying the food chain and causing the extinction of all animal species as well.

And while that’s not good news, the worse news is that humans will have been removed from the equation long before the last holdouts of carbon-based life-forms eventually capitulate.

(Ok, so some microbes may be able to withstand the dry inhospitable conditions of desert earth, but seriously, who cares about the survival of microbes?)

Now if we’re optimistic about all of this (irony intended), the best-case scenario is for an earth that is able to support life as we know it for at most another half billion more years.  (Though this may be a stretch.)  And while that seems like a really long time, we should consider that the earth has already been inhabited for just over 3 and a half billion years.

So having only a half billion years left is sort of like trying to enjoy the last afternoon of a four-day vacation.

7

Enjoy the rest of your day.

The Good, the Bad, and the Caffeinated

By Marc Williams

This morning, as I sat with my oversized mug, finishing off the last of what had been nearly a full pot of coffee, I came across yet another article on the effects of coffee on one’s health.  My coffee mug is an extension of my arm: when I’m emailing students, preparing a new lesson, or grading papers, my coffee is always within reach.  As a major coffee drinker (and serious snob) I’ve spent a good deal of time trying to discover if my daily dose of caffeine, size extra grande, was actually doing harm.

Happily, I’ve found much research that suggests my habit is quite healthful: coffee is linked to reduced risk of certain cancers, Alzheimer’s, diabetes, not to mention its ability to increase alertness.  However, sometimes my consumption borders on excess, and the ill effects of high coffee intake have been well-documented: increased risk of certain cancers and acid reflux, plus caffeine addiction can lead to chronic headaches, etc. etc. etc.

So is coffee good for me or bad for me?  I’m confused.

According to Christie Aschwanden of Slate.com, the confusion is widespread–and the uncertainty about coffee’s effect(s) on health is nothing new.  She mentions Mark Pendergrast, author of Uncommon Grounds: The History of Coffee and How It Changed the World.

According to Pendergrast’s book, coffee has stimulated intellectual and often irreverent pursuits among users throughout the ages, often sparking backlash. One governor of Mecca banned the drink after discovering satirical musings about him coming from local coffeehouses. In 1674, a group of London women grew angry with their husbands for spending so much time at coffeehouses (often in an attempt to sober up after the pub), and published a pamphlet warning that the beverage would make them impotent. The men fought back with a competing pamphlet claiming that coffee actually added a “spiritualescency to the Sperme.” In 1679, French doctors blasted coffee, because it “disaccustom[ed] people from the enjoyment of wine.”

While the debate’s historical component is fascinating, I want answers. According to Aschwanden’s article, University of Alabama physician Melissa Wellons  compiled the various medical studies and concluded that most of the physical effects of caffeinated beverages are “observational,” meaning that causality has not been adequately demonstrated.  In comparing these observational effects side-by-side, Aschwanden concludes that the positive effects outweigh the negative.

So it appears, at least for now, I can slurp away.