I threatened that I would have more on this topic, and I don’t issue empty threats. Herewith, a little trivia about a curious structure: the Vehicle Assembly Building at Kennedy Space Center, Cape Canaveral, Florida.

With the race to the moon came, naturally enough, a significant infrastructure to support the endeavor, and the most visibly prominent part of this in this area of Florida is the building used for final assembly of the impressive Saturn V booster that carried Apollo on its way. Standing 110 meters (363 feet) tall, the Saturn V needed a very big building to be stacked together within, especially since it was built atop the Mobile Launch Platform and driven, upright, to the launch pad, and needed a launch gantry that stood even taller. So, 122 meters (400 feet) for the gantry and another 14 meters (45 feet) for the launch platform and crawler means you need one hell of a garage door. At that time called the Vertical Assembly Building, this big plain block structure is visible for kilometers from across Florida but is deceptive in appearance, because it stands alone and has absolutely no features that provide any scale at all.

Shown here getting ready to receive an external tank for the space shuttle orbiters, you can’t really get much of an idea of the size, and in fact even the external tank is tricky, but if you look at the yellow brace near the top end of the tank to the right, at the base of that brace is someone in blue standing right alongside the tank. Clicking on this image will open the full-resolution image in another window, by the way, if you would like to see the detail (photo courtesy of NASA.) And at the base of the VAB, that open door is wide enough for the external tank soon to pass through. The forced perspective is tricky, isn’t it?

But that’s not really enough either, so look at the crossbar throwing a shadow on the face of the building, about 3/4 of the way up in the center. Have a nice close look:

Those little pixels just barely visible now provide an impression of what they really are: a couple of men on a scaffolding on the side of the building (doing what, I’m not really sure, but better them than me.) Now do you get an idea of the scale? The blue starfield portion of the American flag painted on the side of the building is the size of a basketball court; each stripe is the width of an average road lane. The VAB stands 160 meters (526 feet) tall, not quite half the height of the Empire State Building, but four times the volume. Even more interesting, it is 55 meters (180 feet) higher than the highest ground elevation in Florida – yes, the state is very flat; swamps don’t tend to pile up very high.

Inside it’s just one big open space, with several powerful cranes to assemble the various launch vehicles – the Saturn V initially, but now switched to the shuttle orbiters, an era that ends in a couple of months with the last space shuttle launch. It is highly likely that it will continue duty with the new line of space vehicle, whatever that turns out to be (the lack of contingency in the program is appalling, but I’m one of those who refuse to put the blame on NASA – Congress is the body that approves all plans and funding.) In the right conditions, clouds form inside the building due to the humidity and temperature differential, something that would probably not happen anyplace other than Florida, the nation’s sauna. Outside, you can see the square shape from quite a distance away, a child’s toy block in the middle of an empty carpet.

The northern part of the island that holds the Cape is Merritt Island Wildlife Refuge, which I’ve mentioned before, and on more than one occasion I’ve had to change my shooting angle when photographing wildlife there to prevent the building from being in the frame (or, in this case, I simply decided to use it as a distinctive setting.)

When watching the night launch of Mission STS-113, I missed an opportunity I really didn’t know I’d have. Both the shuttle launch pad and the VAB are floodlit for night launches, and my viewing angle placed them only a short distance apart, easily able to be seen together with even a moderate-angle lens. Rockets heading into orbit always seem to be falling back towards Earth as they transition gradually from vertical to horizontal and start following the curve of the planet, but that evening the orbital path lay almost directly away from me. As the shuttle dwindled to a point of light and discarded the Solid Rocket Boosters, only the main engines remained to be seen, and the arc was very tight, almost a straight up/straight down affair. It disappeared from view only a little above the horizon, and directly above the VAB. Had I known and been prepared, I could have had a spare camera set up with a wide angle lens, capturing the entire arc from launch to vanishing, pad to VAB. Nuts.

But I have to give James Vernatocola credit for composing this great shot of the arc with the foreground details.


Respect. Oh, sweet baby rhesus, how that word is abused! From my own warped point of view (or at least, from my perspective based on the media I choose to examine,) this is perhaps the key word to define the past decade – not because it was particularly respectful, but because that was what everyone thought they deserved and decried not receiving. The ’80s were considered the “Me” decade; the ’00s might be considered the “Respect Me!” decade. I would like to think that this will pass in the next decade, but we’ll just have to see.

People don’t seem to understand what the word actually means. They demand respect for their views, for their practices, for their lack-of-respect for others. But respect does not translate to “right,” as in, the rights someone may have as a human, as a citizen, whatever. In the US, for instance, we have the right to follow whatever religion we choose – and frankly, no one can enforce or deny what we personally believe, obviously. But this does not mean anyone must respect that belief. Anyone has the right to believe what they want, and everyone else has the right to believe they’re ignorant loons. That’s how rights work.

Respect, however, is a personal quality, an opinion, a value judgment. One does not demand that an opinion favor them, or that everyone agrees to the same values. Respect is earned, despite the impression we might have culturally – for instance, the forced respect of military hierarchy (which isn’t actually respect, but discipline,) or the respect we are expected to have for community leaders or even the elderly. Respect isn’t even provided by laws – the best that they can provide is protection, but they only imply an attitude of respect.

Our culture is a bit confused over this issue, though. Still laboring under the supposed virtue of “political correctness,” we tend to hear people calling for respect and we pause or even give way, instead of the very simple and appropriate response, “You don’t demand that from me, buddy boy! Show me you’re worth it!” But we’ve gotten so far away from this now that people with some really whacked ideas and practices gain far too much attention for holding an opinion and thinking they’re special for that. It has really come down to the vain idea that one person holding an opinion supersedes anyone else holding an opposing one.

Such an attitude, however, destroys the very meaning of the word. Respect used to be something sought after precisely because it was a measure of accomplishment, of regard. You gained respect because you showed that your views were more appropriate, beneficial, or intelligent than average, because your skills exceeded most expectations, because you succeeded where others failed, or even because you demonstrated some self-improvement. It held importance because pleasing a majority of people meant you could provide the greatest benefit to society, or recognized that collective advances work better than individual competition. It was a measure of cultural selection, reinforcement of the benefits of cooperative society. We wanted it because we have internal drives to seek social elevation – that’s how our species works. To think that respect should be reduced to an automatic deference, to the mere recognition of individuality and opinion, actually denies that individuality in the first place.

This isn’t what those demanding respect actually want, though. They really do want to be elevated above others – they just don’t want to work for it.

This is a trap we can’t afford to fall for. No one has to respect another opinion; no one should be held from disagreement. Our ability to separate the bad from the good is the only thing that can possibly work to advance us, in either big or small ways. If someone has a dissenting opinion, this has as much right to be heard as any other.

Even more importantly, we often have a hard time speaking against the perceived majority – we don’t want to isolate ourselves among a group of adverse opinions. But think what happens if everyone feels that way – how do you know what majority opinion even is? If one person speaks their mind, and everyone else stays quiet because they don’t want to stand alone in dissent, you achieve a majority of one with all others abstaining. That’s ludicrous.

While it may sound hypothetical, this happens all the time. In discussions centered on fundamentalists and anti-social practices, I have seen an untold number of moderately religious folk take offense, avowing that they do not want to be lumped in with the fundies. And while I appreciate this sentiment, I find it particularly tiresome – because those same moderates are nowhere to be seen when fundamentalists, always regarded as an insignificant minority, define the path that religion takes. When Westboro Baptist Church parades around redefining both “intolerance” and “fucking asshole,” I have never seen any religious figure, no matter how prominent, speak against them. When some religious leader makes reprehensible opportunistic statements about disaster victims deserving their fate, in a crass attempt to capitalize on human suffering, I have never seen moderates lambaste the practice. When a politician stands up and blurts some pandering religious platitude, I have never seen any religious person of any level remind anyone that political office requires a neutral stance on religion. However, when treated with the lack of respect that necessarily follows from remaining silent in the face of religious impropriety, they cry that they did not support those actions, and apparently carried dissent in their hearts.

If I had more than four people reading this blog, I’d attempt to coin a term: “closet respectable,” referring to those who hold standards that they simply will not display or communicate. It reminds me of the “boyfriend in the next town” that high-school girls seem to have fairly frequently, the one no one ever gets to meet.

We cannot afford to treat respect as a right, as a bumper sticker rewarding non-accomplishment. Remaining silent in the face of what we disagree with produces nothing of any benefit. Being afraid to stand out merely lowers our standards of society. Respect is earned, and it should be a challenge to meet its criteria. If we fail to seek honest respect, we’re not providing any benefit, to others or even ourselves. And if we do not hold that bar of respect high for everyone, we allow our society to sink further toward mediocrity, failing everyone including ourselves.

If someone thinks they have respect because of their title, such as “christian,” “Democrat,” “white,” “male,” “supervisor,” “owner,” “high-salaried individual,” “doctor,” “feminist,” and so on… they’re almost certainly not thinking of respect in its intended definition. If they feel they’re respected by others holding the same titles, they perhaps need to ask if this is truly respect, or simply the lip-service paid by others just to garner the same attitude back towards themselves, mutual self-congratulation. And, of course, if this “respect” within their title is enough.

If we want honest respect, we should be prepared to cultivate it, raise it, groom it, and nurture it – always being aware that it comes from other people. The secret is to make them happy, and proud to bestow it upon us. We do not steal it from them, or take it as taxes; we receive it in trade for being respectable. If, of course, we are not receiving it, what we are offering is not worth it.

I hate it when I’m slow

A few years ago when living in Florida, I kept a journal about wildlife observations, which included no small amount of speculation on what I was seeing. It’s interesting to look back through it and see how certain things solidified as I found our more information or made subsequent observations, and I’ll probably feature some parts of it in posts later on.

On occasion, this blog will reflect it too, like the sudden dawning I had yesterday on a post from a few days back. At the end of that post, I surmised that the value that we place on tradition was so powerful, it seemed almost like an evolutionary trait. The dawning came when I realized that it was, and we’re already well aware of how it works. Kindly note that I have confirmed none of this, and will gladly (well, maybe not gladly, but willingly) retract it if someone comes along and tells me how I’m talking bollocks.

Understanding human behavior sometimes comes when you break it down into core actions, rather than the assigned properties with which we view such behavior from minds that enjoy dealing in abstract concepts – in other words, if you think of us as mere animals (which we are.) “Tradition” then becomes an instinct to follow past examples, or to reduce that even further, to copy our parents. That this is an evolved trait seems abundantly obvious – it’s how we learn to talk, and to parse the nuances and rules of language. It’s how we know what to eat. If we didn’t have this drive, we’d take forever to develop, or really, may not develop very well at all. Independence doesn’t work that well when you’re not very functional for the first stages of your life.

We can see this in other species, and this is the part that made it click in my head. Back in Florida, there were muscovy ducks that lived in the pond at the apartment complex, and I watched them raise a few broods there. Everyone knows the folklore about ducklings and the first thing that they see upon hatching, and following around some other animal they think is “mom,” but the reality is, birds do imprint on behavior too easily, a trait that wildlife rehabilitators have to be aware of lest they raise a bird that does not know it’s a bird, and cannot cope on its own in the wild. Ducklings, like many other species, know instinctively to take their cue from momma, and will copy her behavior automatically. When she preens, they preen, all together.

It’s remarkable to observe, because the ducklings don’t appear to be watching their mother at all, and the sudden onset of preening seems almost simultaneous, but momma always starts first. And no, the two in the back aren’t lacking this behavior, but if you watch birds preen, they do brief sessions and pause, taking a moment to ensure that predators haven’t started closing in while their attention is elsewhere, another instinctual mechanism. I just happened to catch them during this pause.

Considered from this angle, it’s easy to see why “tradition” even became a concept in the first place – it puts a name to the instinct to follow behavior and learn from others. It’s another example of the interesting organs that our brains really are. We have automatic functions, like breathing and pain response, and we have subconscious, instinctual functions, like being aware of danger and seeking mates, and then we have the deliberate functions like cognitive thought. But the cognition part relies on the other two, and we have a hard time distinguishing deliberate (“rational”) thought from the instincts that we have. In fact, we’re very often in denial of the parts played, since we tend to feel that only “animals” (meaning everything but us) rely on instincts, but we vaunted humans do everything deliberately – the whole “free will” concept. It’s total vanity, of course, as only brief reflection will demonstrate, but it’s an insidious belief.

It gets worse. When we fail to recognize that subconscious, inherited behavior plays a large part in our thinking processes, we fall into a trap of believing that everything we do is part of a rational process – we intended to do it, and will even make up excuses as to why we engage in such instinctual behavior: “rationalizing.” The ugly catch becomes that we purposefully avoid engaging the truly rational part of our brains to overcome instinctual behavior that may not apply to a particular situation, simply because we deny that we have such instincts. The failure to recognize it can lead to remaining a pawn of it.

Much of what we have built our culture around is extensions of such instinctual traits, the attempts to take vague urges and feelings and embellish them into important social structures. Tradition is of course one example, and much earlier I pointed out that space exploration might even be another. Facebook actually takes advantage of our desire to build a community of “friends” without any of the effort involved in actually maintaining what we once considered a friendship – it’s prompted by the very name used, “friend,” rather than, “someone I once knew, or maybe someone who knows someone I knew, that clicked on a link in hopes of reciprocation” (SIOKOMSWKSIKTCOALIHOR for short.) We have such a strong desire for social reciprocation and cooperation that we actually get frustrated when life isn’t fair, and think that if bad things happen to us, there must be a reason. Even the trait of curiosity, of determining how things work (which I’m engaging in right now, and hopefully you are too) leads us to believe, all too often, that the entire universe has a reason, when we don’t even have a reason for fire ants (rotten little bastards.) When we think that something has to be the case, perhaps we need to stop and think about whether there’s a distinct rationale behind such a standpoint, or if we favor it because, at some point in time, it helped us survive to think that way.

And now, I ask a sneaky little question: how many people stopped to read this post because of the picture of cute little ducklings? What might you suppose was at work in that case?

[I readily admit that this was not planned, and the duckling behavior memory really did lead me down this road, but I realized how it might work while writing the paragraph above.]

Breaking with tradition

[Originally, I wrote most of these thoughts as a separate article to try and get published, but since the concept of actually getting paid to write has vanished anymore (I knew I should have gone into throwing balls around,) I might as well at least make it public. Granted, a blog is a version of “public” much like the notice of intended demolition of Arthur Dent’s house, but anyway…]

Let’s talk about tradition. Such a simple word, but almost amazing in what it can convey. In virtually every usage, it conjures up an aura of respectability, of culture. Practices handed down through generations, techniques or languages or clothing or entertainment preserved, sometimes painstakingly, from older origins. Just uttering the word in response to a question is almost always a perfectly sufficient answer: “Why? It’s tradition, of course!” Even religion pales before the explanatory power of the word, and in many cases, relies on it. How many words can you think of that communicate so well and require no further support?

But here’s the funny part of it all: ask someone why. Why is “tradition” so complete an answer? Why do we hold the concept of tradition up so highly? And do you get slightly uncomfortable even asking that question? If you imagine asking that of some friend or family member, does their potential response make you cringe? I think most of us would have little difficulty finding someone who might respond rather sharply to such a question. And that, in and of itself, should make us more aware of the power of the word.

Merriam-Webster has this to offer as the primary definition of tradition:

…an inherited, established, or customary pattern of thought, action, or behavior (as a religious practice or a social custom).

That sounds almost too simple to invoke the response in ourselves that it usually does. Tradition is respect for our forebears, and recognition of our cultures. It is preservation of rituals, and continuation of the “line” (whatever that line may be). It is the bearing of the torch, the survival of something we identify with. Well, now, that’s all right then – survival is important, the prime goal of life itself. No wonder it’s such a powerful word.

Until, of course, you compare this concept against the things we normally associate with tradition. Turkey dinners for the holidays? Well, now, I suppose survival isn’t really in question there – soylent green could work as well (perhaps that’s a bad example when we’re talking about survival). Wedding ceremonies? But more and more people are participating in less traditional ceremonies these days, sometimes none at all. Cultural dress or dance? Can we honestly say dancing or neckties or frills have anything to do with survival? From a practical standpoint, is there much of anything in traditional practices that would be detrimental if we ceased to observe it?

Sure, there’s an argument for preserving a culture. Tradition is what keeps alive many of the facets that define a culture to begin with. But again, is this more the power of the word than the importance of the culture or practice? We know rain dances are just a reflection of culture rather than a method to ensure adequate sustenance for crops. If we’ve never seen a rain dance, are the chances high that we will be at a disadvantage because of it? If we no longer know how to properly dye the family colors, can we reasonably say that the world is poorer for it? Those colors could be considered a representation of the family heritage, a coat of arms if you will, or they could simply have been the hue of ochre that came from the local clay. Had the family been given the choices we have now, maybe they wouldn’t have chosen those colors at all.

Looking still deeper, in many cases tradition is a matter of belonging, of marking the distinction of a particular group of people. Our family, our tribe or village, our land, our country – sometimes these are kept alive simply through the traditions that have been passed along, and often these traditions are the last remaining distinctions long after the other boundaries have vanished – “this is the way we did it in the old country.”

But there are two interesting factors behind this idea. The first is that, things change, for good or bad. It could be argued in many ways, but one is that change occurs because the “old ways” are no longer functional, needed, or wanted. Tradition, in such cases, is a resistance to change, but it may be against the tide. Respect for the old ways is not necessarily a bad thing, but perhaps respect for ways should be tempered by recognizing which ways are respectable in the first place. “Tradition” isn’t particularly meaningful in and of itself – there is a difference between a song that records the history of a culture, and a song that speaks simply of lost loves, or even holes in buckets.

The second interesting factor behind the community idea of traditions is that “community” not only speaks of togetherness, but of separation at the same time: those who are not part of the community. The second message behind, “We are the ones who wear the blue and black,” is, “…and you are the ones who do not.” This may seem to be a dramatic take on tradition, but family colors were exactly the way that clans told one another apart on the battlefield. Often, this idea has become lost in time, and the tradition does not recognizably reflect its bloody origins anymore. But in such a case, what is the tradition we’re keeping alive in the first place?

Right now, numerous cultures embrace traditions that, from an outsider’s standpoint, may be anything from ludicrous to abusive, even self-destructive. Respect for tradition, in such cases, may be radically misplaced – “tradition” is hardly an adequate argument for racism, mutilation, poverty, poor health, or countless other detrimental effects. Some cultural ideas do indeed deserve to die out and vanish in the mists of time – change can be for the better. But we can’t see this if we are swayed by the power of a word without wondering what lies behind it.

I had a little more to the article than this, but this point allows me to go on to the thought that stirred this post in the first place. Over at Choice in Dying, Eric MacDonald has a recent post regarding the definition of “New Atheism,” (well, kind of – Eric doesn’t stay to narrow topics,) and within, he talks about examining the histories of scripture and its foundations as divine inspiration:

And then he goes on to quote Irenaeus to the effect that the church did not create the canon; it was instead acknowledged, conserved, and received — as though, in other words, from the very hand of God himself.

But this, quite evidently, simply will not do. We still go back and back, and when we get to the end of a chain of traditions, we find someone with a pen! A human being, just like you and me! So the church, just like the Muslim authorities, took some human writings, no matter how fenced round with sanctity, and then elevated these writings to a stature they simply do not and cannot possess.

Which is where the two ideas came together. The original scribes almost certainly did not run out of their house waving a manuscript wildly and claiming god gave them this great idea for a book. Instead, older writings were selected by church authorities as reflecting divine inspiration (while, as Eric points out, others were not, in a rather arbitrary manner.) But the acceptance of such scripture by the general public, then as it certainly is now, relies on this value of tradition. The strong drive to elevate and indeed revere older sources of wisdom is precisely what gives them value and authenticity.

This idea is supported in three ways. The first is, this is exactly why religion remains active today. Virtually no one chooses their religion, or is ever convinced by reading scripture that it must be accurate – the amount of excuses for the inaccuracies is evidence of that. Instead, people (usually in childhood) are told that scripture reflects the will of the supreme being, and of course, they get to see the elaborate support structure that has grown up around it, the reverence that others place upon it. With no small number of older artifacts and icons, as well. Which is more compelling and interesting: a nice new modern church, or an old church with ridiculously outdated architecture? You know what I mean: the traditional style.

The second way that this is supported is with the histories of the texts themselves. Most of the abrahamic scriptures consist of retelling – almost none of them are contemporary, and even those portions claimed to be from disciples, for instance, show signs of having been written long after the events they relate. The most powerful stories are all historical, in that they do not tell what happened “today,” but many years (centuries!) previously. In fact, the explanation for the age of these stories is often that they were retold with perfect accuracy as oral tradition. This is plainly ludicrous, but such is handwaved away by saying that this tradition was important (which somehow makes it superhuman, it seems.)

And finally, there’s this nasty little fact that many facets of religious scripture have close counterparts in previously existing religions, such as the moses and bullrushes story and several different versions of resurrections. The date of christmas and most of the traditional practices thereof predate christianity (scriptural details point to a spring birth for jesus,) but they were co-opted precisely because they were already traditional. It was easier to morph the whole belief structure into a characterization of previous beliefs than it was to instill a new structure against the power of tradition.

Isn’t that almost frightening? Tradition isn’t just a word, it’s a wickedly motivating force. It raises the question as to whether this is a powerful cultural thing, perhaps one of the most powerful considering how many cultures it spans, or if there’s some kind of internal drive to respect older knowledge over seeking newer knowledge. Is it possible (or even worth speculating on) that there’s some form of evolved mental trait that causes us to fall for the concept of tradition? Tradition itself is difficult to justify rationally, and in all of the history I just outlined above, cultures have changed drastically, but tradition itself remains. It’s something to think about.

[Update: I did, actually – see the expansion of this speculation in the next post.]

Hooray! I scored a “Not Negative!”

Update September 2012 – This was one of the sample posts chosen for the podcasting experiment; click below to listen, if you like (it is identical to the text):
Walkabout podcast – Hooray! I scored a “Not Negative!”

There’s a common argument style that crops up in defense of most of the topics that critical-thinking addresses, such as paranormal activity, alien visitation, religion, alternative medicine, psychic powers, crystal energy, divination, astrology, and the health benefits of smearing yourself with feces. And it’s a very simple one, but fortunately many skeptics and critical-thinkers fall for it. Paraphrased, the argument is, “You cannot prove that [insert topic] doesn’t exist/work.” In other words, “You can’t prove that god doesn’t exist!” and “You can’t prove that the planets don’t influence our lives!”

Now, I’ve addressed this before, most especially with the direct fact that one cannot prove a negative – no one can demonstrate that god cannot exist in some realm we haven’t discovered, or that the alignment of planets does not exert a force we have not found a way to measure. Sure, I’ll openly admit it! You won that round!

Except… what was just won? The argument establishes that [insert topic] might be possible, simply because we cannot actually establish “impossible” as a distinct certainty? Think about this for a second: it literally applies to everything that can be imagined. The universe is infinite, to most accounts, but certainly far bigger than we are going to examine in the duration of our species’ existence. The only thing such an argument really accomplishes is the admission that human beings are not omniscient.

Congratulations on that astounding conclusion! I’d award a cookie for this, but only if the debater is less than six years old. To everyone else, this is hardly a stunning victory. As an argument in favor of any particular topic or concept, it’s remarkably pathetic. I’m trying right now, but I haven’t come up with any way that the bar can be set any lower.

You may have noticed that I accentuated might in the phrase, “might be possible,” two paragraphs up. Because even that is a condition of knowledge, not physics. Anything that we have not established as “impossible,” because of our abysmal lack of omniscience, might still be impossible in our universe, due to laws of physics for instance. So we haven’t even determined “possible” as a fixed property.

Alternately, if you avoid the simple two-choice argument of “possible/impossible” and substitute levels of probability, even that dubious victory almost always vanishes. Probability requires evidence of at least factors within the proposed topic, so that something can actually be measured and statistically compared. You cannot rationally propose an order of probability such as “one chance in ten” without knowing how often certain results have actually occurred. Psychic powers, for instance, could potentially have an order of probability if we could measure electrical fields emanating from our brains (they’re actually there, but hundreds of times weaker that the fields emanating from the camera battery charging at my elbow,) and/or a way of detecting such fields by the brain itself, and/or some way in which future events produce or follow a force which is not constrained by time.

What we’re talking about with that is evidence – establishing positive support of a concept, rather than a lack of negative support. But hold on! This is using the concept of mathematics to apply to physical qualities. Does that even work? Is it rational to apply “positive” and “negative” aspects to the existence of phenomena or properties?

Technically, no. There really isn’t such a thing as a negative existence, and as we determined above, no way to prove non-existence. The best we can reasonably work with is “positive” (proven to exist) and “null” or “zero” (not positive.) Therefore, if you lack positive evidence, you’ve only established zilcho.

And that’s the argument. “You can’t determine a quality which doesn’t actually exist for [insert topic], so you have to admit to nothing!” Sure. Whatever makes you happy. Someone can even try this argument while taking tests, and not mark down any answers at all. “Ha! You can’t prove I had a wrong answer!” But I bet I can predict what the final score comes out to…

Too cool, part eight: It’s not the tool, it’s how you use it

Green herons (Butorides virescens) are cool birds. Small, subtle little guys, they tend to be pretty shy in these parts and not pose for photos all that eagerly – the shot above (and here) was taken at Wakodahatchee Wetlands in Florida, a manmade preserve smack in the heart of suburbia at Delray Beach that has to be seen to be believed. Cross the boardwalk bridge from the parking area into the wetland proper, and the cacophony of bird sounds is likely to hit you almost physically, sounding like an overdone jungle movie. And like many such areas in Florida, the normally shy birds are acclimated to people and allow much closer approaches. Yeah, you thought nature photography was all about careful stalking skill and sitting for days in blinds? It’s also about finding the places where you can get closer with less effort ;-)

Over at Why Evolution Is True, Jerry Coyne has a post about green herons and their usage of tools, a very rare thing among birds. It seems some of them, without this being universal among the species, have taken to bait fishing by placing bread fragments on the surface of the water and waiting for it to be visited by fish, which is of course what green herons eat. Check it out, because he’s got video of this going on.

As I remarked upon over there, I’ve never seen this behavior myself, but I have seen white ibis, the birds also seen in the beginning of the video, performing this dunking maneuver, only to soften the bread it would seem. And I’ve cheated a little (just destroying all of your cherished beliefs about nature photographers, aren’t I?) and tossed bread into a pond in front of my camera to get egrets to chase the fish it attracted. Learning this behavior is an interesting bit of cognition for birds.

Yes, that's exaclty what's happening here...

It’s hard for me to say how this compares to other species and behaviors. Great blue herons (Ardea herodias) have apparently learned that, in Florida, five-gallon buckets often contain easy meals, since live-bait fishing is popular in that state and such buckets are used to house and carry fish like finger mullet. Fisherfolk learn to keep a lid on their buckets, because the herons can get pretty brazen about landing on fishing docks and helping themselves. I’ve also watched brown pelicans (Pelecanus occidentalis), who’ve learned that casting nets can suddenly come ashore bulging with effortless meals, mobbing the poor angler who was collecting bait for a subsequent fishing trip. Many years ago when milk in England was delivered to doorstops daily, I remember reading about songbirds who would pierce the foil seals on the tops of the bottle to obtain a drink (which was bizarre in itself – milk?) None of these are tools, of course, but they all show a certain level of learning behavior, not very far removed from tossing bread into the water.

Alternately, while this isn’t directly related, I’ll include a link to show just how large a fish a great blue heron can manage to swallow whole.

The most important thing you’ll ever read

While I pick on religion a lot in this blog, this is reflecting what I see as a greater need at this point in time; in contrast, a few years back I was quite active on UFO and paranormal forums, and have dueled over topics such as health foods, astrology, and alternative medicine. They all fall under the big umbrella of critical thinking, or to be more precise, they’re all wet precisely because they don’t.

The thing is, we as a species are notoriously bad about rational thought, and fool ourselves in so many myriad ways that at times it seems this defines us more than our intelligence does. Worse than this, however, is the open defiance of this concept, this curious failure of humans to recognize it when we are wrong, or to even consider the possibility. All of those topics I mentioned above, and many more besides (politics comes to mind,) are prey to this – it’s probably safe to say there isn’t a facet of human culture that is not. Which is why I promote critical thinking, and the foremost part of this is adopting the premise that we can always be wrong.

Looking back, one thing in particular helped this aspect along, for me. In the early 1980s there was a magazine called, “Science 80,” only it reflected what year it was actually issued within, thus “Science 81” and “Science 82” as it went along, not the best of naming moves. It’s defunct now, and I cannot locate this particular article to provide credit, but it dealt with suggestibility and implanted memories. It featured a college study of eyewitnesses to a supposed crime, actually staged, with the criteria that a stolen item was described by the “victim” and later recounted to security guards by the eyewitnesses. The details provided about the item, in some cases quite specific, didn’t actually relate to the item the eyewitnesses saw – they had actually not seen an item at all, because there was nothing stolen. All details were supplied by the “victim,” or in some cases just imagined.

There have been a lot of studies about this, really, and it boils down to one very simple idea: our memories are not like recordings, able to be played back with fidelity, but extremely malleable instead. We can actually respond to suggestibility on an astounding level, and worse, have no way of differentiating this. There’s at least one study now that indicates how memory may be a single-use kind of thing, and we retain memories because every time we play them back in our minds, we rebuild them into a new memory. That one, of course, may have attendant details from the rebuilding. A good example is how we remember movie quotes, and the number of them that are simply wrong (“Luke, I am your father,” and “Play it again, Sam,” being two of the most quoted that never actually appeared in the movies.)

We also have a nasty tendency to color our experiences in terms of expectations, assigning traits or categories that are not supported by what we’ve actually sensed. Sometimes, this is influenced by something that we’ve never encountered ourselves, but have only heard about. It becomes pathetically easy to obtain ghost encounters from virtually any building or locale, but the darker and older, the better. All you need is to create a reputation with a few stories. I don’t think I’m surprising anyone by saying that every odd sound or visual phenomenon instantly becomes a ghost in such circumstances, but perhaps many don’t realize this is not something experienced only by those obsessed with ghosts – it’s something we can all be haunted by (a ha ha.)

And then, there’s the common experience of déjà vu [or just deja vu if the accents didn’t render], the distinctive feeling that we’re actually encountering a repeat performance, or a precognitive memory of what’s happening to us at a particular moment. Those that have experienced it usually find it very compelling, and I can vouch for that, but notably, it hasn’t been shown to be actually precognitive in any way. In other words, no one seems to have ever documented it, providing a written account that existed before the experienced event. Instead, it always seems to be this odd feeling of memory just as the event occurs. What this suggests, and studies have supported this, is that the feeling of this being a memory is the defining trait, not the actual existence of the memory. We get a stray feeling of, “Oh, yeah, I remember this!” but not because we actually do remember, but simply because the emotional response typically associated with memories triggered improperly – a false alarm.

The same can be said for many of those great ideas we have just before awakening, which fade away too quickly for us to remember them. Some of the people who have successfully retained them find they’re total nonsense in the light of full consciousness – it wasn’t the idea, it was the eureka emotion all by itself. Who can’t remember a dream, perhaps a lot of them, where emotional properties were assigned to items or events that didn’t seem the least related?

If we consider our minds as this great device for thought and experience, and that memories are indelible records of experience, we’re quite simply mistaken – this has been evidenced and indeed proven, time and time again. But many people never realize this. In fact, in discussions of UFO and paranormal events, the biggest influence by far is not eyewitness accounts, but the weight given to them. Even raising the question of whether the witness actually saw what they believe they did is usually considered impugning the witness, and can immediately get someone labeled a debunker, or closed-minded. The irony, that this failure to recognize the possibility of human error is more closed-minded than considering the possibility, is not lost on those urging critical examination.

The aspect of suggestibility is not only known to courts of law, in some cases it is actively promoted. Attorneys have their clients rehearse their stories over and over again, and this is not because their memory is so indelible. As it says in the study linked above and again right here (at the very least, read the first few pages,) even the careful use of certain words can influence the impression people have of events – smashed instead of bumped being their example when referring to a car accident. Often, this isn’t even intended, but a by-product of both popular opinion and media influence. In a high profile story in The New Yorker magazine, a father of three was convicted of arson and first-degree murder, partially on the testimony of neighbor eyewitnesses to the fire. But the neighbors’ testimonies changed drastically after news reports that the investigators were considering arson as a cause – before that, they had maintained that the suspect showed no indications of unexpected or remorseless behavior. And while such effects are well-known to courts, eyewitness testimony is still treated as much more trustworthy than it should be, because humans relate to the emotional aspect of the witness’ account, with little recognition paid to the fallibility. Since attorneys can benefit from this, they’re not going to be the ones who draw attention to the undeniably damaging aspects of it.

Only a few decades ago, there began a dramatic upsurge in repressed memory therapy, the practice of interviewing and sometimes hypnotizing patients to discover memories, almost always of childhood sexual abuse, that the patient had supposedly suppressed in horror and loathing. Hyped by the media and promoted by all those people who delight in scandal, it became a highly-regarded practice until a few huge settlements on mistaken cases brought attention to the well-known fact that hypnosis actually increases suggestibility, and therapists can influence a patient’s story. Far from revealing hidden records of past events, such therapy can be a fantastic tool for implanting false memories. Is it any surprise that certain therapists were known for their specialization in repressed memories? Is it a greater surprise that a very large number of their patients demonstrated, to the therapists, unmistakable evidence of past abuse, so much so that one made the astounding claim that repressed memories were present in up to 60% of sexual abuse cases? The fields of psychiatry and psychology routinely deal with mental health issues from the inability to forget traumatic experiences, but somehow this trait seemed to reverse when it came to repressed memory therapies. Eventually, the practice started receiving the critical examination that it should have had in the first place, but not before tremendous amounts of damage were done in pursuit of ephemeral “memories” treated as if they had the strength of physical evidence.

Exactly the same thing was at work in alien “abduction” cases, with a few prominent therapists promoting the practice while – and I know you’re going to be shocked at this – offering their services to help reveal such repressed memories. In abduction cases, the repression was supposedly induced by the aliens rather than the patient, but the gist is the same. For both childhood abuse and alien abductions, however, one very distinctive trait is that corroboration, in the form of physical evidence, other witnesses, and such, is virtually impossible. One should certainly be suspicious of therapies touted with a high degree of accuracy that cannot possibly offer any way of determining such.

I want to take a moment here to point out something. In many of these cases, perhaps most, the therapists were not actually trying to create false memories, and honestly believed they were helping their patients. What might have been at work is a combination of things. One simply being pride, in that the therapists felt vindicated and supported by the “positive” results of their therapy, and stayed with methods that seemed to work most effectively. The other is related; as trained professionals (most of them, anyway,) they may have felt they were aware of and unable to fall for the trap of leading the patient along. The possibility of the false positive wasn’t controlled for.

This really means that it’s up to us. We’re not perfect beings, and our senses and our minds are not infallible – in fact, they are prone to errors, so many that we cannot even be aware of where they might be influenced. We need to recognize this, and in fact remain suspicious of our very abilities to experience what goes on around us. It sounds a bit like I’m following the old Descartes argument here: “How can we be sure of anything?” But that’s ridiculously extreme, and like much of philosophy actually leads nowhere – what can you do with that? What I’m saying is that we can be fooled in many ways, so making some effort to support our conclusions is not only useful, it’s practically a necessity. Along with always bearing in mind that we still might not be right after that. Being wrong is okay, and in fact unavoidable. Refusing to realize this and/or correct ourselves is not. There’s an old saying regarding scientific research, to wit, that one needs to be even more suspicious of findings that support a favored theory, because we want to see this too much, and can easily miss the findings that contradict it. In skeptical circles, this is called confirmation bias – counting the “hits,” the positive evidence, and ignoring the “misses.”

Worse, this isn’t helped by belief in religious creation – it can be actively harmed by it. Feeling that humans are “chosen” creatures designed by a perfect deity doesn’t leave a lot of room to feel that we can make mistakes, despite the glaringly obvious evidence that we can. But recognizing that evolution shapes life largely by trial-and-error, and that humans are a product of utilizing old functions in new ways (a work in progress, if you will) allows that we may not always operate the way we’d prefer. It is, perhaps, a nod to the functionality of this process that we can recognize fallibility for what it is, rather than either being oblivious of it, or denying it because we’d rather not believe it’s true.

This might be unnecessary to point out, but convincing people of this when engaging in critical examination of certain topics is remarkably hard to do. People don’t like admitting that they’re wrong. Amusingly, it could be an example of that imperfect job that evolution does. Being wrong is certainly a good thing to avoid, for obvious reasons. But the emotional reaction within us that helps us avoid this isn’t specific enough – it doesn’t differentiate enough between trying to be right, or simply not admitting that we’re wrong. Too often, if no one actually catches us in a mistake, this is sufficient. It shouldn’t be, and we need to pay attention to those circumstances when we’re, in effect, in denial about actually being wrong, and concentrate instead on ensuring that we’re as correct as we can be. This is not the same as winning an argument, by the way, a mistake made far too often.

For skeptics active in debate, however, there’s another aspect to be considered right alongside this. That emotional drive against being wrong means that, even if we have produced an unshakable and irrefutable argument, our “opponent” (for want of a better word) is highly unlikely to concede – we’re not going to see a clear victory. The admission of being wrong is almost certainly going to be a private one, sparing the embarrassment of public recognition. We cannot, and should not, expect to see someone change their mind. Our only goal should be to present the most cogent arguments that we can, and leave it be. Let the seed grow. It can be frustrating, to be sure, but it also leads to a better process: simply presenting the case and stepping away, without the emotional investment of seeking a resolution or victory.

Of course, considering ourselves skeptics and critical-thinkers doesn’t absolve us of error-prone traits, either, and like some of the therapists outlined above, may cause us to drop our guards and feel we’re not likely to be caught. In fact, it’s probably safe to say that all of us are wrong, in one way or another, every day. It’s unavoidable. But the distaste we feel over this thought should be channeled towards correcting our mistakes, instead of avoiding recognition of them.

And by all means, don’t take my word for it. There are a lot of resources available to examine these topics in much more detail, and I encourage you to check them out on your own.

Just because, part four

I just wanted to throw this one out there, because I liked the effect. It was taken four years ago as an experiment, and came out differently than expected. Take a moment and see if you can figure out how it was produced.

I can provide a clue: Most times, TTL flashes operate by measuring the light that makes it to the exposure meter within the camera, and gets shut off when enough is detected. This happens remarkably fast, in a few ten-thousands of a second, so while the flash may look instantaneous to us, it is actually started, then halted by the camera when it determines enough light has been received by the film or (in this case) digital sensor. Except, when the subject is dark and insufficient light is being received, the flash can actually discharge the entire capacitor, which results in some light fading at the end of the exposure. Therefore, moving objects appear brightly lit at the beginning of the exposure, but get dimmer towards the end. This can result in streaking, with the apparent direction opposite of the actual.

Not enough? How about if I tell you I was aiming straight up?

If you haven’t gotten it with those, I’ll simply tell you I stuck the camera out from under the edge of the roof during a downpour and fired off a frame with the flash, which illuminated the rain. The closest drops showed the greatest apparent motion, appearing to be moving towards the center of the frame by the fading flash, when in actuality they were falling past the wide-angle lens. Some drops are well out of focus, others not so. The color effects, such as the red spheres to the left side, I haven’t fully explained, but may be illuminated by light passing through water droplets on the flash head, which was refracted into different colors of the spectrum. I would have suspected a more uniform effect among close drops, though, so maybe it’s the flash light refracting through drops suspended in the air, illuminating close neighbors.

You can click on the image for a slightly larger version. I haven’t done any editing on this at all except for resizing – no color enhancements or contrast changes. If you noticed some “dead patches” in the frame, these were most likely caused by water drops already on the lens, blurring the drops beyond distinction. Just a neat effect, and I was impressed with the amount of color that showed up.

The growing threat to our nation’s parents

A series of Tweets from teenagers across the country is shedding light on what may be a serious menace to American parents: their growing inability to chill the fuck out.

Spurred on by books by “leading sociologists,” as they’re often referred to in parental circles, child-rearing adults may be falling victim to an insidious trend that might be bigger than most teenagers believe. “I’m kinda worried,” texts one adolescent, referring to her mother after an unexpected sit-down talk, “she thinks I’m sexting strangers. Srsly.”

The trend, brought to light on a Facebook page called, “My Parents Are ZOMGing,” shows disturbing tendencies for parents to treat changing social communication methods as indications that their children are in danger. Many of the comments on that site compare adult reactions to normal teen habits; some of these reactions are causing alarm and, most notably, exasperation. Brandon Ellerby sums it up in his status update, posted from his cellphone while on the schoolbus: ” [My] Dad says Im spnding 2 mch time on MySpace!” The status earned 72 Likes from among Ellerby’s 214 Friends, and started a thread reaching 46 comments so far and growing – only a few of them offering possible explanations for his father’s erratic behavior. Most teens agree that this is a symptom of a much bigger concern.

Some young adults are placing the blame on the ever-present market of adult-help books, which exploit the typical concerns of parents by exaggerating, and in some cases inventing, extreme consequences of social media. Many of these books, which are not screened for content by teenagers or responsible editors, take isolated situations and trumpet them as likely outcomes. LaWanda Corbin, a nineteen-year-old college freshman studying statistics this semester, gave us more insight from her cellphone between classes. “There’s, you know, always weird things happening somewhere, because there’s a shitload of people in the country. So some book writers pick these stories out and make ’em sound like they’re happening all the time. Then Oprah gets the book on her show and everyone thinks it’s serious ’cause she says so.” Corbin has previous experience with such influences, having been regularly fed fish oil capsules a few years ago when her mother read about it in Martha Stewart Living magazine, a periodical aimed specifically at parents, especially the vulnerable mother market. “She goes for all that fad shit,” added Corbin. “She doesn’t know that most of those articles and books are based on reading studies wrong; they don’t say what the writers say they do.”

Others have had to cope with their parents trying to find evidence of threats where none exist. “My mom keeps looking through my texts when she gets ahold of my phone,” explains Cesily Andrews, a sophomore at River Valley High aiming to get into marketing in a few years. “She keeps asking me what ‘WTF’ and ‘BRB’ means, but she doesn’t ever believe me when I tell her. It’s like she gets mad ’cause I’m not dealing drugs over my phone or something. Then she starts telling me I gotta use proper words when texting! But she bitches when I spend too much time texting, too! I mean, make up your mind!”

Teens that have tried various methods of addressing this trend caution that it can often backfire. Dylan Mackie, who handled such situations for six years before attending a college out-of-state, tells of the hazards of trying to go it alone. “You think that your parents are just worried, you know, all parents do,” he says, shaking his head when we spoke to him at the coffee shop as he Tweeted from his laptop. “So you Friend them on Facebook, so they can see what you’re actually doing, and chill out a bit. But then you have them jumping into threads with your Friends, and reminding you of curfew, or asking who’s going to be at the party. They don’t relax at all – they just look for more shit to freak over.” After a brief pause to read a reply, Mackie went on, “But the worst is when they try to be cool. It’s totally gay. Take my word for it, just tell them you don’t even have an account.”

Most parents, when asked, see this as much more typical, and nothing to be alarmed over. “Parents should be concerned over what their children get into,” says Jim Therbutin, a longtime father. “It’s not like when we were kids, where the worst you could do was get a girl pregnant. Sometimes, that even turned out okay,” he added thoughtfully. “But now kids are having sex at seven years old when their wristbands get broken, and running up multi-thousand dollar phone bills, and getting trapped in balloons. Look at YouTube! We never did stupid stuff like that when we were that age! You’re not being a good parent if you simply give them some good guidelines and let them have a little responsibility and freedom. Kids just don’t have the sense we had when we were young – it’s all these violent video games and Lady Gaga, it means a parent has to keep their children from going wacko. It’d be irresponsible not to.”

Despite this viewpoint, however, the nation’s young adults remain concerned that their parents have lost their perspective and their grip with reality, and are considering drastic measures if it goes on much longer. “I just can’t wait until I move out,” we were told, reflecting the growing sentiment of many American teens.

*      *      *      *

Thanks to World of Weird Things for the idea, and Sherry Turkle for creating another exploitative book about nonsense fears. And I apologize for resorting to the stilted hackneyed tabloid style of writing…

Dealing with the real world

You know, it’s not too often that I select articles to respond to here, mostly because my readership is small and I’m fairly remote – I’d rather respond where the article appears, and reach the same audience. At the same time, I’m more often simply passing on thoughts that stand alone when I can. But this one not only deserved a response, it demonstrates some of the larger problems which bear their own examination.

BioLogos is an organization that seeks to rectify the “warfare” between science and religion, and get them to live in harmony – don’t take my interpretation of it, you can see their mission on the About page connected to that link. It’s funded by the Templeton Foundation, which has much the same goal. The problem is, there’s really no disharmony or warfare involved. Science, as I’ve said before, is merely a methodical process of learning. It examines nature and, through tests and measurements and lots of attention, finds out how things work. Scientific laws aren’t arbitrary creations of scientists, but merely expressions of natural constants – if you have no idea what they are, they still work exactly the same for you.

The conflict (which often gets treated as “warfare” and deliberate attacks upon religion if you ask the religious) comes in when nothing in the slightest supernatural, mystical, or religious is found when we do this – in other words, no evidence for religious “truths” are ever found. At all.

Normally, this shouldn’t be any big deal, right? Who cares if science can find god? Science is imperfect, but god is everywhere, and all that. Except, no one can seem to agree on what god is or does. As we get more detailed about god’s word and god’s functions and miracles and such, they aren’t actually leading us anywhere, providing any knowledge, provoking the human spirit to new heights, or, well, anything, really. Science, on the other hand, is responsible for massive gains in knowledge, ability, health, welfare, transportation, understanding, and so on. We use it, we rely on it, we’re incredibly happy with it, because it works. And after relying on religion to help us burn people at the stake, drive off demons causing headaches and convulsions, tell us who wasn’t “chosen,” and all that fun stuff from a loving god, science started giving us things that actually worked. So, we use it.

What this actually means is that religion is fading away and losing all the power it once had, and those that wielded this power are quite perturbed by this. Those that liked the idea that they were “chosen” and “saved” and guaranteed a fun time for the remainder of eternity are frustrated that this guarantee looks like it won’t stand up in court. And those that feel that the world shouldn’t be unfair don’t want to hear that “fairness” is only a human concept, and adversity just something to be dealt with. So, there’s resistance. That brings us to the article.

One World: Science and Christianity in Respectful Dialogue, Part I” by Loren Wilkinson is what BioLogos seems to think is a reconciliation between science and religion. It’s unfortunate that Wilkinson doesn’t seem to even understand what science is, much less how it works, and indicative of a greater problem with BioLogos‘ supposed mission is that this article falls so wide of the mark. The first sentence crashes spectacularly not just once, but twice:

The BioLogos Foundation, with its commitment to the “integration of science and Christian faith” is one of many signs that the 150-year-old idea of a “warfare” between science and religion is ending.

Well, the attacks of religion against science have been going on for a lot longer than 150 years, because they certainly didn’t start (as I’m sure Wilkinson is implying) with Darwin’s Theory of Evolution by Natural Selection. Pretty much from the start of christianity, it’s been trashing the rights of human beings, torturing those that failed to avow their obeisance to the scripture, and pillaging those areas of the world considered not holy enough. Galileo was forced to renounce his work on planetary motion – you know, that Earth revolves around the Sun? – and Giordano Bruno was burned at the stake for refusing to recant his theory that the sun was one of many such stars in the universe. You have heard of witch hunts, right? Wilkinson apparently hasn’t. So, either Wilkinson learned absolutely nothing from history, or wants to ignore it entirely in favor of minimizing the church’s warfare against people who dared examine the natural world and learn from it, as well as people who simply didn’t find scripture all that useful, who got to be tortured and killed too.

That’s part one. Part two, for the same sentence, is that BioLogos isn’t really a sign of the end of the warfare, because it doesn’t actually reconcile anything. All it’s doing is fighting to make religion seem relevant to a world that no longer needs it. Let’s keep going, because you may be thinking I’m saying that from bias or spite. On this “warfare”:

First, it obscures the recognition that science at its core, is a religious activity, in the deepest and most literal sense of “re-ligious”—that which links. Religion and science both come from the uniquely human passion to see the diverse pieces of our experience as one supple and coherent body of knowledge: thus its connection with a word like “ligament”, the tissue which holds the skeleton together.

Well, first of all, it’s one hell of a stretch to use the word in the form of its Latin roots and claim this is what it means – especially since dictionaries don’t even recognize this usage of the root, much less the word. Both “religion” and “ligament” have the same root meaning, but it isn’t “to link” – it’s “to bind.” That didn’t sell as well, I’m guessing.

But, he might actually be right that religion and science came from seeking knowledge… though he didn’t actually say this, did he? He’s trying to say they’re intended to be united, which is another stretch. I’ll let this one go, though – sure, we seek knowledge. Philosophy and astrology have the same root cause too. But one’s a semi-valid pursuit of understanding, and the other’s a pile of imagined horseshit. Examining roots doesn’t actually get us anywhere, does it? Nor does thinking that, as supposed forms of knowledge, they are united in goal and value.

There is no science without scientists, and scientists are always and only humans, probing and coming to know an inexhaustibly mysterious cosmos by means of their own passions, beliefs, hunches and theories.

No and no. Science is merely finding out how things work – it only needs scientists to express it to others. Natural laws work without human input at all. The structure of scientific investigation is in place solely because we’re fallible and prone to mistakes. And that’s where the second part of this sentence falls flat, in that science specifically weeds out the passions, beliefs, and hunches to reveal only what is. A scientist can be exceptionally passionate about the aether, for example, and believe in it fervently, but when the tests reveal it doesn’t exist, accepting that is science. I point this out because Wilkinson, having started to couch knowledge in terms of emotion, plans to use this shifty little trick later on.

Second, and more specifically, the warfare language hides the fact that the modern tradition of empirical science has deep roots in the Jewish and Christian tradition. The point was first made clearly in Michael Foster’s meticulously reasoned series of articles, “The Christian Doctrine of Creation and the Origins of Modern Science”, published in the resolutely positivist philosophical journal Mind in 1934.

Actually, empirical science (christians do so love that word, “empirical”) has roots that predate christianity by centuries. The Greeks knew the world was spherical, and had pretty damn good measurements of it, before the biblical stories calling it flat and rectangular made it out of the middle east. The Romans had remarkably long-lasting architecture, aqueducts, roads, monetary systems, and such before jesus was born. Ötzi the Iceman possessed smelted ores used for tools and weapons, a thousand years before the time of the great flood. So, uh, bullshit. The point was false long before Michael Foster came along.

The warfare language implies that there were two kinds of knowledge: “religious knowledge”, established only by emotion and authority, and scientific knowledge, established by experience, experiment and testing. If true this would be a disastrous situation, culturally and personally, since it would doom “religious” people to living in a pseudo-reality constructed from dogma and wishful thinking, and “scientific” people living in a meaningless world of emotion-free “facts” each of which they must establish for themselves.

I agree with Wilkinson, it would be disastrous for “religious” people (his quotes) to live in a pseudo-reality constructed from dogma and wishful thinking. Only, that’s exactly what they do. That’s why there’s a conflict with science – because the real world doesn’t demonstrate any of the properties they really want it to have. Boo hoo.

Meanwhile, people do live in a world of emotion-free facts, and despite Wilkinson’s placement of scare-quotes around the word, facts do indeed exist (for an organization that supposedly supports science alongside religion, this seems like a snarky thing to let slip by, doesn’t it? It’s the kind of thing that a godbot that denigrates science would do, actually.) And yes, these facts have no emotions – how stupid can your point be? Emotions are the traits of living species. Wilkinson wants to imply here that scientists do not feel emotion, but perhaps even he knew that this was one piece of bullshit that he couldn’t pull off. That doesn’t stop him from implying that science dismisses emotions further down in the article, though. This is a favorite within religious arguments, as if emotion is the key to knowledge or “truth.” You’ll see it a lot.

But meaning? Yes, there’s another of the favorite words of the religious, and I’ve addressed this too. Despite the fervent avowals of every religion that this is what they specifically provide, I haven’t actually seen it. Depending on who you ask, it’s to “spread the good word” (seems to be a lot of trouble to go through, floods and wars and all that, to let people know about an omnipotent being who could establish it in our minds in an eyeblink,) or it’s something mysterious we aren’t meant to know – this one invariably comes up when the religious get stumped. Neither one really does anything for me, sorry to say. The chair I’m sitting in is “meaningless,” as are the rocks out in the yard. But we can find a use for both, and even gain knowledge from them. In fact, the pursuit and dissemination of knowledge works pretty damn well as a meaning to me, but that’s simply because we evolved to seek it. So?

We live in one world, part of which we know on authority, part of which we know on experience.

Ah, we come to the common religious tactic of assertion! Just a little sentence fragment, intended to justify and legitimize scripture by saying we know something on “authority.” But, pray tell (yow I’m a hoot,) what is this knowledge that we gained on authority? Can we use it in any way? Is it the creation of the world in seven days? The garden of eden? Noah’s flood? Tower of babel, david & goliath, samson & delilah, jesus, peter? What bits have given us something that contributes to our lives and advancement as humans? Be careful now, because I don’t work from christianity being the only religion in the world, and you’ll have to prove that christians, with this special knowledge, are more advanced than others without it. I’ll be demanding real evidence.

Because, the real reason we have BioLogos and Wilkinson trying to become one with science is that science has that legitimacy they both want, and do not have from their scripture. The more we know, the sillier the biblical stories get, and that’s been going on for a few hundred years now.

From here, Wilkinson then starts mixing jargon-speak about scientific models with philosophical speculation about how adam & eve can fit into the facts of evolution – supposedly, you can take this inane concept about human progenitors and “model” it. No, you actually can’t, and the point he’s trying to hide in the smoke is that both human history and simple logic render the adam & eve story as fatuous. Models give us the means to examine what affect such a premise would have on human development – just saying “we modeled it” is meaningless. What do the models measure and produce? Well, they don’t produce anything, because in order to even try to apply their model into what we know about evolutionary development, they have to change the criteria from the scripture to fit the facts. That makes the scriptural accounts an admitted dead end. Even if the model produced some kind of results, the results couldn’t then tally back into scripture because that’s not what they tested in the first place. In fact, they tested nothing at all, because this talk about models didn’t actually involve any method for which models are used. It’s simply a way to make mere idle speculation sound important.

I want you to notice something here, too, because it’s exceptionally common. Wilkinson doesn’t seem to have any issues with trashing “facts,” most especially when they don’t go the way he’d like them to, and his ignorance of church history is phenomenal. But then he openly tries to use science (or an abject bastardization of it, anyway) as a way to give scripture an improved stature. Notice that we’re not even finding biblical accounts that predicted scientific findings, a pretty solid method of saying that knowledge can come from scripture. We’re warping scripture to try and blend it into what science has already established, and in doing so, holding up scripture as “knowledge from authority.” Cute, isn’t it?

It gets better. On the detailed rebuttals from Jerry Coyne and Eric MacDonald:

Both write from within that familiar fog of confusion (typical of both a-theist and religious fundamentalisms) which arises whenever we assume that the only relationship between religion and science is one of warfare.

You are, of course, welcome to read their input and see if either of them seems to be in a confused fog, or treating the advancement of this model theory as “warfare.” Both, instead, know very well that reliable knowledge comes from evidence and support, function and testability. They address why the model theory fails. But it doesn’t work for Wilkinson’s agenda to have to deal with facts, so he openly resorts to demonizing the opposition, perhaps hoping (and probably being right in far too many cases) that his intended audience isn’t bright enough to see for themselves.

And it goes on, with Wilkinson attempting to draw a metaphorical use for genesis by claiming it defines man’s relation to the cosmos, the purpose of it, and the purpose to man. That is accomplishes none of these is simply ignored outright – again, if it has been asserted, there is no reason to support this assertion with something so crass as evidence, is there? Welcome to what you receive when you seek real answers from religion.

Theologians throughout the entire history of christianity have spent untold years trying to reconcile scriptural accounts in some euphemistic or metaphorical way, but none of them have ever stood up to scrutiny, much less produced any kind of supporting evidence. Sooner or later, you have to come to the realization that the story is simply bullshit. The very fact that it’s taken this long (and will still continue at least a little bit longer) is evidence that the search for religious knowledge isn’t going to work until you treat it like science does, and accept the bad results with the good. It’s fine to start with a premise and see how well it fits, but when it doesn’t, even after all this time, you need to look for something that does. Oh, wait! We’ve already found that long ago! Thus, the “warfare” isn’t a duel between science and religion, but between emotional crutches and accepting reality. And if you want to challenge the idea of our current theories being “reality,” I’ll let you have that point – we can’t prove anything beyond a shadow of a doubt, we can only find what has the greatest evidenced support. That example of greatest evidenced support that Wilkinson alludes to in the beginning, mind you, forms the backbone of biology, anatomy, medicine, and other disciplines, and manages to both function and predict with incredible accuracy. You know, that’s kinda why we use it.

The article even goes so far as to call biblical scripture a “data set,” which is so astoundingly dishonest about science and data that it shows BioLogos as the lie that it is. No one intending to promote science in any way as a useful function could consider haphazard writings from several thousand years ago as “data,” except as evidence that humans indeed had a language then. And of course, to consider it useful information, you would then, if you had an honest bone in your body, have to consider all of the other scriptural “data” as well – you know, the ones from religions that don’t agree with christianity. BioLogos, naturally enough, cannot do that, because it says right there in their mission that christianity is the only standpoint they promote. You may notice that they do not establish why christianity wins this competition, but I imagine that if you ask, you’ll get answers much the same as Wilkinson’s wordy but vacuous article here.

And that’s really the lesson to be learned from all this. When you’re in pursuit of your agenda, you can resort to a lot of tricks to try and establish support for it. Putting together a long litany of impressive-sounding points only works if someone is concerned by the sounds, not the content. If the content won’t stand up against critical examination, you may actually get called on your blatant and biased attempts to skew the results in your favor – in which case, you can decry the language and the disrespectful tone as Wilkinson does. This ignores the fact that nothing in his treatise smokescreen deserves respect, but only demonstrates a shameless manipulation of both facts and emotion. This is the legacy of religion – if you dare to examine the history, you’ll find that this hasn’t ever been different.

My question has always been, why do people find this so compelling? Isn’t the flow of bullshit from the most valued proponents embarrassing enough to avoid? Doesn’t the lack of a decent argument indicate something?