So much for being nice

Atheists are often accused of not being nice, for a variety of reasons. One is, we have no outside moral guidance such as scripture, so we obviously have no morals – like morality is this unintuitive concept (hey, some people assume you are as godawful stupid as they are.) Mostly, however, it’s from the idea that we don’t let people slide on concepts like special pleading, arguments that are considered exempt from either support or logic. What’s funny is, in many ways we’ve been way too nice, and I propose that it should stop immediately. There’s such a thing as letting bullshit go on for far too long.

A prime example was recently discussed at EvolutionBlog and Why Evolution Is True. We have, yet again, some religious apologists making a case for biblical scripture being figurative, rather than either metaphorical or literal, and chastising anyone for not taking this into account. And this is simply a variation of every theological argument proposed in the last century. What’s missing, and what is always missing from every assertion of this kind, is any reason whatsoever to believe it. This discussion shouldn’t even be happening, but it is, solely because we’re being too goddamn nice about it.

Let’s pin it down specifically: the bible is wrong. The earth is not flat, the sky is not a ceiling, light comes from stars, humans evolved and are much older than scripture relates, birds did not form from the air and mammals did not form from the dirt, and on and on and on. Every last thing that it states, that was not obvious to the people living at the time, is wrong. That’s a hell of a lousy track record. We have tons of facts to support this rampant inaccuracy too. Tellingly, this is pretty well established now even among most theologians. Which is where the special cases start rolling in.

“But,” the theologians trumpet, “the bible is not meant to be taken literally!” Which is a crying shame, I think, for everyone in the fifteen to twenty centuries before they arrived at this remarkable conclusion, because every religious person did believe it was literal before then. Absolutely no one, not one theologian, proposed that scripture was not a (divinely dictated, mind you) accurate historical document – it was only when we ran into distinct issues with what was related therein that someone suddenly announced the particularly literary devices of “figurative” or “metaphor.”

Now, there are good reasons to use both figurative speech and metaphors, but only when the reader can actually see them for what they are. Mistaking them for fact means they haven’t been used effectively, which makes the bible the least successful piece of literature in history. Perhaps this means that god is indeed perfect, but his editors suck balls? [There’s a straw for the theologians to grasp desperately at, free of charge.] Moreover, figurative and metaphorical writing usually have a particular structure to them, something that easily denotes “fable” but allows the reader to make a comparison to real-world situations. In other words, they’re used for fiction. There aren’t too many historical documents written figuratively, nor any freaking reason to do so. In fact, it’s probably the last thing you would want to do when providing historical information, except for perhaps writing it in Pig Latin.

So, cards on the table and– oh, look! An elephant, right here in the room! Such claims for literary devices within scripture are simply desperate measures to try and rescue a belief system from its fatal flaws. You know it; I know it; they know it. It’s infantile and petty. Why should we have to provide some kind of respect to anyone who proposes such, as if the idea has the least little merit? Why, even, should we listen to any explanation from someone before they have managed to convince all of the other theologians, so there’s at least consistency in the approach? Does it serve some purpose to listen to every insipid guess at why such scripture appears grossly inaccurate, from someone who does not even have the basic honesty to consider that it appears inaccurate because it’s simply made up? Why, pray tell, should atheists feel obligated to be the only ones in the room with an open mind? Is this getting us anywhere?

Just blurting out some excuse isn’t enough. There has to be a reason why such a situation would be not only evident, but preferred. Seeing such aspects as “the fall” and “original sin” as only metaphors means that they do not have the properties they had when literal, which completely trashes their value in the first place. Why should anyone need a savior when the threat isn’t real? Large sections of scripture are intended, so we’re told, to be the operating manual of mankind, yet they’re wishy-washy and vague? Okay, someone may be vapid enough to believe such a premise (or, more likely, too lazy to ever examine it in the first place,) but it’s insulting to expect everyone else to be as stupid. Worse, that we’re not being fair in considering it. I’m funny this way, but I think treating a stupid idea as stupid is the very definition of fair.

A theory is not composed of one stab in the dark. It must explain all of the evidence that we have, and logically produce the results. If that legwork hasn’t been done by the supposed masterbrain forwarding the proposal, there’s no point in wasting any time at all listening to them. We should feel completely free to tell them to go home, do the whole problem, show their work, and above all, convince the majority of chuckleheads who even want to believe scripture in the first place that this is a viable theory, before attempting to put it past those who really couldn’t care less and have absolutely no use for it. Because, and I know this comes as a shock, the world works just fine without mythology, and proving scripture provides value only to those who stand to gain some power or indulgence from it. Claims of moral guidance have had two thousand years to establish themselves as valid – that’s probably a sufficient length of time to see that they’re not working as planned.

Perhaps it’s time to stop being polite by letting every nitwit with a sudden idea blather about literary devices and special rules, and instead require some distinct benefit to be proposed, from the very start. Everything else works that way. I think it’s time for theologians to grow up and take responsibility.

It’s a head-scratcher

Richard Wiseman is very fond of conducting psychological research on his blog, and I have to appreciate his latest. He asks, very simply, that if you had the power to make a child either smart or pleasant (but not both,) which would you choose? I’m going to examine this a little after the jump, so if you prefer to participate unbiased by my thoughts, go there now before proceeding.

Now this is proof

I’ve had discussions about evidence with a lot of people, mostly in the effort to establish to them that what they were relying on as their own “proof” was questionable at best – more often simply wishful thinking. Confirmation bias is perhaps the worst trait that humans have, allowing us to assure ourselves that we’re right, without all the hassle of actually establishing it. Very efficient, perhaps, but not terribly useful.

Absolutely no one, for instance, has accomplished the level of proof that I have, just yesterday. Forget grilled cheese sandwiches and silly little shrouds; it’s really hard to argue against finding this when I was clearing out a planter from last year:

Shhhh! TV…

I know it’s short notice, but I just found out about it myself, courtesy of The Manatee. If you get Discovery Channel, there is a new show premiering tonight right after Mythbusters, going by the pseudonym of Penn & Teller Tell A Lie, and it sounds like it should be pretty cool. I’ll be recording it, so if you miss it, come on by and bring popcorn.

But not beer…

See? I knew guilt trips would work…

Cultural blind spot

People who pride themselves on skepticism and critical thinking sometimes get accused of being as guilty as anyone else of bias, and of favoring their existing viewpoint when examining the facts, with arguments such as, “atheism requires just as much faith as religion.” Such accusations are occasionally true (not as often as they’re used, mind you.) Being totally open-minded is hard, partially because it’s much easier to expend the time and effort just once to reach a conclusion, and thereafter we can rely on that conclusion – or so we’d like to think. It occasionally requires some effort to re-examine what we do just to be sure we’re being fair and open-minded.

Sometimes, however, we can fall into the same thoughtless traps, letting ourselves adopt cultural ideas without wondering why, without ever applying any skepticism to them in the first place. One in particular I have seen numerous times, and while the prevalence of it might be a little bit lower in the skeptical ‘community’ than elsewhere, it is by no means viewed with the same jaundiced eye as countless other topics that we consider every day. And that blind spot is alcohol.

I’ll be blunt: alcohol is really fucking stupid. We not only consider a substance that damages the body and alters the brain to be acceptable, we actually glorify it and consider it a necessity. Our culture, not just in the US but throughout most of the world, considers it a standard part of entertaining, the accepted way of winding down after work, the key thing to imbibe when viewing sporting events, or when hanging out with friends, or to toss back at picnics… the list goes on forever. It’s not like it even adds anything – it doesn’t have any real taste, and only contributes a burning sensation to anything it is within. Because of this, we try to disguise it with lots of other flavors, or occasionally engage in testosterone-fueled rituals like tossing back shots just to demonstrate that we won’t cough out caustic substances. We’d consider this stupid if it were something like paint thinner or drain cleaner (well, most of us) but it’s cool when it’s alcohol.

Why?

It helps me to relax. No, it’s not needed “to relax” – the most functional way of doing this is to sit back and, you know, fucking relax. The amount of alcohol in the average beer contributes nothing to this, aside from the fact that it’s also a stimulant – people don’t drink caffeine to “relax,” so how does alcohol count? Yes, eventually it gets down to the depressant part, which is where all the really nasty issues come in. Relaxing is such an easy thing to do, with so many variations available, that alcohol is perhaps the stupidest method of achieving it, short of maybe copious bloodletting. Not to mention that it is occasionally useful to overcome relaxation when we need to be alert or responsive, which the human body can handle just fine when it doesn’t have to cope with depressants.

It lowers my inhibitions. This is said like it’s a good thing. For every part that it makes someone less nervous to speak to the opposite sex or get up on stage, it’s twice as likely to make them less capable of knowing what is inappropriate, much less exhibiting any intelligence at all. The rational part of our brains is there for a reason, and quite frankly, far too many people don’t give it enough exercise in the first place. Taking steps to reduce its effect only results in even stupider actions. It’s hard to understand why we might think we have too much control over our actions.

It helps me have fun. Okay, sit back and think about what “fun” is supposed to mean. First off, if someone can’t have fun without altering their brain function, they need therapy. Second, laughing at something that one wouldn’t find amusing when sober hasn’t got anything to do with having fun – it’s simply triggering disconnected reactions. And it’s really, really hard to accept such an argument when a significant percentage of people who become intoxicated turn into total douchebags, and the only way to believe that they’re not is to be intoxicated yourself when viewing them. Maybe, just maybe, it might be a useful idea to raise the bar on what’s entertaining, rather than lowering one’s expectations to meet it. Just a crazy thought. In fact, it leads one to suspect that network programming in the US exists in its current state only because beer exists.

I need it to forget. I’ve known a few people who considered alcohol some kind of cheap therapy for something that they didn’t like about themselves. Unfortunately, I’ve never actually seen it work this way; they don’t forget, and in fact usually get affected even worse by that whole ‘reduced inhibition’ thing. Forgetting only takes place the next day, but here’s a clue: now everyone else knows, too. It’s one thing to wake up the next morning and not have any recollection of those regrettable things bothering you last night; it’s another entirely to not only inflict them on yourself with even more impact, but to inflict them on others as well, especially when you won’t be in any state where their input might have some positive affect.

And then, there’s the arguments that I really don’t have to expand on:

I like throwing up, violently.

I like falling down without adequate reflexes to protect myself.

I like damaging my car for no good reason.

I like going to bed with someone without exercising any judgment.

I like waking up someplace I don’t recognize.

I like getting ridiculously belligerent, maudlin, or obnoxious.

I like not remembering what I did last night (especially when everyone else does.)

I like drug addiction.

I like jail.

I like vehicular manslaughter.

I like asphyxiating on my own vomit.

The really disturbing thing is, none of these are the slightest bit unknown to anyone – we are intimately familiar with just how many damaging affects alcohol has. We are also intimately familiar with the fact that alcohol lowers our ability to judge what constitutes too much alcohol. The inevitable rejoinder of, “There’s a difference between alcoholism and drinking socially,” completely ignores the fact that there is no actual purpose in drinking socially. The idea that someone can ‘drink responsibly’ is simply an advertising gimmick to hide the fact that alcohol itself is irresponsible. So, again, why?

Everyone else does it. That’s the whole thing right there – the only factor that anyone actually uses in the slightest, despite the rationales that they claim. That’s a really stupid argument for the average person, but amazingly lame for anyone who claims to engage in critical thinking. Yet, I still see it all the time. Skeptical meetups take place more often than not in a bar or pub. Post-convention and post-lecture practices always involve drinks afterwards. Forum posts are still peppered with references to alcohol, even as rewards (“I’ll buy you a beer when we meet.”)

Skeptics would never, ever accept “everyone else does it” for an explanation, for anything. One of the salient features of such thinking is the willingness to buck the trends, and to demonstrate by example how much more useful this is so someone else can do it for themselves. But somehow, we let this one slide by unnoticed, even when it’s so prevalent.

We often talk about religious violence, and how often throughout history relying on faith encourages conflict. We rail against alternative medicine, and the stupidity of folk remedies when we have such an advanced medical system. We denigrate psychics and the emotional turmoils that they put people through. Yet how do these compare against the numbers of alcohol-related injuries and deaths? Well over 30% of fatal motor vehicle accidents in the US are alcohol-related, and the stats for alcohol abuse on campuses are staggering. The medical and liability costs alone present a significant economic impact across the world, and we all pay for those. Even just engaging in pointless rituals every weekend should be enough to embarrass us immeasurably.

We make it a point to try and move society towards better, more rational, more functional practices – we don’t have any issues whatsoever with speaking up. So how does this one get past us?

Also, hang up your cute little Star Trek communicator toy when driving. Seriously, no one wants to die because you can’t shut your fucking mouth for a few minutes.

In the interests of balance

Part of adopting a critical-thinking cap is being willing to look at all sides of an issue, and seriously consider the arguments counter to the views you hold. Anyone who’s poked around on this blog long enough knows that I have an interest in evolution, so I feel obligated to feature Bobbie-The-Jean’s post of 50 Reasons I Reject Evolution as an alternate viewpoint. I won’t even comment – I’ll let it stand on its own, and allow you to judge for yourself.

Thanks to Ed Yong at Not Exactly Rocket Science for the link. When you get to be popular, people send you stuff to feature. If you don’t like how rarely I post, it’s cause you’re not doing your part!

Oh, it’s okay. I like you anyway…

Apply directly to forehead

Philosophy is a very curious thing, and I’ve been working out my feelings towards it for the past couple of years now. What I think I’ve finally settled on is that it has its uses, but about 10% of what it is usually given credit for.

Most agree that it was born in Greece a few millennia ago – or at least, that this was the period in time that it was structured and recorded. If we accept the idea of philosophy as being defined by rational argument as opposed to, for instance, experimentation and empirical evidence, then it almost certainly has its origins back when we developed abstract thought as a species, quite a bit before the Greeks started messing with it. The ‘Classic Philosophers,’ however, recorded their pondering on many of the more interesting questions, so that’s where we set the first marker. The Grecian philosophers proceeded with the idea that the universe was orderly and logical, and that knowledge and Truth™ could be determined by finding the most logical set of premises. To no small extent, that is how it is often used today, and at the very least, it is considered to be a bastion of great thinkers.

The rot sets in when we recognize how often the facts get in the way. Countless things in nature continually showed the flaws in logical thought: the earth is not the center of the universe (so believed because it explained gravity, which must draw things to the center of the universe); planetary orbits are not perfect circles; matter is not made up of five basic geometric shapes; there are not four humors within the body; and on and on and on. Part of the problem is, we can only logically argue that which lies within our experience and knowledge to begin with, so there must be a starting point of evidence. If we do not have adequate knowledge of something, the process of logical thought can only proceed with pieces missing, as in most of the above examples.

The second difficulty with philosophy, seen too many times to enumerate, is that it is often based on unwarranted assumptions and posits. The philosophy of religion, usually going under its own title of ‘theology,’ suffers from this constantly, but it is far from alone. It is one thing to assume a posit temporarily to see where that would lead, as a method of testing the posit (if this species of bird eats nuts, then we should expect to see greater numbers of the species around nut trees.) It is another thing entirely to assume the posit, then consider it proven by creating arguments that support it. If it were true that we could find knowledge through the application of logic, then there would be no such thing as logical debate, because there would be no possibility of an opposing viewpoint. In reality, it is exceptionally easy to find support for any premise you care to name, as long as you are careful to avoid and ignore anything running counter to the premise, and this is done constantly – it’s virtually an aspect of human thought processes.

People with philosophy degrees might be inclined to inform me that this is rare, or that philosophy by nature takes such factors into account, yet it is childishly easy to find aspects of philosophical thought that consumed millions of hours, years of someone’s life, that remained totally ignorant of these flaws. Even today, it is possible to find discussions and debates on such topics as free will and objective morality, as if these had distinct meanings, rather than having been adopted into common usage from long-dismissed religious concepts (hell, even ‘objective’ is subjectively defined.) It becomes circular, in that philosophy is wielded to even try and determine what ‘free will’ is, so that we can then see if we should be worried about having it or not. If we don’t have a definition, what is it we’re worried about having?

To return to theology for a moment, this mistake is seen in two of the strongest arguments for a deity, the Ontological and the Cosmological arguments (strongest, by the way, does not imply strong.) In attempting to explain why we should believe in a god, neither one comes anywhere close to defining the gods that we routinely imagine, instead proposing entities with vague abstract properties rather than ones with the distinctive human traits that we are provided by scripture. In these two cases, the initial posit was abandoned in favor of something that could be argued vaguely, but much more easily.

Philosophy is often considered an end unto itself, the process of producing thoughts more profound and of much deeper quality than the average human, which is why philosophers can get so defensive when challenged to provide something useful to mankind. Just going “Whoooaaa!” is a nice emotional rush, to be sure, but not exactly progressive. The common concept of a ‘brain in a jar’ – how can we be sure that we’re experiencing reality and not simply believing we’re experiencing it, like a brain in a jar being fed sensory input by scientists? – is certainly though-provoking and very humbling, but ultimately pointless. Has thinking that we’re experiencing reality worked just fine up until now? Yes? Then why should anyone give a fuck? Another argument is how many grains of sand must be placed in one location before we have produced a “pile” or “hill?” There is no doubt that Zen has some profound answers to that one, almost certainly not as direct as, “It’s a pile when we refer to it as such.” The usage creates the definition, instead of the definition dictating the usage. That took much longer to type than it did to arrive at, and now we can spend our time on doing something that goes someplace. The error demonstrated here is that such questions are given weight by the very idea that they’re philosophical, rather than being functional in and of themselves.

Philosophy is often considered a fundamental aspect of scientific progress, and I’m willing to let this one slide, even though I think defining abstract visualization as ‘philosophy’ is broadening the term to make philosophy majors happy. Yes, most scientific thought and experimentation starts with logically determining if such a premise seems valid, but there are two very key aspects of that sentence: ‘starts’ and ‘seems.’ We don’t achieve anything of value to ourselves, nothing to add to our store of knowledge, until we test these assumptions against measurable effect – that’s empiricism, and specifically excludes philosophy. Albert Michelson and Albert Einstein both had distinct views of how light waves propagated, and both had logical arguments, without flaws, to support their cases. Indeed, Michelson’s (which was in common use at the time) made a lot more sense in that it matched how sound waves propagate, while Einstein’s required esoteric logical comparisons. Einstein’s was accepted partially because the experiments Michelson (and his homey Edward Morley) performed to prove their philosophically sound idea failed to do so. Yet Einstein’s Theory of Special Relativity remained a convincing yet unproven assumption until we had the ability to try it out with better experimental processes. Philosophy was the start, but could not carry the ball home in aether case.

As a quick aside: don’t feel bad for Michelson. He contributed successes to science as well, but even producing a carefully-controlled and documented failure is an advance for science. Until proven, even wrong, scientific speculation remains without value. When your car doesn’t start, knowing that the fuel tank isn’t empty narrows down where you look to discover the solution.

The greatest strength of philosophy, as far as I have ever been able to determine, lies in raising introspection to a commendable pursuit. It can be used to question a thought process or belief system, and to challenge assumptions that we often take for granted (see, for instance, my post on tradition.) We, as a species, have a tendency to think quickly, and sometimes not think at all, but simply react to some previously-established criteria, totally ignoring the elaborate mechanisms for abstract thought within our brains. Sometimes we need to consciously put the mind in gear, to apply perspective and ‘think outside the box,’ to challenge the shortcuts and automatic processes that make up how we think about and react to events.

But these should always have a goal. Thinking for the sake of thinking doesn’t really accomplish anything – Zeno’s Paradox and Plato’s Cave don’t form the building blocks of any pursuit or magisteria, when it comes right down to it. Philosophy can become its own worst enemy in that it glorifies pondering esoteric matters, rather than subjects that can apply to knowledge and progress, where thought exercises and considering the possibility of wrong assumptions are necessary parts of discovering solutions. This is perhaps more of a personality quirk of mine, but I see pragmatism as getting lost in the shuffle too often, as people get wrapped up in the quest and lose sight of the goal, at which philosophy excels. It also panders to those who want to convince everyone of how smart they really are, delving into long-winded minutia and wielding their vocabulary like a club. Yet the first step in contemplating the nature of objective morality is to ask ourselves why we need to know.

I have yet to come across any facet of knowledge that we gained from philosophy alone – everything that we use bears the support of evidence and testing, of going beyond the logical argument to the real-world results. Our minds are really very cool, but far from perfect, and there’s such a thing as placing too much trust in them. It is not enough to use them – they need to be used efficiently, otherwise it becomes simply mental masturbation.

Progress report September 19: Ghosts!


Imagine looking down at your lap and being greeted by this? Freak you right the hell out, wouldn’t it? But no, we’re haunted by the cutest little wraiths any medium has ever seen. Noisy, though.

Since the last report, things have proceeded apace. While the fourth still remains very spooky for some reason, three have now gotten used to hand-feeding, and through some unabashed sneakiness, we now have them getting used to petting as well. Roast beef can accomplish a lot (and it also creates monsters, so be warned.)

We still haven’t named them, since we’re still resolute about placing them in other homes, as well as not really making the effort as yet. However, the calitabby-point we’ve started to simply refer to as “Cali” for convenience, and since we’re ridiculously unoriginal. She is clearly older than the others, and based on some evidence discovered yesterday, we’re fairly sure all of them were abandoned by the neighbors when they were evicted (yes, we’re in a stellar neighborhood.) She’s pretty forthright, and my first attempts at petting were greeted with slaps, but she’s also far too curious and hyperactive, so people have simply been a fascination with her. When she was out exploring in the house yesterday evening, we were moving about as normal and pretending not to notice, to let them get used to our presence without feeling they were watched or threatened. Cali, however, repeatedly darted back and forth past us, as if trapped by our moving from room to room, yet she never went very far away and kept returning – we soon determined that she was actually enjoying this game of keepaway, and liked it even more when I reached down as she darted past and tried to touch her. Even when I was successful, she turned around and came back immediately.

Later on, as I ate my dinner of ravioli and meatballs, she came up onto the chair beside me and actually talked to me in a quiet voice. I thought she was getting a little more people-oriented, but this just goes to show that we tend to be too self-centered when observing: she was after the food, and soon slipped onto the table to help herself. Yes, we’re discouraging that. But she’s gotten used to petting enough now that when I tiptoe in during down time and visit them in their bed, she’ll actually start purring loudly before I’ve even gotten to her.

The bolder lynx point is also getting quite social, and frequently greets us with interest in the morning, though he’s not quite ready to come up for attention yet – give it a day or two. He’s extremely playful, as was seen in the last progress report, and at least twice a day starts tearing around the living room, whether he’s accompanied by playmates or not. Just minutes ago, he was involved in a three-way with Cali and the flame-point, thundering between rooms and trilling excitedly (if you’ve never had kittens, I can’t describe this sound in print adequately, but it’s an excited short burst of purring, comparable to a raccoon’s call – of course you know what that sounds like.) Because of his photo at top and the ability to produce more noise when playing than a cat should, I think I’m going to start calling him Marley, after Dickens’ Jacob Marley from A Christmas Carol, of course.

Earlier today, I decided to play hardball, and sat at my computer with the roast beef and made them come to me. Only Cali and Marley accepted this, but both were coaxed into my lap and received some petting between snacks. The weather has turned a bit chilly recently, a blessing in that I was wearing jeans, the first time I’ve not been in shorts since March; this gave them something to climb, which I’m hoping they get out of when they gain confidence, and will simply jump up when seeking attention. Cali soon lost interest once I put the food away, but Marley liked the petting and began to feel comfortable, so much that when I inadvertently spooked him from my lap, he returned a moment later, then began a quick game of tail chasing. He tried a brief game of “Hang From The Knee and Bite The Denim,” something one of my previous cats used to do on the arm of the couch – I refer to it as a squirrel-killing routine, simply because they appear to like hanging upright and biting the hell out of something. Marley then explored the computer desk a little, but came back and actually began playing with my fingers. He even looked up at me and meowed for attention, the first time I’ve heard him make noise other than during Mortal Komcat. And as I type, he and the flame-point just thundered across the room to my feet, oblivious to this looming human presence. They’re coming along just fine.

By the way, the pic at top was produced when I failed to give the flash time to recharge. I’ve been using a strobe bounced from the ceiling for most of these shots, since it produces very natural-looking light without the dreaded redeye, but it means lots more light is needed than for direct flash, so the strobe takes a moment to recharge. Marley, however, refused to hold still during the longer exposure required by the dark corner where I sat.

1 263 264 265 266 267 292