She blinded me with “Science!”

No apologies for putting that song in your head.

Others have approached this subject, but I felt the need to post about it because a) no one has covered all the details that I think need to be covered, and b) I don’t think, in our culture right now, another voice chiming in is too many.

In the United States at this time, there is this strange perception of science. It’s as if there’s a breed of people that call themselves “scientists,” and the strange rituals that they practice are called “science.” There is a distinct set of rules that everyone must obey, and pronouncements from scientists are considered beyond reproach. Naturally, those that seem to hold this perception are part of the resistance, referring to ivory towers and scientific dogma. Alongside this is the kneejerk reaction to a perceived elitism – in essence, people being defensive over their own level of education (and/or lack thereof,) and allowing as to how they never needed none of that book larnin’ and have been doing just fine. This is the “common man” that politicians and TV programmers like to pander to.

But in a far more subtle way, science and scientists are often held as a distinct subculture, one that borders of the fringes of proper society. Scientists, unable to integrate or even understand common culture, involve themselves in their own little world and, should they dare to intrude, are treated with thinly disguised contempt.

paddrop1To some, this might seem like hyperbole, overstating the case for dramatic effect. But it doesn’t take much effort at all to find real examples of such attitudes. Politicians are notorious for it, and there’s no need to point out how often this comes up in religious circumstances. The amusing thing about this is, the devoutly religious are the ones who seem the most offended by the idea of scientific rules and dogma – the irony of this seems completely lost on them.

Even definitions of “science” seem to fall all over the place – it’s little better than trying to define “art.” But I’m not really sure why this has to be so hard. In a nutshell: Science is a methodical process of learning.

Really, nothing more than that. Just figuring things out. This is hardly insidious. And the key word in there, “methodical,” isn’t the booby trap that it’s sometimes made out to be. It doesn’t represent a process where religion is targeted for elimination, it isn’t a way of ensuring that specific laws are adhered to, and it doesn’t require initiation into a secret club (where you are given your glasses and white lab coat.) The methodology is simply the part that defines it as science to begin with. Without it, you can only produce random results.

So why a method? Well, if you want to be sure that the conclusion you are coming to is correct, and not a result of misinterpretation, misattribution, coincidence, or randomness, what other way is there to go about it? You have to eliminate all of these as possibilities before you can feel confident. And there are three key functions to make this effective: testability, falsifiability, and replicability. To avoid the uncommon words, we can phrase these as questions: Does it do this every time I try it? Does it stop doing it when I remove the purported cause? Does it work the same for everyone else?

It’s kind of hard to argue with those, you know? They do a great job of telling you that you’re heading in the right direction or not, without imposing any more restrictions than are necessary. And they tend to weed out most of the worst factor in the pursuit of knowledge: human fallibility. We’re not perfect, and can not always be sure that we’re correct.

And yet, these are argued against all the time. No, not directly – it’s hard to make a case against someone else checking your figures. Instead, we get to see people avoiding addressing the method, and instead dancing away from it. We’re bombarded with personal anecdotes, which somehow count much more than meticulous clinical trials. We get to hear people arguing against empiricism and materialism, and claiming that there are “other ways of knowing.” The claim is that science only tests what is physical and material, but does not account for spirituality, intuition, transcendence, and knowledge gained from non-physical means.

tidalpool-sIs this possible? Can we, perhaps, learn from some process that cannot be directly measured or sampled? Well, it’s impossible to say, “No,” honestly, but you run into a specific problem: How do you know you have discovered something real? Let’s face it, our minds produce all sorts of illusions: dreams, hallucinations, chemical reactions (i.e. drug trips,) as well as the factors of sensory interpretations such as seeing things, smelling something funny, and thinking something is walking on your leg (I do this for hours after I’ve found a tick on me, but that’s what I get for spending time in the woods.) We are overloaded with distinct impressions, and have no problem accepting that many of these are all in our imaginations. While, somewhere in there, there might be something that’s real, how do you tell it apart? Try to define a way that does not involve material measurements. I wish you the best of luck.

And then, of course, there’s this thing called philosophy, turning inward to puzzle out the fundamental truths. We like the idea that a great-sounding argument means we’re actually dealing with reality. But again, how do you test this? What about people that argue over which philosopher is best? Does this mean reality is a matter of opinion, or popular vote? Or does it mean philosophy is useless for garnering knowledge? Here’s a little exercise: Pick any five favorite and influential philosophers, and demonstrate how our present culture would be different should they never have existed.

Our biggest problem lies in a simple human trait: we don’t like admitting we’re wrong. It’s far easier to call someone else into question than to accept that something we’ve believed (occasionally even cherished) is a mistake. This is, in a way, understandable – mistakes mean we’re flawed, and cannot compete as well. So yeah, try not to make mistakes. The issue we have, however, is trying to defend or ignore mistakes, rather than build from them. We should be beyond such juvenile reactions by now.

There’s this insidious concept that if we like a particular idea, it must be true, as if truth is determined by how aesthetically pleasing a concept is. This is found most often in areas like alternative medicine and paranormal research – two phrases that abuse the second word in their descriptions so badly, it’s practically criminal. In cases like this, big bad ol’ science takes away something that seems mystical, magical, or wondrous and replaces it with boring formulas and clinical trials. Crystals are no longer lenses for focusing universal energies and simply become pretty rocks.

salamanderfossil-sBut “truth” is one of the most abused words in the language, and somehow seems to be differentiated from “fact” in common usage, though no one I’ve ever asked seems to be able to define how. And the merit of any idea or concept should never be determined by how much we personally like it – that’s a dangerous path. Looking at the history of mankind, we wasted a lot of time following preconceived notions when the ability to see that they were wrong was always right in front of us. Our planet is not the center of the universe, our health is not governed by four humors, and we cannot determine an individual’s criminal intent by how far apart their eyes are. We find these silly now, but they were just as silly then, too. The people that died over them would have liked us to find that out a lot sooner, I’m betting. Science may seem like an old fuddy-duddy at times, taking away something we really like to believe, but we need to ask ourselves, very carefully, whether we think it’s better to deceive ourselves and if the path to “enlightenment” involves living in a fantasy world?

What’s really funny about all this is, that’s just what science tries to do: avoid mistakes, or build from them. It serves as a counterweight to our human fallibility. We use it when we learn to ride a bike, and when we put too much oregano in the sauce. Trial, error, experimentation, comparison, conclusion. I call my barbecue rib recipe, “Eureka!” (okay, that’s not really true.)

Another frequent argument against that evil specter of science is that it changes all the time. You can find this specifically in regard to health, and things like the types of cholesterol and trans fats, though it’s also frequently used as an argument in favor of unchanging religious scripture. But let’s face it – we learn new ways of discovering and qualifying information about our world every day. Wouldn’t you like to correct any misconceptions from the past? Are you glad that your doctor no longer recommends leeches for fever? Change is not a bad thing, and is not an indication of weakness – it’s actually an indication of growing strength. Advancement is a form of change, after all.

So am I going anywhere with this rant? Yes, absolutely. It’s really up to us to change the bad conceptions and the negative connotations when that nasty word, “science,” is heard. We need to explain and demonstrate how it works, and how much it does for us. We need to do simple things, like looking at infant mortality rates from only a hundred years ago and realizing how much better we have it now. Or read about the “tests” performed at witch trials (no, not Salem – Salem was small potatoes compared to Europe during that period) and recognize how the culture of that time swallowed the baseless concept of magical beings. We like to think we were primitive beings at that time, and in a way, we were. We changed, thank Science.

We need to stop politicians who cut science funding dead in their tracks. Bailing out inept companies should never, ever come ahead of medical research. In 1953 Chevrolet introduced the Bel Air, which led to – well, not much, unless you’re a classic auto enthusiast. In the same year, Crick and Watson discovered the structure of DNA. I shouldn’t have to point out where that has led so far, and it’s safe to say it will continue for a long time. Knowledge stays around, and grows – that what makes it a sound investment.

We need to take kids aside (and adults, for that matter) and show them how things work. Sure, a prism splits light to show the spectrum. And certain elements absorb parts of that spectrum. We use this in everything from camera lenses (to make clearer photos) to figuring out how old stars are. And what happened recently in their history.

We need to face up to the people we work with and ask, “Why do you think vaccinations are bad?” and be able to show how little evidence they have, or even that they haven’t the faintest idea what evidence really is. If you think that you’re being kind by not engaging in debate, or that it doesn’t hurt anyone to let certain people have their little beliefs, explain your standpoint to Toni & David McCaffrey.

It’s easy to think that un-demonizing science is up to someone else, someone with more resources, someone with more appeal or popularity or clout. It’s easy to dismiss a lot of things as “not our problem.” But, you know, we got rid of some ugly traits like racial prejudice and gender bias, not with laws, not with clout or celebrity endorsements, but by greater and greater numbers of the populace speaking out. Some forms of bad thinking remained as long as they did because they were accepted without questioning, as just, “what everyone else thinks,” until someone bucked the trend and said, “Hey, that doesn’t make sense.”

Psst. Science works. Pass it on.