Apply directly to forehead

Philosophy is a very curious thing, and I’ve been working out my feelings towards it for the past couple of years now. What I think I’ve finally settled on is that it has its uses, but about 10% of what it is usually given credit for.

Most agree that it was born in Greece a few millennia ago – or at least, that this was the period in time that it was structured and recorded. If we accept the idea of philosophy as being defined by rational argument as opposed to, for instance, experimentation and empirical evidence, then it almost certainly has its origins back when we developed abstract thought as a species, quite a bit before the Greeks started messing with it. The ‘Classic Philosophers,’ however, recorded their pondering on many of the more interesting questions, so that’s where we set the first marker. The Grecian philosophers proceeded with the idea that the universe was orderly and logical, and that knowledge and Truth™ could be determined by finding the most logical set of premises. To no small extent, that is how it is often used today, and at the very least, it is considered to be a bastion of great thinkers.

The rot sets in when we recognize how often the facts get in the way. Countless things in nature continually showed the flaws in logical thought: the earth is not the center of the universe (so believed because it explained gravity, which must draw things to the center of the universe); planetary orbits are not perfect circles; matter is not made up of five basic geometric shapes; there are not four humors within the body; and on and on and on. Part of the problem is, we can only logically argue that which lies within our experience and knowledge to begin with, so there must be a starting point of evidence. If we do not have adequate knowledge of something, the process of logical thought can only proceed with pieces missing, as in most of the above examples.

The second difficulty with philosophy, seen too many times to enumerate, is that it is often based on unwarranted assumptions and posits. The philosophy of religion, usually going under its own title of ‘theology,’ suffers from this constantly, but it is far from alone. It is one thing to assume a posit temporarily to see where that would lead, as a method of testing the posit (if this species of bird eats nuts, then we should expect to see greater numbers of the species around nut trees.) It is another thing entirely to assume the posit, then consider it proven by creating arguments that support it. If it were true that we could find knowledge through the application of logic, then there would be no such thing as logical debate, because there would be no possibility of an opposing viewpoint. In reality, it is exceptionally easy to find support for any premise you care to name, as long as you are careful to avoid and ignore anything running counter to the premise, and this is done constantly – it’s virtually an aspect of human thought processes.

People with philosophy degrees might be inclined to inform me that this is rare, or that philosophy by nature takes such factors into account, yet it is childishly easy to find aspects of philosophical thought that consumed millions of hours, years of someone’s life, that remained totally ignorant of these flaws. Even today, it is possible to find discussions and debates on such topics as free will and objective morality, as if these had distinct meanings, rather than having been adopted into common usage from long-dismissed religious concepts (hell, even ‘objective’ is subjectively defined.) It becomes circular, in that philosophy is wielded to even try and determine what ‘free will’ is, so that we can then see if we should be worried about having it or not. If we don’t have a definition, what is it we’re worried about having?

To return to theology for a moment, this mistake is seen in two of the strongest arguments for a deity, the Ontological and the Cosmological arguments (strongest, by the way, does not imply strong.) In attempting to explain why we should believe in a god, neither one comes anywhere close to defining the gods that we routinely imagine, instead proposing entities with vague abstract properties rather than ones with the distinctive human traits that we are provided by scripture. In these two cases, the initial posit was abandoned in favor of something that could be argued vaguely, but much more easily.

Philosophy is often considered an end unto itself, the process of producing thoughts more profound and of much deeper quality than the average human, which is why philosophers can get so defensive when challenged to provide something useful to mankind. Just going “Whoooaaa!” is a nice emotional rush, to be sure, but not exactly progressive. The common concept of a ‘brain in a jar’ – how can we be sure that we’re experiencing reality and not simply believing we’re experiencing it, like a brain in a jar being fed sensory input by scientists? – is certainly though-provoking and very humbling, but ultimately pointless. Has thinking that we’re experiencing reality worked just fine up until now? Yes? Then why should anyone give a fuck? Another argument is how many grains of sand must be placed in one location before we have produced a “pile” or “hill?” There is no doubt that Zen has some profound answers to that one, almost certainly not as direct as, “It’s a pile when we refer to it as such.” The usage creates the definition, instead of the definition dictating the usage. That took much longer to type than it did to arrive at, and now we can spend our time on doing something that goes someplace. The error demonstrated here is that such questions are given weight by the very idea that they’re philosophical, rather than being functional in and of themselves.

Philosophy is often considered a fundamental aspect of scientific progress, and I’m willing to let this one slide, even though I think defining abstract visualization as ‘philosophy’ is broadening the term to make philosophy majors happy. Yes, most scientific thought and experimentation starts with logically determining if such a premise seems valid, but there are two very key aspects of that sentence: ‘starts’ and ‘seems.’ We don’t achieve anything of value to ourselves, nothing to add to our store of knowledge, until we test these assumptions against measurable effect – that’s empiricism, and specifically excludes philosophy. Albert Michelson and Albert Einstein both had distinct views of how light waves propagated, and both had logical arguments, without flaws, to support their cases. Indeed, Michelson’s (which was in common use at the time) made a lot more sense in that it matched how sound waves propagate, while Einstein’s required esoteric logical comparisons. Einstein’s was accepted partially because the experiments Michelson (and his homey Edward Morley) performed to prove their philosophically sound idea failed to do so. Yet Einstein’s Theory of Special Relativity remained a convincing yet unproven assumption until we had the ability to try it out with better experimental processes. Philosophy was the start, but could not carry the ball home in aether case.

As a quick aside: don’t feel bad for Michelson. He contributed successes to science as well, but even producing a carefully-controlled and documented failure is an advance for science. Until proven, even wrong, scientific speculation remains without value. When your car doesn’t start, knowing that the fuel tank isn’t empty narrows down where you look to discover the solution.

The greatest strength of philosophy, as far as I have ever been able to determine, lies in raising introspection to a commendable pursuit. It can be used to question a thought process or belief system, and to challenge assumptions that we often take for granted (see, for instance, my post on tradition.) We, as a species, have a tendency to think quickly, and sometimes not think at all, but simply react to some previously-established criteria, totally ignoring the elaborate mechanisms for abstract thought within our brains. Sometimes we need to consciously put the mind in gear, to apply perspective and ‘think outside the box,’ to challenge the shortcuts and automatic processes that make up how we think about and react to events.

But these should always have a goal. Thinking for the sake of thinking doesn’t really accomplish anything – Zeno’s Paradox and Plato’s Cave don’t form the building blocks of any pursuit or magisteria, when it comes right down to it. Philosophy can become its own worst enemy in that it glorifies pondering esoteric matters, rather than subjects that can apply to knowledge and progress, where thought exercises and considering the possibility of wrong assumptions are necessary parts of discovering solutions. This is perhaps more of a personality quirk of mine, but I see pragmatism as getting lost in the shuffle too often, as people get wrapped up in the quest and lose sight of the goal, at which philosophy excels. It also panders to those who want to convince everyone of how smart they really are, delving into long-winded minutia and wielding their vocabulary like a club. Yet the first step in contemplating the nature of objective morality is to ask ourselves why we need to know.

I have yet to come across any facet of knowledge that we gained from philosophy alone – everything that we use bears the support of evidence and testing, of going beyond the logical argument to the real-world results. Our minds are really very cool, but far from perfect, and there’s such a thing as placing too much trust in them. It is not enough to use them – they need to be used efficiently, otherwise it becomes simply mental masturbation.

« [previous]
[next] »