I have very mixed feelings about emotions [now, there’s a sentence of remarkable profundity.] As manifestations of internal functions to provoke behavior that benefits our species, they are distinctly important, but too often, they’re not specific enough, or they’re too easy to fool, producing behaviors that don’t really benefit us and are sometimes quite detrimental. I hate to imply that we should always try to override emotions, and probably couldn’t do so if we tried, but there are plenty of cases where a good dose of critical examination works a whole lot better than trusting our ‘instincts.’
One interesting manifestation of this comes up fairly frequently in the discussion of scientific topics. Sean Carroll talks about one particular example, where physicists were asked which interpretation of quantum physics they favored. As he points out, there are numerous problems with this, most especially in implying that science can be determined through voting or popularity, which is also demonstrated by how often debates about some topic are promoted through our media outlets. Carl Sagan recounts in Demon-Haunted World how he was often asked what he believed about extra-terrestrial life. Upon giving his answer based on what we could actually determine (which is not much,) he would then be asked what his gut told him. “But I don’t think with my gut,” was his response.
Questions of this nature depart from attempts to obtain verifiable, dependable information and instead ask what someone’s emotional response is. However, the answers to such questions cannot have much, if any, value at all. Even being an experienced physicist doesn’t mean that a gut feeling is indicative of Truth™, or that anyone at all is free from the bias of past experience, personal benefit, or just liking “how it sounds.” In fact, we are far more likely to reject any given idea because we feel the person presenting it is an asshole, than because it lacks substantiation. Those feelings of personal attraction that influence our social interactions can easily get attached to something that has no social impact whatsoever, and which really should be viewed impassively.
To be sure, there are times when an ‘instinctive’ feeling is actually indicative of something recognized subconsciously, like how we may realize some off-the-cuff hypothesis is flawed without initially seeing why – anyone well-versed in any profession may have knowledge that is second-nature, immediately supporting or denying some particular perspective even when not articulated in detail. Yet without hearing the reasoning behind any standpoint, we have no way of knowing if it originates with their knowledge and rational processes, or if it’s simply a manifestation of emotional bias.
This issue is openly recognized, at least among those familiar with common debating errors, in the appeal from authority fallacy. Very frequently, an argument is proposed wherein the opinion of some knowledgeable figure is forwarded. The implication is that someone can be an authority which makes them right, either in select areas or sometimes quite broadly; witness the number of times that someone’s PhD degree is announced, often without even specifying in what field. But science degrees are not some form of royalty, and determining what is correct is not done by decree or title, but by demonstrating the accuracy with evidence. Even if Charles Darwin really had recanted his theory on his deathbed, no one in any biological field would care, because the theory still holds up incredibly well. In fact, it is the strength of the theory which would tell us that Darwin’s recantation was mistaken, and not the other way around. This does, of course, make such claims by creationists all the more amusing, and demonstrates that they really don’t understand science in the slightest.
Some of the search for how others feel about something undoubtedly comes from our distaste for uncertainty. We really don’t like not knowing something for certain, which isn’t by itself a bad thing; it provokes us to search for knowledge. But then again, it has its shortcomings, because the lack of viable answers leaves us unsatisfied, and in a ridiculous number of cases causes us to settle for whatever answer fits our desires best. Carl Sagan, obviously quite interested in the idea of extra-terrestrial life, was asked for a specific ruling on its existence, in the hopes of locating a kindred spirit who supported the desire for such life to really exist.
This is stating the obvious, but most religions are built almost entirely on bold assertions, attempting to establish certainty through repetition and unwavering surety. Religious worldviews invariably come down to some statement to the effect of, “This is how it is” – almost always in the absence of any supporting evidence. It has nothing to do with how much sense any religious idea makes, the functionality of it, its usefulness or prediction; what is most visibly present is the assurance, the emotional state of surety. As a species, we have a strong propensity to take cues from others, and will honestly believe, far too often, that if someone is very firm and confident in their beliefs, then by god they have good reason to be! [You didn’t miss that, did you? Not just the “by god” bit, but the fact that such a phrase directly relies on the concept of surety? I do have fun doing this.]
It’s funny, though, because people often realize the problems with this, to some extent anyway. Challenged with why they might be so sure of something, to provide a reasoning behind their standpoint, they can resort to the most convoluted or insubstantial justifications. I have been assured that the bible is the perfectly accurate word of god, handed down from generations long before writing existed, because oral storytelling was a precise act back then; no chance of any error, misunderstanding, memory loss, or editorializing. Scriptural historians would be rolling on the floor dying with laughter at this idea, not just from the fact that this has never been demonstrated in human history, but in the literally thousands of versions of scripture that exist and the obvious adaptations from other cultures. It’s not about making sense; it’s about justifying a viewpoint with something that at least sounds like the rational portion of the brain was involved, bearing some recognition that mere assertion is a child’s game.
And then there are the people who have the opposite viewpoint, who feel that surety comes with the weight of the evidence, with how dependably something works or how well it predicts. Very often, there is the recognition that nothing is absolutely sure – there’s always the chance that some new evidence will pop up that makes us rethink our understanding of how it all works. The turnover between Newtonian and Relativistic physics is one example; quantum physics is another. While this viewpoint is largely held by anyone sufficiently immersed in the sciences, it isn’t limited to such, and proposes a chicken-or-egg question: is it the knowledge of how science works, the process of examining evidence, that fosters this view of certainty through support; or is it the idea that we can only be confident in something that demonstrates its reliability the very thing that gets people interested in science in the first place? Or, perhaps more accurately, how much of either is responsible for any given individual?
The ‘scientific’ acceptance of uncertainty rather obviously denies the emotional desire for certainty – with some exceptions, as seen in Carroll’s post. But overall, no small number of people can consider something like dark energy as a present mystery, and even when yearning to find out what it’s all about, will not resort to some easy assertion just to appease the inner demon. The stronger emotion seems to be feeling confident in any viewpoint because it’s supported by evidence, functionality, and simply making sense.
The truth is, until we become omniscient (and I wouldn’t suggest waiting on that event,) we’re going to have uncertainty. This isn’t even an ugly truth, because it’s a reasonable expectation – we just don’t like it, solely because we have a drive to understand. Yet, in some cases we’re not going to satisfy this drive, or at least not without substantial effort. It’s really not that big a deal, comparable to numerous desires that are not immediately fulfilled – emotions are, after all, just nudges in our thought processes, not any harder to ignore than the desire for a cooler phone [okay, maybe that was a bad example.] Uncertainty is not a failing, and with a little support from reason, it can serve a motivator towards greater understanding.