On composition, part 22: Distortion

A fundamental part of photography is focusing light onto the recording medium, be it film, digital sensor, or even our own retinas. And the method used for doing this the vast majority of the time is a lens, a transparent substance with a certain index of refraction – the trait of bending light when it passes through the surface of the substance. Put the right correct surface angle in the light’s path, and you can direct it the way you want. What should be known is there is no such thing as a perfect lens, and as a result all photographs demonstrate some form of distortion – some more than others. It can’t be avoided, so it might as well be put to use.

[A small side note here, expanding on a section above: the light bends only at the surface of the transparent substance, both on entering and exiting, which is why the curves of the front and rear surfaces of some lenses are dramatically different. Within the substance, however, it continues on a straight line.]

Far and away, the lenses that demonstrate the most distortion are the wide-angle lenses, the ones intended to capture the widest field of view and cram it all into the frame – the wider they go, the more distortion is present. This is inescapable, at least until we get media that wraps around our heads in order to mimic the scene we would be seeing normally. When you take a view that encompasses 140° horizontally and expect to see it on a monitor which takes up 35° to 55° of our field of view, something’s going to give, and that something is strict accuracy.

It used to be that all lenses were ground spherically – the surface described a portion of a perfect sphere, and only the size of this sphere varied. But the light was being thrown onto the flat plane of the recording medium, not a portion of a sphere like the lens, meaning the edges of the image were further from the point of bending than the center – the result was the center of the photo was typically the sharpest, with the edges dropping off in both sharpness and accurate rendition. Now, with them newfangled computers and them fancy gadgets, it’s possible to make lenses that are aspherical, optimized for a flat focal plane. Such lenses reduce the distortion that used to be found, but still do not fully eradicate it. This means that older lenses, especially the shorter focal lengths that produced the widest angles, are much more likely to show distortion; the same can be said for the more inexpensive consumer lenses, intended to sell at a low price and thus not likely to receive the elaborate grinding.

spherical distortionDistortion from a wide-angle lens is often called barrel distortion, but it might be more clear if we consider it fisheye or glass ornament distortion, the kind of effect you see if you lean close to a reflective sphere – your nose gets too big and your ears disappear around the bend of your head. The effect is rarely that pronounced, which makes it deceptive, because then it can sneak in when we’re not expecting it. The most noticeable effect is from vertical elements of the image that are close to the edge of the frame, which may either lean towards the top or bottom center or bow around the middle of the frame – this becomes even more pronounced when one portion of such a subject, like the top of a tall building, is significantly further from the camera than other portions. It can also appear in the horizon if it crosses too far from the middle of the frame, for instance when we aim higher to get more sky in the image and thus the horizon falls towards the bottom of the frame.

There’s not a lot that can be done about this, save for avoiding the circumstances where it’s most visible. Try not to have trees or columns near the edges of the frame when using such lenses; shoot as close to horizontal as possible, so the relative distances are comparable; keep the horizon in the center of the image (bear in mind you can always crop the image later to get the framing you prefer.) The closer a subject is to the camera, the more pronounced this effect will be, too – again, it’s that relative distance thing, and how much the subject intrudes into the dangerous areas around the edges of the lens effect.

We go back to the lens shape again. Since they’re typically round, they throw a round image onto the focal plane – the rectangular frame of most cameras just cuts more off the top and bottom than the sides. So the sides of the frame, and most especially the corners, tend to get closer to the more distorted regions of the projected image and show the worst aspects.

This also applies to a trait called light falloff. If you look at the spot thrown by a flashlight beam, the edges are not sharp – the light fades at the edges, and the same is true for the image a lens produces. This means that the corners of your photo can go darker, and this is true with any lens, not just wide-angle versions. This is most visible when you have clear sky in your photos. It’s subtle, and many people miss it entirely, but it can have a noticeable affect on those scenic shots, especially when enlarged significantly. The nice thing is, this is very easy to get rid of. Simply use a smaller aperture when shooting; the effect usually vanishes when the lens is topped down 2-3 stops from maximum aperture.

bokeh and macro doublingOne of the more interesting terms you might hear, especially in regards to lens performance, is bokeh. What it refers to is the appearance of the out-of-focus portions of an image – occasionally, it is used to mean just the highlights in these areas. Ideally, bokeh should be nice and soft, appearing airbrushed and not blotchy as seen here, but it’s a lens trait and, as such, there’s not much you can do about it other than purchase another lens. However, if you already have a selection of lenses at your disposal and you know one is better than another, you can sometimes substitute the better lens in limited situations.

A closely related trait is something I’ve mostly seen from macro lenses, usually when used at or near maximum aperture: image doubling. Some aspherical lenses seem to do it too, and this is likely where it originates. Basically, distinctive portions of an image, like the insect legs seen here, can be doubled when well out-of-focus. This is one that you usually can control, in that it seems to disappear once the aperture is stopped down to f8 or more.

A trait of longer focal lengths is chromatic aberration, sometimes called color fringing. This occurs because different wavelengths of light get bent differently by the same lens surface, and is most visible with very bright objects bordered by darkness – the top of the object may have a blue fuzzy edge, while the bottom has a red one. The effect is often worse the farther the subject is from optical center of the lens. This is usually fairly well controlled in newer lenses, and the reasoning behind multi-coating (a term that isn’t seen much anymore since nearly all lenses feature it,) but the very expensive, high-end telephoto lenses also use additives to the glass to control it, and may advertise “extra low dispersion” or “fluorite” and similar terms. I’ve seen it so rarely, even with the large number of consumer lenses that I’ve used, that I find it much more prevalent in rumor and reputation than in actual appearances in the image.

Color fringing, especially in a very distinctive shade of purple, also appears as a result of many different digital sensors, almost always around bright highlights. This is a camera trait, however, and is not affected by lenses, nor is there much you can do about it except avoid the circumstances where it occurs, touch it out afterwards with editing, or find a different camera body.

lens flareA very common set of effects is glare, lens flare, and ghosting – all closely related and stemming from the same causes. Basically, any time bright light (like sunlight) hits the front surface of your lens directly, the light can bounce around and scatter within the lens assembly, perhaps reflecting off of each surface, maybe only reducing contrast and washing out your image. Note that it is not necessary for the light source to be visible in the frame or viewfinder, only that it is reaching the front surface. Seen here, it has produced red and green ‘ghosts’ on the base of the stump, optically opposite the sun peeking through the roots, and the first way to prevent it is to avoid doing what I did and having the bright sun in the image. But immediately behind that is to use a lens hood – generally, any time you’re outside in bright sunlight, but most especially when you’re aiming in a way where sunlight can reach the front of the lens. In some conditions the hood (especially those made for wide zooms) is inadequate to fully protect the lens, and you might use additional shading, such as your hand or your hat. You can often tell in the viewfinder when you’re successfully shading the lens, as the contrast will abruptly increase and/or the ghosts will vanish.

Also seen here, the sun is made distinct by the presence of the ‘starburst’ arms, which is a trait of lens flare attenuated by the aperture – to see it most distinctly, use a small aperture with a point light source. This is one distortion trait that can be used to great advantage, accentuating the light source and even adding some character to the image. Even the bad kinds of lens flare are often used in movies to drive home the idea of a brilliant,overbearing sun, and this is so much a known thing to audiences that it was even replicated in Toy Story, a computer-generated film that didn’t use lenses and so would never have a reason for the effect to occur – watch for the string of hexagons to appear, towards the end, as Woody sees the sunlight magnified by Buzz’s helmet (he uses this to ignite the fuse on the rocket and save the day.)

All forms of wide-angle distortion can be used to accentuate scale and distances, especially by getting very close to your subject. Any surface stretching away can have the distance exaggerated, but close objects (or portions thereof) can also be made to loom large in the frame. Dramatic, unreal perspectives can be used to give a different impression than what we might normally see, perhaps making a subtle subject leap out at the viewer.

The more an image is magnified, the shorter the depth-of-field becomes, and this applies to both telephoto and macro work. This can be used to increase your subject’s isolation, drawing attention directly to it since everything else in the frame is defocused. Alternately, wide-angle (short focal length) lenses increase depth-of-field and so can allow the entire frame to be in focus – useful for scenic and landscape shots of course, but also helpful in compressing two subjects together into the same apparent plane, like those images of tourists ‘holding up’ the Tower of Pisa.

While distortion alters reality (or at least, that version that we perceive with out own senses,) it isn’t always a shortcoming, and by knowing how to use it, a photographer can create more eye-catching compositions. It’s definitely worth knowing how it works with your own lenses.

Fighting with abstracts

This one’s going to be a little bit weird. I mean, more so than usual. It started as just an offhand comment, but grew into a strange bit of philosophical inspection.

I recently read, yet again, the journalistic cliché about someone “beating the odds.” Which is complete nonsense. No one ever beats the odds, though they might fall right in line with the odds in a favorable way – being the one in five million people who wins the lottery, for example. Even if, by some strange chance, they continually, repeatedly got results against all probabilistic expectations, they still didn’t beat the odds, the odds simply changed. Probability is based entirely on what actually happens; it’s not a law unto itself.

Yet, it gets even weirder. Why do we even say anything at all like, “she beat the odds”? As if she physically battled with some abstract concept, where victory could be distinct? If there’s anything that could not possibly be engaged in any form of combat, it would be an abstract idea. Our language has millions of these examples, both objectifying and personifying concepts that we have created entirely from our imaginations – we struggle to learn math, or win out over stubbornness; we beat the rain home, or even teach that squeaky door a lesson. We spend a ridiculous amount of time first assigning some form of agency or personality to objects and ideas, and then engaging in competition with them.

Steven Pinker observed this in How The Mind Works, giving it as evidence of the overriding functions that form human cognition. Despite the immense numbers of things that we encounter that we should be able to view in an entirely neutral manner, we much more often subject them to a ‘friend or foe’ perspective, something that either helps us along in our lives or forms an obstacle to our goals. Some of this really is true, to a certain extent; food might be on the other side of that river (or downtown traffic,) and so we must endure a certain level of hardship, expend more than average effort, in order to achieve the rewards. While it doesn’t seem a big jump to go from, “This is a little harder than I’d prefer,” to, “This is actively blocking me from my goal,” there yet remains no reason to become even frustrated over it, much less view it in terms of competition or adversity. Instead, the idea of competition is so ingrained within our minds that we use it everywhere – and, to the best of my knowledge, in every language and culture.

This is just one example of why an intelligent extra-terrestrial species might have a great deal of difficulty in translating and understanding our language – there is an unknown likelihood that they possess no such traits and wouldn’t understand why we do. But even sticking to this planet, it gives a faint indication why we have so much difficulty with conflict and warfare: we can’t actually get them out of our minds. Rather than living harmoniously with our surroundings, or treating random events as just things that happen, we consider our individual existence as a competition against forces trying to prevent us from our goals. That this is an evolved trait that perpetuated itself seems a given; it’s not hard to see that treating impediments to our survival as a challenge, a test of our very egos, probably produces a more immediate and stronger response than seeing the same thing as just ‘what happens.’ We’re more likely to persevere in any undertaking when we take it personally. Think about the difference between an accomplishment, such as completing a puzzle, and a challenge, like considering our inability to complete the puzzle (which is meaningless after all) to be an indication of failure on our part.

Certainly, it’s been a useful trait. The puzzle of preventing polio, for instance, was solved largely because it was egotistical, a challenge to our abilities; think instead if just the people who had polio, the ones who would benefit directly from its disappearance, were the only ones concerned about it. Even if this had been the case, what was produced was a preventive vaccine, not a cure, and those afflicted received no benefit from it. Don’t let me sell it short, because empathy towards others, including our own genetic line, played a very large part as well – we would prefer not to see anyone stricken. Some of it, too, was seeing children coping with the disease, since that fires up our protective responses. But then again, how much of a part do these play towards eradicating world hunger? This goal should be easily within our grasp, yet it falls well behind the quest for personal wealth and status, far too often. There’s no easy way to tell how much we’re motivated by social justice and protecting children, and how much we’re driven by a nonexistent competition with the world we live in, but we can’t deny that the latter remains a strong influence on our thinking.

It makes me wonder how much of a part it plays in another mystery that I ponder occasionally. While there are plenty of explanations about how humans could have created the concept of gods, it’s harder to justify why so many of these beings are considered kind and beneficent, especially in the face of both scriptural accounts of wrath, and the belief that gods are responsible for the cataclysmic events of the earth. But let’s face it: if a being is both omnipotent and antagonistic, well, game over, man – it’s not a competition that we’re ever going to win. However, let’s say we can win big if, and only if, we play by the rules… that pretty much describes most religions, doesn’t it? And it explains why religions get involved in so much competition and antagonism of their own. It certainly makes a lot more sense than a god, who created humans with certain tendencies, playing games with them that end in perpetual reward or punishment, like souls are poker chips.

So can we call this competitive viewpoint good, or bad? Well, neither, really – that’s just another trait coming into play, one of trying to slot things into distinct categories and make quick decisions. Like most things we might encounter in our world, it can have a lot of different effects on us. Instead, it just helps to know that it’s there, and can appear just about anytime, useful or not. Perhaps that’s enough to let us ignore it when it’s provoking (heh!) us towards an attitude or action that won’t really be beneficial.

Repost: You don’t look a day over eighty

This is cheating, I know, and especially lazy when the posts have been so thin lately. I could have just linked to it while providing new content, but I find the original from last year to be pretty complete. Plus I’m not sure who actually follows links…

* * * * *

So not only is today the summer solstice, but also World Humanist Day – which is, admittedly, an odd thing. Not in that I believe we shouldn’t bother with it, but in the implication that there’s only one day to consider or celebrate humanism. It’s like having a National Don’t Set Your Neighbor On Fire Day; it’s something that we shouldn’t need because it’s automatic. But perhaps the main idea is a day to promote the awareness of humanism, and if so, I can handle that.

Secular humanism is the ideology that we can determine effective moral, ethical, and social guidance without resorting to any religious, supernatural, or spiritual influence. While it is often confused with atheism, there’s a distinctive difference: it’s possible to be an atheist and not give a damn about social welfare. This attitude is remarkably rare, however, so the crossover between the two is common, but this distinction still appears in odd ways. Religious folk desperate for a way not to lose another argument will often point to the dictionary definition of atheism, as if this renders all socially-based arguments from an atheist null and void – apparently there’s a belief that a label must be exact or it’s irrelevant. I’m quite direct in calling myself an atheist, even though ‘secular humanist’ is far more accurate (and ‘critical-thinker’ even more so); besides the fact that far fewer people even know what the term means, calling myself a secular humanist sounds both pretentious and like I’m shying away from the negativity of the word ‘atheist.’ It’s my small way of saying, “Fuck your feeble preconceptions.”

The first usage of the term, according to Merriam Webster, appears to be from 1933 – surprisingly recent, given the long history of the overall concept, which can be traced back for thousands of years. In fact, the actual origins of it may predate every form of religion on the planet. But let’s start from the other end.

We are assured, so often that it’s practically a cultural assumption (at least in the US,) that religion is the source of all morality, and even forms the basis of all laws. When it is pointed out that scripture is remarkably weak on countless concepts of morality, and outright contradictory to others, it is usually asserted that the gist of scripture leads the way – laying the foundation, if you will. There are so many ways that this fails I can’t possibly enumerate them all, but I can provide a representative few. Women’s rights not only lack the barest hints of either existing or being recommended, abrahamic scripture is very distinct in considering women both chattel and unimportant beyond the baby-making angle, something that still exists in countless sects today. Such a basic thing as equality among humans is directly denied, from the sins and low origins of other ‘races’ (there’s just one race, which explains why we can interbreed) to structures like caste systems and chosen people. Followers are openly instructed to beat children and stone heathens and pillage wantonly among the lesser folk. And should anyone wish to claim that these were radical misinterpretations of what scripture really says, we need look only at the long history of holy wars, religious persecution, and declared privilege to determine that the true message was lost on so many people that it defines the most inept body of law in the universe – in fact, actively and repeatedly achieving exactly the opposite of moral guidance. Hiding behind “the fallibility of man” is a feeble excuse; this supposed guidance was directed at us, created to be this way. The message that can be taken from this is that we are intended to run rampant – if we bother to accept such nonsense assertions.

This also means (and this is not an atheist manipulation, but a direct assurance from countless religious folk) that without religious guidance we will descend into self-absorbed, nihilistic behavior, often compared to the “beasts” (another factor in decrying evolution, by the way.) I’ll take a moment to point out that the social structure of many “beasts” is superior to our own quite often, especially when it comes to slaughtering members of the same species. Yet what especially needs to be noted is that every culture developed their own moral guidelines, remarkably similar in more ways than religions have ever managed, and the further any culture gets from reliance on religious authority, the higher its social standards and general well-being. There’s even a study that religion and racism are closely tied. True, this does not mean that religion causes racism, and I’ll be direct: religion is very often just one manifestation of class consciousness, racism being another. But since we’re talking about the moral imperative of religion, we should expect to see much lower levels of racism and higher levels of social harmony. It becomes obvious when one bothers to check: religion isn’t providing much of a guidance.

The question of where moral behavior does come from was the topic of my first “But How?” post – we’ve always had it. It’s the benchmark of a social species, and as such, found in far more than just Homo sapiens. Species that gain a benefit from any kind of group behavior must have cooperative functions, and even see social interaction in a positive light. This is such a fundamental trait that even some species of insect, like ants and bees, possess it; it boggles the mind to think that we would need to learn such behavior.

SquabbleAnd yet, there’s the negative behavior above to consider – just as obviously, the internal guidance wasn’t working too well in the bad cases throughout history. Mostly, this is because it’s not the sole behavioral trait we possess; competition is also pretty strong, and since these are in conflict, there must be some ‘criteria’ for when one or the other is to take precedent. I put ‘criteria’ in quotes because the word implies a much more elaborate structure than what our brains would actually possess. We have strong familial bonds, protecting our spouse and offspring, and these get weaker with the ‘tribe’ and vanish entirely against any perceived threat. So there’s quite a bit of subjectivity about our in-groups and how we interpret anyone as “fer us or agin’ us.” We know that it’s good behavior to favor our in-group against any outsiders, but the method of determining where these lines are is vague. Most religions are remarkably adept at drawing lines, relying on such manipulative concepts as the perfection of the self (“saved,” “chosen,”) the idea of ultimate authority, and of course ideas such as there being One True Religion™ – no need to prove any value or superiority, just proclaim it. Very self-indulgent, but hardly a guideline for moral behavior. As a species, we’re not very good at distinguishing the desire for social cohesion and the desire to feed our egos (another nail in the coffin of the “designed” idea.) Our penchant for drug addiction makes it clear that it’s too often the good feelings that count, not necessarily how we achieve them.

Any immaterial justification for any behavior is going to fare as badly – it’s far too easy to create something that supports our pre-existing views without fulfilling any other function. See if you ever run across someone who announces a spiritual property or “way of knowing” that they themselves do not possess or that fails to boost their ego – good luck with that. But various scholars and philosophers throughout the ages have argued that morality really should be about more than indulgence, one set of guidelines able to apply to everyone without drawing lines. Crazy talk. Even a cursory examination of our Constitution reveals (to those not scared of the idea) a basic principle of equality, fairness, and the reduction of privilege, though it was soon realized that they didn’t specify the dismissal of religious authority within our government, correcting this with the First Amendment. And of course, this whole idea forms the backbone of secular humanism.

It’s not hard to find people claiming that the goal of secular humanism is to eradicate religion – only religious people though, imagine that; I always thought honesty was one of those important things to them, but whatever. Secular humanism, however, only affects religious privilege over others – it destroys pedestals to bring everyone to ground level. No secular humanist would be any more valued or privileged than any member of any religion, and no less answerable for their actions either. Decrying this can only come from someone with something to lose from it, angry that they would be considered as morally responsible as everyone else. A whole orchestra of the world’s tiniest violins has sprung to life…

It could be argued that, if we are born with a socio-moral objective, there is little point to humanism. It could also be argued that humanism is simply attempting to do the same thing as religion – there have even been attempts to have it declared a religion, though what purpose this would serve is unclear, except for leverage in the weird legal system we have over where religions can and cannot appear. So we’ll take a look at both of these, in reverse order.

The definition of religion has always been up for grabs, though legally it is specific enough to rule out humanism, and of course the bare meaning of the word “secular” also puts the kibosh on the religion angle. There seems to be some difficulty with telling the difference between a religion and an ideology; humanism is an ideology, which means it forms an underlying approach or attitude towards decisions and actions. Every form of government is an ideology, as are cultural standards for schooling.

Tackling the former argument about not needing an ideology for social and moral structure, the points above should have made it clear that we have tendencies towards social cohesion, but a hell of a lot of ways in which we get confused, sidetracked, or deeply involved with disguising indulgence as morality. While better than nothing (especially the nothing that the religious insist we would have without their stalwart help,) it’s still a lot worse than we can imagine. Evolution doesn’t always produce strict behavior, but nudges in useful directions – and there are a lot of nudges for a lot of different circumstances in our complicated lives. Not to mention, if we were as dependent on our rational minds as we like to think we are, drug addiction and sexual affairs and arguing over music wouldn’t actually occur at all, much less all the fun we tend to have over how to define and regulate moral behavior. We really do need something that we all find as agreeable and functional as possible, that we can resort to when there are doubts. Hold that thought, because we’re going to come back to it.

Secular humanism, for the most part, isn’t about creating rules, or dictating behavior. It’s about producing a perspective, an underlying concept of what a goal should be, that gives structure to rules and decisions. Someone driving in a residential area does not need a speed limit sign to infer that the limit is probably much lower than the freeway – traffic is thicker, more opportunities for people to pull slowly out into the road, and bicycles and children are far more prevalent. The structure behind all this is, “It’s far more hazardous so stopping distance and reaction time are far more important.” I feel safe in saying most drivers understand this perfectly well – but a few too many think that it somehow doesn’t apply to them, or that a temporary exemption just for them should exist because it’s inconvenient otherwise. Self-indulgence; it’s not that it makes sense, it’s that we’re a species that is adept at manipulating things to our individual advantage. This perspective, this glaring realization of how egotistical we can be, is but one aspect of humanism.

Most notably, secular humanism eliminates (or at least greatly reduces) any reliance on broad labels, pronouncements, or assertions. Good and evil are not properties, but indefinable abstracts; there is no action (much less person) that can be said to be universally good – there is always some way in which someone will fail to benefit from it. Decisions based on the promises of post-mortem states are ludicrous when we have real-world, demonstrable, and above all dependable consequences that are easy to see. Morality is solely about other people, how we interact and the importance of functioning socially – otherwise why would anyone care in the slightest? To make any claim (as many religious folk indeed do) that morality is only about how one appeases their deity not only makes it a pointless concept to promote, it inevitably produces exactly the hedonistic nihilism that is supposed to be so horrifying. “I am good, you are not; since you’re going to hell anyway, let me hasten the process.” Lest anyone thinks this is a straw man representation of religious viewpoints, let me remind people how often phrases such as, “Kill them all and let god sort it out,” are still heard, and how angry evangelists tend to be, and that religion has been used as a justification/motivation for war for nearly all of written history (how many religions specifically chronicle the wars they’re most proud of?) If we think this isn’t accurate anymore and such historical behavior is behind us, it is only because of the secular influences that our culture has been promoting, and increasing.

We can look at scriptural exhortations to stone women who talk without men’s permission and say, “Damn, that’s stupid!” – because it makes no sense. We can create laws against driving while intoxicated, not because there is the faintest religious backing for it in any way whatsoever, but because we know what the consequences are, and find it remarkably unfair that someone far removed from the complete idiocy of alcohol can still come to harm because of it. We can contemplate laws restricting same-sex marriage and say, “Hold on a second – isn’t this creating a double-standard, where a legal practice somehow becomes illegal based entirely on who engages in it?” (Note that I said we can, not necessarily that we do.) And this means we can pause for a second and realize that laws are to prevent harm, not to reinforce someone’s pointless prejudices. That’s secular humanism; guiding our decisions through the application of objective, rational perspective and observable consequences.

Yes, this does mean that secular humanism can actually be pursued by religious folk, as well – and it is, more often than we might think. The laws in this country regarding freedom of speech and freedom of religion, including the ones protecting religious observances as special cases (look up animal sacrifice under santeria, and how kosher foods are classified,) demonstrate that secularity is not anti-religious. And I’ll openly admit that there is a difference between what someone wants to use as their personal worldview, and what they should be pursuing as standards for everyone. Religion is stupid, self-indulgent, petty, and dangerous – it is the dumbest thing any culture can ever embrace. For the record. Yet, making a law against it would be both pointless and oppressive. It’s up to people to make the decision on their own, and my part, placing value in fairness and reason, is to make the case about how stupid religion is, trusting in people to have working brains. If I cannot plead my case convincingly, perhaps it’s not strong enough.

[For anyone who reads that and smugly assures themselves that I haven't convinced them to give up religion, that's quite all right; the ball's now in their court to try and convince me to take it up ;-)]

SocialAbove all, humanism recognizes that the primary focus is the human race, and not whatever subgroup anyone places themselves within. Distinctions about nationality, or skin tone, or sexual preferences, or what is eaten for breakfast, are only methods of feeding ego, of drawing lines that place us on the good side. Don’t get me wrong; lines are undoubtedly beneficial, when drawn in a functional way, such as between the greater populace and rapists. Humanism uses social interaction, and empathy, and a generous helping of demonstrable consequences as its primary guide. It helps reduce the emotional influences upon what we do, especially regarding others, and substitutes careful consideration instead. True, one can argue that empathy is an emotional influence (especially if their goal is to challenge anything they don’t like rather than fairly consider it) – but empathy, or what it produces, is also a considered response: we rely on social interaction as a species, which requires fairness, trust, and mutual benefit. That’s why we developed it.

The only people who argue against these standards, don’t; they inevitably misrepresent humanism in fatuous and wildly creative ways, mostly to (and I hate to sound like a broken record here) feed their own ego and maintain their own privilege. It’s pathetic, and ironically, it’s exactly what those standards are intended to move mankind away from, and in doing so, ahead. We already know selfishness isn’t very beneficial – we just need better skills at recognizing it.

And with that, we return to the comment made above about a system that we can resort to when there are doubts. Because another aspect of secular humanism, also represented quite well in critical thinking, is the ability to evaluate our social structure objectively – to actually have doubts. It’s very easy to think in terms paraphrased as, “I’m happy with it, so what more is needed?” Obviously, this is hardly a functioning method of defining morality – which makes it all the more astounding how often it appears in cultures. There remains no small number of people who believe that a majority vote defines the ‘best’ approach to laws and governing, never comprehending what laws and government are actually for. Or those who believe that a right is something that should apply to one group of people and not another. Both of these fall under a concept called, ‘the tyranny of the masses,’ other times simply referred to as, ‘mob rule.’ There are enough historical examples of how this leads to bigger and nastier mobs that it’s pointless to reiterate here, except to say that less attention should be paid to the culminating events and more to the underlying attitudes that fostered them – treat the illness before it irreparably damages the body.

It’s entirely possible there is, or will be, some better method of approaching social and moral structure – though it’s hard to imagine how, to be honest. What has been demonstrated throughout history is that secular humanism far exceeds any other approach we’ve tried, and provides the structure for the greatest benefit and the highest function. So on World Humanist Day let’s at least recognize what it’s accomplished for us so far, and consider what it may yet accomplish in the future.

The depths of your eyes

Yeah, that title’s fairly similar to a post from about a year ago, but the difference is significant. That one was about a fly with a maze-like pattern in its eyes (thus, “lost,” get it?) while this one really does involve depth. I spend hours on these titles…

Anyone who’s had a close enough encounter with a praying mantis knows about the false pupil, even if they haven’t discovered that it’s false, believing instead that it indicates where the mantis is actually looking, as our own eyes do. Mantids, though, have compound eyes like most arthropods, lots of simple optical mechanisms bundled together into a knobby group that provides a wide field of view. Even with this field of view, mantids have an optimum angle of sight, and so will still turn their heads to face potential prey or danger; when this happens, the false pupil may become minimized or disappear altogether, enhancing the illusion, but the bare truth is, the false pupil (when visible) always faces the viewer. The mantis may or may not be focusing its primary attention on us, but those little black spots give us the impression that it’s looking right at us.

two images showing false pupil depthThere is a particular trait that I’ve noticed before under high magnification, and managed to capture in images the other day: the false pupils are not on the surface of the eye, but actually down beneath. This makes sense when you know why they occur, but seeing it firsthand is pretty cool.

Notice how the false pupil isn’t visible in the top image where the mantis’ face is sharp, but is pretty distinct in the bottom image where the face is out of focus; also note the comparative focus on the shoulder. Working in natural light with the macro lens at its widest aperture of f4, the depth of sharp focus is incredibly short. The false pupils are actually there in the top image, but blurred into indistinction by being out of focus. A slight twitch closer in the bottom image brought them into focus. Using even a slightly smaller aperture would have increased the depth of field enough to have face and false pupil in focus simultaneously (especially for a subject this small.)

You see, the false pupil is an optical effect. The compound eyes of most arthropods aren’t little clusters of spheres, but something more like a globular flower blossom, originating deeper within the owner’s head. Each eye is a tapered tube, with a simple lens on top and an optic nerve at the bottom; this gives each eye a very specific direction that it sees. Most times, it is the walls of these tubes that give the eyes their collective colors, since we are seeing nearly all of them obliquely, at an angle. It is only when we can see directly down the tubes that the color vanishes, and we get darkness instead, perhaps even seeing the optic nerve at the bottom. So yes, it really is farther away than the surface of the compound eyes, with the possibility that the effect is enhanced by the lenses themselves.

It is believed that this is an evolved protective trait, much like the coloration resembling eyes that several different species possess. Something that is staring right at you is aware of your presence, perhaps ready to defend itself vigorously. This not only runs against the hunting instincts of many species that want to capture their prey unawares, even we feel it; mantids are routinely described as having an “evil stare.” They have no more stare than a housefly, but just saying that isn’t enough to dispel the feeling, is it?

Another interesting trait about mantis eyes can be seen in the last image in this post; at night, the camouflage coloration fades to black. Presumably, this provides some benefit to their night-vision capabilities, but as yet I cannot tell you how or why. It’s also a trait that has to develop. The Chinese mantises, at least, are born with darker eyes but they turn to much the same color as the body within hours (see also this post,) and for the next several weeks, the eyes remain that way day or night. At a certain age, perhaps following a molt, their eyes can become dark at night, as I found out the other evening, when the same model we see above posed for a tight closeup well after sundown.
tight portrait of juvenile Chinese mantis
I wasn’t around when these hatched, so I only have a guesstimate of how old they are – we’ll use the known date of the hatching I witnessed and consider these nine weeks old. What I do have is a measurement of the eyes, since this guy held still for a close pass of the calipers: it’s 4.5mm across the outside of the eyes.

I have no information or trivia to pass along regarding this next image – I’m just including it for variety, and because I obtained it during the same photo session. It appears we also have a resident grey treefrog (Hyla versicolor,) though it might actually be a Copes grey treefrog (Hyla chrysoscelis,) a rarer species – I have to record their mating call to be sure, and so far I haven’t heard a sound from this one. Since they are largely identical, call it either one for the sake of the image.
grey treefrog
The little bit of cottony fluff near the toe, by the way, is some form of leafhopper nymph, partially demonstrating the same trait as the mantis, only for this species, the eyes turn red at night; by day, they are very pale blue-white. I’ll come back later on with more detail pics of the species.

Other ways of getting the results you want

Every once in a while, you will get to hear the phrase, “other ways of knowing” – almost invariably, it will be in defense of some topic that is sorely lacking in demonstrable evidence or repeatable results. But this doesn’t matter, because science isn’t all it’s cracked up to be, since there are other ways of knowing. While, not surprisingly, it is used most often to defend religion (most especially religious revelation,) I have seen it also used to excuse astrology and psychic powers, and philosophers have even blurted it out as a rebuttal to the loathsome demon of empiricism. I’ve never been able to take it seriously, always considering it a blatant dodge, but I finally decided to see if there was a more rigorous definition than the common usage; to see if I was selling it short, in other words.

The first thing to note is, ‘knowing’ is clearly a wildly subjective term. While most people are likely to consider this to refer to information that we not only have confidence in, we can also use it to predict or explain something about our world, this is rarely what anyone is referring to when they use ‘other ways of knowing’ as their trump card. Like ‘Truth™,’ knowing seems to only refer to something self-validating, supportive of a pre-existing view. No one ever points to someone else holding a view counter to their own and concedes the argument to them because of other ways of knowing – it is, strangely enough, only used in a selfish way.

Which makes it a little surprising to me to find that there are courses that examine ‘other ways of knowing’ as a defined topic in the theory of knowledge. I suppose it shouldn’t be surprising; theology still exists after all, and some pretty esoteric and pointless schools of philosophy. But it does make it a little easier to post about a specific approach rather than anyone’s personal usage. It also demonstrates that people have tried too hard to justify it as a viable topic rather than consider whether it really is a viable topic.

Am I being harsh? Well, you can judge that for yourself, since four of the other ways of knowing are emotion, faith, intuition, and memory. Naturally, faith had to get in here, since that’s the prime thing that people try to justify in the face of stubbornly nonexistent evidence, but can we honestly consider it a way of knowing anything if whatever is ‘known’ is wildly different the world over? Is something known because it is defined by how someone grew up and who placed emphasis on it, or is it simply culturally defined as important? If we can consider faith as a way of knowing something, then knowing has virtually no meaning whatsoever – you might as well say I know I am a brilliant scientist. Should I list this on my résumé?

Even the other three – emotion, intuition, and memory – are well known as being ridiculously inaccurate. In fact, it is the very scientific method that these attempt to dodge which demonstrates this, as if the huge success of any gambling establishment wasn’t enough. Perhaps we’re not talking about gambler’s intuition, or failed relationships, or even the low accuracy of eyewitness testimony when we speak of ‘knowing.’ But then again, if we’re allowed to pick and choose only the bits that support the concept, are we establishing any value to other ways of knowing at all? The scientific method was created because of these, because what people ‘knew’ wasn’t really producing any accurate answers. Falsifiability and replicability are the foils of false confidence.

What about imagination, and the role it has played in theoretical sciences and sudden insights? Does this make it worthy of consideration? Certainly, it’s an important part of scientific endeavor, but again, let’s not count only the successes – for every breakthrough achieved by imagined scenarios, there are a few thousand failures, since we need to remember that every crackpot and garage inventor is also relying on imagination. As is every child when playing, and every creator of fiction or art, and so on. So, how much is this contributing to our base of knowledge, versus how much is going off on unrelated or unproductive tangents? And does it even count if every breakthrough that was achieved through imagination also had to be backed by solid evidence and repeatable results, the hallmark of science in the first place?

So we get the question of whether language is a ‘way of knowing,’ instead of considering the rather obvious influence it has on how we approach things. It only takes a moment’s thought to realize that culture, quite naturally, has an affect on how we learn, and what we consider important, but that’s a far cry from considering it a method of obtaining knowledge in the first place. And of course, since we’re purposefully avoiding the hoary old empirical methods in this pursuit, we must therefore ignore the rather telling evidence that those speaking Portuguese do not produce more, or less, knowledge or insights than those speaking Farsi.

We come to sense perception, and are now starting to delve into the realm of the ridiculous. Everything that we ever learn comes through our senses, so they cannot be considered any ‘other way of knowing,’ but the functional apparatus that permits us to do so in the first place. Even imagination is considered to be mere reconstruction of sensory experience; we are not believed to be able to imagine something that has not been experienced, and if you don’t believe that, imagine what it’s like to see in infrared without using any resemblance to any other form of vision that we have. Meanwhile, questions about whether our senses can be considered accurate or skewed are philosophical at best, and tackled long ago, with the utter lack of value established back then as well. Certainly, we do not perceive everything that exists, and almost certainly, much of what we do perceive is individually colored. But this is as valuable as whether a computer has produced the answer to a mathematical formula by using Windows or Mac OS as the operating system; who cares? Is the answer accurate? What more do you need?

Finally, we get to reason, and you might think I’d have a hard time arguing against this. Yet, reason is only as good as the information it uses as a base. A few hundred years ago, it was certainly reasonable to believe that lightning and volcanoes were evidence of a god’s wrath; they were impressive and violent and, of course, everyone knew gods existed. Look as hard as you like, and try to find the people who determined geothermal activity through reason, intuition, emotion, faith, imagination, or even sensory perception. Dig out the people (and, since other ways of knowing shouldn’t be sporadic or rare, there should be a lot of them) who announced the true nature of pathogen-borne illnesses before the age of microscopes and culture dishes.

In fact, if you’re looking at the info in those provided links (1, 2,) you might notice something: they’re not really demonstrating that any knowledge is being produced by these topics, but instead asking if we can consider these as contributing. This is not only philosophy, but weak philosophy at that; soliciting essays on opinions isn’t exactly establishing the viability of the approach, is it? Especially when ‘knowing’ isn’t even defined, nor any goal set. Despite the number of times I’ve heard the phrase, I have yet to see any example of knowledge gained in this manner, even when I’ve specifically inquired. One would think, if it were a recognized phenomena, an example isn’t too much to ask – a lot of them isn’t too much to ask.

This has been tackling the defined, structured definitions of ‘other ways of knowing,’ which is saying nothing at all about revelation, or extra-sensory perception, or cosmic connections, or drug-induced insights, or all of the other aspects people seize onto when they feel there must be something else. Now, correct me if I’m wrong, but shouldn’t we expect knowledge gained through whatever means to be consistent, and extending beyond the personal experience? Shouldn’t the millions of psychedelic drug users who claim they have reached a different plane of consciousness be producing similar experiences? Shouldn’t religious revelation the world over be pointing to the same concept of gods, whatever they may be? Isn’t that how we actually define knowledge?

All of this has been ignoring a simple, yet wildly misunderstood fact: that the pursuit of science is not a structured ritual, but only a method to try and eliminate mistakes and human influence – exactly as noted above. There is nothing that prevents us from finding some previously unknown trait of humans, or clouds our judgment of such; if we can detect it ourselves, then ‘science’ can certainly find it. It’s not like it has to fit into a test tube or anything, and our methods have determined some pretty subtle and curious things. We discovered that numerous species can not only orient to the Earth’s magnetic field, they can read it to extremely fine degrees, something that we neither knew from experience nor expected. Many other species see portions of the electromagnetic spectrum (light) way outside of what humans can see, and possess abilities to detect distress in other species or the turbulence of the water. ‘Other ways of knowing’ are not, by any stretch of the imagination, ruled out by scientific investigation, or ignored, or even discouraged, and some of them have even been researched (and found lacking, imagine that.) It’s the scientific approach that lets us test the intuition, the imagination, the revelation or insight, to determine if they really are valuable. And, more often, shows us that they aren’t – for every right answer in science, numerous wrong answers have been ruled out by the same method. The ability to determine that something really is wrong, instead of just wondering or, even worse, ignoring the possibility wholesale, is also the strength of the scientific method.

Yet, there’s an even bigger disservice that ‘other ways of knowing’ inflicts upon us. As noted earlier, many of the potential other ways are known for their inaccuracy – something that is often poorly recognized by many people, when it’s not outright ignored. We have vast amounts of evidence that emotions, for instance, are simply mechanisms to provoke survival behavior – not at all a way of knowing, but a way of reacting, like the slap of a beaver’s tail onto the water when danger threatens. At times, we must ignore the emotional provocations, for the sake of polite company or traffic safety or avoiding a stay in prison. The supreme functionality of a brain that handles abstract thought and nuanced decisions is its ability to override emotions, to recognize that intuition is perhaps just wishful thinking, to see that faith is a cultural attempt to deny that evidence is thousands of times more dependable. Rather than finding facile, superficial ways to promote self-indulgence, we could be expending effort instead towards recognizing just how our thought processes work – and how they can go wrong. Might that be considered a bit more useful than self-gratification? I’d like to think so, anyway.

*    *    *    *    *

When looking up web resources for “other ways of knowing,” I came across this article. Lilian “Na’ia” Alessa has interpreted the phrase differently from the linked sources above, and indeed from most uses of it; her version, contrasting traditional Native American practices against the structure of “Western science,” is one of the few times I’ve seen the phrase used in a coherent and plausible manner. The point she makes is that her grandmother, lacking the benefit of any structured education, nonetheless possessed the skills to thrive in her environment.

I have no argument with this, but is this really another ‘way of knowing,’ or simply a culture clash? I haven’t run across anyone who’s ever said that people did not learn anything before the scientific method was adopted, or that current educational practices were the only ones that were effective. Alessa herself admits that her grandmother did not obtain her traditions through intuition or some kind of unknown ‘connection,’ but through the trial-and-error, long experience and observations that, in a more structured form, underlie ‘Western’ science itself (I perceive a certain snarkiness in her use of this compass distinction, but maybe I’m reading too much into it.)

Then, too, we must consider the other aspects found in the same culture, of personifying plants and the land and crediting amotken with the creation, as well as their belief that they have occupied the land since the start of time. While some of the rituals are undeniably useful, what are we to make of the lack of belief in amotken elsewhere in the world, or the significant evidence that the ancestors of the Salish entered this continent less than 20,000 years ago? How much accuracy is needed to consider something an effective ‘way of knowing?’ Because I have a special coin sitting on my desk that, for simple true/false questions, is correct 50% of the time.

But I can only determine this, of course, if I already ‘know’ what the correct answer should be through other means…

Blogging wasn’t in the cards

For anyone, should they actually exist, who has been stopping by and not finding any new posts, I apologize. On occasion, circumstances inhibit sitting down and working on posts, and this particular occasion was a move. We are now in a new house!

I take no credit for this whatsoever; it was all The Girlfriend’s accomplishment. Well, okay, that sounds like I didn’t even help with the move, which isn’t true at all, but what I mean by that is, it’s her house, and her finances that permitted it. She is quite pleased with it, and rightfully so. It’s in a considerably nicer neighborhood, not terribly far from the old place (which made moving a little easier,) but a lot more convenient to her work, and various useful shops. We will not be missing the old place, or the obnoxious neighbors, in any way at all.

And that goes for my own pursuits as well. I had actually planned to bring along a few of the mantises that had hatched back there, but found that a Japanese maple tree at the new place already plays host to a large number of Chinese mantises (got the Asian thing going on,) and I didn’t feel the need to introduce competition. The butterfly bush came along for transplant, along with the salvia plant and my almond tree, but the rosemary bush had grown too large to move, so we’ll have to start a new one here (and yes, we did get lots of cuttings to do this.)
Chinese mantis on Japanese maple
I was watching the almond tree with trepidation for the past few weeks, since the local white-tailed deer had a tendency to let it get fully leafed out before stripping more than half of the leaves away early in the morning; this is how they feed, browsing for tasty leaves or shoots but not killing off the provider, and then leaving it alone for a few weeks to replenish their food source before returning. My little tree, which had sprouted spontaneously from a discarded almond in the compost bin a few years back, had produced its first leaves in the spring and then been stripped several weeks ago. It had reproduced its foliage, and was due for a return visit; I figured it would get nailed right before the move, but the deer waited too long and I was able to transplant it intact.

The cats, it must be said, did not handle the move with feline grace; the more appropriate term is, “freaked out.” For a couple of days, they slunk around the new place like feral strays, jumping at every sound, and spending a lot of time deep in a closet. Eventually, they determined they were not intruding on someone else’s territory and could claim this as their own, and soon discovered the delight of stairs and a balcony overlooking the living room. A few days later when they were mellow, they were permitted to explore the screened-in back porch, which was all kinds of okay to them.
Little Girl, or is it Zoe? chilling in the window.
I, myself, am still recovering – my hands, feet, legs, and back took a beating, and of course I’m doing the typical post-move endeavor of trying to find where I packed this or that crucial thing. It doesn’t matter how organized you try to be, I think – Chaos will take over and make you dance to his discordant tune. I suspect I will get back into posting slowly, so for now I’ll just close with a small crab spider, genus Mecaphesa, that I shot during the final stages of packing. She measures 6mm across the tucked legs, so, not exactly an imposing specimen unless you’re tiny (or extremely arachnophobic.) I have spotted several interesting arthropods in the immediate vicinity, but so far have only taken the time to photograph the mantis above – I’ll try to amend that soon.
Mecaphesa crab spider in defensive posture

Near invisibility potion

Honeysuckle genitals
The other day I went out chasing pics again, and didn’t really snag much of merit. But while playing around with macro shots of honeysuckle flowers, I captured a few frames that illustrate a peculiar, and sometimes handy, photographic trait. It takes some explaining, so bear with me.

First, the illustration. These are two frames from almost exactly the same vantage point, with just a change of focus in between. The green stigma is in front of the yellow anthers bearing the pollen in both images. But as can be seen, it is rendered almost invisible in the right side, plainly semi-transparent.
composite image showing defocus transparency
How can this occur?

The first thing to remember is, when we look at something with our naked, or even demurely clad, eyes, we’re seeing through the tiny hole of our pupil. Photons that reflect from any surface all have to pass through this opening for us to see anything, and the size of the opening restricts both how many photons can come through, and from what angle. Objects reflect light not just towards us, but in all directions; most of it we simply do not see. And in the case of the stigma and anther, as illustrated in the top part of the image below, the stigma is sufficient to block most of our view of the anther.
Defocus transparency illustration
But a camera lens, and indeed many other lenses such as telescopes and binoculars, are different. They’re much larger than our pupils, so they capture a much greater percentage of the light reflecting from an object, everything that hits their front surface – properly focused, they take all of these photons, every path that meets the lens, and converge them back into a sharp image. The larger the lens surface (which usually means the ‘faster,’ or the greater the aperture,) the more light is gathered. This is why a 300mm f2.8 lens is so much larger than a 300mm f5.6.

This means that the green stigma may not necessarily block the view of the yellow anther, because the lens can also see past the stigma, to either side, above and below. While a portion of the view is blocked, not all of it is, so some of the light from the anther comes past. This gets focused down onto the film/sensor plane (shown in deep green.)

Yet, what about the green stigma? It’s still there, and still sending its own reflected light to the lens, right? True enough, but it’s out of focus, so the light paths do not converge back down into a sharp image; instead, the light is somewhat scattered, diffused over a greater area, while the light from the yellow anther is concentrated tightly (this pretty much defines the difference between unfocused and focused.) Light from the green stigma hits the film/sensor in the same place as light from the yellow anther, but the anther’s light is more concentrated, and overpowers the stigma’s. It’s not exactly transparency, it’s just that the object with the most light takes precedent.

This method works best when there is a large difference in focus (which usually translates as distance) between two subjects, and can be used to blur out a fence that blocks our view, for example. The higher the depth of field, of course, the weaker the effect, because the light from the green stigma would become more focused and concentrated. Since lenses are usually at maximum aperture while we’re framing our subject, only closing down to the desired shooting aperture after we trip the shutter, this can sometimes play against the macro photographer: when chasing subjects down among the plants, a leaf or stem can actually be directly between the camera and the chosen subject, but so far out of focus that it is virtually transparent to us through the viewfinder – only to burst into sharper focus and obliterate the subject when the shutter is tripped and the aperture closes down, re-concentrating the light from the leaf/stem. If the object is bright enough, even well out of focus it can throw a color cast across the subject, exactly as seen above.

Our eyes have lenses too, and the effect is exactly the same, but since they’re much smaller it is not as pronounced, and we tend to ignore it when it occurs. However, you can close one eye and hold something narrow like a toothpick vertically in your vision path while focused on something well past the toothpick, and see the same effect, just probably not to as high a transparency as the photo shows. Also, since we have two eyes, the other one may have a clearer view, and our brains can select which eye to give its attention to, so issues only arise on those rare occasions when either eye has a radically different view from the other, such as when we try to see into a narrow gap (especially when we want the depth-perception that two eyes provide.)

I have a page dedicated to explaining how and why aperture affects focus, if you want further information – just click here. It also explains some of the weird things that might occur, and why there is a limit to closing down aperture to increase depth of field.

I’m a dude

I had to wash off some things outside a short while ago, and while draining the hose, I set the sprayer for ‘mist’ and applied a liberal coating to grasses where I knew some of the praying mantises lived; I was rewarded with seeing one of them scamper up and begin drinking deeply from the water droplets adhering to the leaves. Of course, I trotted (it might have been a canter, come to think of it) inside myself and grabbed the camera. The recipient of my largesse, however, did not acquiesce to displaying this as I loomed nearby with the softbox rig.

There are at least three mantids that have moved to the dog fennel plants, however – this does not seem to have been the most advantageous action as they remain smaller than their brethren; either that, or there was another hatching that I remained unaware of. But since the one on the grasses appreciated the moisture, I brought out the misting bottle and heavily doused the areas on the dog fennel where the other mantids were out foraging. They appreciated this as much as the first, and eagerly sucked up what adhered to the leaves before the sun (which is quite bright and hot today) evaporated this windfall. Perhaps ‘windfall’ is not the right word here…
juvenile Chinese mantis gathering water
Seen here, one that had borne the full effect of the misting draws up water from its forelegs, having swept its eyes clear. If you don’t have a little misting bottle to carry in the camera bag, get one. Mine is from a purse-sized Jheri Curl, after I used the product up keeping my ‘fro dashing. (The true story is, I went to the drugstore specifically to find a misting bottle for photographic purposes, but everything they had was too big for the camera bag – until I spotted a clearance bin on the way out with items for a buck; that was fifteen years ago, and I still use that mister.)

Lest you think the mantis might not have appreciated this soaking, I wish to point out that not only do they suffer much worse than this during downpours and even fog, any of them could have easily dodged deeper into the dog fennel had they felt the urge – they certainly do it often enough as I lean in for a nice portrait. They get most of their fluids from overnight dew, and it did hit the dewpoint last night, but they still took advantage of the misting I provided, so, cool!

Not deep

I’m still here, and still largely busy – it’s going to be a lean posting month, but I’ll still try to put something up from time to time.

["From time to time" - isn't that a stupid phrase? Who makes these things up, and did they have any think what word good is?]

A few days back we received torrential rain, which is not to say this is any more remarkable than the rest of the country, but only as a lead-in, since it spurred me to go down to the river. I’ve seen plenty of evidence that it rises dramatically, yet never been down there to witness it firsthand, so I stopped down briefly the morning after the deluge. Below is a comparison composite image, the left side taken a few years ago but representative of typical conditions, while the right is the level Friday morning.
river depth comparison
It’s the same boulder in the middle of both pics, except one is seen aiming downstream while the other shooting across from the small point seen in the former. I wasn’t going to go wading in the river that day.

You can see how the river is flooding the banks in the right image, and as I stood there, a largish snapping turtle appeared between the tree and I, struggling desperately to gain a foothold on the flooded bank before vanishing back into the torrent; I had barely raised the camera and didn’t even lock focus before it was gone.

flooded footpathThis is part of the path that winds alongside the river, only about 20 cm under water at this point so I was still able to follow it – I spend the non-winter months in waterproof sandals specifically for conditions like this, because I think it’s silly to let a little water block me from something interesting. My feet are so used to this that I rarely notice the water temperature at all, unless it gets really extreme. [An example of this was when my dad visited one winter and we went out to the Outer Banks. He snagged a favorite and expensive fishing lure on something not far offshore, while casting in Croatan Sound off Roanoke Island, and I waded in barefoot to try and retrieve it - the water temperature did not exceed 4°c (40°f.) It took two attempts and became pretty painful, turning my lower legs beet red, but I got the lure and recovered quickly. I still find the people who go swimming in freezing weather to be morons, though.]

New, untouched silt and debris were distributed onto the path at higher elevations than this as well, indicating that the water level had been at least a half-meter higher, and probably more like a meter, sometime the previous night. It’s difficult to predict how hard this is on the local animal life; I know beavers often live in hollows in the banks, and water snakes are common, but both of these need air and can be drowned in their dens if the water level traps them within. Animals such as deer than venture into the rapids can easily be swept away, and the debris that is carried can be pretty dangerous even to animals that can handle the turbulent water.

There’s not much else going on. I haven’t been tackling any philosophical ideas recently, and I have a number of posts in draft form but nothing I feel too motivated to finish. It goes that way sometimes, and while I occasionally feel bad for not putting up new content, I also made up my mind long ago not to post for the sake of posting (whether I’m succeeding in that resolve remains to be seen, I suppose.)

Even the arthropods have been fairly scarce. The mantids dispersed quickly throughout the yard, and I occasionally spot one but they’re still shy about close contact; at their size, anything that shows detail at all is close contact, so good shots are tricky.

One exception is a variety of treehopper that has descended on the erupting dog fennel plants. I had to check just now to determine what the difference was between treehopper and leafhopper; basically, treehoppers look like thorns while leafhoppers look like buds or seeds. I also had to add both words to my computer’s dictionary so it would stop highlighting them as misspelled; it seems to think either should be two separate words, or hyphenated at least.
Entylia carinata
Entylia carinata and stem damageThis is an Entylia carinata, no apparent common name, and a significant number appeared on the plants overnight it seems. They run about 5mm in length, and have a tendency (like all ‘hoppers) to scuttle around to the back side of the stem when someone leans close, so it took quite a few tries to get a nice shot. The pic at right, in fact, is one I consider a ‘miss’ except for one thing: I’m pretty sure the discoloration of the plant stem beneath the treehopper is from the damage that they do while sucking out the sap. The amount of nutrients they extract from the sap is minimal, so they draw a lot and process it through their systems pretty quickly, excreting the rest as ‘dew’ that is often harvested by ants. No ants were taking advantage of these, however, even though the species is known as a favorite of them. I’ll keep my eyes open, since while I have a few images of ants farming leafhoppers and aphids, I’d still like some more detailed examples.

I’ll close, despite my disparaging comments above, with a mantis image. It might seem strange, but different individual insects can display different ‘personalities,’ or to be more accurate, varying responses to the same stimulus. What this means is some of the mantids from the same hatching are quite spooky and go for cover if I make the wrong move, while at least one other seems pretty tolerant of me leaning close with the camera and flash/softbox rig. Since this one has moved to a patch of some ornamental grass next to the rosemary bush, I am confident that I’m encountering the same individual, who has now taken on a more distinctive green hue. While larger, this one is still only 20 mm in overall length, so you can imagine how small the head is. In contrast, the sibling who has moved to the dog fennel plants is far more circumspect, and hasn’t allowed any decent images at all, much less a menacing portrait of this nature.
Chinese mantis portrait

Spoke too soon

Looking at the nest box only minutes ago, and it appeared the parents were still trying to feed their nonexistent young, and so I decided to see if the snake was still present. There was a small surprise waiting therein.
surviving bluebird fledgling
Go back down and look at that other image. Do you see any hint of this guy in there? Yeah, me neither, through multiple frames too. But it did explain why I thought I’d heard a peep while outside the nest box, shining my flashlight through the opening – after finding nothing but the snake, I put it down to coming from the mother nearby. Somewhere under or among those coils, however, sat this little sprog, apparently a helping the snake couldn’t stomach. Either that or he fought his way out…