Home At Last

Featured Photo Credit by Esther Merbt from Pixabay [edited]

Have you ever had this sensation?

You’re about to walk someplace, maybe as a young child, maybe with a parent or sibling, feeling absolutely glum, maybe even dismal, because “… it’s so far away!… we’ll have to walk so long!… it’s going to take forever!” The whole way there, it takes ages, like one long walking wait.

But once you finally get there and do your thing, and leave again for home, the walk back feels nowhere near as long, or daunting. I remember one explanation was that we encounter all the same things on the way home in reverse order – a fence, a tree, a crack in the sidewalk. Obviously, we reach them sooner in reverse, but they seem to arrive more quickly because they’re fresher memories. Then, before you know it, there we are again, home at last.

Image by Dimitris Vetsikas from Pixabay

… yeah, who knows. But I do know I’ve had that sensation plenty of times where the journey back felt way faster, and home nowhere near so forever-far away, than when we first set out.

Another explanation is that, when we first set out, a whole adventure lies ahead, and our imaginations have room to breathe and explore the unknown. This one rings true both directions, there and back, which is actually why I don’t buy it… if it’s such an adventure, then why all the dread and pre-walk fatigue and wishing we’re already finally finally there? Why do things one way feel like forever, but the other way seem so quick?

I got to wondering after another idea… how looking forward to the future can seem so far away, compared to looking back at the past, which can feel like just yesterday.

Photo Credit by Nati from Pexels

I know what you’re thinking, and you’re right: the past really is just yesterday. But I mean the distant past, which can feel so recent, particularly as we get older. There we are one day, when suddenly – whoosh – it’s all behind us. All sorts of thoughts arise, looking back… ‘it just goes so quickly’… ‘if I could do it all over again’… ‘I wish I knew then what I know now’… all those thoughts, and emotions too, which we sometimes call ‘regret’ or sometimes we call ‘wisdom’, and which only arrive as we look back from where we came.

As we look forward, “…ages from now” or “…in a few thousand years,” the future just seems so forever-far away, though there’s also a reverse effect… say, when some local business tries to invent tradition by leaning on – wow – a whole “quarter century.” No question, time scales in the hundreds and thousands consume lifetimes. Yet I’ve also had days as an adult when even “… next month” felt like the distant future. Days like that, looking back at fleeting life, I might happily wish I really was back walking to some faraway place with a parent or sibling – then, at least, I’d be looking forward not to the weight of ages but only to the walk back home.

Memories Go Fourth

Featured Image by PublicDomainPictures on Pixabay

Do you click hyperlinks? This post is full of them: click them, follow them, read them. Or else search the topics and find links of your own, but get started doing something to learn more.

A year ago, I was reminiscing… three memories: inflation, gold, and a profligate economy founded upon debt, which only sounds like an oxymoron.

… Grandpa here yet?

One more memory, along those lines. Mid ’70s, probably pre-Star Wars, probably a Saturday, and Grandpa’s over for dinner, always a minor occasion. With front-door greetings over, coats taken, dogs settled, and drinks well in hand, the family sits together around the living room, as ever talking and remembering and passing the time.

Grandpa could slice politicians like birthday sponge cake

This particular afternoon, though, the tone is vexed… that special incredulity-slash-resignation only politicians can inspire. This afternoon, the value of a dollar or, rather, value past now lost: how much a dollar no longer buys, and the daft decision-makers who don’t or won’t comprehend, much less accept, their own responsibility – not only for here-and-now but come the inevitable reckoning.

Bonus memory… remember this one? Probably from the ’90s… your RSP advisor has you signing a document that reads, Past is no indication of future performance, and while you scrawl, they finish their spiel by saying, “…but historically, the stock market has always tended to rise.” And as far as that was accurate, it was also mere dissembling unless adjusted for inflation.

As snow on the ground is not the weather, ‘rising prices’ are not inflation. As far as inflation is the issue, prices don’t actually rise; rather, the currency’s purchasing power falls, and more dollars must accumulate to buy the same item as before, which seems like prices getting higher but is not the same thing as costs that rise against sound measures.

If today’s runaway inflation has not been headed our way the past fifteen years, then how about the past fifty? A few months after I was born, the 37th US President responded to runaway inflation, which by then of course had been politicised. Two features of his new economic policy were an end to the fixed-exchange currency arrangement, in existence since 1944, which by then of course had been politicised, and a shocking yet not shocking halt to the international convertibility of currencies into US-held gold. Perhaps no shock, none of this is considered among the 37th Administration’s accomplishments, not whether you look here, here, or even here although it is briefly mentioned here.

At least grant the past two years, yet the ‘runaway’ bit, now as in the ’70s, is simply the amplified effect of machinations underway for the past 109.

M2 Money Supply still matters. Chart: https://fred.stlouisfed.org/series/M2 (although don’t bother checking on May 2–3, 2022)

Before the Federal Reserve was enacted in 1913, inflation was an expansion of the money supply – these days an increase in the supply of currency units: dollars, renminbi, rupees, whatever. An effect of inflation is a definite fall of purchasing power by dilution of currency units from having created more and more and more of them. Prices ‘rise’ driven by currency debasement, an effect of inflation where prices seem to rise because their numerical values get higher, and people might spend more for the same, or spend less and have less. Another effect is an expectation everybody gathers for prices to get higher still. Prices ‘getting higher’ or prices ‘rising’ is my way to distinguish between what economists say is nominal and what they say is real. Credit 20th century Fed-era economists, all tied up in knots over the determinants of demand-pull and cost-push and all manner of academic-importance speech, which I guess is the mission-creep you’d need if there were a vacuum of cultural memory that needed filling – as easy as money from thin air, or playing god with people’s lives and livelihoods, or hubris.

How little of this actually is a cultural memory anymore, much less a family one, might be telling. Something I remember my Dad used to say all the time – he said it that Saturday in the living room, and I remember my Grandpa agreed: “It isn’t that people don’t know; they simply don’t want to know.”

There’s a lot riding on partial perspectives – everything, you could say – so, what… let it ride?

Or look into things. More thoroughly, on your own terms, whether you’re curious or not. Suppose you even come to understand the world more than you thought you did, or more than you remembered.

Can you tell which one was the watchdog?

On Bias: I. Disparate Bias

Featured Image Credit: Photo by Obelixlatino on Pixabay

On Bias: I. Disparate Bias

Various dictionaries define bias as a tendency or inclination: usually preconceived, sometimes unreasoned, and typically unfair; an inherent if not intentionally irrational preference. Partiality, expectancy, perception… where such words are synonymous, still if they meant exactly the same thing, wouldn’t we use exactly the same word? So what exactly is bias?

Sociologist Jim Mackenzie, ostensibly on behalf of teachers, introduces bias as “something we all deplore.” He associates bias with truth, in a negative way where bias leans toward its own ends, and with justice where bias unchecked is distinctly unfair. Still, his ensuing analysis of two “images” is instructive. First, bias indicates something to be gotten rid of… imagine an excess of prejudice, or a distortion or “impurity in a lens… that prevents us from seeing things as they really are” (p. 491) which, you’ll notice, leaves an alternative of being unable or unwilling to get rid of it. Second, as a void or deficit to be filled, or an insufficient consideration or partial blindness by which we’re unable to see what’s already there, or even see any alternatives, bias indicates something to be gained which, again, leaves open the inability or unwillingness to gain it.

Before going any further, I’d be remiss to overlook my own word choice, summarising Mackenzie: both times, you’ll see I wrote “… bias indicates, which leaves….” Something indicated is pointed out, presumably against criteria or else how would you know to single it out? And for leaving an open alternative from which to choose, such criteria would be definite, or else why not assert amidst ambiguity? Anyway, this is the reasoning for my word choice, and you can take it or leave it, as you will.

Considering his first image, an excess of prejudice, Mackenzie cites Edmund Husserl and the “full intellectual freedom” needed to “break down the mental barriers which [our habits of thought] have set along the horizons of our thinking” (p. 43) – by today’s messaging, ‘open-mindedness is a hard ask’. Maybe so, but set against this is Husserl’s steadfast urgency that “nothing less is required.”

Husserl’s prescription reminded me of Sir Francis Bacon, who would have us overcome what he called the four Idols of the Mind that “imbue and corrupt [our] understanding in innumerable and sometimes imperceptible ways”:

The human understanding is most excited by that which strikes and enters the mind at once and suddenly, and by which the imagination is immediately filled and inflated. It then begins almost imperceptibly to conceive and suppose that everything is similar to the few objects which have taken possession of the mind…

… although the greatest generalities in nature must be positive, just as they are found, and in fact not causable, yet the human understanding, incapable of resting, seeks for something more intelligible. Thus, however, while aiming at further progress, it falls back to what is actually less advanced, namely, final causes; for they are clearly more allied to man’s own nature, than the system of the universe, and from this source they have wonderfully corrupted philosophy.…

The human understanding resembles not a dry light, but admits a tincture of the will and passions, which generate their own system accordingly; for man always believes more readily that which he prefers. He, therefore, rejects difficulties for want of patience in investigation; sobriety, because it limits his hope; the depths of nature, from superstition; the light of experiment, from arrogance and pride, lest his mind should appear to be occupied with common and varying objects; paradoxes, from a fear of the opinion of the vulgar; in short, his feelings imbue and corrupt his understanding in innumerable and sometimes imperceptible ways. (pp. 24–26)

My guess would be no love lost for Bacon, these days, which is nothing if not ironic.

Husserl also reminded me of one of my professors, Dr. William Pinar, whose concept of “reactivation” would seem to grant history just that wee bit more consideration as we set about our ideas and memories, revisiting if not revising them with more intention and, presumably, greater awareness. All these together – Bacon, Husserl, Pinar – I see reflecting Mackenzie’s second image, a deficit of alternatives, not something to be corrected or removed but rather a void to be filled, an absence noted, an allusion by Joyce to the “gnomon in the Euclid” (p. 1). What luck for Mackenzie, being so well represented (thanks of course to me).

And lucky for all of us to be surrounded by the greatest unfilled void imaginable, a universe of limitless time and space: “… a pretty big place!” to quote Dr. Arroway, and certainly big enough to surpass any no-worries belief that, nah, we have it all well in-hand. Mackenzie’s implication is that everyone’s necessarily biased for being finite. Sure, we continually develop new understandings across spaces over time, but short of real omniscience, as if we might observe the Earth’s sphere from its surface – so, make that well short – who could possibly come to know all there is to know? Our limit is our bias although I find a lot of people seem to get this reversed. Then again, pride and prejudice pair up for box office mojo that wisdom and humility would hardly dare to dream.

Our limit is our bias although I find a lot of people seem to get this reversed.

Bias arises inevitably from… call it what you want: our nature, a state of being, Dasein. We’re inescapably subject to it, beset and enamoured by it. Its cumulative effects inflect our cultural systems and institutions while remaining, as Heikes says, invisible to everyone involved, like the water to those oblivious fish. This may be why Mackenzie sets the “onus of proof” for demonstrating bias, be it misunderstanding or insufficient consideration, upon “the person who claims that something is biased, for that is provable,” i.e. hey look! something more, something else, something different. As for demonstrating that someone has fully completely understood or utterly thoroughly considered a matter, and therefore is unbiased: this remains impossible although more and more our cultural infatuation is to start your impossible and put those pesky finite limitations to the sword. Time to fly on waxen wing and silence father’s voice.

Characterising limit as the antagonist is not our only option although for raising hackles, or for squeamish sentiments like Mackenzie’s introduction, I guess bias was inevitably doomed to be the dagger of our mind. Still, somewhat less drastic is MacMullen, who casts bias as more benign prejudice “that exhibits resistance to rational criticism” wherein, I suppose, the patient must minister to himself. As it happens, I have a teacher bias that’s comfortable with MacMullen’s perspective, particularly where he cuts to the core debate that has faced biased educators through the ages: what should comprise the curriculum? what should we teach because what is worth learning? what is school for? Whatever our response to any of this, surely it’s not to cap our limits but to stretch them.

“We must decide,” MacMullen counsels, “whether, when, and how to expose children to (what we take to be) the most powerful critiques of and significant alternatives to our existing political order.” Critical thinking powers activate – that goes for you too, Critical Theorists, so I hope you brought enough for everybody.

Hey, though, one look at the pantheon of scholars and heroes in this post should be all anyone needs, as compared to MacMullen’s thing about (what we take to be) critiques and alternatives to our existing political order, or any order really… since when did the root of all debate become a mere parenthetical?

Click here to read On Bias: II. Wrong Bias?

%d bloggers like this: