What On Earth Were They Thinking?

How often have you heard somebody question people who lived “back then,” in the swirling historical mist, who somehow just didn’t have the knowledge, the nuance, or the capability that we so proudly wield today?

“Things back then were just a lot more simple.”

“They just weren’t as developed back then as we are today.”

“Society back then was a lot less informed than we are today.”

It’s difficult to even confront such statements out of their context, but where I’ve had all three of these spoken to me in (separate) conversations, I challenged each impression as an insinuation that we’re somehow smarter than all the peoples of history, more skilled, more sophisticated, more aware, more woke (as they say, these days), that we’re, in the main, altogether better than our elders merely for having lived later than they did. These days, apparently, “we’re more evolved” – ha! 🙂  more evolved, that’s always a good one, as if back then everyone was Australopithecus while here we are jetting across the oceans, toting our iPhones, and drinking fine wines. Well, sure, maybe things have changed since back then, whenever “then” was. But, more typically I’ve found, contemporary judgments levelled upon history are borne of an unintended arrogance, a symptom of 20:20 hindsight and the self-righteous assurance that, today, we’ve finally seen the error – actually, make that the errors – of their ways.

Surely, these days, few – or any – would believe that we’re ignorant or unaccomplished or incapable, not the way someone might say of us while looking back from our future. At any given point on the historical timeline, I wonder whether a person at that point would feel as good about their era, looking back, as any person on some other point of the timeline would feel, also looking back, about theirs. Is it a common tendency, this judgment of contemporary superiority? These days, we might well feel superior to people who had no indoor plumbing or viral inoculations or Internet access, just as someone earlier would have appreciated, say, some technological tool, a hydraulic lift to raise heavy objects, or a set of pulleys, or a first-class lever – choose whatever you like! It really is arbitrary for a person, looking back at history, to feel better about contemporary life because contemporary life infuses all that person’s experience while history’s something that must be learnt.

I’d say that learning history means learning whatever has lasted and been passed on because what has lasted and been passed on was deemed to have merit. We’re taught the history that has been deemed worth remembering. What I’ve found has been deemed worth remembering (i.e., the kinds of things I learned from History classes) are the general mood of the times, the disputes and fights that occurred (violent or academic), a select handful of the figures involved, and an inventory of whichever non-living innovations and technologies simultaneously arose alongside it all. If, later, we demerit and no longer pass on what has lasted up until then, passing on instead some different history, then that’s entirely indicative of us, now, versus anyone who came before us, and it reflects changed but not necessarily smarter or better priorities and values.

For me, we shouldn’t be saying we’re any smarter or better, only different. So much literature has lasted, so much art. Commerce has lasted, institutions have lasted, so much has lasted. Civilization has lasted. Cleverness, ingenuity, shrewdness, wit, insight, intellect, cunning, wisdom, kindness, compassion, deceit, pretence, honesty, so many many human traits – and they all transcend generations and eras. People vary, but human nature perdures. I’ll trust the experts, far more immersed in specific historical study than me, to identify slow or subtle changes in our traits – hell, I’ll even grant we may have culturally evolved, after all – and I can only imagine in how many ways the world is different now as compared to before now. But what does it mean to be better? Better than other people? Really? And who decides, and what’s the measure?

We can measure efficiency, for instance, so to say technology has advanced and is better than before is, I think, fair. Even then, an individual will have a subjective opinion – yours, mine, anybody’s – making culture not proactive and definitive but rather reactive and variable, a reflection, the result of comprised opinions that amplify what is shared and stifle what is not. As we’re taught merited history, you’re almost forced to concur, at least until we reconsider what has merit. That’s a sticking point because everyone will have an opinion on what is culturally better and what is culturally worse. Morality inevitably differs, and suddenly we have ethical debate, even disagreement, even discord. But to say people or culture are better, I think, is too subjective to rationalize and a questionable path to tread.

Consider this as well: we each know what we’ve learned, and as individuals, we’ve each learned what we value. But what we’re taught is what’s broadly valued and, thereby, prescribed for all. We’ve all heard that rather hackneyed epigram, that those who neglect history are doomed to repeat it. Well, maybe the ones screwing up just didn’t learn the right history to begin with. I tend to abide by another hackneyed epigram, that they are wisest who know how little they know. Real historical wisdom and real historical understanding would be seeing and thinking and understanding as people earlier did. But short of firing up the Delorean for an extended visit some place, some time, it seems to me that judgments about history are made with an aplomb that might be better aimed at acknowledging our finite limitations. We’re no angels. If anything, this error of judgment speaks volumes about us. Condescension is what it is, but in my opinion, it’s no virtue.

We should hardly be judging the past as any less able or intelligent or kind or tolerant or virtuous as us, especially not if we aim to live up to today’s woke cultural embrace of acceptance. Being different should never be something critiqued; it should be something understood. Conversely, in proportion to how much we know, passing judgment is assumptive, and we all know what Oscar Wilde had to say about assuming (at least, we know if we’ve studied that piece of history). At the very least, we ought to admit our own current assumptions, mistakes, errors, accidents, troubles, disputes, and wars before we pass any judgment on historical ones.

On that positive note, I will say that considering all this has prompted me to notice something maybe more constructive: so often, at least in my experience, what we seemed to study in History class were trouble-making causes and effects, bad decisions, and selfishly motivated behaviours. Far more rarely was History class ever the study of effective decision-making and constructive endeavour – maybe the odd time, but not usually. Maybe my History teachers were, themselves, stifled as products of the system that educated them. What could they do but pass it along to me and my peers? Considering that, I might more readily understand how people, alive today, could conclude that all who came before were simply not as enlightened, as sophisticated, or as adept as we are now.

Yet that merely implicates contemporary ignorance: assumptions and mistakes still happen, errors still occur, accidents – preventable or not – haven’t stopped, troubles and disputes and wars rage on. If the axiom of being doomed to repeat history were no longer valid, we wouldn’t still feel and accept its truthful description, and it would have long ago faded from meaning. All I can figure is that we’re still poor at learning from history – the collective “we,” I mean, not you in particular (in case this essay was getting too personal). We need learned people in the right positions at the right times, if we hope to prevent the mistakes of history. Not enough people, I guess, have bothered to study the branches of history with genuine interest. Or, no, maybe enough people have studied various branches of history, but they don’t remember lessons sharply enough to take them on board. Or, no no, maybe plenty of people remember history, but the circumstances they face are just different enough to tip the scale out-of-favour. Epigram time: history doesn’t repeat, but it does rhyme. Or maybe we’re just full of ourselves, thinking that we’ve got it all solved when, evidently, we don’t.

It also dawned on me, considering all this, that high school “History” influences what many people think about broader “history.” My high school experience, university too, was mostly a study of politics and geography, and toss in what would be considered very elementary anthropology – all this as compared to those other branches of historical study. Archaeology and palaeontology come to mind as detailed, more scientific branches of history, but there are so many – literary history, philosophical history, religious, environmental, military, economic, technological, socio-cultural as I’ve already indicated, on and on they go, so many categories of human endeavour. I’ve even come across a thoughtful paper contemplating history as a kind of science, although one that is normative and susceptible to generational caprice. One final epigram: history is what gets written by the winners, which some will rue, some will ridicule, and some will call “unfair,” but which I will simply acknowledge as what people evidently do.

And that’s really the point here: throughout what we call human history, where we’ve subdivided it so many ways – right down to the perspective of every single person who ever bothered to contribute, if you want to break it down that far – it’s people all the way back, so it’s all biased. So it’s neither complete nor even accurate until you’ve spent oodles of time and effort creating a more composite comprehension of the available historical records. And, dear lord, who has time for that! History, in that respect, is barely conceivable in its entirety and hardly a thing to grasp so readily as to say, simply, “Back then…” History is people, and lives, and belief inscribed for all time. To know it is to know who lived it as well as who recorded it. Knowing others is empathy, and empathy is a skill trained and fuelled by curiosity and diligence, not emotion or opinion. Emotion and opinion come naturally and without effort. For me, valid history derives from informed empathy, not the other way around.

As far as recording history for future study, ultimately, it will have been people again recording and studying all of it, “it” being whatever we care to remember and record about what somebody was doing, and “doing” being all of what people were doing to the attract attention of those doing the recording. It’s all a bit cyclical, in itself, and completely biased, and someone will always be left out. So people might be forgiven when shaking their heads in judgment of the past because, without empathy, what else could they possibly know besides themselves?

Catch-22: A Masterpiece by Joseph Heller

WARNING! This post is an analysis and celebration of Joseph Heller’s novel, Catch-22, and it DOES contain PLOT SPOILERS. If you wish to read the novel for the first time, do not read this post.

“Give the first twelve chapters a chance” has long been my advice to anyone who asks about Catch-22, Joseph Heller’s modernist masterpiece that critiques the absurdity of the military during wartime. If you haven’t read the book, I will hardly spoil things by explaining how eagerly we witless first-timers set out to read such a lauded modern classic, only to be confronted by what might be the most frustrating paragon of show-versus-tell in existence. (However, I will be discussing spoiler details from here on, so be warned.) From the seemingly disparate chapter titles to the disjointed narrative, which repeatedly folds back upon itself, from a maddeningly mirthful plot device, which tempts you to toss the book aside and deny its existence, to an irresolute closing – if you make it that far – the book continually challenges readers to deduce what’s happening and piece together what’s happened. Toss in what seems like an endless cadre of characters, ranging from odder to oddest to perhaps not so odd, the book is a challenge, no question.

For seven years, I assigned this book as summer reading for returning seniors. Oh, how the students complained about those twelve chapters – excessive! pointless! irritating! – only to feel more aggrieved at hearing, “Exactly,” my necessary reply. Once the venting subsided – usually at least half the first lesson – we’d begin discussing why Heller’s book could only be written this way as compared to some more conventional, accessible way.

For one thing, we need to meet the protagonist, Yossarian, and understand his circumstances so that, at appropriate upcoming times, which of course will have already occurred, we won’t criticise but will instead favour him. To this end, the entire story is told out-of-sequence, opening apparently in media res during Yossarian’s hospital stay. We have character introductions and letter censoring, foreshadowing how words and language will be manipulated while characters will be isolated, alienated, and demeaned. Subsequently, we learn the logic of Catch-22 from Doc Daneeka. And that Snowden dies. If we’ve navigated the twelve opening chapters and lived to tell about it, we learn that Yossarian, originally a young, excited airman, once needed two passes over a target in order to bomb it successfully, which gets his crewmember, Kraft, killed. Yossarian is further distressed upon returning when he receives a medal for the mission. Meanwhile, Milo opens his syndicate. The tension of tedium, the injustice of fortune. The folly of command, the depravity of humankind. Capping the story is the gruesome account of Snowden’s death, the key incident that incites Yossarian’s fear and lands him in hospital, where we first meet him – naturally, Heller waits until the end to tell us the beginning.

Heller writes with an absurd, illogical narrative style that characterises Yossarian’s internal eternal predicament, wending its way through isolation, alienation, discord, misery, paranoia, fear, senselessness, deception, vice, cruelty, even rape and murder. Catch-22 being what it is, its victims have zero-chance to overcome because the antagonists are permitted to do whatever the protagonists are unable to prevent. All along the way, Heller has Yossarian wanting out of the military (fly no missions = live), and he continually ups the ante between Yossarian and all the disturbing confrontations and contradictions that antagonise him, from his enemies and his commanders to his acquaintances and his comrades. But ultimately, and most potently, he has Yossarian suffering from his own self-interest. As the narrative flits and tumbles about, in its own progressive way, Yossarian’s self-interest evolves or, better to say, devolves. What does evolve, inversely to self-interest, is his compassion as he gradually grows more concerned for the men in his squadron, and which by Chapter 40, “Catch-22,” has extended to all innocent people beset by oppression, prejudice, and exploitation. So when Colonel Cathcart’s promised deal to send him home safely, definitely, comes ironically (fittingly!) at the expense of the squadron, Yossarian ultimately recovers enough self-reliance to overcome his personal anguish but not enough to remand himself to the cycle of absurdity. Given Heller’s dispersed timeline, describing Yossarian’s character development as a narrative arc or an evolution is less accurate than the piecing together of a jigsaw or the unveiling of a secret.

Perhaps unsurprisingly, Yossarian’s instinct for self-preservation is the source of his personal torment. His despondency and disgust over the preponderance of all human self-interest finally turn Yossarian’s decision to go AWOL, at criminal risk but personal safety. That such a climax works is because readers – like Yossarian – are no longer fighting back but giving in, yet even then Heller offers no respite – the story ends ambiguously, leaving readers to satisfy their own vexation. Even so, I suspect that Heller appreciated John Chancellor’s life-imitating-art intiative as one inspired by more than a spirit of fandom. So where some characters have been subjects of compassion, others agents of absurdity, readers’ resultant responses have also undergone a perfectly natural evolution, mirroring Yossarian’s character development and culminating with his terrifying walk through Rome. The horrors of “The Eternal City,” in this light, are not only an essential but an inevitable piece in Heller’s plan.

Yossarian’s shall-we-say militant decision to desert is borne of Snowden’s ugly death during the Avignon mission, only a week after the death of Kraft and the award of Yossarian’s medal. Seeing Snowden’s innards spilling rudely from his body nauseates Yossarian and haunts him throughout the entire (or, from Yossarian’s perspective, for the rest of) the story. Yossarian, inset by Heller on behalf of soldiers as a protagonist, has no way of making things better. His futile effort at comfort, “There, there” (p. 166), is comically insincere for its honest helplessness, an understated shriek from all soldiers continually sent to face death – not death without context but without resonance. However, for Yossarian and his comrades, the context of sacrifice is all too irrationally clear: thanks very much. Catch-22. Soldiers face the dilemma of following orders that entirely devalue their very existence.

Participation as a soldier offends Yossarian to the core, yet it also helps him to reconcile his fear over death: “… man is matter,” finite, mortal and – without spirit – simply “garbage.” In fact, this sentence sums human worth as a blunt statement: “The spirit gone, man is garbage” (p. 440). Six words of sad, harsh consequence, war, no longer wearing a comic mask. The absolute phrase, a terse syntactical effect, annuls man’s significance – spirited briefly, gone abruptly, an empty corporeal body left over, garbage. Garbage is a harsh image – rotting flesh, buzzing flies, scum, residue, stench. Pessimism, cynicism, worthlessness. On such terms, one wonders whether anyone might willingly die to save themselves, as it were, another troubling revelation engineered by a masterpiece of unprosaic illogic. Yet even on this point, Heller’s genius is flawless. Haunting though it is, Snowden’s death gradually reveals to Yossarian the very path to life and safety that he has pursued ever since the opening chapter in the hospital – which is to say, ever since Snowden’s death drove him there in the first place.

This is why Heller refers to Snowden’s death, specifically his entrails, as a “secret” because to reveal it any earlier would be to end the novel. And he calls it Snowden’s “grim secret” to illustrate Yossarian’s suppressed mental anguish. Heller has Yossarian recall Snowden a number of times, each admitting more detail, each growing more vivid, each driving him a little closer to his final resolution. Heller’s portrayal of Yossarian’s traumatised memories in this way suggests the nightmarish flashbacks that people, particularly soldiers, endure following the horrors of war. His final flashback in Chapter 41, “Snowden”, is prompted when Yossarian wards off the mysterious stranger in – where else? – the hospital. It’s most revelatory for Yossarian – and readers, by extension – because, here at the end of his patchy, appalling flashbacks, he is finally secure enough to divine for himself – or is it to admit to us? – the grim secret found in Snowden’s entrails. In the same way, the climax is most revelatory for readers who – at the mercy of Heller’s dispersed narrative structure – have been made to wait until the closing, when the time is finally ripe.

To get there, we are dragged unwittingly by Heller down a path of frustrating sympathy, illogical absurdity, and agonising anticipation. By the time Yossarian is introduced (in the opening chapter!) censoring letters and conniving a way to escape the war, he is that much nearer to desertion than we can yet know. Certainly, Snowden will convince us to desert as surely as he convinces Yossarian, but that will happen later, after Heller has aggravated our tolerance and mottled our innocence. Heller must drag us down Yossarian’s agonising path, or else he places us at risk of passing premature judgment upon not merely his protagonist but his entire message. Finally, when the moment arrives that we gather full appreciation of Snowden’s death, we have all we need to share in the vindication of Yossarian’s desertion.

So here is our way to grasp the grim secret behind the novel’s dissembling structure as restlessly and imperturbably as Yossarian does: the root of conflict, Snowden’s death, can only occur at the end of Heller’s narrative path, not Yossarian’s. The story simply works no other way.

Suppose Trivial Grammar Were Actually the Road Map

Satire compounds disagreement, and ridicule has a way of locking closed doors

Sometimes, the hardest part of teaching felt like finding a way to reach students when they just didn’t get it. But if there’s one thing I learned while teaching, it’s that it takes two. In fact, the hardest part of teaching was coming to realise it wasn’t them not getting it, it was me not getting them.

In my own defense, I think we just never can know what another person’s motive truly is. It was times like that when I realised the true constructive value of respect and a good rapport. To have any hope of being open-minded, I intentionally needed to respect my students’ dignity, and I needed to be more self-aware as to how open- or closed-minded I was being. Humility has that way of being, well, humbling. These days I’m still fallible but a lot better off for knowing it. And, yes, humility’s easier said than done.

Over sixteen marvellous years teaching secondary English in a high school classroom, I learned that teaching is a relationship. Better still, it’s a rapport. I learned that it takes two, not just hearing and talking but listening and speaking in turn, and willingly. And, because bias is inescapable, I learned to consider a constructive question: what motives and incentives are driving anyone to listen and speak to anyone else?

It’s a question with an admittedly unscrupulous undertone: what’s in it for me, what’s in it for them, who’s more likely to win out? The thought of incentives in high school likely evokes report cards, which is undeniable. But where listening (maybe speaking, too) to some degree means interpreting, what my students and I valued most was open-minded class discussion. With great respect for our rapport, we found the most positive approach was, “What’s in it for us?” The resulting back-and-forth was a continual quest for clarity, motivated on everyone’s behalf by incentives to want to understand – mutual trust and respect. Looking back, I’m pleased to say that tests and curricula seldom prevented us from pursuing what stimulated us most of all. We enjoyed very constructive lessons.

Of course, we studied through a lens of language and literature. Of particular interest to me was the construction of writing, by which I mean not just words but the grammar and punctuation that fit them together. My fascination for writing has been one of the best consequences of my own education, and I had encouraging – and one very demanding – writing teachers. In the classroom and on my own, I’ve always been drawn to structure as much as content, if not more so, which isn’t unorthodox although maybe not so common. The structure of writing gets me thinking on behalf of others: why has the writer phrased it this certain way? What other ways might be more or less well-suited for this audience? How might I have phrased something differently than this writer, and why? Most English teachers I know would agree that pondering such questions embodies a valuable constructive skill, these days trumpeted as critical thinking. I’d argue further that it’s even a pathway to virtue. Situated in context, such questions are inexhaustible, enabling a lifetime of learning, as literally every moment or utterance might be chosen for study.

In this respect, in my classroom, we loosely defined text beyond writing to include speech, body language, film, painting, music, architecture – literally any human interaction or endeavour. I’ll stick mostly with listening and speaking, reading and writing, just to simplify this discussion. The scope being so wide, really what our class sought to consider were aim and intention. So when students read a text for content, the WHAT, I’d ask them to consider choices made around vocabulary, syntax, arrangement, and so forth, the HOW. That inevitably posed further questions about occasion and motive, the WHY, which obliged varying degrees of empathy, humility, and discernment in reply: for a given writer, how best to write effectively on a topic while, for a given audience, what makes for skillful reading? What motives are inherent to each side of the dialogue? What incentives? These and others were the broader-based “BIG Question” objectives of my courses. They demanded detailed understanding of texts – heaven knows we did plenty of that. More importantly, the BIG Questions widened our context and appreciation even while they gave us focus. When times were frustrating, we had an answer for why studying texts mattered. Questions reflect motivation. Prior to exercising a constructive frame-of-mind, they help create one to be exercised.

Questions, like everything else, also occur in a particular context. “Context is everything,” I would famously say, to the point where one class had it stencilled for me on a T-Shirt. So much packed into those three plain words – everything, I suppose. And that’s really my thesis here: if we aim to be constructive, and somehow do justice to that over-taxed concept, critical thinking, then we need to be actively considering what we hear and say or read and write alongside other people, and what it all makes us think for ourselves – especially when we disagree. (Is active thinking the same as critical thinking? I’m sure the phrase is hardly original, but I’ll consider the two kinds of thinking synonymous.) During my last 3-4 years in the classroom, all this came to be known by the rallying cry, “Raise the level of discourse!” These days, however, the sentiment is proving far more serious than something emblazoned on a T-Shirt.

I’m referring, of course, to the debacle that has been the 2016 U.S. Presidential election and its aftermath. Specifically, I have in mind two individual remarks, classic teachable moments inspired by current events. The first remark, from an NPR article by Brian Naylor on the fallout over the executive order banning Muslim immigrants, is attributed to the President. The second remark is a response in the comment section that follows Naylor’s article, representative of many commenters’ opinions. To begin, I’ll explain how something as detailed as grammar and punctuation can help raise the level of discourse, especially with such a divisive topic. From there, I’ll consider more broadly how and why we must always accept responsibility for this active language – sometimes correct grammar should matter not just to nit-pickers but to everybody.

In the article (February 8, 2017), Brian Naylor writes:

“Trump read parts of the statute that he says gives him authority to issue the ban on travel from seven predominantly Muslim nations, as well as a temporary halt in refugee admissions. ‘A bad high school student would understand this; anybody would understand this,’ he said.”

We all know the 45th U.S. President can be brusque, even bellicose, besides his already being a belligerent blundering buffoon. This comment was received in that light by plenty, me included. For instance, by classifying “bad” (versus “good”), the President appeals at once to familiar opposites: insecurity and self-worth. We’ve all felt the highs and lows of being judged by others, so “bad” versus “good” is an easy comparison and, thereby, a rudimentary emotional appeal. However, more to my point, his choice to compare high school students with lawyers, hyperbole or not, was readily construed as belittling since, rationally, everyone knows the difference between adult judges and teenaged students. That his ire on this occasion was aimed at U.S. District Judge James Robart is not to be misunderstood. Ironically, though, the President invokes the support of minors in a situation where they have neither legal standing nor professional qualification, rendering his remark not just unnecessarily divisive but inappropriate, and ignorant besides – although he must have known kids aren’t judges, right?

To be fair, here’s a slightly longer quotation of the President’s first usage of “bad student”:

“I thought, before I spoke about what we’re really here to speak about, I would read something to you. Because you could be a lawyer– or you don’t have to be a lawyer: if you were a good student in high school or a bad student in high school, you can understand this.”

Notice, in the first place, that I’ve transcribed and punctuated his vocal statement, having watched and listened to video coverage. As a result, I have subtly yet inevitably interpreted his intended meaning, whatever it actually was. Yet my punctuation offers only what I believe the President meant since they’re my punctuation marks.

So here’s another way to punctuate it, for anyone who feels this is what the President said:

“Because you could be a lawyer, or you don’t have to be a lawyer – if you were a good student in high school or a bad student in high school, you can understand this.”

Here’s another:

“Because you could be a lawyer. Or you don’t have to be a lawyer. If you were a good student in high school or a bad student in high school, you can understand this.”

Finally, but not exhaustively, here’s another:

“Because you could be a lawyer… or you don’t have to be a lawyer; if you were a good student in high school or a bad student in high school, you can understand this.”

Other combinations are possible.

Rather than dismiss all this as pedantry, I’d encourage you to see where I’m coming from and consider the semantics of punctuation. I’m hardly the only one to make the claim, and I don’t just refer to Lynne Truss. Punctuation does affect meaning, both what was intended and what was perceived. To interpret the President’s tone-of-voice, or his self-interrupting stream-of-consciousness, or his jarring pattern-of-speech, or whatever else, is to partly infer what he had in mind while speaking. We interpret all the time, listening not only to words but tone and volume, and by watching body language and facial expression. None of that is typically written down as such, except perhaps as narrative prose in some novel. The point here is that, in writing, punctuation fills part of the interpretive gloss.

Note also where a number of news headlines have used the word “even” as an interpreted addition of a word the President did not actually say. Depending upon how we punctuate his statement, inclusive of everything from words to tone to gestures to previous behaviour, perhaps we can conclude that he did imply “even” or, more accurately, perhaps it’s okay to suggest that it’s what he intended to imply. But he didn’t say it.

If we’re going to raise the level of discourse to something constructive, we need to balance between accepting whatever the President intended to mean by his statement with what we’ve decided he intended to mean. In the classroom, I put it to students as such: “Ask yourself where his meaning ends and yours begins.” It’s something akin to the difference between assuming (based on out-and-out guesswork because, honestly, who besides himself could possibly know what the President is thinking) and presuming (based on some likelihood from the past because, heaven knows, this President has offered plenty to influence our expectations). Whatever he meant by referring to good and bad students might be enraging, humbling, enlightening – anything. But only if we consider the overlap, where his meaning ends and ours begins, are we any better off ourselves, as analysts. Effective communication, like teaching and learning, takes two sides, and critical thinking accounts for both of them.

Effective, though, is sometimes up for debate, not merely defining it but even deciding why it matters. Anyway, can’t we all generally figure out what somebody means? Isn’t fussing over details like grammar more about somebody’s need to be right? I’d argue that taking responsibility for our language includes details like grammar precisely so that an audience is not left to figure things out, or at least so they have as little ambiguity to figure out as possible. Anything less from a speaker or writer is lazy and irresponsible.

In the Comments section following Naylor’s article, a reader responds as follows:

“Precisely describing Trump’s base…bad high school students who’s [sic] level of education topped out in high school, and poorly at that. This is exactly what Trump and the GOP want, a poorly educated populous [sic] that they can control with lies and bigoted rhetoric.”

Substantively, the commenter – let’s call him Joe – uses words that (a) oversimplify, blanketing his fellow citizens, and (b) presume, placing Joe inside the President’s intentions. Who knows, maybe Joe’s correct, but I doubt he’s clairvoyant or part of the President’s inner circle. On the other hand, we’re all free to draw conclusions, to figure things out. So, on what basis has Joe made his claims? At a word count of 42, what was he aiming to contribute? Some of his diction is charged, yet at a mere two sentences, it’s chancy to discern his motives or incentives, lest we be as guilty as he is by characterising him as he characterises the President. Even if I’m supportive of Joe, it’s problematic defending his remarks, for the same reason: they leave such a gap to fill. At 42 words, where he ends is necessarily where the rest of us begin, and maybe I’m simply better off ignoring his comment and starting from scratch. Maybe that’s fine, too, since we should all have our own opinions. In any event, Joe has hardly lived up to any measure of responsibility to anybody, himself included – here I am parsing his words months later in another country. I’d even say Joe loses this fight since his inflammatory diction and sweeping fallacy play to his opponents, if they so choose. Unsurprisingly, Joe’s comment is not at all constructive.

For all its faults, his comment aptly demonstrates the two-way nature of dialogue. On the one side, responsibility falls to each reader or listener to bring their research and experience, then discern for themselves what was meant. In that regard, Joe has left us with a lot of work to do, if we’re motivated enough to bother. Yet I chose his particular comment as mere illustration – literally hundreds of others, just as brief and labour-intensive, scroll by below Naylor’s article… so much work for us to do, or else to dismiss, or perhaps to gain-say, if not insult. On that note, consider the other side: responsibility falls to the speaker or writer to offer substantive claims as well as the evidence that prompted them. In this instance, no matter the justification for offering something at all, what can a two-sentence comment add to issues as complex and long-standing as, say, Presidential politics? Whether or not on-line comments are democracy in action, certainly offering 42 words in two sentences struggles to promote a meaningful, substantive exchange of ideas.

I used to liken such on-line comments to my students as standing in line, debating with others while waiting for coffee, before returning to our cars or our lives, none the more informed except perhaps annoyed by some while appreciative of others. With the best intentions, we might excuse people, overlooking that we’re the ones who walked out and drove away – maybe we were late for work that day. We’ve been closed-minded to the degree that we haven’t sought to reach a thorough understanding, and certainly we’ve failed to raise the level of discourse. Would it have been better to just say nothing, grab our coffee, and leave?

Yes, I think so, which may not be easy to accept. Conversely, consider that reasoning from presumption and enthymeme is not reasoning at all. Further, consider that two sentences of 42 words or a few minutes spent chatting in the coffee line will barely scratch the surface. Who can say what motivates people to contribute so readily yet so sparsely? Recent times are emotional, growing more volatile, and potentially far more dangerous, as a result. We see in Joe’s comment, and so many others like it, that trust and respect are divisively encased in separate echo chambers. By virtue of us versus them, both sides are challenged to be open-minded.

Worse, the so-called era of “post-truth” impedes exactly the constructive dialogue we need right now, raising ire and diatribe in place of substance and equanimity. Satire compounds disagreement and grows that much more venomous, and ridicule has a way of locking closed doors. I don’t support proceeding from pretence or unfounded opinion – there’s nothing whatsoever to show for an exchange-of-opinion based on falsehood. The burden of post-truth is far too high. A bias and the truth can co-exist, and they do, guaranteed – one truth, objective, and one bias per person, subjective. Bias is an inevitable fact of existence. Left unchecked, bias obviates respect, which is why a constructive approach is so crucial. As I’ve said elsewhere, post-truth is anti-trust, at least for me, and, at its furthest extent, a threat to civil security, which sounds alarmist – good, let it. We need to attend to this. More than ever now, we need respect or, failing that, at least greater tolerance. That’s for starters.

Worse still, in this post-truth world, fictional claims face no arbiter but the other side so distrusted and maligned. The kind of polarised situation made infamous in Washington, DC is spreading, realised in a zillion on-line comments like Joe’s with every article published. Hopefully not this one, unless maybe someone hasn’t actually read this. On such a perilous path – facts in dispute, emotions enflamed – each side qualifies “open-minded” as unique to themselves and misappropriated by the rest. That’s significantly divisive and the recipe for unrest that I spy, and it sounds my alarm. In that divided state, in lieu of anything left to discuss, even as reality has its way of catching up, what damage might already be done? Especially when facing fellow citizens, whatever we choose now must accord with what we’re prepared to accept later. Let that sober thought sink to the core because the less we share common goals, the more we’re set to clash over unshared ones. But it’s within us to converse and to converge.

Let’s be willing to listen with empathy, understand with compassion, research with diligence, and respond with substance. Do your own investigation. Accept responsibility to inform yourself. Yes, take what you find with a grain of salt until you can believe to your own satisfaction what is right and trustworthy. Yet, even then, be tolerant if not respectful of others – too much salt is harmful. We all have our own motives and incentives for listening and participating, so let’s dig deeper than how pissed off we are with the other side: walking the high road with pride or smug assurance is really the low road and a path of hubris. It’s closed-minded, but not in the sense that we haven’t sought to reach a thorough understanding of the other side. It’s closed-minded to the degree that we haven’t sought to understand how and why the other side reached their position to begin with.

None of this is hard to understand. Once upon a time, we decided that education mattered, and it’s no accident that the trivium – grammar, rhetoric, dialectic – was given a central role. These days, its value in niche markets, notably private Christian education, is enough to switch some people off, which sadly exemplifies this entire discussion. I believe classical education is valuable for all. We’ve neglected it to our detriment, perhaps to our peril. We have a lot in common, more than we might credit, with our neighbours and fellow citizens. It’s not like they grew up on Mars. We’re not significantly different – hands up if you’re a human being. Start with that, some basic human dignity.

There’s a lot to be offered by rapport in our relationships, and little to expect without it. All we can do is understand the other person’s interpretation, and they ours, and go from there – or else not. And it’s easy to nod and say, “I already do that while others do not.” But reflect upon yourself anyway, in every conversation, debate, or exchange. Humility is a virtue, even when kept low-key. Everybody bears responsibility for their own participation. The more we live up to being respectful, even of those whom we oppose, the more progress we’re liable to make – however slowly it might happen.

As I said at the outset, yes, humility’s easier said than done. But by the same token, why write this essay if 42 words would do? We must neither hide ourselves away nor proceed prematurely. We must be able to discern flaws of reason, and we must be able to communicate with humility if we aim to deliver – and, more critically, if we hope to be received – from a place of thoughtfully considered understanding. Whether or not we truly trust one another, let’s help put the logos back in dialogue and accept our own responsibility to approach people with intentional self-awareness. Let’s seize the opportunity to be role-models – you just never know what somebody else is thinking. Let’s raise the level of discourse. And let’s remember that taking the high road must be open-hearted as well as open-minded.

%d bloggers like this: