A needfully challenging appeal to raise the level of discourse, and an appropriate inclusion to The Rhetorical WHY, from an Opinion piece in The New York Times (Feb 22, 2018) by Op-Ed columnist, Bret Stephens:
“This is the text of a lecture delivered at the University of Michigan on Tuesday [Feb 20, 2018]. The speech was sponsored by Wallace House.
“I’d like to express my appreciation for Lynette Clemetson and her team at Knight-Wallace for hosting me in Ann Arbor today. It’s a great honor. I think of Knight-Wallace as a citadel of American journalism. And, Lord knows, we need a few citadels, because journalism today is a profession under several sieges.…” [continue reading]
Some thoughts of my own on the significance of a free press to our lives…
Arendt: “… how can you have an opinion if you are not informed?”
Everybody has opinions – our five senses give us opinions. In order to be “informed,” we need discernment enough to detect accurate information.
Arendt: “If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer.”
For me, continual lies ultimately yield zero trust, but again, how would I know who’s even lying, but for my own discernment and experience?
At the least, if I were aware that all around were lies, that much I’d know is true. It’s not that “nobody believes anything any longer,” so much as it’s “everybody goes about searching out truth on their own.” The downside is when those individual searches for truth become disrespectful, as we’ve seen lately, or worse, chaotic.
Nevertheless, investigate! Accept responsibility to inform yourself. Accept or believe all with a grain of salt until such time as you can prove to your own satisfaction who and what are trustworthy. And, at that point, be tolerant, if not respectful, of others – this applies to everybody, all sides, liberals and conservatives and all points between. Taking the high road is not to be done with pride or smug assurance. It’s easy to nod and say, “I already do while others do not,” but even so, reflect upon yourself with each conversation, each debate, each exchange.
Open-minded and open-hearted – both are virtues, but they don’t have to be the same thing.
Arendt: “If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer.”
On its face, this statement could only be accurate if you had some clairvoyance or a crystal ball.
By “everybody” doing their own investigation and accepting responsibility to inform themselves, I mean everybody. We’re able to trust news & media sources to the extent that they have lived up to their responsibility… to the extent we’re aware that they have. I support proper, professional investigative journalism and public intellectualism, both of which I gather to be in decline.
Finally, I offer two sets of remarks about journalism by two long-retired anchor-journalists of PBS fame, partners Robert MacNeil and Jim Lehrer. The first is transcribed from an exchange between them during a tribute to MacNeil upon his retirement in October 1995. The second – comprising two parts – is Lehrer’s closing words upon the “retirement” of his name from the title of the PBS NewsHour, on December 04, 2009. Following that, I’ve included a thoughtful follow-up by the PBS Ombudsman, Michael Getler, published the next week on December 11.
MacNeil’s remarks upon his retirement (October 20, 1995)…
MacNeil: You know, I’m constantly asked, and I know you are in interviews, and there have been a lot of them just now – I’m constantly asked, “But isn’t your program a little boring to some people?” and I find that amazing, because, well, sure, it probably is, but they’re people who don’t watch. The people who watch it all the time don’t find it boring, or they wouldn’t watch.
Lehrer: That’s right.
MacNeil: And it’s the strange idea that’s come out of this medium, because it’s become so much a captive of its tool – as its use as a sales tool that it’s driven increasingly, I think, by a tyranny of the popular. I mean, after all, you and I’ve said this to each other lots of times – might as well share it with the audience: what is the role of an editor? The role of an editor is to make– is to make judgments somewhere between what he thinks is important or what they think is important and what they think is interesting and entertaining.
Jim Lehrer’s guidelines of journalism (December 04, 2009)…
Lehrer: People often ask me if there are guidelines in our practice of what I like to call MacNeil/Lehrer journalism. Well, yes, there are. And here they are:
* Do nothing I cannot defend.
* Cover, write and present every story with the care I would want if the story were about me.
* Assume there is at least one other side or version to every story.
* Assume the viewer is as smart and as caring and as good a person as I am.
* Assume the same about all people on whom I report.
* Assume personal lives are a private matter, until a legitimate turn in the story absolutely mandates otherwise.
* Carefully separate opinion and analysis from straight news stories, and clearly label everything.
* Do not use anonymous sources or blind quotes, except on rare and monumental occasions.
* No one should ever be allowed to attack another anonymously.
* And, finally, I am not in the entertainment business.
Here is how I closed a speech about our changes to our PBS stations family last spring:
‘We really are the fortunate ones in the current tumultuous world of journalism right now. When we wake up in the morning, we only have to decide what the news is and how we are going to cover it. We never have to decide who we are and why we are there.’
I am struck by the continuity of their respective final comments, about entertainment – each, in his own way, seeks to distance journalism from vagary, each thereby implying that we are susceptible to emotional or whimsical tendencies, which evidently seem capable of overtaking our focus to learn; otherwise, why mention the point at all?
Watch Lehrer’s remarks here, in a functional if awkward series of video archives of that 2009 broadcast.
In May 2011, upon Lehrer’s retirement, MacNeil returned to offer his own reflections upon his friend and colleague that include some further worthwhile commentary upon contemporary TV journalism
I recall “Lehrer’s Rules,” as they were called, making a small stir – some of it more substantive, meaningful, and some the critical “woe-is-Us” lament at the passing of favourite things. In amongst it all, as I mentioned, were the following comments from PBS Ombudsman, Michael Getler, which I include here, at length, on account of PBS webpages’ tendency to disappear.
In fact, a number of the PBS pages where I found these articles are no longer active – where possible, I have checked, updated, and even added weblinks. But I believe Getler’s comments, like the rest, are worth preserving, on account of their potential to provoke us to think and learn more about a free press and its relation to ourselves.
A couple of people wrote to me in the aftermath of that Dec. 4 sign-off to say how much they liked Lehrer’s guidelines and asked how they could get a copy. That’s why they are reproduced above. A subscriber to the widely-read Romenesko media news site also posted them there on Dec. 6 and they also were posted on the campus site of the Society of Professional Journalists (SPJ). “Whether you agree with all of Lehrer’s guidelines, or not,” that posting read, “he has surely earned our attention.”
That’s certainly true in my case. I’ve also been a devoted watcher of the NewsHour in all of its evolutions during most of the past 30-plus years, long before I took on this job four years ago. Although segments of the program have been the subject of critical ombudsman columns on a number of occasions, I’ve also said many times that it remains the best and most informative hour of news anywhere on television, and it has never been more important. I follow the news closely but almost always learn something from this broadcast every night.
Boring, at Times, But a Luxury Always
Sometimes, of course, it can seem boring. Sometimes the devotion to balanced he said/she said panel discussions can leave you frustrated and angry and no smarter than you were 15 minutes earlier. Sometimes the interviewing is less challenging than one might hope. But the luxury of an uninterrupted hour of serious, straight-forward news and analysis is just that these days, a luxury. And, in today’s world of media where fact and fiction, news and opinion, too often seem hopelessly blurred, it is good to have Lehrer – clearly a person of trust – still at work.
I had the sense when he added his guidelines to that closing segment last Friday that the 75-year-old Lehrer was trying to re-plant the flag of traditional, verifiable journalism that he has carried so well all these years so that it grows well beyond his tenure – whatever that turns out to be – and spreads to all the new platforms and audiences that the contemporary media world now encompasses.
Oddly, I did not get any e-mail from viewers commenting on the new NewsHour format, other than one critical message that said “do not post.” Maybe that’s a good sign since people usually write to me to complain.
Make no mistake, the now defunct NewsHour with Jim Lehrer is still quite recognizable within the new PBS NewsHour. So those who wrote earlier and said they didn’t want any change won’t be terribly disappointed. I, personally, found the first few days of the new format and approach to be a distinct improvement. The program seemed to have more zip and energy, faster paced, with good interviews and without the always predictable language that introduced the show in the past. It presented its news judgments more quickly, benefitted from the early introduction of other top staff members as co-anchors, and from the introduction of a promising “new guy,” Hari Sreenivasan, a former CBS and ABC correspondent who presents a headline summary from the newsroom and is the liaison to an expanded NewsHour Web operation.
Now, just to keep this a respectable ombudsman’s column, let me add a few quibbles when it comes to Lehrer’s rules, as posted above.
First, one of the interesting things about American journalism is that there are no agreed-upon national standards, no journalistic equivalent of the Hippocratic Oath for physicians. There are, of course, many universal values and practices that vast numbers of journalists have voluntarily adhered to generally for many years, best exemplified by SPJ’s Code of Ethics. But the fact is that all major news organizations – from the Associated Press to the New York Times to PBS and CBS – have their own guidelines and standards that they try and live by. And they all have their differences.
Naturally, a Few Quibbles
Lehrer’s guidelines embody lots of the good, praiseworthy stuff, and we come out of the same journalistic generation and traditions. But I think on a couple of points they are actually too nice, too lofty, cruising somewhere above some of the grittier realities of journalism.
For example, “Assume the viewer is as smart and as caring and as good a person as I am. Assume the same about all people on whom I report.” Really? Bernard Madoff? Osama bin Laden?
Then there is: “Assume personal lives are a private matter, until a legitimate turn in the story absolutely mandates otherwise.” I would argue, and have, that the NewsHour withheld from its viewers at the time a legitimate turn in a major story – reported by all other major news organizations – last year when it declined to inform them that a former senator and former candidate for the vice-presidency, John Edwards, issued a public statement and went on ABC Television to acknowledge that he had had an extra-marital affair with a woman who had been hired by his political action committee to make films for his campaign. That’s news.
Finally, there is, “Do not use anonymous sources or blind quotes, except on rare and monumental occasions.” I agree about the blind quotes when they are used to attack someone personally. But anonymous sources have often proved to be absolutely crucial to the public’s right to know what’s really going on in scores of major stories as they have unfolded from Watergate to secret CIA prisons overseas.
The most accurate and important pre-war stories challenging the Bush administration’s on-the-record but bogus case for Iraqi weapons of mass destruction were based on anonymous sources. Many of those stories, in part because they were based on anonymous sources, got buried or underplayed by newspapers at the time. Many of them never got reported at all on television, including the NewsHour. But there are times when there are mitigating circumstances – like internal threats within an administration or maybe jail time for leakers – when some sources must remain anonymous and when editors need to trust their reporters. And often you don’t know if the occasion is “rare and monumental” until it is too late. Pre-war Iraq, again, being Exhibit A.
As far as I understand Jacques Derrida’s différance, he observes that we understand our experiences as distinctive, but not exhaustive, communicated links or marks comprising an on-going decisive chain of experiential moments. As to the language we use to describe our experiences, a word has contextual meaning, both from its usage at any given time as well as from its etymology over the course of time. I tend to agree with this attendance to context as furnishing meaning, and I can also spot the rabbit hole that it poses. For example, to understand some word’s definition, I might look it up in the dictionary and be left to rely upon the definition of whomever decided what it meant while, at the same time, face all sorts of words in the definition that now need looking up, too – Sisyphean, indeed! Cruel but so usual. On the other hand, thanks to whomever for compiling the dictionary, a pretty utile compendium, I have to say.
To be clear, I am not intending to invoke logocentrism, by which all our words are accorded a decided meaning from a cultural centre, which propagates existing biases or “privileges”; Derrida would roll over in his grave. Granted, I may already have laid grounds here to be accused of logocentrism, myself, by writing with words (and I confess to using English because I didn’t think anyone had the patience to muddle over Wingdings). My present aim is to suggest how we might address the afore-mentioned rabbit-hole dilemma by searching for or (… almost afraid to say it) by decidingupon some definitions of our own. Not like a dictionary, but more like– well yes, okay, like a dictionary, but one that we’ll fashion from the ground-up, like when the light bulb would go on above Darla’s head, and Spanky would snap his fingers to say, “Hey, everyone! Maybe we can put on a play!” So, in the spirit of dissemination, hey everybody, maybe we can compile a dictionary! A real, deconstructive, crowd-sourced dictionary!
I’m not really compiling a dictionary. I’m just trying to make some sense of Derrida and différance. Let me try to illustrate what I mean from my own experience. Sometimes I play Walking Football, a version of the game where players are not permitted to run. Naturally, the debate is over what differentiates walking from running. We’ve agreed that walking means “always having at least one foot in contact with the ground during the striding motion.” Running means “having both feet leave the ground at some point during the striding motion.” This makes for certainty, and so long as our eyes are trained enough to spot feet in motion, which I can spot sometimes so clearly, with such immediacy, that its more like I’m watching, not playing – I’m ghinding it tuff even now to ghet the right words, but trust me. And so long as each player is willing to obey the rules – and, ohh my, there’s always that one player who just won’t. You know who I mean… *sigh… Anyway, so long as they’re not just words uttered that then float away in the breeze, our definitions of the rules for walking and running are useful.
Luckily, too, I might add, when we clarify the rules, we do so out loud, together, and don’t whisper it around in a circle, like when my daughter plays Telephone at a birthday party – after all, we want everyone to be clear. Finally, even if we have trouble spotting feet in motion, because it all happens too quickly, or even if that one player is a cheater at heart, the definitions themselves remain clear, and usually at least one or two of us can remember them well enough to recite back, as needed, usually with a lot of finger-pointing and furrowed brows. One time we even wrote the no-running rule on the gym chalkboard, and even though no one challenged this, on the grounds that writing is secondary to speech, everyone still understood why it was scrawled there, by which I mean everyone knew exactly who should read it the most – wow, does every game have that player? Incorrigible.
Bottom line: accountability is down to the sincerity and respect offered to each player by every other player who decides to participate. As an aside, the need for a referee, an arbiter, is all the more clear when the stakes are as high as bragging rights and free beer. But, even as we play for fun, the rules exist or else the game, as such, does not. (On that note, I find a lot of players just don’t like Walking Football and would rather play with running, and that’s fine, too: it’s their decision, and plenty other like-minded players keep both games afloat. I find the Walking game amplifies decision-making, so maybe this feature just appeals to me. And I still play traditional football, too.) My broader point is that any one person must decide to accept what has been defined and, likewise, any group of people must reach a consensus. Shared meaning matters because, otherwise, as I say, we don’t have a game, or else we have a very different one, or we just have anarchy. But whether that person, alone, or the group, altogether, searching for a way to decide upon meaning, has the patience to delve down the rabbit hole… well, yes, context does indeed matter – both usage and etymology. I’ve said and written as much, myself, for a long time. So, in light of all this, I hope I’ve gathered a little something of Derrida’s différance. I’m still learning.
Another illustration: in my teaching, I occasionally introduced this matter of contextual meaning by offering students a list of synonyms: “slim,” “slender,” “skinny,” “thin,” “narrow.” Each word, of course, has its own particular meaning. “If they meant the same thing,” I offered, “then we’d use the same word,” so just what explains the need for all these synonyms? Well, students would say, there are lots of different things out there that possess or demonstrate these various adjectives (my word, not theirs), so we’ve come up with words to describe them (and I think that’s a charitable “we,” like the Royal “We.”) As the discussion proceeded, I might ask which of these words typically describe human traits versus those – leaving aside metaphors – that typically do not. Next, which words typically possess positive connotations, and which negative, or neutral? And, as it pertains to the personification metaphors, which words are more easily envisioned versus those that really stretch the imagination, or even credibility?
Eventually, I would shift from ontology to epistemology, posing the questions at the heart of my intention: For any of the previous questions about these synonyms, how do you know what you’re talking about? For what each of these words could mean, where have your assurances come from? Of course, the most frequent reply to that question was “the dictionary,” followed by “my parents” or “books I’ve read,” or “just everyday experience, listening and talking to people.” Occasionally, the reply was something akin to “Who cares… it just means what it means, doesn’t it?” In every reply, though, one common thread was detectable: the involvement of other people as part of the meaning-making process. Fair enough, we can’t all be Thoreau.
One more example: when is “red” no longer red but perhaps orange or purple? Well, for one thing, if you’re colour blind, the question means something entirely different, which I say not flippantly but again to illustrate how important dialogue and community are to deciding what something means. For another thing, we might wish to ask, in keeping with context-dependency, “Why even ask?” Again, this is not flippant or dismissive but practical: when does it matter so that we distinctly need to identify the colour red? Where a group of people might face the question over what is red versus what is orange or purple, we might expect some kind of discussion to ensue. And, whether asking as part of such a group or as a hermit, alone, I submit that one person’s decision about what is “red” is ultimately down to one person to determine: “Red is this,” or “This is red,” or even, “Gosh, I still can’t really decide.” Even a coerced decision we can still attribute to the one who forces the issue – one person has decided on behalf of another, however benignly or violently: might makes right, or red, as it were.
Coercion introduces a political consideration about whose authority or power has influence, similar to needing referees on account of those players who decide to run. The point, for now, is simply that a decision over what something means to a person is ultimately made by a person, leaving others to deal with that decision on their own terms in whatever way. But other people are part of the meaning-making process, even passively, or else I wouldn’t need language to begin with since the rest of you wouldn’t trouble me by existing. Not to worry, by the way, I appreciate you reading this far. From what I understand (and I am convinced I must learn more, being no avid student of either postmodernism or Derrida), his observation of différance either discounts or else offers no account for the arbitrary decision-making that people might make when they decide they’ve had enough. People tend to land somewhere in a community, and it’s the rare person who lives and plays wholly and uncompromisingly by their own rules. However, the fact that he felt différance was worth the effort to publicise and explain to the rest of us does reflect an arbitrary decision on the part of Derrida and says something about him.
So this is where I have more fundamental trouble understanding Derrida and différance – the very notion of “different,” as in, in what world could there not be an arbiter? Even a life alone would face endless decisions: what to eat, where to go, when to sleep, and so forth. From such musing – speaking of rabbit holes – I was led to reading about another philosopher named Jacques, this one Rancière, and what he calls the axiom of equality. In pure nutshell form, I take this to mean that no (socio-political) inequality exists until it has been claimed to exist – and note that it’s claimed in a boat-rocking kind of way, what the kids these days are calling “disruptive.” The upshot is that equality, itself, can only ever be theoretical because someone somewhere inevitably is and always will be marginalised by the arbitrary decisions of cultural hegemony. Still learning.
Back to the Walking Football analogy: if the rabbit hole of defining a word in the context of those that surround it, and then having to define, even further, all those words, and on and on, and experience is inexhaustible, and what’s the point, and lift a glass to nihilism… if that kind of limitless indefinite deconstructive search-and-compare lies at the heart of what is different, then maybe Derrida just found it difficult to reach agreement with other people. It stands to reason that, if he played Walking Football, Derrida might be the worst cheater on the floor, continually running when he should be walking, then denying it just the same as he tried to gain advantage. Maybe, fed up being called a cheater, he would take his ball and go home to play by himself, where no one could say he was wrong. Being alone, who would be there, whether as an obedient player or as a sneaky one, to challenge him?
In fact, maybe that’s why he chose to return to play the next day – for all the arguing, he enjoyed the game, or the attention, or the camaraderie, or the exercise, or whatever, more than being accused of cheating. I wonder if, perhaps, in the great game of philosophy football, he would have been the only rival to strike real fear in Diogenes – I mean awe & respect kind of fear, just to clarify, and I mean if they had lived at the same time. It’s hard to know about Diogenes since nothing he wrote down ever survived, and these days, I doubt more than a few can recall any of whatever he said, besides that lamp-carrying honesty thing. (We should all have such good spirit when it comes to our first principles.) Anyway, I think Diogenes played for Wimbledon.
Well, I am being unkind to Derrida. Evidently, he was a kinder person by nature than I have let on, as well as an advocate for all voices, all people. And the professional care, the uncompromising expertise he took to convey his ideas, to trouble himself with delving down the rabbit hole so arbitrarily – to go down at all but, moreover, to go so far when he might, just the same, have decided to halt. Delve as far as you like, but accept responsibility for your decision, every time. In that respect, how does Derrida differ from any other person facing decisions? Did he have still other motivations? No player who kicks a football is deliberately playing to lose, not unless they have been coerced by someone else to do so. On the other hand, for all I know, maybe what Derrida called red I would call blue. Be careful not to pass the ball to the wrong team! (By the way, in sport, dynasties are remembered precisely because they eventually come to an end.)
Was Derrida no less accountable and open to scrutiny than you, or me, or anybody else? To suggest that a word only makes sense based on how it differs from those around it is no less arbitrary than its reciprocal suggestion: a word only makes sense based on how it describes only what it describes. Half-full / half-empty, six of one… Two sides of the same coin are still the same coin. Alternatively, who put him up to all this? Meanwhile, on his own, surely Derrida had it within himself, as people do when they reach a point, simply to say, “Here is enough. I decide to stop here. For me, [the item in question] means this.” If that doesn’t ring true and sound like him, well, I’d say that can be just as telling of his character; I heard it suggested, once, how we can be helped in knowing something by what it is not. So, fine – for Derrida to stake the claim called différance, I’m willing to concede him that moment. We all land somewhere, and we’re all hardly alike, even when we’re alike.
We are, each and every one of us, individual. But together we comprise something just as dynamic on a larger scale – one might construe us societally, or perhaps historically, anthropologically, or on and on, in whatever way through whichever lens. For me, différance appears an attempt to speak for all about all, prescriptively. A grand stab at philosophy, no question, and that’s the beauty of the equality of philosophy, with thanks to Rancière: we all have a part to play and a right to respond. For the time being, as I have understood Derrida and his thinking, and I willingly stand to be instructed further, différance strikes me as ironic, being an advocacy for the dynamic development of people and language and culture that self-assuredly asserts its own accuracy. That is not an uncommon indictment of postmodernists. What’s more, it is ohh, so human.
On-line comments are not guns, they don’t kill people. And the people who wield them, as in write them, are not having a stand-off at high noon. On-line comments are not deadly but, boy, can they be deadly stupid.
They’re so very often uninformed, superficial, and emotionally driven as well as – frankly – bloody lazy. Plenty of opinions from plenty of people carrying free-speech chips on self-righteous shoulders. On-line comments, these days, are just another sign of the times.
“Just how many people bother to research and draft for a ‘Comments’ section response, anyway?”
Does it show I’m fed up with people trying to win personal pissing matches in the “Comments” section? Does it show? …people clawing their way to the top of some imagined pile of respect, in a community comprising whomever read the article – unless of course they only read the headline. Does it show? …the invective, the insults, the one-liner spree? Commenters affirming, negating, defending, attacking. Pointing out who’s so obviously wrong, what’s so evidently right. Commenters commenting, exercising their democracy, one comment at a time? On-line comments are the Twitter of– er, hmm, I’ll need some time to work on that one.
Of course I’m unable to say on-line comments kill people, but that’s not because they actually don’t kill people. It’s because, in the analogy, on-line comments are just the bullets. Computer keyboards would be the guns. And it’s still people pulling the trigger by pressing send – there’s got to be a triggering joke in here somewhere, I’m sure of it. For now, enough to say that guns don’t post comments, people do.
Time was when a letter-to-the-editor was the main public recourse. But sending one to your chosen publication was no guarantee of being published, or at least not published in full. But then came the Internet, the great equalizer. I can only suspect that, way back when, when that first on-line article permitted readers to leave comments, that the author or editor or publisher proudly lifted a glass of wine to rejoice the enabling of the public voice. One step forward for free speech. Here’s to democracy.
How often I’ve read an article, then followed up with the on-line comments, thinking, “I’d like a sense of the broader opinion out there, maybe encounter some different perspectives, pick up a hyperlink or two for this topic.” This does still happen, and it’s what makes on-line comments, for me, worthwhile. It also means I’m relying on the other commenters to offer anything of substance. But, obviously (…is it obvious?) substance doesn’t always just happen. Honestly, though, pretty naïve to expect that it would. And if you thought, in the sheer amount of comments for just one single article, that the law of averages would help, then you probably haven’t read too many on-line comments. They can far, far surpass the length of the article and illustrate far, far less than broader opinion or different perspectives or anything useful at all about the topic. Just as often they proliferate because somebody needed to win.
How often is someone’s on-line comment about the article as compared to that commenter seeking personal affirmation or recognition as some kind of uber-reliability source? How often does an on-line comment chain turn into a personal on-line shoving match? And how often has somebody replied along these lines: “You’re pretty tough when it’s not face-to-face…” ?
Nobody thinks they’re even beginning to solve the issue [whatever it is] in the on-line Comments section. Do they? At least, they couldn’t possibly think so when all they’ve written is a sentence or two, right? At least, when they’ve written sentences. But, unquestionably, essay-length on-line comments are the exception to the rule. Aren’t they? At least, they are in the Age of Twitter – wait, sorry, I already slammed Twitter. This time, I’ll go with Google making us stupid (not for the first time). By the way, even shortened attention spans have been called into question (have a look, neither’s a long read). My own sense, for what it’s worth, is that we attend to what stimulates us the most although – egregiously – I have no research to back my opinion, and if any of you trolls call me on that, I’ll comment you back. So just be warned. Gotta be almost time for that trigger joke.
Are people commenting when maybe they should be writing an article of their own? Would that be too much responsibility to bear? to ask? Would writing an article require too much effort? People seem to care enough to leave a comment yet not enough to offer something more substantive than a line or two, or a paragraph the odd time. Even a few paragraphs, that one time by that one person, but anything truly edited for cohesion – are you kidding, what are we, journalists? How many of us are writers, period, much less paid ones? Heaven forbid anyone be expected to offer more than a few lines of opinion masquerading as oh-such-obvious-fact, or a one-liner, or a dogmatic tirade! (Yes, I not only see the irony, I intended it.) Leave all that responsibility crap for whoever else. Whomever, actually, but that would mean caring.
Who are you, anyway, that you’d present yourself in so superficial a manner as on-line comments yet expect to be taken seriously? Who are you, that you’d conflate your real-life person with your on-line persona in such a way where one belies the other? Which one is demonstrating the true you? Who are you, to be taking this so personally right now when, in fact, right now I’m giving you the benefit of the doubt? Cynicism aside, everyone can think – hence my frustration. If on-line comments suggest anything, they suggest that emotion rules, not thinking.
Don’t misconstrue – thinking and emotion, I’d say, both occur, but by default (I’d say), emotion controls thinking more than the other way around. Far more rarely does rationality show up beyond the article itself, if even there.
Below is an edited chain of comments that I cut & pasted from an NPR article posted to Facebook, about the bombing of the Manchester arena following the Ariana Grande concert in 2017.
To be precise, these comments that I cut & pasted are from the Facebook post, not NPR’s website. I also present these comments as a single, focused discussion when, in fact, other peoples’ semi-related comments had appeared in between some of these, responding to still other people. But the way comments appear is evidently controlled more by their time stamp, when they were posted, than by which person’s receiving a reply in the thread. So, in selecting only these comments here, I tried to maintain the direct discussion between particular people, back and forth. Finally, I’ve published their Facebook names with hyperlinks because all this is publicly published anyway, and nobody’s owed any shelter.
Rather than take sides, see if you can read this thread to understand my point, the futility of trying to solve such grand issues in a Comments section, the pointlessness of on-line comments in general. (Yes, I see the irony in having my own Comments section below. I even intended it.)
Ask of each comment, and each commenter…
what, really, is the motive behind this comment getting not just written but posted?
what, really, is the response that this person…
believes for themselves?
presumes from the other person?
seeks from anyone else (like us) who may be reading?
Read not only with self-awareness but with other-awareness, with empathy. But please resist taking sides on the issues, irrespective of your own feelings, because the point here is the comments having been crafted and shared, not the terror incident or the politics that are introduced. A tangential point is to acknowledge that it’s possible and sometimes productive to keep our feelings and our rationality separate.
James Alford What a great freedom festival! I just don’t know what we’d do without all the freedom that comes with unfettered access to semiautomatic weapons. Thanks for sharing this awesome display of our enviable freedom!
How do other nations cope without our awesome brand of freedom?! I mean, other than longer life expectancy, ultra low crime rates, drastically lower prison populations, and better overall quality of life.
Yep. Would never wanna swap my bullet-y freedom for any of that.
Scott Macleod How are the Ariana Grande concerts in places without freedom?
James Alford Scott, they do have WAY less freedom, don’t they?! They only have 0.23 gun homicides per 100,000. We have almost 11!!! Murkah!
Scott Macleod James, that is a meaningless statistic. Here I’ll show you:
Last year cars killed:
United States 36,166
Deaths from drowning, children under 14:
United States 548
Deaths from alcohol per year:
United States 88,000
The United States is an outlier on all of these. You can do the same breakdown with antibiotics. You can do it with hot water heaters. Or with deaths from bees. And the US will have higher death rates.
Jacqui Parker Percentages based on overall population would make more sense in your example.
Seth Martin Did anyone in this thread actually read the article?
Scott Macleod The numbers don’t change when broken down per capita. The US is still an outlier. Know why? Because deaths from X will always be higher in countries with more X. Determining causality is much more complicated. Would taking X away eliminate those deaths? Or would X just be substituted for something else and what would have been deaths from X become deaths from Y? This is what’s important.
Jim Chan Total death doesn’t equal to death rate. What are you a 2nd grader?
Scott Macleod Jim, see comment above. Per capita break down does not change the analysis.
Amon-Raa Valencia Scott Macleod the replacement theory can be checked by looking at life expectancy.
Do the countries you point out have higher life expectancy than the US?
James Alford I’m afraid you’re unacquainted with how percentages work, Scott.
If I have 10 tomatoes in my garden, and 2 of them are rotten, and you have 10,000 tomatoes, and 200 of them are rotten, then my problem is still 10 times bigger than yours, even though you have 100 times more rotten tomatoes.
Find a local 5th grader. He’ll be happy to provide more illustrations.
Wesley D. Stoner So Scott Macleod, in your example, if X = guns then the US logically has more gun deaths because there are more guns, right? Where do you think I am going next….
Scott Macleod So by that logic, James, those other countries have the same problem the US does from guns? Please respond without insults.
Scott Macleod <<Scott Macleod the replacement theory can be checked by looking at life expectancy.
Do the countries you point out have higher life expectancy than the US?>>
Other factors go into determining life expectancy. Access to healthcare for example.
James Alford Nevermind, Scott. The original stat said it all. The U.K. (who you brought up) has a gun homicide rate 44 times lower than ours. If you can’t grasp that incredibly straightforward piece of empirical data, then we don’t have a starting point.
Chris Toscano James Alford, you have unlocked Master Troll Level 99. Fine work sir! Look at all the ammosexuals that you have up in arms.
Margaret Moore Bennett Scott Macleod, I am a statistics teacher, you show a basic lack of understanding for how statistics work. You are a poster child for why the GOP is successful with the un and under-educated.
Scott Macleod <<Nevermind, Scott. The original stat said it all. The U.K. (who you brought up) has a gun homicide rate 44 times lower than ours. If you can’t grasp that incredibly straightforward piece of empirical data, then we don’t have a starting point.>>
A) Again, this stat is meaningless. It tells us nothing about causality or how public policy changes the death rates.
B) Those are GUN death rates. Of course a country with 330 million GUNS is going to have higher death rates from GUNS. Just like a country with greater access to antibiotics has more deaths from antibiotics. It tells us nothing about whether antibiotics or guns are good or bad for society.
Scott Macleod How about explaining it to me Margaret rather than resorting to ad hominem and appeals to authority?
Scott Macleod I am not uneducated. I am not a republican. There’s two misses. What are the odds your third claim that I demonstrate a lack of understanding for statistics is correct.
David Houghton Well, you led with raw numbers and not per capita numbers. Not exactly putting your best foot forward on the stats front.
Tandy Fitzgerald Scott Macleod does that mean a US citizen is more reckless when it comes to driving that the rest of the world and less aware when it comes to their children swimming or less aware of health issues and oblivious to the affects of alcohol? Man US citizens really do prefer to live on the edge far more than anyone else in the world…I guess freedom has more prices then just serving in the military.
Scott Macleod I led with what I had available to copy and paste to demonstrate what I was getting at. I agree it would have been better to break them down per capita. Alas, I’m on my phone and these comments move quickly.
Normally when you see this argument, though, it involves raw numbers. As I have said, what I was illustrating does not change when broken down per capita.
Nick Lucas Scott Macleod My favorite part about your posts is that you are trying to dismiss data because of your claims of causality but you make your first statement of the Ariana Grande concert without the same rule of thought.
What gun would have somehow stopped that bomb from exploding? Why didn’t a person with a gun stop the OK bombing or Boston bombing?
This is the problem with bias is we tend to not be able to apply the same logic to our own beliefs that we do others we disagree with.
Michael Dugger Scott Macleod no gun would’ve have stopped a silent bomb carrier Scott.
Scott Macleod Nick, my original comment was a quip. It was a snarky counter to the OP. I feel like you are reading too much into it.
Nevertheless, it does illustrate what I mentioned earlier. When X is not available, people will substitute with Y and nearly the same amount of people would likely die anyway. Why commit suicide with a $500 gun when you can do it with $3 of rope? Looking at guns only is a disingenuous way of looking at the problem. To be sincere, we would need to look at all homicides to determine causality.
I have not made the claim that access to guns will stop bombings.
Bill Melton “Would it make you happier, little girl, if they were pushed out of a seventh floor window?” Archie Bunker
Jenny Caldwell Scott Macleod Those aren’t death rates, those are simply the numbers of deaths. Death rates are population-based, i.e. # of deaths by drowning/1000. US death *rates* by gun violence are indeed much higher than other countries.
Paul Errman James Alford go cry yourself a river. When their violent crime rate drops and they actually have a population of over 300 million call us.
Onica Annika Scott Macleod you can’t take a gun into a concert permit or not. Stupid example.
Scott Macleod <<Scott Macleod you can’t take a gun into a concert permit or not. Stupid example.>>
I never said you could or that you should. Why are you bringing this irrelevant insight into the conversation?
Scott Macleod <<Scott Macleod Those aren’t death rates, those are simply the numbers of deaths. Death rates are population-based, i.e. # of deaths by drowning/1000. US death *rates* by gun violence are indeed much higher than other countries.>>
I know this. I never disagreed. US death rates by gun violence are higher. I never claimed otherwise. What I dispute is the significance of this information.
Scott Macleod We also have higher death RATES due to drowning, alcohol consumption, motor vehicle accidents, and a whole host of other phenomena. Why?
Looking at RATES and ignoring all other factors gives people a misleading glimpse into reality.
Onica Annika Scott Macleod YOU WROTR “How are the Ariana Grande concerts in places without freedom? 👌🏻”
By comparing the bombing in Manchester to carrying guns and implying people would be safer at concerts WITH GUNS is how THIS WAS BROUGHT UP.
You cannot shoot a suicide bomber without expecting to have an explosion. It would have made absolutely NO DIFFERENCE!
Next are comments following NPR’s report on two bounty hunters who engaged a fugitive at a car dealership in Texas, also in 2017 and also posted to Facebook.
The Next Thread
‘Jonathan Fitzgerald So I’m starting to get a little pissed by all this bounty hunter bashing. While a little rough around the edges. Boba Felt was a pretty decent guy. And could tell some great jokes, once he got a few drinks in him. Cad Bane was a generous and loving fellow. He was known to work at the soup kitchens all the time. So chill out. They aren’t ALL bad.
Candy Ellman Johannes That may be but when you see the video it’s quite clear that the two bounty hunters handled the situation very badly. Because they were like that doesn’t mean they were good at their JOB. It doesn’t mean they’re bad. Just that they shouldn’t have been handling this job.
Isaac Unson Wow, what a tragic and visceral story! Should I maybe post a comment to spur discussion about bounty hunting, or the lack of consideration for things going south very badly?
Nahhhh, that’s original content worthy of discussion. Why not just be cynical and predictable instead and make the usual jokes about guns?
Russell Good It’s fitting this happened in a death merchants offices. More people are killed with vehicles, than anything else, and yet anyone can buy a car without a background check. Unlicensed drivers and unregistered vehicles are hurtling past the innocent in their thousands at this very minute. When will we stop the insanity?
Candy Ellman Johannes A death merchant? No, they’re not. They can’t control how someone is going to drive the cars they buy. And a background check will not tell you how they will do that. Just drive around our community in Texas and you will see a lot of idiots on the road. Most of whom would clear any background check you might think they should conduct.
I hardly even know where to begin with either discussion. The responses, pardon the pun, speak for themselves, from the aggressors to the defenders to the cooler heads to the comic relief. I don’t even say “aggressors” and “defenders” with any political bent so much as simply noting name-calling and tone. The fact that one person or another, with [whichever] political beliefs, is the aggressor or the defender, here, is not my point, which is why I cautioned to read without taking sides – everybody can be mean-spirited or good-willed, aggressive or defensive. My point is that everybody can also think and listen and reflect, if only they wish to do so, which means more targeted effort and more controlled emotional reaction.
The futility of quoted statistics, which are then attacked and defended, as are the people themselves, in a forum that is informal and, for the most part, unmonitored (perhaps beyond hate speech or something that Facebook would moderate) …what’s the point of it all? Once these people close their browsers, what does each one feel he or she has accomplished? If little to nothing, then why even participate? If something more, then who besides themselves is measuring their effectiveness and, anyway, to what end? And who besides themselves even has a right to judge their effectiveness, especially since this cast of characters – evidently – would have plenty more to say about being judged, and we just kick-start another thread!
How many of you just now reading saw either of these comment threads before reading them here in my post? Which audience needs to see these threads, and why?
Are Comments sections some kind of exercise of free speech? If so, are they worth the trouble? On-line comments are not always anonymous, but they’re also not face-to-face, and that’s perhaps most significant here as to the point of being responsible and thorough before posting something regarding another person. However, it’s most significant in both positive and negative ways – positive because we owe the other person enough dignity to offer them an intelligent reply that respects their point of view, and negative because we can insult the bastard without (likely) ever feeling some physical repercussion. At the opening, I called on-line commenters lazy. Maybe they’re cowardly, too.
Geez, how seriously am I taking this? They’re just blog comments, for goodness’ sake!
Oh please, it’s a comments section not a peer-reviewed journal.
Here’s another partial comment thread, cut & pasted from The Atlantic website, this time without any comments removed from in between – these are consecutive responses to an article about America’s intellectual decline – a topic not too dissimilar from this very post, even though I disagree in detail with a number of the writer’s claims. However, again, the point here is not to debate the issues. It’s to note the motives and tone behind the comments.
Start with Gutenberg. Then move on to education, art, medicine, culture, and philosophy. Don’t forget Martin Luther and Henry VIII.
Yes the Greeks and Arabs and others made their contributions. But how did those contributions find their way to becoming building blocks for Western Civ? Via the Romans (Christians by the end) and the Crusades (Christian holy wars). For centuries it was literate clergymen who preserved the ancient knowledge which would eventually set the stage for the Enlightenment.
Like it or not, Christianity is at least as inextricably entwined with the building of Western Civilization as any other influence one could name.
It’s truly odd that you find this overwhelmingly obvious fact truly odd.
In other news, if your Mom had chosen a different man to be your father, not only can we never know what you’d look like today–it wouldn’t in fact be you. That child might well not have even existed.
Europe’s faced many existential threats over the millennia. Change just a few events, and Western Civilization wouldn’t have survived. Subtract Christianity, and there’s a strong chance the region becomes conquered by neighboring civilizations, and never even develops the thing we now call Western Civilization.
And that’s as far as I need to go chasing after this particularly nonsensical counter-factual.
“exactly, it’s a counter factual, no need to chase it.”
Asking what might have happened if different decisions had been made is often vital to understanding historical events. Although the process is inherently fraught with ambiguity, it’s a valid exercise.
Your reply makes it seem like perhaps you don’t grasp the purpose of a counter-factual.
This counter-factual is nonsensical. Not all of them are.
yes I understand that as well. This is getting a little exasperating. My only point in this was that one cannot attribute western civilization’s existence to christianity. At most, one can say that christianity was instrumental in the history and current state of western civilization.
Had David simply crafted a more thorough reply to begin with, as he indicates in the final response of this chain, he might have pre-empted all these back-and-forth remarks. More complete remarks might have stirred new ideas, better avenues for discussion, alternatives for research, and just a more thorough model for others to consider. And, sure, where his exchange with Duncan Tweedy was essentially civil and pain-free, he has still made the potential for a bunch of negative things to occur…
(i) people might have accepted his cursory remarks, thereby reinforcing (albeit superficially) their own beliefs in that echo-chamber kind of way
(ii) people might have rejected his cursory remarks, thereby reinforcing (albeit superficially) their own beliefs in that polarising kind of way
(iii) people might have misread, misunderstood, misconstrued, or otherwise missed the context of his remarks and, additionally, might have failed to follow up this thread as far as the point where I have cut & pasted it here
(iv) as a result of (ii) or (iii), people might have grown upset or angry with his cursory remarks and taken him on with vitriol, or worse, simply have begun insulting him outright, neither of which contributes to any constructive progress but, rather, destructive regress and both of which inspire ill feelings that those people now carry into everyday life, and which might later be echoed – on-line or off-line – by still others
(v) some other outcome I haven’t mentioned
Had David afforded more time and thought to (a) the kind of response that could adequately convey his thinking, and also (b) the kinds of responses he might elicit from people who read his remarks if he were to write them this way or that way, then he wouldn’t have responded the way we find here. Yet it hardly seems worth critiquing these, or any, on-line comments at all, they’re so ubiquitous! Yes, I see the irony; in fact, I intended it.
Just how many people bother to research and draft for a “Comments” section response, anyway? The whole concept of the on-line “Comments” section seems tailor-made to evade the vetting and sober second-thought of taking a breath and waiting the requisite 24 hours before responding to messages we don’t like. I’d pay $49 for the t-shirt that reads, “Who took the ‘Editor’ out of ‘Letter to the Editor’? Send me $50 and I’ll tell you.”
Obviously, the question is not how many bother to research and draft. It’s not even a question of whether to bother researching and drafting. It’s a question of whether to bother engaging in the on-line comments, to begin with. And I describe it as “bother” because research and drafting mean “work,” i.e. “What a blasted bother!” as in making a deliberate driven effort versus the blurt of perfunctory emotional reaction, which has always been a human foible and which, these days, seems even that much more common.
All that bother for… what, exactly? For someone to reply with one-line invectives? Who are we trying to reach, on-line? And, in light of that, who are we trying to be?
“Who are you on-line? The person versus the persona – it’s a concept worth considering.”
We’re all still responsible for the things we say, especially when they get published, and especially when “published” now means forever to be seen on-line (a consideration discussed here as well) – a newspaper or a book might at least fall out of print or get tossed in the trash. We might consider being responsible for the writing of an article, so why not for the offering of a comment? All the people I’ve quoted here have plenty to offer, I suspect, given their apparent literacy. But taking the time and resources at their disposal and using them in a more constructive way evidently hasn’t happened. Can that be changed?
What incentive would motivate these, or any, people to offer more when commenting on-line? Do people care about a growing reputation, however much or little it permeates the Cloud of e-culture? Who do they think they are? Who do they think that we think they are? Do they even care what others think they are? Do they care, themselves? There’s so little accountability, no formal editing or vetting as might be found in print publication, aside from moderation, as I’ve said, and that often simply automated. Sometimes there’s a log-in procedure via Facebook or Disqus, say, for whatever assurance that offers. It allowed me to publish selected commenters, here.
“Who are we on-line?” asks Flora the Explorer. It’s a question worth considering before we ever touch a keyboard. So, okay: who are you on-line? The person versus the persona – it’s a concept worth considering since I suspect 99% of on-line commenters will never meet their fellows face-to-face. Yet, precisely because of that physical separation, I suspect few care to consider (or even just bother to actively recognise?) this concept. Yes, that’s ironic, and it’s a shame. If people did care more (or at least actively acknowledge?) the fact that dialogue comprises more than self, how much more might a conversation yield? As it is, on-line dynamics affect our selves so subtly yet profoundly that the Internet, the great democratic equalizer, is proving its ability to take us one step forward and two steps back.
In and of themselves, given their entire context including their culture, article & blog comments tend to run the risk of oversimplifying issues that warrant and deserve far greater diligence and time spent in meaningful appreciation. Issues that deserve… really? Why? Well, for starters, somebody published an article about [whatever it was], so now it’s out there for public consideration. Moreover, somebody decided that publishing [whatever it was] was worth the bother, and like you and me and everyone, that somebody deserves some basic dignity and respect, whether we ultimately agree with their published material or not. At a minimum, that requires reading the article, if not subsequently researching a bit more. Beyond that, it requires crafting responses of your own that do right by the author who invested the time and effort to create an article worthy of your comment – not “worthy” because you agree but “worthy” because you bothered to respond. Boy, all this bother! Why bother?
The Rhetorical WHY, itself, is a response not unlike what I suggest here – like anyone, I can’t cover it all in one go, but at the least, I can offer something more than a one-liner. The rest of you deserve that much, as I deserve likewise from the rest of you. So, in fact, it is about deserving: if one deserves, then all deserve – either no one is above any other, as far as it involves basic respect for human dignity, or we’re all of us bound for war, waged by all upon all.
For the record, this time I’m not trying to be ironic, though it might seem so now more than ever before. This time, it’s all too serious.
If people considered article & blog comments as I’ve tried to frame them here, as a matter of respect for human dignity, then comments – and public discourse, altogether – could be a whole lot different, and probably more constructive. Instead of comments, maybe people would compose entire articles of their own, which I remember is what Internet apologists used to boast: “The platform of the Internet is the great equalizer!” and “The Internet gives everyone a voice!” and “The Internet is democracy at its finest!” …that sort of thing. Shame that so many decide, instead, to use it superficially, far beneath both its potential as well as their own.
So, please, follow up on your own, comment and post, publish and be responsible. Contribute constructively. Most importantly, be thoughtful and thorough because that’s respectful of everybody else’s time and effort and bother. No one can cover every single detail, and every person has two cents to add of their own. But don’t be fooled by that miniscule metaphor – two cents refers to humility, so please make an effort to offer more than a reactive outburst.
Thanks, everybody, for leaving whatever considered comments you might have, and don’t let the end of this post be the end of your opinion.
How often have you heard somebody question people who lived “back then,” in the swirling historical mist, who somehow just didn’t have the knowledge, the nuance, or the capability that we so proudly wield today?
“Things back then were just a lot more simple.”
“They just weren’t as developed back then as we are today.”
“Society back then was a lot less informed than we are today.”
It’s difficult to even confront such statements out of their context, but where I’ve had all three of these spoken to me in (separate) conversations, I challenged each impression as an insinuation that we’re somehow smarter than all the peoples of history, more skilled, more sophisticated, more aware, more woke (as they say, these days), that we’re, in the main, altogether better than our elders merely for having lived later than they did. These days, apparently, “we’re more evolved” – ha! 🙂 more evolved, that’s always a good one, as if back then everyone was Australopithecus while here we are jetting across the oceans, toting our iPhones, and drinking fine wines. Well, sure, maybe things have changed since back then, whenever “then” was. But, more typically I’ve found, contemporary judgments levelled upon history are borne of an unintended arrogance, a symptom of 20:20 hindsight and the self-righteous assurance that, today, we’ve finally seen the error – actually, make that the errors – of their ways.
Surely, these days, few – or any – would believe that we’re ignorant or unaccomplished or incapable, not the way someone might say of us while looking back from our future. At any given point on the historical timeline, I wonder whether a person at that point would feel as good about their era, looking back, as any person on some other point of the timeline would feel, also looking back, about theirs. Is it a common tendency, this judgment of contemporary superiority? These days, we might well feel superior to people who had no indoor plumbing or viral inoculations or Internet access, just as someone earlier would have appreciated, say, some technological tool, a hydraulic lift to raise heavy objects, or a set of pulleys, or a first-class lever – choose whatever you like! It really is arbitrary for a person, looking back at history, to feel better about contemporary life because contemporary life infuses all that person’s experience while history’s something that must be learnt.
I’d say that learning history means learning whatever has lasted and been passed on because what has lasted and been passed on was deemed to have merit. We’re taught the history that has been deemed worth remembering. What I’ve found has been deemed worth remembering (i.e., the kinds of things I learned from History classes) are the general mood of the times, the disputes and fights that occurred (violent or academic), a select handful of the figures involved, and an inventory of whichever non-living innovations and technologies simultaneously arose alongside it all. If, later, we demerit and no longer pass on what has lasted up until then, passing on instead some different history, then that’s entirely indicative of us, now, versus anyone who came before us, and it reflects changed but not necessarily smarter or better priorities and values.
For me, we shouldn’t be saying we’re any smarter or better, only different. So much literature has lasted, so much art. Commerce has lasted, institutions have lasted, so much has lasted. Civilization has lasted. Cleverness, ingenuity, shrewdness, wit, insight, intellect, cunning, wisdom, kindness, compassion, deceit, pretence, honesty, so many many human traits – and they all transcend generations and eras. People vary, but human nature perdures. I’ll trust the experts, far more immersed in specific historical study than me, to identify slow or subtle changes in our traits – hell, I’ll even grant we may have culturally evolved, after all – and I can only imagine in how many ways the world is different now as compared to before now. But what does it mean to be better? Better than other people? Really? And who decides, and what’s the measure?
We can measure efficiency, for instance, so to say technology has advanced and is better than before is, I think, fair. Even then, an individual will have a subjective opinion – yours, mine, anybody’s – making culture not proactive and definitive but rather reactive and variable, a reflection, the result of comprised opinions that amplify what is shared and stifle what is not. As we’re taught merited history, you’re almost forced to concur, at least until we reconsider what has merit. That’s a sticking point because everyone will have an opinion on what is culturally better and what is culturally worse. Morality inevitably differs, and suddenly we have ethical debate, even disagreement, even discord. But to say people or culture are better, I think, is too subjective to rationalize and a questionable path to tread.
Consider this as well: we each know what we’ve learned, and as individuals, we’ve each learnt what we value. But what we’re taught is what’s broadly valued and, thereby, prescribed for all. We’ve all heard that rather hackneyed epigram, that those who neglect history are doomed to repeat it. Well, maybe the ones screwing up just didn’t learn the right history to begin with. I tend to abide by another hackneyed epigram, that they are wisest who know how little they know. Real historical wisdom and real historical understanding would be seeing and thinking and understanding as people earlier did. But short of firing up the Delorean for an extended visit some place, some time, it seems to me that judgments about history are made with an aplomb that might be better aimed at acknowledging our finite limitations. We’re no angels. If anything, this error of judgment speaks volumes about us. Condescension is what it is, but in my opinion, it’s no virtue.
We should hardly be judging the past as any less able or intelligent or kind or tolerant or virtuous as us, especially not if we aim to live up to today’s woke cultural embrace of acceptance. Being different should never be something critiqued; it should be something understood. Conversely, in proportion to how much we know, passing judgment is assumptive, and we all know what Oscar Wilde had to say about assuming (at least, we know if we’ve studied that piece of history). At the very least, we ought to admit our own current assumptions, mistakes, errors, accidents, troubles, disputes, and wars before we pass any judgment on historical ones. On that positive note, I will say that considering all this prompted me to notice something maybe more constructive – so often, at least in my experience, what we seemed to study in History class were troublemaking causes and effects, bad decisions, and selfishly motivated behaviours. Far more rarely was History class ever the study of effective decision-making and constructive endeavour – maybe the odd time, but not usually. Maybe my History teachers were, themselves, stifled as products of the system that educated them. What could they do but pass it along to me and my peers? Considering that, I might more readily understand how people, alive today, could conclude that all who came before were simply not as enlightened, as sophisticated, or as adept as we are now.
Yet that merely implicates contemporary ignorance: assumptions and mistakes still happen, errors still occur, accidents – preventable or not – haven’t stopped, troubles and disputes and wars rage on. If the axiom of being doomed to repeat history were no longer valid, we wouldn’t still feel and accept its truthful description, and it would have long ago faded from meaning. All I can figure is that we’re still poor at learning from history – the collective “we,” I mean, not you in particular (in case this essay was getting too personal). We need learned people in the right positions at the right times, if we hope to prevent the mistakes of history. Not enough people, I guess, have bothered to study the branches of history with genuine interest. Or, no, maybe enough people have studied various branches of history, but they don’t remember lessons sharply enough to take them on board. Or, no no, maybe plenty of people remember history, but the circumstances they face are just different enough to tip the scale out-of-favour. Epigram time: history doesn’t repeat, but it does rhyme. Or maybe we’re just full of ourselves, thinking that we’ve got it all solved when, evidently, we don’t.
It also dawned on me, considering all this, that high school “History” influences what many people think about broader “history.” My high school experience, university too, was mostly a study of politics and geography, and toss in what would be considered very elementary anthropology – all this as compared to those other branches of historical study. Archaeology and palaeontology come to mind as detailed, more scientific branches of history, but there are so many – literary history, philosophical history, religious, environmental, military, economic, technological, socio-cultural as I’ve already indicated, on and on they go, so many categories of human endeavour. I’ve even come across a thoughtful paper contemplating history as a kind of science, although one that is normative and susceptible to generational caprice. One final epigram: history is what gets written by the winners.
And that’s really the point here: throughout what we call human history, where we’ve subdivided it so many ways (right down to the perspective of every single person who ever bothered to contribute, if you want to break it down that far), it’s people all the way back, so it’s all biased, so it’s neither complete nor even accurate until you’ve spent oodles of time and effort creating a more composite comprehension of the available historical records. And, dear lord, who has time for that! History, in that respect, is barely conceivable in its entirety and hardly a thing to grasp so readily as to say, simply, “Back then…” History is people, and lives, and belief inscribed for all time. To know it is to know who lived it as well as who recorded it. Knowing others is empathy, and empathy is a skill trained and fuelled by curiosity and diligence, not emotion or opinion. Emotion and opinion come naturally and without effort. For me, valid history derives from informed empathy, not the other way around.
As far as recording history for future study, ultimately, it will have been people again recording and studying all of it, “it” being all of what people were doing to attract attention, and “doing” being whatever we care to remember and record. It’s all a bit cyclical, in itself, and completely biased, and someone will always be left out. So people might be forgiven when shaking their heads in judgment of the past because, without empathy, what else could they possibly know besides themselves?
“…a picture’s usually worth even more than the thousand words we commonly ascribe.”
The word “text” derives from Latin, texere, meaning to weave or fit together. For me, text connotes far more than just the printed word – photography, movies, music, sculpture, architecture, the list goes on and on. The Visual WHY offers a specific look at paintings, texts with no less substance and arguably far more aesthetic. But underpinning the textuality of art altogether is its human endeavour. And beyond weaving something together for the sake of weaving, weavers – artists, people – have a further end, communication. Artists across all media are still people with influences and motives for expressing themselves. Conjointly, texts of all kinds are plenty human: provocative and reflective. Whether rich and symbolic for a global audience, or doodled sketches for your own amusement, art is text, and text has purpose. As we try to understand it more thoroughly, we can’t help but raise the level of discourse. Who knows, someday maybe art will save the world…
WARNING! This post is an analysis and celebration of Joseph Heller’s novel, Catch-22, and it DOES contain PLOT SPOILERS. If you wish to read the novel for the first time, do not read this post.
“Give the first twelve chapters a chance” has long been my advice to anyone who asks about Catch-22, Joseph Heller’s modernist masterpiece that critiques the absurdity of the military during wartime. If you haven’t read the book, I will hardly spoil things by explaining how eagerly we witless first-timers set out to read such a lauded modern classic, only to be confronted by what might be the most frustrating paragon of show-versus-tell in existence. (However, I will be discussing spoiler details from here on, so be warned.) From the seemingly disparate chapter titles to the disjointed narrative, which repeatedly folds back upon itself, from a maddeningly mirthful plot device, which tempts you to toss the book aside and deny its existence, to an irresolute closing – if you make it that far – the book continually challenges readers to deduce what’s happening and piece together what’s happened. Toss in what seems like an endless cadre of characters, ranging from odder to oddest to perhaps not so odd, the book is a challenge, no question.
For seven years, I assigned this book as summer reading for returning seniors. Oh, how the students complained about those twelve chapters – excessive! pointless! irritating! – only to feel more aggrieved at hearing, “Exactly,” my necessary reply. Once the venting subsided – usually at least half the first lesson – we’d begin discussing why Heller’s book could only be written this way as compared to some more conventional, accessible way.
For one thing, we need to meet the protagonist, Yossarian, and understand his circumstances so that, at appropriate upcoming times, which of course will have already occurred, we won’t criticise but will instead favour him. To this end, the entire story is told out-of-sequence, opening apparently in media res during Yossarian’s hospital stay. We have character introductions and letter censoring, foreshadowing how words and language will be manipulated while characters will be isolated, alienated, and demeaned. Subsequently, we learn the logic of Catch-22 from Doc Daneeka. And that Snowden dies. If we’ve navigated the twelve opening chapters and lived to tell about it, we learn that Yossarian, originally a young, excited airman, once needed two passes over a target in order to bomb it successfully, which gets his crewmember, Kraft, killed. Yossarian is further distressed upon returning when he receives a medal for the mission. Meanwhile, Milo opens his syndicate. The tension of tedium, the injustice of fortune. The folly of command, the depravity of humankind. Capping the story is the gruesome account of Snowden’s death, the key incident that incites Yossarian’s fear and lands him in hospital, where we first meet him – naturally, Heller waits until the end to tell us the beginning.
Heller writes with an absurd, illogical narrative style that characterises Yossarian’s internal eternal predicament, wending its way through isolation, alienation, discord, misery, paranoia, fear, senselessness, deception, vice, cruelty, even rape and murder. Catch-22 being what it is, its victims have zero-chance to overcome because the antagonists are permitted to do whatever the protagonists are unable to prevent. All along the way, Heller has Yossarian wanting out of the military (fly no missions = live), and he continually ups the ante between Yossarian and all the disturbing confrontations and contradictions that antagonise him, from his enemies and his commanders to his acquaintances and his comrades. But ultimately, and most potently, he has Yossarian suffering from his own self-interest. As the narrative flits and tumbles about, in its own progressive way, Yossarian’s self-interest evolves or, better to say, devolves. What does evolve, inversely to self-interest, is his compassion as he gradually grows more concerned for the men in his squadron, and which by Chapter 40, “Catch-22,” has extended to all innocent people beset by oppression, prejudice, and exploitation. So when Colonel Cathcart’s promised deal to send him home safely, definitely, comes ironically (fittingly!) at the expense of the squadron, Yossarian ultimately recovers enough self-reliance to overcome his personal anguish but not enough to remand himself to the cycle of absurdity. Given Heller’s dispersed timeline, describing Yossarian’s character development as a narrative arc or an evolution is less accurate than the piecing together of a jigsaw or the unveiling of a secret.
Perhaps unsurprisingly, Yossarian’s instinct for self-preservation is the source of his personal torment. His despondency and disgust over the preponderance of all human self-interest finally turn Yossarian’s decision to go AWOL, at criminal risk but personal safety. That such a climax works is because readers – like Yossarian – are no longer fighting back but giving in, yet even then Heller offers no respite – the story ends ambiguously, leaving readers to satisfy their own vexation. Even so, I suspect that Heller appreciated John Chancellor’s life-imitating-art intiative as one inspired by more than a spirit of fandom. So where some characters have been subjects of compassion, others agents of absurdity, readers’ resultant responses have also undergone a perfectly natural evolution, mirroring Yossarian’s character development and culminating with his terrifying walk through Rome. The horrors of “The Eternal City,” in this light, are not only an essential but an inevitable piece in Heller’s plan.
Yossarian’s shall-we-say militant decision to desert is borne of Snowden’s ugly death during the Avignon mission, only a week after the death of Kraft and the award of Yossarian’s medal. Seeing Snowden’s innards spilling rudely from his body nauseates Yossarian and haunts him throughout the entire (or, from Yossarian’s perspective, for the rest of) the story. Yossarian, inset by Heller on behalf of soldiers as a protagonist, has no way of making things better. His futile effort at comfort, “There, there” (p. 166), is comically insincere for its honest helplessness, an understated shriek from all soldiers continually sent to face death – not death without context but without resonance. However, for Yossarian and his comrades, the context of sacrifice is all too irrationally clear: thanks very much. Catch-22. Soldiers face the dilemma of following orders that entirely devalue their very existence.
Participation as a soldier offends Yossarian to the core, yet it also helps him to reconcile his fear over death: “… man is matter,” finite, mortal and – without spirit – simply “garbage.” In fact, this sentence sums human worth as a blunt statement: “The spirit gone, man is garbage” (p. 440). Six words of sad, harsh consequence, war, no longer wearing a comic mask. The absolute phrase, a terse syntactical effect, annuls man’s significance – spirited briefly, gone abruptly, an empty corporeal body left over, garbage. Garbage is a harsh image – rotting flesh, buzzing flies, scum, residue, stench. Pessimism, cynicism, worthlessness. On such terms, one wonders whether anyone might willingly die to save themselves, as it were, another troubling revelation engineered by a masterpiece of unprosaic illogic. Yet even on this point, Heller’s genius is flawless. Haunting though it is, Snowden’s death gradually reveals to Yossarian the very path to life and safety that he has pursued ever since the opening chapter in the hospital – which is to say, ever since Snowden’s death drove him there in the first place.
This is why Heller refers to Snowden’s death, specifically his entrails, as a “secret” because to reveal it any earlier would be to end the novel. And he calls it Snowden’s “grim secret” to illustrate Yossarian’s suppressed mental anguish. Heller has Yossarian recall Snowden a number of times, each admitting more detail, each growing more vivid, each driving him a little closer to his final resolution. Heller’s portrayal of Yossarian’s traumatised memories in this way suggests the nightmarish flashbacks that people, particularly soldiers, endure following the horrors of war. His final flashback in Chapter 41, “Snowden”, is prompted when Yossarian wards off the mysterious stranger in – where else? – the hospital. It’s most revelatory for Yossarian – and readers, by extension – because, here at the end of his patchy, appalling flashbacks, he is finally secure enough to divine for himself – or is it to admit to us? – the grim secret found in Snowden’s entrails. In the same way, the climax is most revelatory for readers who – at the mercy of Heller’s dispersed narrative structure – have been made to wait until the closing, when the time is finally ripe.
To get there, we are dragged unwittingly by Heller down a path of frustrating sympathy, illogical absurdity, and agonising anticipation. By the time Yossarian is introduced (in the opening chapter!) censoring letters and conniving a way to escape the war, he is that much nearer to desertion than we can yet know. Certainly, Snowden will convince us to desert as surely as he convinces Yossarian, but that will happen later, after Heller has aggravated our tolerance and mottled our innocence. Heller must drag us down Yossarian’s agonising path, or else he places us at risk of passing premature judgment upon not merely his protagonist but his entire message. Finally, when the moment arrives that we gather full appreciation of Snowden’s death, we have all we need to share in the vindication of Yossarian’s desertion.
So here is our way to grasp the grim secret behind the novel’s dissembling structure as restlessly and imperturbably as Yossarian does: the root of conflict, Snowden’s death, can only occur at the end of Heller’s narrative path, not Yossarian’s. The story simply works no other way.
Sometimes, the hardest part of teaching felt like finding a way to reach students when they just didn’t get it. But if there’s one thing I learned while teaching, it’s that it takes two. In fact, the hardest part of teaching was coming to realise it wasn’t them not getting it, it was me not getting them. In my own defense, I think we just never can know what another person’s motive truly is. It was times like that when I realised the true constructive value of respect and a good rapport. To have any hope of being open-minded, I intentionally needed to respect my students’ dignity, and I needed to be more self-aware as to how open- or closed-minded I was being. Humility has that way of being, well, humbling. These days I’m still fallible but a lot better off for knowing it. And, yes, humility’s easier said than done.
Over sixteen marvellous years teaching secondary English in a high school classroom, I learned that teaching is a relationship. Better still, it’s a rapport. I learned that it takes two, not just hearing and talking but listening and speaking in turn, and willingly. And, because bias is inescapable, I learned to consider a constructive question: what motives and incentives are driving anyone to listen and speak to anyone else? It has an admittedly unscrupulous undertone: what’s in it for me, what’s in it for them, who’s more likely to win out? The thought of incentives in high school likely evokes report cards, which is undeniable. But where listening (maybe speaking, too) to some degree means interpreting, what my students and I valued most was open-minded class discussion. With great respect for our rapport, we found the most positive approach was, “What’s in it for us?” The resulting back-and-forth was a continual quest for clarity, motivated on everyone’s behalf by incentives to want to understand – mutual trust and respect. Looking back, I’m pleased to say that tests and curricula seldom prevented us from pursuing what stimulated us most of all. We enjoyed very constructive lessons.
Of course, we studied through a lens of language and literature. Of particular interest to me was the construction of writing, by which I mean not just words but the grammar and punctuation that fit them together. My fascination for writing has been one of the best consequences of my own education, and I had encouraging – and one very demanding – writing teachers. In the classroom and on my own, I’ve always been drawn to structure as much as content, if not more so, which isn’t unorthodox although maybe not so common. The structure of writing gets me thinking on behalf of others: why has the writer phrased it this certain way? What other ways might be more or less well-suited for this audience? How might I have phrased something differently than this writer, and why? Most English teachers I know would agree that pondering such questions embodies a valuable constructive skill, these days trumpeted as critical thinking. I’d argue further that it’s even a pathway to virtue. Situated in context, such questions are inexhaustible, enabling a lifetime of learning, as literally every moment or utterance might be chosen for study.
In that respect, we loosely defined text beyond writing to include speech, body language, film, painting, music, architecture – literally any human interaction or endeavour. I’ll stick mostly with listening and speaking, reading and writing, just to simplify this discussion. The scope being so wide, really what our class sought to consider were aim and intention. So when students read a text for content, the WHAT, I’d ask them to consider choices made around vocabulary, syntax, arrangement, and so forth, the HOW. That inevitably posed further questions about occasion and motive, the WHY, which obliged varying degrees of empathy, humility, and discernment in reply: for a given writer, how best to write effectively on a topic while, for a given audience, what makes for skillful reading? What motives are inherent to each side of the dialogue? What incentives? These and others were the broader-based “BIG Question” objectives of my courses. They demanded detailed understanding of texts – heaven knows we did plenty of that. More importantly, the BIG Questions widened our context and appreciation even while they gave us focus. When times were frustrating, we had an answer for why studying texts mattered. Questions reflect motivation. Prior to exercising a constructive frame-of-mind, they help create one.
Questions, like everything else, also occur in a particular context. “Context is everything,” I would famously say, to the point where one class had it stencilled for me on a T-Shirt. So much packed into those three plain words – everything, I suppose. And that’s really my thesis here: if we aim to be constructive, and somehow do justice to that over-taxed concept, critical thinking, then we need to be actively considering what we hear and say or read and write alongside other people, and what it all makes us think for ourselves – especially when we disagree. (Is active thinking the same as critical thinking? I’m sure the phrase is hardly original, but I’ll consider the two kinds of thinking synonymous.) During my last 3-4 years in the classroom, all this came to be known by the rallying cry, “Raise the level of discourse!” These days, however, the sentiment is proving far more serious than something emblazoned on a T-Shirt.
I’m referring, of course, to the debacle that has been the 2016 U.S. Presidential election and its aftermath. Specifically, I have in mind two individual remarks, classic teachable moments inspired by current events. The first remark, from an NPR article by Brian Naylor on the fallout over the executive order banning Muslim immigrants, is attributed to the President. The second remark is a response in the comment section that follows Naylor’s article, representative of many commenters’ opinions. To begin, I’ll explain how something as detailed as grammar and punctuation can help raise the level of discourse, especially with such a divisive topic. From there, I’ll consider more broadly how and why we must always accept responsibility for this active language – sometimes correct grammar should matter not just to nit-pickers but to everybody.
In the article (February 8, 2017), Brian Naylor writes:
“Trump read parts of the statute that he says gives him authority to issue the ban on travel from seven predominantly Muslim nations, as well as a temporary halt in refugee admissions. ‘A bad high school student would understand this; anybody would understand this,’ he said.”
We all know the 45th U.S. President can be brusque, even bellicose, besides his already being a belligerent blundering buffoon. This comment was received in that light by plenty, me included. For instance, by classifying “bad” (versus “good”), the President appeals at once to familiar opposites: insecurity and self-worth. We’ve all felt the highs and lows of being judged by others, so “bad” versus “good” is an easy comparison and, thereby, a rudimentary emotional appeal. However, more to my point, his choice to compare high school students with lawyers, hyperbole or not, was readily construed as belittling since, rationally, everyone knows the difference between adult judges and teenaged students. That his ire on this occasion was aimed at U.S. District Judge James Robart is not to be misunderstood. Ironically, though, the President invokes the support of minors in a situation where they have neither legal standing nor professional qualification, rendering his remark not just unnecessarily divisive but inappropriate, and ignorant besides – although he must have known kids aren’t judges, right?
To be fair, here’s a slightly longer quotation of the President’s first usage of “bad student”:
“I thought, before I spoke about what we’re really here to speak about, I would read something to you. Because you could be a lawyer– or you don’t have to be a lawyer: if you were a good student in high school or a bad student in high school, you can understand this.”
Notice, in the first place, that I’ve transcribed and punctuated his vocal statement, having watched and listened to video coverage. As a result, I have subtly yet inevitably interpreted his intended meaning, whatever it actually was. Yet my punctuation offers only what I believe the President meant since they’re my punctuation marks.
So here’s another way to punctuate it, for anyone who feels this is what the President said:
“Because you could be a lawyer, or you don’t have to be a lawyer – if you were a good student in high school or a bad student in high school, you can understand this.”
“Because you could be a lawyer. Or you don’t have to be a lawyer. If you were a good student in high school or a bad student in high school, you can understand this.”
Finally, but not exhaustively, here’s another:
“Because you could be a lawyer… or you don’t have to be a lawyer; if you were a good student in high school or a bad student in high school, you can understand this.”
Other combinations are possible.
Rather than dismiss all this as pedantry, I’d encourage you to see where I’m coming from and consider the semantics of punctuation. I’m hardly the only one to make the claim, and I don’t just refer to Lynne Truss. Punctuation does affect meaning, both what was intended and what was perceived. To interpret the President’s tone-of-voice, or his self-interrupting stream-of-consciousness, or his jarring pattern-of-speech, or whatever else, is to partly infer what he had in mind while speaking. We interpret all the time, listening not only to words but tone and volume, and by watching body language and facial expression. None of that is typically written down as such, except perhaps as narrative prose in some novel. The point here is that, in writing, punctuation fills part of the interpretive gloss.
Note also where a number of newsheadlines have used the word “even” as an interpreted addition of a word the President did not actually say. Depending upon how we punctuate his statement, inclusive of everything from words to tone to gestures to previous behaviour, perhaps we can conclude that he did imply “even” or, more accurately, perhaps it’s okay to suggest that it’s what he intended to imply. But he didn’t say it.
If we’re going to raise the level of discourse to something constructive, we need to balance between accepting whatever the President intended to mean by his statement with what we’ve decided he intended to mean. In the classroom, I put it to students as such: “Ask yourself where his meaning ends and yours begins.” It’s something akin to the difference between assuming (based on out-and-out guesswork because, honestly, who besides himself could possibly know what the President is thinking) and presuming (based on some likelihood from the past because, heaven knows, this President has offered plenty to influence our expectations). Whatever he meant by referring to good and bad students might be enraging, humbling, enlightening – anything. But only if we consider the overlap, where his meaning ends and ours begins, are we any better off ourselves, as analysts. Effective communication takes two sides, and critical thinking accounts for both of them.
Effective, though, is sometimes up for debate, not merely defining it but even deciding why it matters. Anyway, can’t we all generally figure out what somebody means? Isn’t fussing over details like grammar more about somebody’s need to be right? I’d argue that taking responsibility for our language includes details like grammar precisely so that an audience is not left to figure things out, or at least so they have as little ambiguity to figure out as possible. Anything less from a speaker or writer is lazy and irresponsible.
In the Comments section following Naylor’s article, a reader responds as follows:
“Precisely describing Trump’s base…bad high school students who’s [sic] level of education topped out in high school, and poorly at that. This is exactly what Trump and the GOP want, a poorly educated populous [sic] that they can control with lies and bigoted rhetoric.”
Substantively, the commenter – let’s call him Joe – uses words that (a) oversimplify, blanketing his fellow citizens, and (b) presume, placing Joe inside the President’s intentions. Who knows, maybe Joe’s correct, but I doubt he’s clairvoyant or part of the President’s inner circle. On the other hand, we’re all free to draw conclusions, to figure things out. So, on what basis has Joe made his claims? At a word count of 42, what was he aiming to contribute? Some of his diction is charged, yet at a mere two sentences, it’s chancy to discern his motives or incentives, lest we be as guilty as he is by characterising him as he characterises the President. Even if I’m supportive of Joe, it’s problematic defending his remarks for the same reason – they leave such a gap to fill. At 42 words, where he ends is necessarily where the rest of us begin, and maybe I’m simply better off ignoring his comment and starting from scratch. Maybe that’s fine, too, since we should all have our own opinions. In any event, Joe has hardly lived up to any measure of responsibility to anybody, himself included – here I am parsing his words months later in another country. I’d even say Joe loses this fight since his inflammatory diction and sweeping fallacy play to his opponents, if they so choose. Unsurprisingly, Joe’s comment is not at all constructive.
For all its faults, his comment aptly demonstrates the two-way nature of dialogue. On the one side, responsibility falls to each reader or listener to bring their research and experience, then discern for themselves what was meant. In that regard, Joe has left us with a lot of work to do, if we’re motivated enough to bother. Yet I chose his particular comment as mere illustration – literally hundreds of others, just as brief and labour-intensive, scroll by below Naylor’s article… so much work for us to do, or else to dismiss, or perhaps to gain-say, if not insult. On that note, consider the other side: responsibility falls to the speaker or writer to offer substantive claims as well as the evidence that prompted them. In this instance, no matter the justification for offering something at all, what can a two-sentence comment add to issues as complex and long-standing as, say, Presidential politics? Whether or not on-line comments are democracy in action, certainly offering 42 words in two sentences struggles to promote a meaningful, substantive exchange of ideas.
I used to liken such on-line comments to my students as standing in line, debating with others while waiting for coffee, before returning to our cars or our lives, none the more informed except perhaps annoyed by some while appreciative of others. With the best intentions, we might excuse people, overlooking that we’re the ones who walked out and drove away – maybe we were late for work that day. We’ve been closed-minded to the degree that we haven’t sought to reach a thorough understanding, and certainly we’ve failed to raise the level of discourse. Would it have been better to just say nothing, grab our coffee, and leave?
Yes, I think so, which may not be easy to accept. Conversely, consider that reasoning from presumption and enthymeme is not reasoning at all. Further, consider that two sentences of 42 words or a few minutes spent chatting in the coffee line will barely scratch the surface. Who can say what motivates people to contribute so readily yet so sparsely? Recent times are emotional, growing more volatile, and potentially far more dangerous, as a result. We see in Joe’s comment, and so many others like it, that trust and respect are divisively encased in separate echo chambers. By virtue of us versus them, both sides are challenged to be open-minded.
Worse, the so-called era of “post-truth” impedes exactly the constructive dialogue we need right now, raising ire and diatribe in place of substance and equanimity. Satire compounds disagreement and grows that much more venomous, and ridicule has a way of locking closed doors. I don’t support proceeding from pretence or unfounded opinion – there’s nothing whatsoever to show for an exchange-of-opinion based on falsehood. The burden of post-truth is far too high. A bias and the truth can co-exist, and they do, guaranteed – one truth, objective, and one bias per person, subjective. Bias is an inevitable fact of existence. Left unchecked, bias obviates respect, which is why a constructive approach is so crucial. As I’ve said elsewhere, post-truth is anti-trust, at least for me, and, at its furthest extent, a threat to civil security, which sounds alarmist – good, let it. We need to attend to this. More than ever now, we need respect or, failing that, at least greater tolerance. That’s for starters.
Worse still, in this post-truth world, fictional claims face no arbiter but the other side so distrusted and maligned. The kind of polarised situation made infamous in Washington, DC is spreading, realised in a zillion on-line comments like Joe’s with every article published. Hopefully not this one, unless maybe someone hasn’t actually read this. On such a perilous path – facts in dispute, emotions enflamed – each side qualifies “open-minded” as unique to themselves and misappropriated by the rest. That’s significantly divisive and the recipe for unrest that I spy, and it sounds my alarm. In that divided state, in lieu of anything left to discuss, even as reality has its way of catching up, what damage might already be done? Especially when facing fellow citizens, whatever we choose now must accord with what we’re prepared to accept later. Let that sober thought sink to the core because the less we share common goals, the more we’re set to clash over unshared ones. But it’s within us to converse and to converge.
Let’s be willing to listen with empathy, understand with compassion, research with diligence, and respond with substance. Do your own investigation. Accept responsibility to inform yourself. Yes, take what you find with a grain of salt until you can believe to your own satisfaction what is right and trustworthy. Yet, even then, be tolerant if not respectful of others – too much salt is harmful. We all have our own motives and incentives for listening and participating, so let’s dig deeper than how pissed off we are with the other side: walking the high road with pride or smug assurance is really the low road and a path of hubris. It’s closed-minded, but not in the sense that we haven’t sought to reach a thorough understanding of the other side. It’s closed-minded to the degree that we haven’t sought to understand how and why the other side reached their position to begin with.
None of this is hard to understand. Once upon a time, we decided that education mattered, and it’s no accident that the trivium – grammar, rhetoric, dialectic – was given a central role. These days, its value in niche markets, notably private Christian education, is enough to switch some people off, which sadly exemplifies this entire discussion. I believe classical education is valuable for all. We’ve neglected it to our detriment, perhaps to our peril. We have a lot in common, more than we might credit, with our neighbours and fellow citizens. It’s not like they grew up on Mars. We’re not significantly different – hands up if you’re a human being. Start with that, some basic human dignity.
There’s a lot to be offered by rapport in our relationships, and little to expect without it. All we can do is understand the other person’s interpretation, and they ours, and go from there – or else not. And it’s easy to nod and say, “I already do that while others do not.” But reflect upon yourself anyway, in every conversation, debate, or exchange. Humility is a virtue, even when kept low-key. Everybody bears responsibility for their own participation. The more we live up to being respectful, even of those whom we oppose, the more progress we’re liable to make – however slowly it might happen.
As I said at the outset, yes, humility’s easier said than done. But by the same token, why write this essay if 42 words would do? We must neither hide ourselves away nor proceed prematurely. We must be able to discern flaws of reason, and we must be able to communicate with humility if we aim to deliver – and, more critically, if we hope to be received – from a place of thoughtfully considered understanding. Whether or not we truly trust one another, let’s help put the logos back in dialogue and accept our own responsibility to approach people with intentional self-awareness. Let’s seize the opportunity to be role-models – you just never know what somebody else is thinking. Let’s raise the level of discourse. And let’s remember that taking the high road must be open-hearted as well as open-minded.