Here’s Why. Here’s Precisely Why.

To all my students, and to whomever else wondered what and WHY I taught the stuff I did…

Has the unfolding story involving Cambridge Analytica and Facebook got you considering whether or not to keep your Facebook account?

The details of this story illustrate precisely why our coursework took direct aim at raising the level of discourse.

What is the story? In a nutshell, Facebook permitted C.A. wide access to its users’ private data, without their consent. Authorities now suspect that C.A. used the data to affect political influence in various countries around the world, specifically by way of on-line advertisements and news stories, both factual and contrived. People were fed information tailored to appeal to them and thereby challenged to discern the factual from the fictional.

Here are a sampling of reports about the story and its fallout, some of which has been severe.

 


The Story

The Guardian

Vox

The Global Significance

BBC News

Fox News

Fox News: Opinion

Al Jazeera: Opinion

CBC News: Opinion

CNN: Opinion


 

Raising the level of discourse is all about discerning and appreciating peoples’ motives, thereby helping to determine what people are after, in order to understand why they do what they do.

To my students, I am confident that our coursework has prepared you to face the kind of cognitive assault launched by C.A. on people across the world. For my part, I’m comfortable keeping my Facebook account open… for the time being, anyway!

To everybody else, yes, absolutely this post is a plug for raising the level of discourse, an approach I encourage all of you to consider.

Read more about my own general take on Facebook here.

From The New York Times – “Free Speech and the Necessity of Discomfort” and further Reflections on Journalism

A needfully challenging appeal to raise the level of discourse, and an appropriate inclusion to The Rhetorical WHY, from an Opinion piece in The New York Times (Feb 22, 2018) by Op-Ed columnist, Bret Stephens:

This is the text of a lecture delivered at the University of Michigan on Tuesday [Feb 20, 2018]. The speech was sponsored by Wallace House.

“I’d like to express my appreciation for Lynette Clemetson and her team at Knight-Wallace for hosting me in Ann Arbor today. It’s a great honor. I think of Knight-Wallace as a citadel of American journalism. And, Lord knows, we need a few citadels, because journalism today is a profession under several sieges.…” [continue reading]

 

'Oddly enough, I feel offended...'
All That’s Fit to Print, Wiley Miller

Some thoughts of my own on the significance of a free press to our lives…

Next, I offer a series of responses I made to remarks by Hannah Arendt, published October 26, 1978 in The New York Review of Books, itself a report of her interview with Roger Errera). I encountered them in a Facebook post from The Bureau of Investigative Journalism.

 


Arendt: “… how can you have an opinion if you are not informed?”

Everybody has opinions – our five senses give us opinions. In order to be “informed,” we need discernment enough to detect accurate information.

Arendt: “If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer.”

For me, continual lies ultimately yield zero trust, but again, how would I know who’s even lying, but for my own discernment and experience?

At the least, if I were aware that all around were lies, that much I’d know is true. It’s not that “nobody believes anything any longer,” so much as it’s “everybody goes about searching out truth on their own.” The downside is when those individual searches for truth become disrespectful, as we’ve seen lately, or worse, chaotic.

Nevertheless, investigate! Accept responsibility to inform yourself. Accept or believe all with a grain of salt until such time as you can prove to your own satisfaction who and what are trustworthy. And, at that point, be tolerant, if not respectful, of others – this applies to everybody, all sides, liberals and conservatives and all points between. Taking the high road is not to be done with pride or smug assurance. It’s easy to nod and say, “I already do while others do not,” but even so, reflect upon yourself with each conversation, each debate, each exchange.

Open-minded and open-hearted – both are virtues, but they don’t have to be the same thing.

Arendt: “If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer.”

On its face, this statement could only be accurate if you had some clairvoyance or a crystal ball.

By “everybody” doing their own investigation and accepting responsibility to inform themselves, I mean everybody. We’re able to trust news & media sources to the extent that they have lived up to their responsibility… to the extent we’re aware that they have. I support proper, professional investigative journalism and public intellectualism, both of which I gather to be in decline.


'Well, apparently you haven't heard. . . personal opinions are the new facts.'
The New Facts, Chris Wildt

Finally, I offer two sets of remarks about journalism by two long-retired anchor-journalists of PBS fame, partners Robert MacNeil and Jim Lehrer. The first is transcribed from an exchange between them during a tribute to MacNeil upon his retirement in October 1995. The second – comprising two parts – is Lehrer’s closing words upon the “retirement” of his name from the title of the PBS NewsHour, on December 04, 2009. Following that, I’ve included a thoughtful follow-up by the PBS Ombudsman, Michael Getler, published the next week on December 11.

MacNeil’s remarks upon his retirement (October 20, 1995)…


MacNeil: You know, I’m constantly asked, and I know you are in interviews, and there have been a lot of them just now – I’m constantly asked, “But isn’t your program a little boring to some people?” and I find that amazing, because, well, sure, it probably is, but they’re people who don’t watch. The people who watch it all the time don’t find it boring, or they wouldn’t watch.

Lehrer: That’s right.

MacNeil: And it’s the strange idea that’s come out of this medium, because it’s become so much a captive of its tool – as its use as a sales tool that it’s driven increasingly, I think, by a tyranny of the popular. I mean, after all, you and I’ve said this to each other lots of times – might as well share it with the audience: what is the role of an editor? The role of an editor is to make– is to make judgments somewhere between what he thinks is important or what they think is important and what they think is interesting and entertaining.


Jim Lehrer’s guidelines of journalism (December 04, 2009)…


Lehrer: People often ask me if there are guidelines in our practice of what I like to call MacNeil/Lehrer journalism. Well, yes, there are. And here they are:

* Do nothing I cannot defend.

* Cover, write and present every story with the care I would want if the story were about me.

* Assume there is at least one other side or version to every story.

* Assume the viewer is as smart and as caring and as good a person as I am.

* Assume the same about all people on whom I report.

* Assume personal lives are a private matter, until a legitimate turn in the story absolutely mandates otherwise.

* Carefully separate opinion and analysis from straight news stories, and clearly label everything.

* Do not use anonymous sources or blind quotes, except on rare and monumental occasions.

* No one should ever be allowed to attack another anonymously.

* And, finally, I am not in the entertainment business.

Here is how I closed a speech about our changes to our PBS stations family last spring:

‘We really are the fortunate ones in the current tumultuous world of journalism right now. When we wake up in the morning, we only have to decide what the news is and how we are going to cover it. We never have to decide who we are and why we are there.’


 

I am struck by the continuity of their respective final comments, about entertainment – each, in his own way, seeks to distance journalism from vagary, each thereby implying that we are susceptible to emotional or whimsical tendencies, which evidently seem capable of overtaking our focus to learn; otherwise, why mention the point at all?

 


  • Watch Lehrer’s remarks here, in a functional if awkward series of video archives of that 2009 broadcast.
  • In May 2011, upon Lehrer’s retirement, MacNeil returned to offer his own reflections upon his friend and colleague that include some further worthwhile commentary upon contemporary TV journalism
  • Watch them during a more recent (October 25, 2016) retrospective interview from 92nd Street Y, a Jewish cultural and community centre in Manhattan.

 

I recall “Lehrer’s Rules,” as they were called, making a small stir – some of it more substantive, meaningful, and some the critical “woe-is-Us” lament at the passing of favourite things. In amongst it all, as I mentioned, were the following comments from PBS Ombudsman, Michael Getler, which I include here, at length, on account of PBS webpages’ tendency to disappear.

In fact, a number of the PBS pages where I found these articles are no longer active – where possible, I have checked, updated, and even added weblinks. But I believe Getler’s comments, like the rest, are worth preserving, on account of their potential to provoke us to think and learn more about a free press and its relation to ourselves.

 


“Lehrer’s Rules” by Michael Getler (December 11, 2009)

A couple of people wrote to me in the aftermath of that Dec. 4 sign-off to say how much they liked Lehrer’s guidelines and asked how they could get a copy. That’s why they are reproduced above. A subscriber to the widely-read Romenesko media news site also posted them there on Dec. 6 and they also were posted on the campus site of the Society of Professional Journalists (SPJ). “Whether you agree with all of Lehrer’s guidelines, or not,” that posting read, “he has surely earned our attention.”

That’s certainly true in my case. I’ve also been a devoted watcher of the NewsHour in all of its evolutions during most of the past 30-plus years, long before I took on this job four years ago. Although segments of the program have been the subject of critical ombudsman columns on a number of occasions, I’ve also said many times that it remains the best and most informative hour of news anywhere on television, and it has never been more important. I follow the news closely but almost always learn something from this broadcast every night.

Boring, at Times, But a Luxury Always

Sometimes, of course, it can seem boring. Sometimes the devotion to balanced he said/she said panel discussions can leave you frustrated and angry and no smarter than you were 15 minutes earlier. Sometimes the interviewing is less challenging than one might hope. But the luxury of an uninterrupted hour of serious, straight-forward news and analysis is just that these days, a luxury. And, in today’s world of media where fact and fiction, news and opinion, too often seem hopelessly blurred, it is good to have Lehrer – clearly a person of trust – still at work.

I had the sense when he added his guidelines to that closing segment last Friday that the 75-year-old Lehrer was trying to re-plant the flag of traditional, verifiable journalism that he has carried so well all these years so that it grows well beyond his tenure – whatever that turns out to be – and spreads to all the new platforms and audiences that the contemporary media world now encompasses.

Oddly, I did not get any e-mail from viewers commenting on the new NewsHour format, other than one critical message that said “do not post.” Maybe that’s a good sign since people usually write to me to complain.

Make no mistake, the now defunct NewsHour with Jim Lehrer is still quite recognizable within the new PBS NewsHour. So those who wrote earlier and said they didn’t want any change won’t be terribly disappointed. I, personally, found the first few days of the new format and approach to be a distinct improvement. The program seemed to have more zip and energy, faster paced, with good interviews and without the always predictable language that introduced the show in the past. It presented its news judgments more quickly, benefitted from the early introduction of other top staff members as co-anchors, and from the introduction of a promising “new guy,” Hari Sreenivasan, a former CBS and ABC correspondent who presents a headline summary from the newsroom and is the liaison to an expanded NewsHour Web operation.

Now, just to keep this a respectable ombudsman’s column, let me add a few quibbles when it comes to Lehrer’s rules, as posted above.

First, one of the interesting things about American journalism is that there are no agreed-upon national standards, no journalistic equivalent of the Hippocratic Oath for physicians. There are, of course, many universal values and practices that vast numbers of journalists have voluntarily adhered to generally for many years, best exemplified by SPJ’s Code of Ethics. But the fact is that all major news organizations – from the Associated Press to the New York Times to PBS and CBS – have their own guidelines and standards that they try and live by. And they all have their differences.

Naturally, a Few Quibbles

Lehrer’s guidelines embody lots of the good, praiseworthy stuff, and we come out of the same journalistic generation and traditions. But I think on a couple of points they are actually too nice, too lofty, cruising somewhere above some of the grittier realities of journalism.

For example, “Assume the viewer is as smart and as caring and as good a person as I am. Assume the same about all people on whom I report.” Really? Bernard Madoff? Osama bin Laden?

Then there is: “Assume personal lives are a private matter, until a legitimate turn in the story absolutely mandates otherwise.” I would argue, and have, that the NewsHour withheld from its viewers at the time a legitimate turn in a major story – reported by all other major news organizations – last year when it declined to inform them that a former senator and former candidate for the vice-presidency, John Edwards, issued a public statement and went on ABC Television to acknowledge that he had had an extra-marital affair with a woman who had been hired by his political action committee to make films for his campaign. That’s news.

Finally, there is, “Do not use anonymous sources or blind quotes, except on rare and monumental occasions.” I agree about the blind quotes when they are used to attack someone personally. But anonymous sources have often proved to be absolutely crucial to the public’s right to know what’s really going on in scores of major stories as they have unfolded from Watergate to secret CIA prisons overseas.

The most accurate and important pre-war stories challenging the Bush administration’s on-the-record but bogus case for Iraqi weapons of mass destruction were based on anonymous sources. Many of those stories, in part because they were based on anonymous sources, got buried or underplayed by newspapers at the time. Many of them never got reported at all on television, including the NewsHour. But there are times when there are mitigating circumstances – like internal threats within an administration or maybe jail time for leakers – when some sources must remain anonymous and when editors need to trust their reporters. And often you don’t know if the occasion is “rare and monumental” until it is too late. Pre-war Iraq, again, being Exhibit A.


 

Freedom of the Press_ Matteo Bertelli
Freedom of the Press, Matteo Bertelli

Some other links…

World Association of Newspapers and News Publishers

World Press Freedom Index

Ryerson School of Journalism

Edelman Trust Barometer

Freedom House

Deciding over Derrida’s Différance

As far as I understand Jacques Derrida’s différance, he observes that we understand our experiences as distinctive, but not exhaustive, communicated links or marks comprising an on-going decisive chain of experiential moments. As to the language we use to describe our experiences, a word has contextual meaning, both from its usage at any given time as well as from its etymology over the course of time. I tend to agree with this attendance to context as furnishing meaning, and I can also spot the rabbit hole that it poses. For example, to understand some word’s definition, I might look it up in the dictionary and be left to rely upon the definition of whomever decided what it meant while, at the same time, face all sorts of words in the definition that now need looking up, too – Sisyphean, indeed! Cruel but so usual. On the other hand, thanks to whomever for compiling the dictionary, a pretty utile compendium, I have to say.

To be clear, I am not intending to invoke logocentrism, by which all our words are accorded a decided meaning from a cultural centre, which propagates existing biases or “privileges”; Derrida would roll over in his grave. Granted, I may already have laid grounds here to be accused of logocentrism, myself, by writing with words (and I confess to using English because I didn’t think anyone had the patience to muddle over Wingdings). My present aim is to suggest how we might address the afore-mentioned rabbit-hole dilemma by searching for or (… almost afraid to say it) by deciding upon some definitions of our own. Not like a dictionary, but more like– well yes, okay, like a dictionary, but one that we’ll fashion from the ground-up, like when the light bulb would go on above Darla’s head, and Spanky would snap his fingers to say, “Hey, everyone! Maybe we can put on a play!” So, in the spirit of dissemination, hey everybody, maybe we can compile a dictionary! A real, deconstructive, crowd-sourced dictionary!

I’m not really compiling a dictionary. I’m just trying to make some sense of Derrida and différance. Let me try to illustrate what I mean from my own experience. Sometimes I play Walking Football, a version of the game where players are not permitted to run. Naturally, the debate is over what differentiates walking from running. We’ve agreed that walking means “always having at least one foot in contact with the ground during the striding motion.” Running means “having both feet leave the ground at some point during the striding motion.” This makes for certainty, and so long as our eyes are trained enough to spot feet in motion, which I can spot sometimes so clearly, with such immediacy, that its more like I’m watching, not playing – I’m ghinding it tuff even now to ghet the right words, but trust me. And so long as each player is willing to obey the rules – and, ohh my, there’s always that one player who just won’t. You know who I mean… *sigh… Anyway, so long as they’re not just words uttered that then float away in the breeze, our definitions of the rules for walking and running are useful.

Luckily, too, I might add, when we clarify the rules, we do so out loud, together, and don’t whisper it around in a circle, like when my daughter plays Telephone at a birthday party – after all, we want everyone to be clear. Finally, even if we have trouble spotting feet in motion, because it all happens too quickly, or even if that one player is a cheater at heart, the definitions themselves remain clear, and usually at least one or two of us can remember them well enough to recite back, as needed, usually with a lot of finger-pointing and furrowed brows. One time we even wrote the no-running rule on the gym chalkboard, and even though no one challenged this, on the grounds that writing is secondary to speech, everyone still understood why it was scrawled there, by which I mean everyone knew exactly who should read it the most – wow, does every game have that player? Incorrigible.

Bottom line: accountability is down to the sincerity and respect offered to each player by every other player who decides to participate. As an aside, the need for a referee, an arbiter, is all the more clear when the stakes are as high as bragging rights and free beer. But, even as we play for fun, the rules exist or else the game, as such, does not. (On that note, I find a lot of players just don’t like Walking Football and would rather play with running, and that’s fine, too: it’s their decision, and plenty other like-minded players keep both games afloat. I find the Walking game amplifies decision-making, so maybe this feature just appeals to me. And I still play traditional football, too.) My broader point is that any one person must decide to accept what has been defined and, likewise, any group of people must reach a consensus. Shared meaning matters because, otherwise, as I say, we don’t have a game, or else we have a very different one, or we just have anarchy. But whether that person, alone, or the group, altogether, searching for a way to decide upon meaning, has the patience to delve down the rabbit hole… well, yes, context does indeed matter – both usage and etymology. I’ve said and written as much, myself, for a long time. So, in light of all this, I hope I’ve gathered a little something of Derrida’s différance. I’m still learning.

Another illustration: in my teaching, I occasionally introduced this matter of contextual meaning by offering students a list of synonyms: “slim,” “slender,” “skinny,” “thin,” “narrow.” Each word, of course, has its own particular meaning. “If they meant the same thing,” I offered, “then we’d use the same word,” so just what explains the need for all these synonyms? Well, students would say, there are lots of different things out there that possess or demonstrate these various adjectives (my word, not theirs), so we’ve come up with words to describe them (and I think that’s a charitable “we,” like the Royal “We.”) As the discussion proceeded, I might ask which of these words typically describe human traits versus those – leaving aside metaphors – that typically do not. Next, which words typically possess positive connotations, and which negative, or neutral? And, as it pertains to the personification metaphors, which words are more easily envisioned versus those that really stretch the imagination, or even credibility? Eventually, I would shift from ontology to epistemology, posing the questions at the heart of my intention: For any of the previous questions about these synonyms, how do you know what you’re talking about? For what each of these words could mean, where have your assurances come from? Of course, the most frequent reply to that question was “the dictionary,” followed by “my parents” or “books I’ve read,” or “just everyday experience, listening and talking to people.” Occasionally, the reply was something akin to “Who cares… it just means what it means, doesn’t it?” In every reply, though, one common thread was detectable: the involvement of other people as part of the meaning-making process. Fair enough, we can’t all be Thoreau.

One more example: when is “red” no longer red but perhaps orange or purple? Well, for one thing, if you’re colour blind, the question means something entirely different, which I say not flippantly but again to illustrate how important dialogue and community are to deciding what something means. For another thing, we might wish to ask, in keeping with context-dependency, “Why even ask?” Again, this is not flippant or dismissive but practical: when does it matter so that we distinctly need to identify the colour red? Where a group of people might face the question over what is red versus what is orange or purple, we might expect some kind of discussion to ensue. And, whether asking as part of such a group or as a hermit, alone, I submit that one person’s decision about what is “red” is ultimately down to one person to determine: “Red is this,” or “This is red,” or even, “Gosh, I still can’t really decide.” Even a coerced decision we can still attribute to the one who forces the issue – one person has decided on behalf of another, however benignly or violently: might makes right, or red, as it were.

Coercion introduces a political consideration about whose authority or power has influence, similar to needing referees on account of those players who decide to run. The point, for now, is simply that a decision over what something means to a person is ultimately made by a person, leaving others to deal with that decision on their own terms in whatever way. But other people are part of the meaning-making process, even passively, or else I wouldn’t need language to begin with since the rest of you wouldn’t trouble me by existing. Not to worry, by the way, I appreciate you reading this far. From what I understand (and I am convinced I must learn more, being no avid student of either postmodernism or Derrida), his observation of différance either discounts or else offers no account for the arbitrary decision-making that people might make when they decide they’ve had enough. People tend to land somewhere in a community, and it’s the rare person who lives and plays wholly and uncompromisingly by their own rules. However, the fact that he felt différance was worth the effort to publicise and explain to the rest of us does reflect an arbitrary decision on the part of Derrida and says something about him.

So this is where I have more fundamental trouble understanding Derrida and différance – the very notion of “different,” as in, in what world could there not be an arbiter? Even a life alone would face endless decisions: what to eat, where to go, when to sleep, and so forth. From such musing – speaking of rabbit holes – I was led to reading about another philosopher named Jacques, this one Rancière, and what he calls the axiom of equality. In pure nutshell form, I take this to mean that no (socio-political) inequality exists until it has been claimed to exist – and note that it’s claimed in a boat-rocking kind of way, what the kids these days are calling “disruptive.” The upshot is that equality, itself, can only ever be theoretical because someone somewhere inevitably is and always will be marginalised by the arbitrary decisions of cultural hegemony. Still learning.

Back to the Walking Football analogy: if the rabbit hole of defining a word in the context of those that surround it, and then having to define, even further, all those words, and on and on, and experience is inexhaustible, and what’s the point, and lift a glass to nihilism… if that kind of limitless indefinite deconstructive search-and-compare lies at the heart of what is different, then maybe Derrida just found it difficult to reach agreement with other people. It stands to reason that, if he played Walking Football, Derrida might be the worst cheater on the floor, continually running when he should be walking, then denying it just the same as he tried to gain advantage. Maybe, fed up being called a cheater, he would take his ball and go home to play by himself, where no one could say he was wrong. Being alone, who would be there, whether as an obedient player or as a sneaky one, to challenge him?

In fact, maybe that’s why he chose to return to play the next day – for all the arguing, he enjoyed the game, or the attention, or the camaraderie, or the exercise, or whatever, more than being accused of cheating. I wonder if, perhaps, in the great game of philosophy football, he would have been the only rival to strike real fear in Diogenes – I mean awe & respect kind of fear, just to clarify, and I mean if they had lived at the same time. It’s hard to know about Diogenes since nothing he wrote down ever survived, and these days, I doubt more than a few can recall any of whatever he said, besides that lamp-carrying honesty thing. (We should all have such good spirit when it comes to our first principles.) Anyway, I think Diogenes played for Wimbledon.

Well, I am being unkind to Derrida. Evidently, he was a kinder person by nature than I have let on, as well as an advocate for all voices, all people. And the professional care, the uncompromising expertise he took to convey his ideas, to trouble himself with delving down the rabbit hole so arbitrarily – to go down at all but, moreover, to go so far when he might, just the same, have decided to halt. Delve as far as you like, but accept responsibility for your decision, every time. In that respect, how does Derrida differ from any other person facing decisions? Did he have still other motivations? No player who kicks a football is deliberately playing to lose, not unless they have been coerced by someone else to do so. On the other hand, for all I know, maybe what Derrida called red I would call blue. Be careful not to pass the ball to the wrong team! (By the way, in sport, dynasties are remembered precisely because they eventually come to an end.)

Was Derrida no less accountable and open to scrutiny than you, or me, or anybody else? To suggest that a word only makes sense based on how it differs from those around it is no less arbitrary than its reciprocal suggestion, that a word only makes sense based on how it describes only what it describes. Half-full / half-empty, six of one… Two sides of the same coin are still the same coin. Alternatively, who put him up to all this? Meanwhile, on his own, surely Derrida had it within himself, as people do when they reach a point, simply to say, “Here is enough. I decide to stop here. For me, [the item in question] means this.” If that doesn’t ring true and sound like him, well, I’d say that can be just as telling of his character; I heard it suggested, once, how we can be helped in knowing something by what it is not. So, fine – for Derrida to stake the claim called différance, I’m willing to concede him that moment. We all land somewhere, and we’re all hardly alike, even when we’re alike.

We are, each and every one of us, individual. But together we comprise something just as dynamic on a larger scale – one might construe us societally, or perhaps historically, anthropologically, or on and on, in whatever way through whichever lens. For me, différance appears an attempt to speak for all about all, prescriptively. A grand stab at philosophy, no question, and that’s the beauty of the equality of philosophy, with thanks to Rancière: we all have a part to play and a right to respond. For the time being, as I have understood Derrida and his thinking, and I willingly stand to be instructed further, différance strikes me as ironic, being an advocacy for the dynamic development of people and language and culture that self-assuredly asserts its own accuracy. That is not an uncommon indictment of postmodernists. What’s more, it is ohh, so human.

Wanted on the Voyage: Professional Teachers are Experts in their Field

The needs of the economy and our society are changing and therefore you need to have a learning system that fits the purpose, and that purpose is constantly shifting.”

So said Anthony Mackay, CEO of the Centre for Strategic Education (CSE) in Australia, during an interview with Tracy Sherlock from The Vancouver Sun. Mr Mackay was at SFU’s Wosk Centre for Dialogue in Vancouver on January 29, 2015, facilitating a forum about the changing face of education. Although links to the forum’s webcast archive and Sherlock’s interview are now inactive, I did save a copy of the interview text at the time, posted here beneath this essay. Tracy Sherlock has since told me that she doesn’t know why the interview’s links have been disconnected (e-mail communication, January 27, 2017). Nonetheless, there remains ample on-line and pdf-print promotion and coverage of the event.

The forum and the interview were first brought to my attention via e-mail, shared by an enthusiastic colleague who hoped to spur discussion, which is altogether not an uncommon thing for teachers. Originally, I wrote distinct yet connected responses to a series of quotations from Mr Mackay’s interview. Here, some thirty-two months later, I’ve edited things into a more fluid essay although, substantively, my thoughts remain unchanged. Regrettably, so does the bigger picture.

For starters, Mr Mackay’s remark tips his hand – and that of the CSE – when he precedes society with economy. Spotting related news reports makes the idea somewhat more plausible, that of a new curriculum “…addressing a chronic skills shortage in one of the few areas of the Canadian economy that is doing well” (Silcoff)[1]. Meanwhile, in Sherlock’s interview [posted below this essay], Mr Mackay concludes by invoking “the business community,” “the economy of the future,” and employers’ confidence. Make no mistake, Mr Mackay is as ideological as anyone out there, including me and you and everybody, and I credit him for being obvious. On the other hand, he plays into the hands of the grand voice of public educators, perhaps willfully yet in a way that strikes me as disingenuous, couched in language so positive that you’re a sinner to challenge him. Very well, I accept the challenge.

Whatever “purpose” of education Mr Mackay has in mind, here, it’s necessarily more specific unto itself than to any single student’s interests or passions. In other words, as I take his portrayal, some student somewhere is a square peg about to be shown a round hole. Yet this so-called purpose is also “constantly shifting,” so perhaps these are triangular or star-shaped holes, or whatever, as time passes by.

Enter “discovery learning” – by the way, are we in classrooms, or out-and-about on some experiential trip? – and the teacher says only what the problem is, leaving the students to, well, discover the rest. I can see where it has a place; how it enables learning seems obvious enough since we learn by doing – teach someone to fish, and all. But when it comes to deciding which fish to throw back, or how many fish are enough when you don’t have a fridge to store them in before they rot and attract hungry bears… when it comes to deciding what’s more versus less important, those minutiae of mastery, it’s not always as easy as an aphorism or a live-stream video conference. Where it’s more hands-off from the teacher, in order to accommodate the student, discovery learning seems to me better suited to learners well past any novice stage. And if the teacher said, “Sorry, that’s not discovery learning,” would the students remain motivated? Some would; others most certainly would not: their problem, or the teacher’s? When both the teacher and the students say, “We really do need to follow my lead just now,” which party needs to compromise for the other, and to what extent? Teaching and learning ought to be a negotiation, yes, but never an adversarial one! In the case of “discovery learning,” I wonder whether “teacher” is even the right title anymore.

In any case, Mr Mackay appears guilty of placing the cart before the horse where it comes to educating students according to some systemic purpose. I’ve got more to say about this particular detail, what he calls “personalization.” For now, it’s worth setting some foundation: Ken Osborne wrote a book called Education, which I would recommend as a good basis for challenging Mr Mackay’s remarks from this interview.

TechnoFailsThat Osborne’s book was published in 1999 I think serves my point, which is to say that discernment, critical thinking, effective communication, and other such lauded 21st century skills were in style long before the impending obscurity of the new millennium. They have always offered that hedge against uncertainty. People always have and always will need to think and listen and speak and read, and teachers can rely on this. Let’s not ever lose sight of literacy of any sort, in any venue. Which reminds me…

“Isn’t that tough when we don’t know what the jobs of the future will be?”

I must be frank and admit… this notion of unimaginable jobs of the future never resonated with me. I don’t remember when I first heard it, or even who coined it, but way-back-whenever, it instantly struck me as either amazing clairvoyance or patent nonsense. I’ve heard it uttered umpteen times by local school administrators, and visiting Ministry staff, and various politicians promoting the latest new curriculum. The idea is widely familiar to most people in education these days: jobs of the future, a future we can’t even imagine! Wow!

Well, if the unimaginable future puzzles even the government, then good lord! What hope, the rest of us? And if the future is so unimaginable, how are we even certain to head any direction at all? When you’re lost in the wilderness, the advice is to stay put and wait for rescue. On the other hand, staying put doesn’t seem appropriate to this discussion; education does need to adapt and evolve, so we should periodically review and revise curricula. But what of this word unimaginable?

For months prior to its launch, proponents of BC’s new curriculum clarified – although, really, they admonished – that learning is, among other things, no longer about fingers quaintly turning the pages of outmoded textbooks. To paraphrase the cliché, that ship didn’t just sail, it sank. No need to worry, though. All aboard were saved thanks to new PDFs– er, I mean PFDs, personal floatation devices– er, um, that is to say personal floatation e-devices, the latest MOBI-equipped e-readers, to be precise. As for coming to know things (you know, the whole reason behind “reading” and all…), well, we have Google and the Internet for everything you ever did, or didn’t, need to know, not to mention a 24/7 news cycle, all available at the click of a trackpad. It’s the 21st century, and learning has reserved passage aboard a newer, better, uber-modern cruise ship where students recline in ergonomic deck chairs, their fingertips sliding across Smart screens like shuffleboard pucks. Welcome aboard! And did I mention? Technology is no mere Unsinkable Ship, it’s Sustainable too, saving forests of trees from the printing press (at a gigawatt-cost of electricity, mind, but let’s not pack too much baggage on this voyage).

Sorry, yes, that’s all a little facetious, and I confess to swiping as broadly and inaccurately as calling the future “unimaginable.” More to the point: for heaven’s sake, if we aren’t able to imagine the future, how on earth do we prepare anybody for it? Looking back, we should probably excuse Harland & Wolff, too – evidently, they knew nothing of icebergs. Except that they did know, just as Captain Smith was supposed to know how to avoid them.

But time and tide wait for no one which, as I gather, is how anything unimaginable arose in the first place. Very well, if we’re compelled toward the unknowable future, a cruise aboard the good ship Technology at least sounds pleasant. And if e-PFDs can save me weeks of exhausting time-consuming annoying life-skills practice – you know, like swimming lessons – so much the better. Who’s honestly got time for all that practical life-skills crap, anyway, particularly when technology can look after it for you – you know, like GPS.

If the 21st century tide is rising so rapidly that it’s literally unimaginable (I mean apart from being certain that we’re done with books), then I guess we’re wise to embrace this urgent… what is it, an alert? a prognostication? guesswork? Well, whatever it is, thank you, Whoever You Are, for such vivid foresight– hey, that’s another thing: who exactly receives the credit for guiding this voyage? Who’s our Captain aboard this cruise ship? Tech Departments might pilot the helm, or tend the engine room, but who’s the navigator charting our course to future ports of call? What’s our destination? Even the most desperate voyage has a destination; I wouldn’t even think a ship gets built unless it’s needed. Loosen your collars, everybody, it’s about to get teleological in here.

Q: What destination, good ship Technology?

A: The unknowable future…

Montague Dawson
Montague Dawson

Land?-ho! The Not-Quite-Yet-Discovered Country… hmm, would that be 21st century purgatory? Forgive my Hamlet reference – it’s from a mere book.

To comprehend the future, let’s consider the past. History can be instructive. Remember that apocryphal bit of historical nonsense, that Christopher Columbus “discovered America,” as if the entire North American continent lay indecipherably upon the planet, unbeknownst to Earthlings? (Or maybe you’re a 21st century zealot who only reads blogs and Twitter, I don’t know.) Faulty history aside, we can say that Columbus had an ambitious thesis, a western shipping route to Asia, without which he’d never have persuaded his political sponsors to back the attempt. You know what else we can say about Columbus, on top of his thesis? He also had navigation and seafaring skill, an established reputation that enabled him to approach his sponsors in the first place. As a man with a plan to chart the uncharted, even so Columbus possessed some means of measuring his progress and finding his way. In that respect, it might be more accurate to say he earned his small fleet of well-equipped ships. What history then unfolded tells its own tale, the point here simply that Columbus may not have had accurate charts, but he also didn’t set sail, clueless, to discover the unimaginable in a void of invisible nowhere.

But what void confronts us? Do we really have no clue what to expect? To hear the likes of Mackay tell it, with technological innovation this rapid, this influential, we’re going to need all hands on deck, all eyes trained forward, toward… what exactly? Why is the future so unimaginable? Here’s a theory of my very own: it’s not.

Snoopy the Blackeared Pirate

Discovering in the void might better describe Galileo, say, or Kepler, who against the mainstream recharted a mischarted solar system along with the physics that describe it. Where they disagreed over detail such as ocean tides (as I gather, Kepler was right), they each had pretty stable Copernican paradigms, mediated as much by their own empirical data as by education. Staring into the great void, perhaps these astronomers didn’t always recognise exactly what they saw, but they still had enough of the right stuff to interpret it. Again, the point here is not about reaching outcomes so much as holding a steady course. Galileo pilots himself against the political current and is historically vindicated on account of his curious mix of technological proficiency, field expertise, and persistent vision. For all that he was unable to predict or fully understand, Galileo still seemed to know where he was going.

Copernicus Kepler Galileo.png

I suppose if anyone might be accused of launching speculative missions into the great void of invisible nowhere, it would be NASA, but even there is clarity. Just to name a few: Pioneer, Apollo, Voyager, Hubble – missions with destinations, destinies, and legacies. Meanwhile, up in the middle of Nowhere, people now live in the International Space Station. NASA doesn’t launch people into space willy-nilly. It all happens, as it should, and as it must, in a context with articulated objectives. Such accomplishments do not arise because the future is unimaginable; on the contrary, they arise precisely because people are able to imagine the future.

Which brings me back to Mr Mackay and the government’s forum on education. It’s not accurate for me to pit one side against another when we all want students to succeed. If I’ve belaboured the point here, it’s because our task concerns young people, in loco parentis. Selling those efforts as some blind adventure seems, to me, the height of irresponsibility wrapped in an audacious marketing campaign disguised as an inevitable future, a ship setting sail so climb aboard, and hurry! Yes, I see where urgency is borne of rapid innovation, technological advancement made obsolete mere weeks or months later. For some, I know that’s thrilling. For me, it’s more like the America’s Cup race in a typhoon: thanks, but no thanks, I’ll tarry ashore a while longer, in no rush to head for open sea, not even aboard a vaunted ocean liner.

We simply mustn’t be so eager to journey into the unknown without objectives and a plan, not even accompanied as we are by machines that contain microprocessors, which is all “technology” seems to imply nowadays. There’s the respect that makes calamity of downloading the latest tablet apps, or what-have-you, just because the technology exists to make it available. How many times have teachers said the issue is not technology per se so much as knowing how best to use it? Teleology, remember? By the way, since we’re on the subject, what is the meaning of life? One theme seems consistent: the ambition of human endeavour. Sharpen weapon, kill beast. Discover fire, cook beast! Discover agriculture, domesticate beast. Realise surplus, and follows world-spanning conquest that eventually reaches stars.

Look, if learning is no longer about fingers quaintly turning the pages of outmoded textbooks, then fine. I still have my doubts – I’ve long said vinyl sounds better – but let that go. Can we please just drop the bandwagoning and sloganeering, and get more specific? By now, I’ve grown so weary of “the unimaginable future” as to give it the dreaded eye-roll. And if I’m a teenaged student, as much as I might be thrilled by inventing jobs of the future, I probably need to get to know me, too, what I’m all about.

In truth, educators do have one specific aim – personalized learning – which increasingly has come into curricular focus. Personalization raises some contentious issues, not least of which is sufficient funding since the need for individualized attention requires more time and resources per student. Nevertheless, it’s a strategy that I’ve found positive, and I agree it’s worth pursuing. That brings me back to Ken Osborne. One of the best lessons I gathered from his book was the practicality of meeting individuals wherever they reside as compared to determining broader needs and asking individuals to meet expectations.

Briefly, the debate presents itself as follows…

  • Side ‘A’ would determine communal needs and educate students to fill the roles

In my humble opinion, this is an eventual move toward social engineering and a return to unpleasant historical precedent. Know your history, everybody.

  • Side ‘B’ would assess an individual’s needs and educate a student to fulfil personal potential

In my humble opinion, this is a course that educators claim to follow everyday, especially these days, and one that they would do well to continue pursuing in earnest.

Apollo 11
Apollo 11

In my experience, students find collective learning models less relevant and less authentic than the inherent incentives found in personalized approaches that engender esteem and respect. Essentially, when we educate individuals, we leave them room to sort themselves out and accord them due respect for their ways and means along the way. In return, each person is able to grasp the value of personal responsibility. Just as importantly, the opportunity for self-actualisation is now not only unfettered but facilitated by school curricula, which I suspect is what was intended by all the “unimaginable” bluster. The communal roles from Osborne’s Side ‘A’ can still be filled, now by sheer numbers from the talent pool rather than by pre-conceived aims to sculpt square pegs for round holes.

Where I opened this essay with Anthony Mackay’s purposeful call to link business and education, I’ve been commenting as a professional educator because that is my field, so that is my purview. In fairness to government, I’ve found that more recent curricular promotion perhaps hints at reversing course from the murk of the “unimaginable” future by emphasizing, instead, more proactive talk of skills and empowerment. Even so, a different posture remains (touched upon in Katie Hyslop’s reporting of the forum and its participants, and a fairly common discursive thread in education in its own right) that implicitly conflates the aims of education and business, and even the arts. Curricular draft work distinguishes the “world of work” from details that otherwise describe British Columbia’s “educated citizen” (p. 2). [2] Both Ontario and Alberta’s curricular plans have developed comparably to BC’s, noting employers’ rising expectations that “a human capital plan” will address our ever-changing “world of work” (p. 5)[3] – it’s as if school’s industrial role were a given. Credit where it’s due, I suppose: they proceed from a vision towards a destination. And being neither an economist nor an industrialist, I don’t aim to question the broader need for business, entrepreneurship, or a healthy economy. Everybody needs to eat.

What I am is a professional educator, and that means I have been carefully and intentionally trained and accredited alongside my colleagues to envision, on behalf of all, what is best for students. So when I read a claim like Mr Mackay’s, that “what business wants in terms of the graduate is exactly what educators want in terms of the whole person,” I am wary that his educational vision and leadership are yielding our judgment to interests, such as commerce and industry, that lie beyond the immediately appropriate interests of students. Anthony Mackay demonstrates what is, for me, the greatest failing in education: leaders whose faulty vision makes impossible the very aims they set out to reach. (By the by, I’ve also watched such leadership condemn brilliant teaching that reaches those aims.) As much as a blanket statement, Mr Mackay makes an unfounded statement, and I could hardly do better to find an example of begging the question. If Mr Mackay is captain of the ship, then maybe responsible educators should be reading Herman Wouk – one last book, sorry, couldn’t resist.

Education is about empowering individuals to make their own decisions, and any way you slice it, individuals making decisions is how society diversifies itself. That includes diversifying the economy, as compared to the other way around (the economy differentiating individuals). Some people are inevitably more influential than others. All the more reason then for everybody, from captains of industry on down, to learn to accept responsibility for respecting an individual’s space, even while everybody learns to decide what course to ply for themselves. Personalized learning is the way to go as far as resources can be distributed, so leave that to the trained professional educators who are entrusted with the task, who are experts at reading the charts, spotting the hazards, and navigating the course, even through a void. Expertise is a headlight, or whatever those are called aboard ships, so where objectives require particular expertise, let us be lead by qualified experts.

And stop with the nonsense. No unimaginable future “world of work” should be the aim of students to discover while their teachers tag along like tour guides. Anyway, I thought the whole Columbus “discovery” thing had helped us to amend that sort of thinking, or maybe I was wrong. Or maybe the wrong people decided to ignore history and spend their time, instead, staring at something they convinced themselves was impossible to see.


 

3 February 2015

Vancouver Sun Weekday Sample

Tracy Sherlock VANCOUVER SUN View a longer version of this interview at vancouversun.com Sun Education Reporter tsherlock@vancouversun.com

Changing education to meet new needs

“The learning partnership has gotto go beyond the partnership of young person and family, teacher and school, to the community and supportive agencies.
TONY MACKAY CEO, CENTRE FOR STRATEGIC EDUCATION IN AUSTRALIA

Tony Mackay, CEO at the Centre for Strategic Education in Australia, was in Vancouver recently, facilitating a forum about changing the education system to make it more flexible and personalized. He spoke about the rapidly changing world and what it means for education.

Q Why does the education system need to change?

A The needs of the economy and our society are changing and therefore you need to have a learning system that fits the purpose, and that purpose is constantly shifting. So it’s not just a matter of saying we can reach a particular level and we’ll be OK, because you’ve got such a dynamic global context that you have a compelling case that says we will never be able to ensure our ongoing level of economic and social prosperity unless we have a learning system that can deliver young people who are ready — ready for further education, ready for the workforce, ready for a global context. That’s the compelling case for change.

Q Isn’t that tough when we don’t know what the jobs of the future will be?

A In the past we knew what the skill set was and we could prepare young people for specialization in particular jobs. Now we’re talking about skill sets that include creativity, problem solving, collaboration, and the global competence to be flexible and to have cultural understanding. It’s not either or, it’s both and — you need fantastic learning and brilliant learning in the domains, which we know are fundamental, but you also need additional skills that increasingly focus on emotional and social, personal and inter-personal, and perseverance and enterprising spirit. And we’re not saying we just want that for some kids, we want to ensure that all young people graduate with that skill set. And we know they’re going to have to effectively “learn” a living — they’re going to have to keep on learning in order to have the kind of life that they want and that we’re going to need to have an economy that thrives. I believe that’s a pretty compelling case for change.

Q How do you teach flexibility?

A When I think about the conditions for quality learning, it’s pretty clear that you need to be in an environment where not only are you feeling emotionally positive, you are being challenged — there’s that sense that you are challenged to push yourself beyond a level of comfort, but not so much that it generates anxiety and it translates into a lack of success and a feeling of failure that creates blockages to learning. You need to be working with others at the same time — the social nature of learning is essential. When you’re working with others on a common problem that is real and you have to work as a team and be collaborative. You have to know how to show your levels of performance as an individual and as a group. You can’t do any of that sort of stuff as you are learning together without developing flexibility and being adaptive. If you don’t adapt to the kind of environment that is uncertain and volatile, then you’re not going to thrive.

Q What does the science of learning tell us?

A We now know more about the science of learning than ever before and the question is are we translating that into our teaching and learning programs? It’s not just deeper learning in the disciplines, but we want more powerful learning in those 21st-century skills we talked about. That means we have to know more than ever before about the emotions of learning and how to engage young people and how young people can encourage themselves to self-regulate their learning.

The truth is that education is increasingly about personalization. How do you make sure that an individual is being encouraged in their own learning path? How do we make sure we’re tapping into their strengths and their qualities? In the end, that passion and that success in whatever endeavour is what will make them more productive and frankly, happier.

Q But how do you change an entire education system?

A Once you learn what practice is done and is successful, how do you spread that practice in a school system so it’s not just pockets of excellence, but you’ve actually got an innovation strategy that helps you to spread new and emerging practice that’s powerful? You’re doing this all in the context of a rapidly changing environment, which is why you need those skills like flexibility and creativity. The learning partnership has got to go beyond the partnership of young person and family, teacher and school, to the community and supportive agencies. If we don’t get the business community into this call to action for lifelong learning even further, we are not going to be able to get there. In the end, we are all interdependent. The economy of the future — and we’re talking about tomorrow — is going to require young people with the knowledge, skills and dispositions that employers are confident about and can build on.

Previously available at http://www.pressreader.com/canada/the-vancouver-sun/20150203/282183649467268/TextView

and http://newsinvancouver.com/qa-changing-education-to-meet-new-needs/

[1] Sean Silcoff, The Globe and Mail, BC to add computer coding to school curriculum (Jan 17th, 2016)

[2] Introduction to British Columbia’s Redesigned Curriculum (August 2015 – Draft)

[3] Inspiring Action on Education (June 2010)

From KQED News – How a Chilliwack School Ditched Awards and Assemblies to Refocus on Kids and Learning

What a terrific story from Chilliwack, BC being covered by Linda Flanagan for California’s Bay Area PBS news outlet.

Photograph credit: Flickr/Sarah

Agreed! Excellence is a culture.

Some leaders talk a great game, but no matter the words coming out of their mouths, people respond to the culture they’re part of, and within it, they respond in both overt and subtle ways.

By the way, leaders aren’t limited to those in the head office… leaders are people who take initiative, work to their strengths, and lift others to do the same thing… so pay attention to making your strengths and inspiration constructive instead of deflating or injurious.

If you aren’t getting the results you expected, then reflect and consider what reality people are experiencing and which messages they may be receiving around your place, maybe unintended ones, that might conflict or work against your aims for attitude and behaviour.

It’s a shame when aims and culture contradict. It’s hypocrisy when aims are ignored or undermined by deliberately contradictory culture.

No shame in reflection. Reflection is learning, and learning’s a virtue.

Also agreed! School education should be “looking beyond the short term and thinking more about what kinds of adults they’re trying to develop.” That’s always been my approach.

Post-secondary, career, parenthood, civic involvement… all these and more will come about, and with guidance, let each person find their own way. But the adult human beings making their life decisions need a virtuous, thoughtful, positive foundation, and that’s what school education should always be about.

Click here to read Linda Flanagan’s story.

On-line Comments: Worthwhile, Bile, or Just Futile

Updated: Aug. 16, 2017

This is like the wild west, kind of lawless.

Colette A. Wilt

 

On-line comments are not guns, they don’t kill people. And the people who wield them, as in write them, are not having a stand-off at high noon. On-line comments are not deadly but, boy, can they be deadly stupid.

They’re so very often uninformed, superficial, and emotionally driven as well as – frankly – bloody lazy. Plenty of opinions from plenty of people carrying free-speech chips on self-righteous shoulders. On-line comments, these days, are just another sign of the times.

“Just how many people bother to research and draft for a ‘Comments’ section response, anyway?”

Does it show I’m fed up with people trying to win personal pissing matches in the “Comments” section? Does it show? …people clawing their way to the top of some imagined pile of respect, in a community comprising whomever read the article – unless of course they only read the headline. Does it show? …the invective, the insults, the one-liner spree? Commenters affirming, negating, defending, attacking. Pointing out who’s so obviously wrong, what’s so evidently right. Commenters commenting, exercising their democracy, one comment at a time? On-line comments are the Twitter of– er, hmm, I’ll need some time to work on that one.

Of course I’m unable to say on-line comments kill people, but that’s not because they actually don’t kill people. It’s because, in the analogy, on-line comments are just the bullets. Computer keyboards would be the guns. And it’s still people pulling the trigger by pressing send – there’s got to be a triggering joke in here somewhere, I’m sure of it. For now, enough to say that guns don’t post comments, people do.

Time was when a letter-to-the-editor was the main public recourse. But sending one to your chosen publication was no guarantee of being published, or at least not published in full. But then came the Internet, the great equalizer. I can only suspect that, way back when, when that first on-line article permitted readers to leave comments, that the author or editor or publisher proudly lifted a glass of wine to rejoice the enabling of the public voice. One step forward for free speech. Here’s to democracy.

How often I’ve read an article, then followed up with the on-line comments, thinking, “I’d like a sense of the broader opinion out there, maybe encounter some different perspectives, pick up a hyperlink or two for this topic.” This does still happen, and it’s what makes on-line comments, for me, worthwhile. It also means I’m relying on the other commenters to offer anything of substance. But, obviously (…is it obvious?) substance doesn’t always just happen. Honestly, though, pretty naïve to expect that it would. And if you thought, in the sheer amount of comments for just one single article, that the law of averages would help, then you probably haven’t read too many on-line comments. They can far, far surpass the length of the article and illustrate far, far less than broader opinion or different perspectives or anything useful at all about the topic. Just as often they proliferate because somebody needed to win.

How often is someone’s on-line comment about the article as compared to that commenter seeking personal affirmation or recognition as some kind of uber-reliability source? How often does an on-line comment chain turn into a personal on-line shoving match? And how often has somebody replied along these lines: “You’re pretty tough when it’s not face-to-face…” ?

Nobody thinks they’re even beginning to solve the issue [whatever it is] in the on-line Comments section. Do they? At least, they couldn’t possibly think so when all they’ve written is a sentence or two, right? At least, when they’ve written sentences. But, unquestionably, essay-length on-line comments are the exception to the rule. Aren’t they? At least, they are in the Age of Twitter – wait, sorry, I already slammed Twitter. This time, I’ll go with Google making us stupid (not for the first time). By the way, even shortened attention spans have been called into question (have a look, neither’s a long read). My own sense, for what it’s worth, is that we attend to what stimulates us the most although – egregiously – I have no research to back my opinion, and if any of you trolls call me on that, I’ll comment you back. So just be warned. Gotta be almost time for that trigger joke.

Are people commenting when maybe they should be writing an article of their own? Would that be too much responsibility to bear? to ask? Would writing an article require too much effort? People seem to care enough to leave a comment yet not enough to offer something more substantive than a line or two, or a paragraph the odd time. Even a few paragraphs, that one time by that one person, but anything truly edited for cohesion – are you kidding, what are we, journalists? How many of us are writers, period, much less paid ones? Heaven forbid anyone be expected to offer more than a few lines of opinion masquerading as oh-such-obvious-fact, or a one-liner, or a dogmatic tirade! (Yes, I not only see the irony, I intended it.) Leave all that responsibility crap for whoever else. Whomever, actually, but that would mean caring.

Who are you, anyway, that you’d present yourself in so superficial a manner as on-line comments yet expect to be taken seriously? Who are you, that you’d conflate your real-life person with your on-line persona in such a way where one belies the other? Which one is demonstrating the true you? Who are you, to be taking this so personally right now when, in fact, right now I’m giving you the benefit of the doubt? Cynicism aside, everyone can think – hence my frustration. If on-line comments suggest anything, they suggest that emotion rules, not thinking.

Don’t misconstrue – thinking and emotion, I’d say, both occur, but by default (I’d say), emotion controls thinking more than the other way around. Far more rarely does rationality show up beyond the article itself, if even there.

Below is an edited chain of comments that I cut & pasted from an NPR article posted to Facebook, about the bombing of the Manchester arena following the Ariana Grande concert in 2017.

To be precise, these comments that I cut & pasted are from the Facebook post, not NPR’s website. I also present these comments as a single, focused discussion when, in fact, other peoples’ semi-related comments had appeared in between some of these, responding to still other people. But the way comments appear is evidently controlled more by their time stamp, when they were posted, than by which person’s receiving a reply in the thread. So, in selecting only these comments here, I tried to maintain the direct discussion between particular people, back and forth. Finally, I’ve published their Facebook names with hyperlinks because all this is publicly published anyway, and nobody’s owed any shelter.

Rather than take sides, see if you can read this thread to understand my point, the futility of trying to solve such grand issues in a Comments section, the pointlessness of on-line comments in general. (Yes, I see the irony in having my own Comments section below. I even intended it.)

Ask of each comment, and each commenter…

  • what, really, is the motive behind this comment getting not just written but posted?
  • what, really, is the response that this person…
    • believes for themselves?
    • presumes from the other person?
    • seeks from anyone else (like us) who may be reading?

Read not only with self-awareness but with other-awareness, with empathy. But please resist taking sides on the issues, irrespective of your own feelings, because the point here is the comments having been crafted and shared, not the terror incident or the politics that are introduced. A tangential point is to acknowledge that it’s possible and sometimes productive to keep our feelings and our rationality separate.

 


The Thread

James Alford What a great freedom festival! I just don’t know what we’d do without all the freedom that comes with unfettered access to semiautomatic weapons. Thanks for sharing this awesome display of our enviable freedom!

How do other nations cope without our awesome brand of freedom?! I mean, other than longer life expectancy, ultra low crime rates, drastically lower prison populations, and better overall quality of life.

Yep. Would never wanna swap my bullet-y freedom for any of that.

Scott Macleod How are the Ariana Grande concerts in places without freedom?

James Alford Scott, they do have WAY less freedom, don’t they?! They only have 0.23 gun homicides per 100,000. We have almost 11!!! Murkah!

Scott Macleod James, that is a meaningless statistic. Here I’ll show you:

Last year cars killed:
Switzerland 269
Israel 263
Sweden 285
Norway 145
Australia 1,299
Canada 2,075
United States 36,166

Deaths from drowning, children under 14:
Switzerland 3
Israel 8
Sweden 4
Norway 1
Australia 33
Canada 28
United States 548

Deaths from alcohol per year:
Switzerland 1,600
Israel n/a
Sweden 2,222
Norway 1,016
Australia 5,554
Canada 8,100
United States 88,000

The United States is an outlier on all of these. You can do the same breakdown with antibiotics. You can do it with hot water heaters. Or with deaths from bees. And the US will have higher death rates.

Jacqui Parker Percentages based on overall population would make more sense in your example.

Seth Martin Did anyone in this thread actually read the article?

Scott Macleod The numbers don’t change when broken down per capita. The US is still an outlier. Know why? Because deaths from X will always be higher in countries with more X. Determining causality is much more complicated. Would taking X away eliminate those deaths? Or would X just be substituted for something else and what would have been deaths from X become deaths from Y? This is what’s important.

Jim Chan Total death doesn’t equal to death rate. What are you a 2nd grader?

Scott Macleod Jim, see comment above. Per capita break down does not change the analysis.

Amon-Raa Valencia Scott Macleod the replacement theory can be checked by looking at life expectancy.

Do the countries you point out have higher life expectancy than the US?

James Alford I’m afraid you’re unacquainted with how percentages work, Scott.

If I have 10 tomatoes in my garden, and 2 of them are rotten, and you have 10,000 tomatoes, and 200 of them are rotten, then my problem is still 10 times bigger than yours, even though you have 100 times more rotten tomatoes.

Find a local 5th grader. He’ll be happy to provide more illustrations.

Wesley D. Stoner So Scott Macleod, in your example, if X = guns then the US logically has more gun deaths because there are more guns, right? Where do you think I am going next….

Scott Macleod So by that logic, James, those other countries have the same problem the US does from guns? Please respond without insults.

Scott Macleod <<Scott Macleod the replacement theory can be checked by looking at life expectancy.

Do the countries you point out have higher life expectancy than the US?>>

Other factors go into determining life expectancy. Access to healthcare for example.

James Alford Nevermind, Scott. The original stat said it all. The U.K. (who you brought up) has a gun homicide rate 44 times lower than ours. If you can’t grasp that incredibly straightforward piece of empirical data, then we don’t have a starting point.

Chris Toscano James Alford, you have unlocked Master Troll Level 99. Fine work sir! Look at all the ammosexuals that you have up in arms.

Carrie Anne 😂Ammosexuals

Margaret Moore Bennett Scott Macleod, I am a statistics teacher, you show a basic lack of understanding for how statistics work. You are a poster child for why the GOP is successful with the un and under-educated.

Scott Macleod <<Nevermind, Scott. The original stat said it all. The U.K. (who you brought up) has a gun homicide rate 44 times lower than ours. If you can’t grasp that incredibly straightforward piece of empirical data, then we don’t have a starting point.>>

A) Again, this stat is meaningless. It tells us nothing about causality or how public policy changes the death rates.

B) Those are GUN death rates. Of course a country with 330 million GUNS is going to have higher death rates from GUNS. Just like a country with greater access to antibiotics has more deaths from antibiotics. It tells us nothing about whether antibiotics or guns are good or bad for society.

Scott Macleod How about explaining it to me Margaret rather than resorting to ad hominem and appeals to authority?

Scott Macleod I am not uneducated. I am not a republican. There’s two misses. What are the odds your third claim that I demonstrate a lack of understanding for statistics is correct.

David Houghton Well, you led with raw numbers and not per capita numbers. Not exactly putting your best foot forward on the stats front.

Tandy Fitzgerald Scott Macleod does that mean a US citizen is more reckless when it comes to driving that the rest of the world and less aware when it comes to their children swimming or less aware of health issues and oblivious to the affects of alcohol? Man US citizens really do prefer to live on the edge far more than anyone else in the world…I guess freedom has more prices then just serving in the military.

Scott Macleod I led with what I had available to copy and paste to demonstrate what I was getting at. I agree it would have been better to break them down per capita. Alas, I’m on my phone and these comments move quickly.

Normally when you see this argument, though, it involves raw numbers. As I have said, what I was illustrating does not change when broken down per capita.

Nick Lucas Scott Macleod My favorite part about your posts is that you are trying to dismiss data because of your claims of causality but you make your first statement of the Ariana Grande concert without the same rule of thought.

What gun would have somehow stopped that bomb from exploding? Why didn’t a person with a gun stop the OK bombing or Boston bombing?

This is the problem with bias is we tend to not be able to apply the same logic to our own beliefs that we do others we disagree with.

Michael Dugger Scott Macleod no gun would’ve have stopped a silent bomb carrier Scott.

Scott Macleod Nick, my original comment was a quip. It was a snarky counter to the OP. I feel like you are reading too much into it.

Nevertheless, it does illustrate what I mentioned earlier. When X is not available, people will substitute with Y and nearly the same amount of people would likely die anyway. Why commit suicide with a $500 gun when you can do it with $3 of rope? Looking at guns only is a disingenuous way of looking at the problem. To be sincere, we would need to look at all homicides to determine causality.

I have not made the claim that access to guns will stop bombings.

Bill Melton “Would it make you happier, little girl, if they were pushed out of a seventh floor window?” Archie Bunker

Jenny Caldwell Scott Macleod Those aren’t death rates, those are simply the numbers of deaths. Death rates are population-based, i.e. # of deaths by drowning/1000. US death *rates* by gun violence are indeed much higher than other countries.

Paul Errman James Alford go cry yourself a river. When their violent crime rate drops and they actually have a population of over 300 million call us.

Onica Annika Scott Macleod you can’t take a gun into a concert permit or not. Stupid example.

Onica Annika Scott Macleod Snowflake are you triggered? 😂😂

Scott Macleod <<Scott Macleod you can’t take a gun into a concert permit or not. Stupid example.>>

I never said you could or that you should. Why are you bringing this irrelevant insight into the conversation?

Scott Macleod <<Scott Macleod Those aren’t death rates, those are simply the numbers of deaths. Death rates are population-based, i.e. # of deaths by drowning/1000. US death *rates* by gun violence are indeed much higher than other countries.>>

I know this. I never disagreed. US death rates by gun violence are higher. I never claimed otherwise. What I dispute is the significance of this information.

Scott Macleod We also have higher death RATES due to drowning, alcohol consumption, motor vehicle accidents, and a whole host of other phenomena. Why?

Looking at RATES and ignoring all other factors gives people a misleading glimpse into reality.

Andrew Scribner NUMBERS HURT BRAIN!!

Doug Bolantis https://www.youtube.com/watch?v=Ooa98FHuaU0

Choose Your Own Crime Stats

Onica Annika Scott Macleod YOU WROTR “How are the Ariana Grande concerts in places without freedom? 👌🏻”
By comparing the bombing in Manchester to carrying guns and implying people would be safer at concerts WITH GUNS is how THIS WAS BROUGHT UP.

You cannot shoot a suicide bomber without expecting to have an explosion. It would have made absolutely NO DIFFERENCE!


 

Next are comments following NPR’s report on two bounty hunters who engaged a fugitive at a car dealership in Texas, also in 2017 and also posted to Facebook.

 


The Next Thread

Jonathan Fitzgerald So I’m starting to get a little pissed by all this bounty hunter bashing. While a little rough around the edges. Boba Felt was a pretty decent guy. And could tell some great jokes, once he got a few drinks in him. Cad Bane was a generous and loving fellow. He was known to work at the soup kitchens all the time. So chill out. They aren’t ALL bad.

Candy Ellman Johannes That may be but when you see the video it’s quite clear that the two bounty hunters handled the situation very badly. Because they were like that doesn’t mean they were good at their JOB. It doesn’t mean they’re bad. Just that they shouldn’t have been handling this job.

Jonathan Fitzgerald I think you have never heard of Star Wars or a joke.

Isaac Unson Wow, what a tragic and visceral story! Should I maybe post a comment to spur discussion about bounty hunting, or the lack of consideration for things going south very badly?

Nahhhh, that’s original content worthy of discussion. Why not just be cynical and predictable instead and make the usual jokes about guns?

Russell Good It’s fitting this happened in a death merchants offices. More people are killed with vehicles, than anything else, and yet anyone can buy a car without a background check. Unlicensed drivers and unregistered vehicles are hurtling past the innocent in their thousands at this very minute. When will we stop the insanity?

Candy Ellman Johannes A death merchant? No, they’re not. They can’t control how someone is going to drive the cars they buy. And a background check will not tell you how they will do that. Just drive around our community in Texas and you will see a lot of idiots on the road. Most of whom would clear any background check you might think they should conduct.

Patrick Oleary Bounty hunters, we don’t need their scum!

Dan Varns Right, the bounty hunters are the bad guy, not the wanted criminal who assaulted a police officer. Got it. Brilliant, take you all day to come up with that??

Patrick Oleary Dan Varns whooooosh

Marc Brown Put Captain Solo in the cargo hold

Kristopher Danger Why read the article when I can just comment about how much I don’t like or understand firearms?

Isaac Unson Nah, that required critical thinking!

PaulandCarla Todaro Why would anyone want to watch that? Why would you tell us to? So sad that we can watch such violence and then continue to scroll.

Marcus Harrison You idiots this wasn’t about exercising freedom.You people are so stupid!

Christopher Hill How do libtards turn this into a gun control argument


 

I hardly even know where to begin with either discussion. The responses, pardon the pun, speak for themselves, from the aggressors to the defenders to the cooler heads to the comic relief. I don’t even say “aggressors” and “defenders” with any political bent so much as simply noting name-calling and tone. The fact that one person or another, with [whichever] political beliefs, is the aggressor or the defender, here, is not my point, which is why I cautioned to read without taking sides – everybody can be mean-spirited or good-willed, aggressive or defensive. My point is that everybody can also think and listen and reflect, if only they wish to do so, which means more targeted effort and more controlled emotional reaction.

The futility of quoted statistics, which are then attacked and defended, as are the people themselves, in a forum that is informal and, for the most part, unmonitored (perhaps beyond hate speech or something that Facebook would moderate) …what’s the point of it all? Once these people close their browsers, what does each one feel he or she has accomplished? If little to nothing, then why even participate? If something more, then who besides themselves is measuring their effectiveness and, anyway, to what end? And who besides themselves even has a right to judge their effectiveness, especially since this cast of characters – evidently – would have plenty more to say about being judged, and we just kick-start another thread!

How many of you just now reading saw either of these comment threads before reading them here in my post? Which audience needs to see these threads, and why?

Are Comments sections some kind of exercise of free speech? If so, are they worth the trouble? On-line comments are not always anonymous, but they’re also not face-to-face, and that’s perhaps most significant here as to the point of being responsible and thorough before posting something regarding another person. However, it’s most significant in both positive and negative ways – positive because we owe the other person enough dignity to offer them an intelligent reply that respects their point of view, and negative because we can insult the bastard without (likely) ever feeling some physical repercussion. At the opening, I called on-line commenters lazy. Maybe they’re cowardly, too.

Geez, how seriously am I taking this? They’re just blog comments, for goodness’ sake!

 

Oh please, it’s a comments section not a peer-reviewed journal.

Christ, man

 

Here’s another partial comment thread, cut & pasted from The Atlantic website, this time without any comments removed from in between – these are consecutive responses to an article about America’s intellectual decline – a topic not too dissimilar from this very post, even though I disagree in detail with a number of the writer’s claims. However, again, the point here is not to debate the issues. It’s to note the motives and tone behind the comments.

 


The Last Thread

Assistantref

“but also because Christianity, for example, made Western civilization a possibility in the first place.”

What a truly odd assertion. What is your basis for it?

Duncan Tweedy

Start with Gutenberg. Then move on to education, art, medicine, culture, and philosophy. Don’t forget Martin Luther and Henry VIII.

Yes the Greeks and Arabs and others made their contributions. But how did those contributions find their way to becoming building blocks for Western Civ? Via the Romans (Christians by the end) and the Crusades (Christian holy wars). For centuries it was literate clergymen who preserved the ancient knowledge which would eventually set the stage for the Enlightenment.

Like it or not, Christianity is at least as inextricably entwined with the building of Western Civilization as any other influence one could name.

It’s truly odd that you find this overwhelmingly obvious fact truly odd.

David

No, Christianity made western civilization what it is today. It’s not what made it possible. However, we can never know what it would look like had Christianity not exerted the power it did.

Duncan Tweedy

In other news, if your Mom had chosen a different man to be your father, not only can we never know what you’d look like today–it wouldn’t in fact be you. That child might well not have even existed.

Europe’s faced many existential threats over the millennia. Change just a few events, and Western Civilization wouldn’t have survived. Subtract Christianity, and there’s a strong chance the region becomes conquered by neighboring civilizations, and never even develops the thing we now call Western Civilization.

And that’s as far as I need to go chasing after this particularly nonsensical counter-factual.

David

exactly, it’s a counter factual, no need to chase it.

Duncan Tweedy

“exactly, it’s a counter factual, no need to chase it.”

Asking what might have happened if different decisions had been made is often vital to understanding historical events. Although the process is inherently fraught with ambiguity, it’s a valid exercise.

Your reply makes it seem like perhaps you don’t grasp the purpose of a counter-factual.

This counter-factual is nonsensical. Not all of them are.

David

yes I understand that as well. This is getting a little exasperating. My only point in this was that one cannot attribute western civilization’s existence to christianity. At most, one can say that christianity was instrumental in the history and current state of western civilization.


 

Had David simply crafted a more thorough reply to begin with, as he indicates in the final response of this chain, he might have pre-empted all these back-and-forth remarks. More complete remarks might have stirred new ideas, better avenues for discussion, alternatives for research, and just a more thorough model for others to consider. And, sure, where his exchange with Duncan Tweedy was essentially civil and pain-free, he has still made the potential for a bunch of negative things to occur…

(i) people might have accepted his cursory remarks, thereby reinforcing (albeit superficially) their own beliefs in that echo-chamber kind of way

(ii) people might have rejected his cursory remarks, thereby reinforcing (albeit superficially) their own beliefs in that polarising kind of way

(iii) people might have misread, misunderstood, misconstrued, or otherwise missed the context of his remarks and, additionally, might have failed to follow up this thread as far as the point where I have cut & pasted it here

(iv) as a result of (ii) or (iii), people might have grown upset or angry with his cursory remarks and taken him on with vitriol, or worse, simply have begun insulting him outright, neither of which contributes to any constructive progress but, rather, destructive regress and both of which inspire ill feelings that those people now carry into everyday life, and which might later be echoed – on-line or off-line – by still others

(v) some other outcome I haven’t mentioned

Had David afforded more time and thought to (a) the kind of response that could adequately convey his thinking, and also (b) the kinds of responses he might elicit from people who read his remarks if he were to write them this way or that way, then he wouldn’t have responded the way we find here. Yet it hardly seems worth critiquing these, or any, on-line comments at all, they’re so ubiquitous! Yes, I see the irony; in fact, I intended it.

Just how many people bother to research and draft for a “Comments” section response, anyway? The whole concept of the on-line “Comments” section seems tailor-made to evade the vetting and sober second-thought of taking a breath and waiting the requisite 24 hours before responding to messages we don’t like. I’d pay $49 for the t-shirt that reads, “Who took the ‘Editor’ out of ‘Letter to the Editor’? Send me $50 and I’ll tell you.”

Obviously, the question is not how many bother to research and draft. It’s not even a question of whether to bother researching and drafting. It’s a question of whether to bother engaging in the on-line comments, to begin with. And I describe it as “bother” because research and drafting mean “work,” i.e. “What a blasted bother!” as in making a deliberate driven effort versus the blurt of perfunctory emotional reaction, which has always been a human foible and which, these days, seems even that much more common.

All that bother for… what, exactly? For someone to reply with one-line invectives? Who are we trying to reach, on-line? And, in light of that, who are we trying to be?

“Who are you on-line? The person versus the persona – it’s a concept worth considering.”

We’re all still responsible for the things we say, especially when they get published, and especially when “published” now means forever to be seen on-line (a consideration discussed here as well) – a newspaper or a book might at least fall out of print or get tossed in the trash. We might consider being responsible for the writing of an article, so why not for the offering of a comment? All the people I’ve quoted here have plenty to offer, I suspect, given their apparent literacy. But taking the time and resources at their disposal and using them in a more constructive way evidently hasn’t happened. Can that be changed?

What incentive would motivate these, or any, people to offer more when commenting on-line? Do people care about a growing reputation, however much or little it permeates the Cloud of e-culture? Who do they think they are? Who do they think that we think they are? Do they even care what others think they are? Do they care, themselves? There’s so little accountability, no formal editing or vetting as might be found in print publication, aside from moderation, as I’ve said, and that often simply automated. Sometimes there’s a log-in procedure via Facebook or Disqus, say, for whatever assurance that offers. It allowed me to publish selected commenters, here.

“Who are we on-line?” asks Flora the Explorer. It’s a question worth considering before we ever touch a keyboard. So, okay: who are you on-line? The person versus the persona – it’s a concept worth considering since I suspect 99% of on-line commenters will never meet their fellows face-to-face. Yet, precisely because of that physical separation, I suspect few care to consider (or even just bother to actively recognise?) this concept. Yes, that’s ironic, and it’s a shame. If people did care more (or at least actively acknowledge?) the fact that dialogue comprises more than self, how much more might a conversation yield? As it is, on-line dynamics affect our selves so subtly yet profoundly that the Internet, the great democratic equalizer, is proving its ability to take us one step forward and two steps back.

Clint Eastwood, Eli Wallach, Lee Van Cleef
Who’s who? The more we conflate our Person with our Persona, the more care we must take to be honest about who we are presenting as ourselves. Having integrity has to mean more than having strong opinions, or talented chops. These three were willing to pose for a candid moment as something other-than-pretend (i.e. as themselves!) not merely because they could distinguish persona from person but, presumably, because they respected and even liked each other – go figure…

In and of themselves, given their entire context including their culture, article & blog comments tend to run the risk of oversimplifying issues that warrant and deserve far greater diligence and time spent in meaningful appreciation. Issues that deserve… really? Why? Well, for starters, somebody published an article about [whatever it was], so now it’s out there for public consideration. Moreover, somebody decided that publishing [whatever it was] was worth the bother, and like you and me and everyone, that somebody deserves some basic dignity and respect, whether we ultimately agree with their published material or not. At a minimum, that requires reading the article, if not subsequently researching a bit more. Beyond that, it requires crafting responses of your own that do right by the author who invested the time and effort to create an article worthy of your comment – not “worthy” because you agree but “worthy” because you bothered to respond. Boy, all this bother! Why bother?

The Rhetorical WHY, itself, is a response not unlike what I suggest here – like anyone, I can’t cover it all in one go, but at the least, I can offer something more than a one-liner. The rest of you deserve that much, as I deserve likewise from the rest of you. So, in fact, it is about deserving: if one deserves, then all deserve – either no one is above any other, as far as it involves basic respect for human dignity, or we’re all of us bound for war, waged by all upon all.

For the record, this time I’m not trying to be ironic, though it might seem so now more than ever before. This time, it’s all too serious.

If people considered article & blog comments as I’ve tried to frame them here, as a matter of respect for human dignity, then comments – and public discourse, altogether – could be a whole lot different, and probably more constructive. Instead of comments, maybe people would compose entire articles of their own, which I remember is what Internet apologists used to boast: “The platform of the Internet is the great equalizer!” and “The Internet gives everyone a voice!” and “The Internet is democracy at its finest!” …that sort of thing. Shame that so many decide, instead, to use it superficially, far beneath both its potential as well as their own.

So, please, follow up on your own, comment and post, publish and be responsible. Contribute constructively. Most importantly, be thoughtful and thorough because that’s respectful of everybody else’s time and effort and bother. No one can cover every single detail, and every person has two cents to add of their own. But don’t be fooled by that miniscule metaphor – two cents refers to humility, so please make an effort to offer more than a reactive outburst.

Thanks, everybody, for leaving whatever considered comments you might have, and don’t let the end of this post be the end of your opinion.

What On Earth Were They Thinking?

How often have you heard somebody question people who lived “back then,” in the swirling historical mist, who somehow just didn’t have the knowledge, the nuance, or the capability that we so proudly wield today?

“Things back then were just a lot more simple.”

“They just weren’t as developed back then as we are today.”

“Society back then was a lot less informed than we are today.”

It’s difficult to even confront such statements out of their context, but where I’ve had all three of these spoken to me in (separate) conversations, I challenged each impression as an insinuation that we’re somehow smarter than all the peoples of history, more skilled, more sophisticated, more aware, more woke (as they say, these days), that we’re, in the main, altogether better than our elders merely for having lived later than they did. These days, apparently, “we’re more evolved” – ha! 🙂  more evolved, that’s always a good one, as if back then everyone was Australopithecus while here we are jetting across the oceans, toting our iPhones, and drinking fine wines. Well, sure, maybe things have changed since back then, whenever “then” was. But, more typically I’ve found, contemporary judgments levelled upon history are borne of an unintended arrogance, a symptom of 20:20 hindsight and the self-righteous assurance that, today, we’ve finally seen the error – actually, make that the errors – of their ways.

Surely, these days, few – or any – would believe that we’re ignorant or unaccomplished or incapable, not the way someone might say of us while looking back from our future. At any given point on the historical timeline, I wonder whether a person at that point would feel as good about their era, looking back, as any person on some other point of the timeline would feel, also looking back, about theirs. Is it a common tendency, this judgment of contemporary superiority? These days, we might well feel superior to people who had no indoor plumbing or viral inoculations or Internet access, just as someone earlier would have appreciated, say, some technological tool, a hydraulic lift to raise heavy objects, or a set of pulleys, or a first-class lever – choose whatever you like! It really is arbitrary for a person, looking back at history, to feel better about contemporary life because contemporary life infuses all that person’s experience while history’s something that must be learnt.

I’d say that learning history means learning whatever has lasted and been passed on because what has lasted and been passed on was deemed to have merit. We’re taught the history that has been deemed worth remembering. What I’ve found has been deemed worth remembering (i.e., the kinds of things I learned from History classes) are the general mood of the times, the disputes and fights that occurred (violent or academic), a select handful of the figures involved, and an inventory of whichever non-living innovations and technologies simultaneously arose alongside it all. If, later, we demerit and no longer pass on what has lasted up until then, passing on instead some different history, then that’s entirely indicative of us, now, versus anyone who came before us, and it reflects changed but not necessarily smarter or better priorities and values.

For me, we shouldn’t be saying we’re any smarter or better, only different. So much literature has lasted, so much art. Commerce has lasted, institutions have lasted, so much has lasted. Civilization has lasted. Cleverness, ingenuity, shrewdness, wit, insight, intellect, cunning, wisdom, kindness, compassion, deceit, pretence, honesty, so many many human traits – and they all transcend generations and eras. People vary, but human nature perdures. I’ll trust the experts, far more immersed in specific historical study than me, to identify slow or subtle changes in our traits – hell, I’ll even grant we may have culturally evolved, after all – and I can only imagine in how many ways the world is different now as compared to before now. But what does it mean to be better? Better than other people? Really? And who decides, and what’s the measure?

We can measure efficiency, for instance, so to say technology has advanced and is better than before is, I think, fair. Even then, an individual will have a subjective opinion – yours, mine, anybody’s – making culture not proactive and definitive but rather reactive and variable, a reflection, the result of comprised opinions that amplify what is shared and stifle what is not. As we’re taught merited history, you’re almost forced to concur, at least until we reconsider what has merit. That’s a sticking point because everyone will have an opinion on what is culturally better and what is culturally worse. Morality inevitably differs, and suddenly we have ethical debate, even disagreement, even discord. But to say people or culture are better, I think, is too subjective to rationalize and a questionable path to tread.

Consider this as well: we each know what we’ve learned, and as individuals, we’ve each learnt what we value. But what we’re taught is what’s broadly valued and, thereby, prescribed for all. We’ve all heard that rather hackneyed epigram, that those who neglect history are doomed to repeat it. Well, maybe the ones screwing up just didn’t learn the right history to begin with. I tend to abide by another hackneyed epigram, that they are wisest who know how little they know. Real historical wisdom and real historical understanding would be seeing and thinking and understanding as people earlier did. But short of firing up the Delorean for an extended visit some place, some time, it seems to me that judgments about history are made with an aplomb that might be better aimed at acknowledging our finite limitations. We’re no angels. If anything, this error of judgment speaks volumes about us. Condescension is what it is, but in my opinion, it’s no virtue.

We should hardly be judging the past as any less able or intelligent or kind or tolerant or virtuous as us, especially not if we aim to live up to today’s woke cultural embrace of acceptance. Being different should never be something critiqued; it should be something understood. Conversely, in proportion to how much we know, passing judgment is assumptive, and we all know what Oscar Wilde had to say about assuming (at least, we know if we’ve studied that piece of history). At the very least, we ought to admit our own current assumptions, mistakes, errors, accidents, troubles, disputes, and wars before we pass any judgment on historical ones. On that positive note, I will say that considering all this prompted me to notice something maybe more constructive – so often, at least in my experience, what we seemed to study in History class were troublemaking causes and effects, bad decisions, and selfishly motivated behaviours. Far more rarely was History class ever the study of effective decision-making and constructive endeavour – maybe the odd time, but not usually. Maybe my History teachers were, themselves, stifled as products of the system that educated them. What could they do but pass it along to me and my peers? Considering that, I might more readily understand how people, alive today, could conclude that all who came before were simply not as enlightened, as sophisticated, or as adept as we are now.

Yet that merely implicates contemporary ignorance: assumptions and mistakes still happen, errors still occur, accidents – preventable or not – haven’t stopped, troubles and disputes and wars rage on. If the axiom of being doomed to repeat history were no longer valid, we wouldn’t still feel and accept its truthful description, and it would have long ago faded from meaning. All I can figure is that we’re still poor at learning from history – the collective “we,” I mean, not you in particular (in case this essay was getting too personal). We need learned people in the right positions at the right times, if we hope to prevent the mistakes of history. Not enough people, I guess, have bothered to study the branches of history with genuine interest. Or, no, maybe enough people have studied various branches of history, but they don’t remember lessons sharply enough to take them on board. Or, no no, maybe plenty of people remember history, but the circumstances they face are just different enough to tip the scale out-of-favour. Epigram time: history doesn’t repeat, but it does rhyme. Or maybe we’re just full of ourselves, thinking that we’ve got it all solved when, evidently, we don’t.

It also dawned on me, considering all this, that high school “History” influences what many people think about broader “history.” My high school experience, university too, was mostly a study of politics and geography, and toss in what would be considered very elementary anthropology – all this as compared to those other branches of historical study. Archaeology and palaeontology come to mind as detailed, more scientific branches of history, but there are so many – literary history, philosophical history, religious, environmental, military, economic, technological, socio-cultural as I’ve already indicated, on and on they go, so many categories of human endeavour. I’ve even come across a thoughtful paper contemplating history as a kind of science, although one that is normative and susceptible to generational caprice. One final epigram: history is what gets written by the winners.

And that’s really the point here: throughout what we call human history, where we’ve subdivided it so many ways (right down to the perspective of every single person who ever bothered to contribute, if you want to break it down that far), it’s people all the way back, so it’s all biased, so it’s neither complete nor even accurate until you’ve spent oodles of time and effort creating a more composite comprehension of the available historical records. And, dear lord, who has time for that! History, in that respect, is barely conceivable in its entirety and hardly a thing to grasp so readily as to say, simply, “Back then…” History is people, and lives, and belief inscribed for all time. To know it is to know who lived it as well as who recorded it. Knowing others is empathy, and empathy is a skill trained and fuelled by curiosity and diligence, not emotion or opinion. Emotion and opinion come naturally and without effort. For me, valid history derives from informed empathy, not the other way around.

As far as recording history for future study, ultimately, it will have been people again recording and studying all of it, “it” being all of what people were doing to attract attention, and “doing” being whatever we care to remember and record. It’s all a bit cyclical, in itself, and completely biased, and someone will always be left out. So people might be forgiven when shaking their heads in judgment of the past because, without empathy, what else could they possibly know besides themselves?