Here’s Why. Here’s Precisely Why.

To all my students, and to whomever else wondered what and WHY I taught the stuff I did…

Has the unfolding story involving Cambridge Analytica and Facebook got you considering whether or not to keep your Facebook account?

The details of this story illustrate precisely why our coursework took direct aim at raising the level of discourse.

What is the story? In a nutshell, Facebook permitted C.A. wide access to its users’ private data, without their consent. Authorities now suspect that C.A. used the data to affect political influence in various countries around the world, specifically by way of on-line advertisements and news stories, both factual and contrived. People were fed information tailored to appeal to them and challenged to discern the factual from the fictional.

Here are a sampling of reports about the story and its fallout, some of which has been severe.

 


The Story

The Guardian

Vox

The Global Significance

BBC News

Fox News

Fox News: Opinion

Al Jazeera: Opinion

CBC News: Opinion

CNN: Opinion


 

Raising the level of discourse is all about discerning and appreciating peoples’ motives, thereby helping to determine what people are after, in order to understand why they do what they do.

To my students, I am confident that our coursework has prepared you to face the kind of cognitive assault launched by C.A. on people across the world. For my part, I’m comfortable keeping my Facebook account open… for the time being, anyway!

To everybody else, yes, absolutely this post is a plug for raising the level of discourse, an approach I encourage all of you to consider.

Read more about my own general take on Facebook here.

From The New York Times – “Free Speech and the Necessity of Discomfort” and further Reflections on Journalism

A needfully challenging appeal to raise the level of discourse, and an appropriate inclusion to The Rhetorical WHY, from an Opinion piece in The New York Times (Feb 22, 2018) by Op-Ed columnist, Bret Stephens:

This is the text of a lecture delivered at the University of Michigan on Tuesday [Feb 20, 2018]. The speech was sponsored by Wallace House.

“I’d like to express my appreciation for Lynette Clemetson and her team at Knight-Wallace for hosting me in Ann Arbor today. It’s a great honor. I think of Knight-Wallace as a citadel of American journalism. And, Lord knows, we need a few citadels, because journalism today is a profession under several sieges.…” [continue reading]

 

'Oddly enough, I feel offended...'
All That’s Fit to Print, Wiley Miller

Some thoughts of my own on the significance of a free press to our lives…

Next, I offer a series of responses I made to remarks by Hannah Arendt, published October 26, 1978 in The New York Review of Books, itself a report of her interview with Roger Errera). I encountered them in a Facebook post from The Bureau of Investigative Journalism.

 


Arendt: “… how can you have an opinion if you are not informed?”

Everybody has opinions – our five senses give us opinions. In order to be “informed,” we need discernment enough to detect accurate information.

Arendt: “If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer.”

For me, continual lies ultimately yield zero trust, but again, how would I know who’s even lying, but for my own discernment and experience?

At the least, if I were aware that all around were lies, that much I’d know is true. It’s not that “nobody believes anything any longer,” so much as it’s “everybody goes about searching out truth on their own.” The downside is when those individual searches for truth become disrespectful, as we’ve seen lately, or worse, chaotic.

Nevertheless, investigate! Accept responsibility to inform yourself. Accept or believe all with a grain of salt until such time as you can prove to your own satisfaction who and what are trustworthy. And, at that point, be tolerant, if not respectful, of others – this applies to everybody, all sides, liberals and conservatives and all points between. Taking the high road is not to be done with pride or smug assurance. It’s easy to nod and say, “I already do while others do not,” but even so, reflect upon yourself with each conversation, each debate, each exchange.

Open-minded and open-hearted – both are virtues, but they don’t have to be the same thing.

Arendt: “If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer.”

On its face, this statement could only be accurate if you had some clairvoyance or a crystal ball.

By “everybody” doing their own investigation and accepting responsibility to inform themselves, I mean everybody. We’re able to trust news & media sources to the extent that they have lived up to their responsibility… to the extent we’re aware that they have. I support proper, professional investigative journalism and public intellectualism, both of which I gather to be in decline.


'Well, apparently you haven't heard. . . personal opinions are the new facts.'
The New Facts, Chris Wildt

Finally, I offer two sets of remarks about journalism by two long-retired anchor-journalists of PBS fame, partners Robert MacNeil and Jim Lehrer. The first is transcribed from an exchange between them during a tribute to MacNeil upon his retirement in October 1995. The second – comprising two parts – is Lehrer’s closing words upon the “retirement” of his name from the title of the PBS NewsHour, on December 04, 2009. Following that, I’ve included a thoughtful follow-up by the PBS Ombudsman, Michael Getler, published the next week on December 11.

MacNeil’s remarks upon his retirement (October 20, 1995)…


MacNeil: You know, I’m constantly asked, and I know you are in interviews, and there have been a lot of them just now – I’m constantly asked, “But isn’t your program a little boring to some people?” and I find that amazing, because, well, sure, it probably is, but they’re people who don’t watch. The people who watch it all the time don’t find it boring, or they wouldn’t watch.

Lehrer: That’s right.

MacNeil: And it’s the strange idea that’s come out of this medium, because it’s become so much a captive of its tool – as its use as a sales tool that it’s driven increasingly, I think, by a tyranny of the popular. I mean, after all, you and I’ve said this to each other lots of times – might as well share it with the audience: what is the role of an editor? The role of an editor is to make– is to make judgments somewhere between what he thinks is important or what they think is important and what they think is interesting and entertaining.


Jim Lehrer’s guidelines of journalism (December 04, 2009)…


Lehrer: People often ask me if there are guidelines in our practice of what I like to call MacNeil/Lehrer journalism. Well, yes, there are. And here they are:

* Do nothing I cannot defend.

* Cover, write and present every story with the care I would want if the story were about me.

* Assume there is at least one other side or version to every story.

* Assume the viewer is as smart and as caring and as good a person as I am.

* Assume the same about all people on whom I report.

* Assume personal lives are a private matter, until a legitimate turn in the story absolutely mandates otherwise.

* Carefully separate opinion and analysis from straight news stories, and clearly label everything.

* Do not use anonymous sources or blind quotes, except on rare and monumental occasions.

* No one should ever be allowed to attack another anonymously.

* And, finally, I am not in the entertainment business.

Here is how I closed a speech about our changes to our PBS stations family last spring:

‘We really are the fortunate ones in the current tumultuous world of journalism right now. When we wake up in the morning, we only have to decide what the news is and how we are going to cover it. We never have to decide who we are and why we are there.’


 

I am struck by the continuity of their respective final comments, about entertainment – each, in his own way, seeks to distance journalism from vagary, each thereby implying that we are susceptible to emotional or whimsical tendencies, which evidently seem capable of overtaking our focus to learn; otherwise, why mention the point at all?

 


  • Watch Lehrer’s remarks here, in a functional if awkward series of video archives of that 2009 broadcast.
  • In May 2011, upon Lehrer’s retirement, MacNeil returned to offer his own reflections upon his friend and colleague that include some further worthwhile commentary upon contemporary TV journalism
  • Watch them during a more recent (October 25, 2016) retrospective interview from 92nd Street Y, a Jewish cultural and community centre in Manhattan.

 

I recall “Lehrer’s Rules,” as they were called, making a small stir – some of it more substantive, meaningful, and some the critical “woe-is-Us” lament at the passing of favourite things. In amongst it all, as I mentioned, were the following comments from PBS Ombudsman, Michael Getler, which I include here, at length, on account of PBS webpages’ tendency to disappear.

In fact, a number of the PBS pages where I found these articles are no longer active – where possible, I have checked, updated, and even added weblinks. But I believe Getler’s comments, like the rest, are worth preserving, on account of their potential to provoke us to think and learn more about a free press and its relation to ourselves.

 


“Lehrer’s Rules” by Michael Getler (December 11, 2009)

A couple of people wrote to me in the aftermath of that Dec. 4 sign-off to say how much they liked Lehrer’s guidelines and asked how they could get a copy. That’s why they are reproduced above. A subscriber to the widely-read Romenesko media news site also posted them there on Dec. 6 and they also were posted on the campus site of the Society of Professional Journalists (SPJ). “Whether you agree with all of Lehrer’s guidelines, or not,” that posting read, “he has surely earned our attention.”

That’s certainly true in my case. I’ve also been a devoted watcher of the NewsHour in all of its evolutions during most of the past 30-plus years, long before I took on this job four years ago. Although segments of the program have been the subject of critical ombudsman columns on a number of occasions, I’ve also said many times that it remains the best and most informative hour of news anywhere on television, and it has never been more important. I follow the news closely but almost always learn something from this broadcast every night.

Boring, at Times, But a Luxury Always

Sometimes, of course, it can seem boring. Sometimes the devotion to balanced he said/she said panel discussions can leave you frustrated and angry and no smarter than you were 15 minutes earlier. Sometimes the interviewing is less challenging than one might hope. But the luxury of an uninterrupted hour of serious, straight-forward news and analysis is just that these days, a luxury. And, in today’s world of media where fact and fiction, news and opinion, too often seem hopelessly blurred, it is good to have Lehrer – clearly a person of trust – still at work.

I had the sense when he added his guidelines to that closing segment last Friday that the 75-year-old Lehrer was trying to re-plant the flag of traditional, verifiable journalism that he has carried so well all these years so that it grows well beyond his tenure – whatever that turns out to be – and spreads to all the new platforms and audiences that the contemporary media world now encompasses.

Oddly, I did not get any e-mail from viewers commenting on the new NewsHour format, other than one critical message that said “do not post.” Maybe that’s a good sign since people usually write to me to complain.

Make no mistake, the now defunct NewsHour with Jim Lehrer is still quite recognizable within the new PBS NewsHour. So those who wrote earlier and said they didn’t want any change won’t be terribly disappointed. I, personally, found the first few days of the new format and approach to be a distinct improvement. The program seemed to have more zip and energy, faster paced, with good interviews and without the always predictable language that introduced the show in the past. It presented its news judgments more quickly, benefitted from the early introduction of other top staff members as co-anchors, and from the introduction of a promising “new guy,” Hari Sreenivasan, a former CBS and ABC correspondent who presents a headline summary from the newsroom and is the liaison to an expanded NewsHour Web operation.

Now, just to keep this a respectable ombudsman’s column, let me add a few quibbles when it comes to Lehrer’s rules, as posted above.

First, one of the interesting things about American journalism is that there are no agreed-upon national standards, no journalistic equivalent of the Hippocratic Oath for physicians. There are, of course, many universal values and practices that vast numbers of journalists have voluntarily adhered to generally for many years, best exemplified by SPJ’s Code of Ethics. But the fact is that all major news organizations – from the Associated Press to the New York Times to PBS and CBS – have their own guidelines and standards that they try and live by. And they all have their differences.

Naturally, a Few Quibbles

Lehrer’s guidelines embody lots of the good, praiseworthy stuff, and we come out of the same journalistic generation and traditions. But I think on a couple of points they are actually too nice, too lofty, cruising somewhere above some of the grittier realities of journalism.

For example, “Assume the viewer is as smart and as caring and as good a person as I am. Assume the same about all people on whom I report.” Really? Bernard Madoff? Osama bin Laden?

Then there is: “Assume personal lives are a private matter, until a legitimate turn in the story absolutely mandates otherwise.” I would argue, and have, that the NewsHour withheld from its viewers at the time a legitimate turn in a major story – reported by all other major news organizations – last year when it declined to inform them that a former senator and former candidate for the vice-presidency, John Edwards, issued a public statement and went on ABC Television to acknowledge that he had had an extra-marital affair with a woman who had been hired by his political action committee to make films for his campaign. That’s news.

Finally, there is, “Do not use anonymous sources or blind quotes, except on rare and monumental occasions.” I agree about the blind quotes when they are used to attack someone personally. But anonymous sources have often proved to be absolutely crucial to the public’s right to know what’s really going on in scores of major stories as they have unfolded from Watergate to secret CIA prisons overseas.

The most accurate and important pre-war stories challenging the Bush administration’s on-the-record but bogus case for Iraqi weapons of mass destruction were based on anonymous sources. Many of those stories, in part because they were based on anonymous sources, got buried or underplayed by newspapers at the time. Many of them never got reported at all on television, including the NewsHour. But there are times when there are mitigating circumstances – like internal threats within an administration or maybe jail time for leakers – when some sources must remain anonymous and when editors need to trust their reporters. And often you don’t know if the occasion is “rare and monumental” until it is too late. Pre-war Iraq, again, being Exhibit A.


 

Freedom of the Press_ Matteo Bertelli
Freedom of the Press, Matteo Bertelli

Some other links…

World Association of Newspapers and News Publishers

World Press Freedom Index

Ryerson School of Journalism

Edelman Trust Barometer

Freedom House

Deciding over Derrida’s Différance

As far as I understand Jacques Derrida’s différance, he observes that we understand our experiences as distinctive, but not exhaustive, communicated links or marks comprising an on-going decisive chain of experiential moments. As to the language we use to describe our experiences, a word has contextual meaning, both from its usage at any given time as well as from its etymology over the course of time. I tend to agree with this attendance to context as furnishing meaning, and I can also spot the rabbit hole that it poses. For example, to understand some word’s definition, I might look it up in the dictionary and be left to rely upon the definition of whomever decided what it meant while, at the same time, face all sorts of words in the definition that now need looking up, too – Sisyphean, indeed! Cruel but so usual. On the other hand, thanks to whomever for compiling the dictionary, a pretty utile compendium, I have to say.

To be clear, I am not intending to invoke logocentrism, by which all our words are accorded a decided meaning from a cultural centre, which propagates existing biases or “privileges”; Derrida would roll over in his grave. Granted, I may already have laid grounds here to be accused of logocentrism, myself, by writing with words (and I confess to using English because I didn’t think anyone had the patience to muddle over Wingdings). My present aim is to suggest how we might address the afore-mentioned rabbit-hole dilemma by searching for or (… almost afraid to say it) by deciding upon some definitions of our own. Not like a dictionary, but more like– well yes, okay, like a dictionary, but one that we’ll fashion from the ground-up, like when the light bulb would go on above Darla’s head, and Spanky would snap his fingers to say, “Hey, everyone! Maybe we can put on a play!” So, in the spirit of dissemination, hey everybody, maybe we can compile a dictionary! A real, deconstructive, crowd-sourced dictionary!

I’m not really compiling a dictionary. I’m just trying to make some sense of Derrida and différance. Let me try to illustrate what I mean from my own experience. Sometimes I play Walking Football, a version of the game where players are not permitted to run. Naturally, the debate is over what differentiates walking from running. We’ve agreed that walking means “always having at least one foot in contact with the ground during the striding motion.” Running means “having both feet leave the ground at some point during the striding motion.” This makes for certainty, and so long as our eyes are trained enough to spot feet in motion, which I can spot sometimes so clearly, with such immediacy, that its more like I’m watching, not playing – I’m ghinding it tuff even now to ghet the right words, but trust me. And so long as each player is willing to obey the rules – and, ohh my, there’s always that one player who just won’t. You know who I mean… *sigh… Anyway, so long as they’re not just words uttered that then float away in the breeze, our definitions of the rules for walking and running are useful.

Luckily, too, I might add, when we clarify the rules, we do so out loud, together, and don’t whisper it around in a circle, like when my daughter plays Telephone at a birthday party – after all, we want everyone to be clear. Finally, even if we have trouble spotting feet in motion, because it all happens too quickly, or even if that one player is a cheater at heart, the definitions themselves remain clear, and usually at least one or two of us can remember them well enough to recite back, as needed, usually with a lot of finger-pointing and furrowed brows. One time we even wrote the no-running rule on the gym chalkboard, and even though no one challenged this, on the grounds that writing is secondary to speech, everyone still understood why it was scrawled there, by which I mean everyone knew exactly who should read it the most – wow, does every game have that player? Incorrigible.

Bottom line: accountability is down to the sincerity and respect offered to each player by every other player who decides to participate. As an aside, the need for a referee, an arbiter, is all the more clear when the stakes are as high as bragging rights and free beer. But, even as we play for fun, the rules exist or else the game, as such, does not. (On that note, I find a lot of players just don’t like Walking Football and would rather play with running, and that’s fine, too: it’s their decision, and plenty other like-minded players keep both games afloat. I find the Walking game amplifies decision-making, so maybe this feature just appeals to me. And I still play traditional football, too.) My broader point is that any one person must decide to accept what has been defined and, likewise, any group of people must reach a consensus. Shared meaning matters because, otherwise, as I say, we don’t have a game, or else we have a very different one, or we just have anarchy. But whether that person, alone, or the group, altogether, searching for a way to decide upon meaning, has the patience to delve down the rabbit hole… well, yes, context does indeed matter – both usage and etymology. I’ve said and written as much, myself, for a long time. So, in light of all this, I hope I’ve gathered a little something of Derrida’s différance. I’m still learning.

Another illustration: in my teaching, I occasionally introduced this matter of contextual meaning by offering students a list of synonyms: “slim,” “slender,” “skinny,” “thin,” “narrow.” Each word, of course, has its own particular meaning. “If they meant the same thing,” I offered, “then we’d use the same word,” so just what explains the need for all these synonyms? Well, students would say, there are lots of different things out there that possess or demonstrate these various adjectives (my word, not theirs), so we’ve come up with words to describe them (and I think that’s a charitable “we,” like the Royal “We.”) As the discussion proceeded, I might ask which of these words typically describe human traits versus those – leaving aside metaphors – that typically do not. Next, which words typically possess positive connotations, and which negative, or neutral? And, as it pertains to the personification metaphors, which words are more easily envisioned versus those that really stretch the imagination, or even credibility? Eventually, I would shift from ontology to epistemology, posing the questions at the heart of my intention: For any of the previous questions about these synonyms, how do you know what you’re talking about? For what each of these words could mean, where have your assurances come from? Of course, the most frequent reply to that question was “the dictionary,” followed by “my parents” or “books I’ve read,” or “just everyday experience, listening and talking to people.” Occasionally, the reply was something akin to “Who cares… it just means what it means, doesn’t it?” In every reply, though, one common thread was detectable: the involvement of other people as part of the meaning-making process. Fair enough, we can’t all be Thoreau.

One more example: when is “red” no longer red but perhaps orange or purple? Well, for one thing, if you’re colour blind, the question means something entirely different, which I say not flippantly but again to illustrate how important dialogue and community are to deciding what something means. For another thing, we might wish to ask, in keeping with context-dependency, “Why even ask?” Again, this is not flippant or dismissive but practical: when does it matter so that we distinctly need to identify the colour red? Where a group of people might face the question over what is red versus what is orange or purple, we might expect some kind of discussion to ensue. And, whether asking as part of such a group or as a hermit, alone, I submit that one person’s decision about what is “red” is ultimately down to one person to determine: “Red is this,” or “This is red,” or even, “Gosh, I still can’t really decide.” Even a coerced decision we can still attribute to the one who forces the issue – one person has decided on behalf of another, however benignly or violently: might makes right, or red, as it were.

Coercion introduces a political consideration about whose authority or power has influence, similar to needing referees on account of those players who decide to run. The point, for now, is simply that a decision over what something means to a person is ultimately made by a person, leaving others to deal with that decision on their own terms in whatever way. But other people are part of the meaning-making process, even passively, or else I wouldn’t need language to begin with since the rest of you wouldn’t trouble me by existing. Not to worry, by the way, I appreciate you reading this far. From what I understand (and I am convinced I must learn more, being no avid student of either postmodernism or Derrida), his observation of différance either discounts or else offers no account for the arbitrary decision-making that people might make when they decide they’ve had enough. People tend to land somewhere in a community, and it’s the rare person who lives and plays wholly and uncompromisingly by their own rules. However, the fact that he felt différance was worth the effort to publicise and explain to the rest of us does reflect an arbitrary decision on the part of Derrida and says something about him.

So this is where I have more fundamental trouble understanding Derrida and différance – the very notion of “different,” as in, in what world could there not be an arbiter? Even a life alone would face endless decisions: what to eat, where to go, when to sleep, and so forth. From such musing – speaking of rabbit holes – I was led to reading about another philosopher named Jacques, this one Rancière, and what he calls the axiom of equality. In pure nutshell form, I take this to mean that no (socio-political) inequality exists until it has been claimed to exist – and note that it’s claimed in a boat-rocking kind of way, what the kids these days are calling “disruptive.” The upshot is that equality, itself, can only ever be theoretical because someone somewhere inevitably is and always will be marginalised by the arbitrary decisions of cultural hegemony. Still learning.

Back to the Walking Football analogy: if the rabbit hole of defining a word in the context of those that surround it, and then having to define, even further, all those words, and on and on, and experience is inexhaustible, and what’s the point, and lift a glass to nihilism… if that kind of limitless indefinite deconstructive search-and-compare lies at the heart of what is different, then maybe Derrida just found it difficult to reach agreement with other people. It stands to reason that, if he played Walking Football, Derrida might be the worst cheater on the floor, continually running when he should be walking, then denying it just the same as he tried to gain advantage. Maybe, fed up being called a cheater, he would take his ball and go home to play by himself, where no one could say he was wrong. Being alone, who would be there, whether as an obedient player or as a sneaky one, to challenge him? In fact, maybe that’s why he chose to return to play the next day – for all the arguing, he enjoyed the game, or the attention, or the camaraderie, or the exercise, or whatever, more than being accused of cheating. I wonder if, perhaps, in the great game of philosophy football, he would have been the only rival to strike real fear in Diogenes – I mean awe & respect kind of fear, just to clarify, and I mean if they had lived at the same time. It’s hard to know about Diogenes since nothing he wrote down ever survived, and these days, I doubt more than a few can recall any of whatever he said, besides that lamp-carrying honesty thing. (We should all have such good spirit when it comes to our first principles.) Anyway, I think Diogenes played for Wimbledon.

Well, I am being unkind to Derrida. Evidently, he was a kinder person by nature than I have let on, as well as an advocate for all voices, all people. And the professional care, the uncompromising expertise he took to convey his ideas, to trouble himself with delving down the rabbit hole so arbitrarily – to go down at all but, moreover, to go so far when he might, just the same, have decided to halt. Delve as far as you like, but accept responsibility for your decision, every time. In that respect, how does Derrida differ from any other person facing decisions? Did he have still other motivations? No player who kicks a football is deliberately playing to lose, not unless they have been coerced by someone else to do so. On the other hand, for all I know, maybe what Derrida called red I would call blue. Be careful not to pass the ball to the wrong team! (By the way, in sport, dynasties are remembered precisely because they eventually come to an end.)

Was Derrida no less accountable and open to scrutiny than you, or me, or anybody else? To suggest that a word only makes sense based on how it differs from those around it is no less arbitrary than its reciprocal suggestion, that a word only makes sense based on how it describes only what it describes. Half-full / half-empty, six of one… Two sides of the same coin are still the same coin. Alternatively, who put him up to all this? Meanwhile, on his own, surely Derrida had it within himself, as people do when they reach a point, simply to say, “Here is enough. I decide to stop here. For me, [the item in question] means this.” If that doesn’t ring true and sound like him, well, I’d say that can be just as telling of his character; I heard it suggested, once, how we can be helped in knowing something by what it is not. So, fine – for Derrida to stake the claim called différance, I’m willing to concede him that moment. We all land somewhere, and we’re all hardly alike, even when we’re alike.

We are, each and every one of us, individual. But together we comprise something just as dynamic on a larger scale – one might construe us societally, or perhaps historically, anthropologically, or on and on, in whatever way through whichever lens. For me, différance appears an attempt to speak for all about all, prescriptively. A grand stab at philosophy, no question, and that’s the beauty of the equality of philosophy, with thanks to Rancière: we all have a part to play and a right to respond. For the time being, as I have understood Derrida and his thinking, and I willingly stand to be instructed further, différance strikes me as ironic, being an advocacy for the dynamic development of people and language and culture that self-assuredly asserts its own accuracy. That is not an uncommon indictment of postmodernists. What’s more, it is ohh, so human.

Lest We Forget

I am indebted to three of my students – Maddy, Kira, and Shannon – for collaborating to write this essay, which we formally read aloud during a school Remembrance Day ceremony in 2013. As I told them at the time, our planning sessions together were as good as any committee-style work I’ve ever done – everyone thoughtful, respectful, contributing, and focused – and I remain as proud of our group effort today as I was back then.

I have only slightly revised our essay, for fluidity, to suit a print-format but have endeavoured to avoid any substantive changes.

One hundred years ago, the Dominion of Canada’s soldiers fought in the Great War. By November 1917, the Canadian Expeditionary Force had been in Europe for over three years, staying one more year and sacrificing their safety and their lives for their country on behalf of the British Empire.

What is sacrifice? Sacrifice is soldiers seeing past terror on the battlefield, placing themselves into vulnerability, and giving themselves on our behalf. Each year on November 11, Remembrance Day in Canada, we recognize our soldiers by wearing a poppy over our hearts. Why a poppy is more well-known, yet since its adoption in 1921, how the symbolic pin has remained potent is perhaps less well-considered. A century later, in such a different world, the relevance of the poppy as a way of honoring the sacrifice of wartime warrants reflection.

As time passes, the poppy’s symbolism, in and of itself, remains the same. We change – people, culture – and inevitably, as we change, our relationship with the poppy changes, too, however much or little. The poppy, the same symbol, is different for those who feel firsthand the costs of war, so many people separated, harmed, and displaced, so many lives lost. Pains of loss are felt most intensely when they occur, by those who are closest to the people involved. For those of us with no direct wartime experience, what we feel and know matters, yet it also differs. To activate a more complete appreciation, one meaningful place to which we might turn is poetry. During World War I, poetry was a common means for those with direct wartime experience to share, and to cope.

For the lover in the poem, “To His Love,” by Ivor Gurney, one particular soldier’s death has wiped out his lover’s dreams for a comfortable future. Experiencing the fresh pains of loss, she could not possibly forget her soldier or his sacrifice. The poppy we wear both honours his sacrifice and “[hides] that red wet thing,” her loss. But, because of our distance, our poppy does not hold the same raw pain as it does for her, for those who have so immediately lost their loved ones. So, if our poppies do not hold that same raw pain, why do we continue to wear them?

The poppy stays the same because of the fact that each soldier’s death remains. Again, from “To His Love,” Gurney writes, “You would not know him now…” Generations removed, do we remember who this soldier was? Would we recognize him on the street? No. “But still he died.” We may find it difficult to assess the significance of his death, here in our world, far removed by time and distance. But let us appreciate, let us remember, in that moment of a soldier’s death, how he died: “with due regard for decent taste.” A soldier dies with dignity, for his own sake, because that is all he has. He is a small blip in the universe. “But still he died,” and that will forever be. And for all who loved him, and for all he loved, we remember.

We continue to honour our soldiers, and the sacrifices of all during wartime, because of the timelessness of that sacrifice, which each one makes. Even now, removed from war, we can find reasons to remember the deaths of soldiers because the memory that remains of each soldier embodies our definition of a hero: ordinary people facing extraordinary circumstances and giving themselves, perhaps giving their lives, on our behalf. When we wear their poppies, we let their deaths weigh on our present.

Let their deaths weigh on our present, and let their memories live in their stead.

“They shall grow not old, as we that are left grow old:

Age shall not weary them, nor the years condemn.

At the going down of the sun and in the morning

We will remember them.”

Wanted on the Voyage: Professional Teachers are Experts in their Field

The needs of the economy and our society are changing and therefore you need to have a learning system that fits the purpose, and that purpose is constantly shifting.”

So said Anthony Mackay, CEO of the Centre for Strategic Education (CSE) in Australia, during an interview with Tracy Sherlock from The Vancouver Sun. Mr Mackay was at SFU’s Wosk Centre for Dialogue in Vancouver on January 29, 2015, facilitating a forum about the changing face of education. Although links to the forum’s webcast archive and Sherlock’s interview are now inactive, I did save a copy of the interview text at the time, posted here beneath this essay. Tracy Sherlock has since told me that she doesn’t know why the interview’s links have been disconnected (e-mail communication, January 27, 2017). Nonetheless, there remains ample on-line and pdf-print promotion and coverage of the event.

The forum and the interview were first brought to my attention via e-mail, shared by an enthusiastic colleague who hoped to spur discussion, which is altogether not an uncommon thing for teachers. Originally, I wrote distinct yet connected responses to a series of quotations from Mr Mackay’s interview. Here, some thirty-two months later, I’ve edited things into a more fluid essay although, substantively, my thoughts remain unchanged. Regrettably, so does the bigger picture.

For starters, Mr Mackay’s remark tips his hand – and that of the CSE – when he precedes society with economy. Spotting related news reports makes the idea somewhat more plausible, that of a new curriculum “…addressing a chronic skills shortage in one of the few areas of the Canadian economy that is doing well” (Silcoff)[1]. Meanwhile, in Sherlock’s interview [posted below this essay], Mr Mackay concludes by invoking “the business community,” “the economy of the future,” and employers’ confidence. Make no mistake, Mr Mackay is as ideological as anyone out there, including me and you and everybody, and I credit him for being obvious. On the other hand, he plays into the hands of the grand voice of public educators, perhaps willfully yet in a way that strikes me as disingenuous, couched in language so positive that you’re a sinner to challenge him. Very well, I accept the challenge.

Whatever “purpose” of education Mr Mackay has in mind, here, it’s necessarily more specific unto itself than to any single student’s interests or passions. In other words, as I take his portrayal, some student somewhere is a square peg about to be shown a round hole. Yet this so-called purpose is also “constantly shifting,” so perhaps these are triangular or star-shaped holes, or whatever, as time passes by.

Enter “discovery learning” – by the way, are we in classrooms, or out-and-about on some experiential trip? – and the teacher says only what the problem is, leaving the students to, well, discover the rest. I can see where it has a place; how it enables learning seems obvious enough since we learn by doing – teach someone to fish, and all. But when it comes to deciding which fish to throw back, or how many fish are enough when you don’t have a fridge to store them in before they rot and attract hungry bears… when it comes to deciding what’s more versus less important, those minutiae of mastery, it’s not always as easy as an aphorism or a live-stream video conference. Where it’s more hands-off from the teacher, in order to accommodate the student, discovery learning seems to me better suited to learners well past any novice stage. And if the teacher said, “Sorry, that’s not discovery learning,” would the students remain motivated? Some would; others most certainly would not: their problem, or the teacher’s? When both the teacher and the students say, “We really do need to follow my lead just now,” which party needs to compromise for the other, and to what extent? Teaching and learning ought to be a negotiation, yes, but never an adversarial one! In the case of “discovery learning,” I wonder whether “teacher” is even the right title anymore.

In any case, Mr Mackay appears guilty of placing the cart before the horse where it comes to educating students according to some systemic purpose. I’ve got more to say about this particular detail, what he calls “personalization.” For now, it’s worth setting some foundation: Ken Osborne wrote a book called Education, which I would recommend as a good basis for challenging Mr Mackay’s remarks from this interview.

TechnoFailsThat Osborne’s book was published in 1999 I think serves my point, which is to say that discernment, critical thinking, effective communication, and other such lauded 21st century skills were in style long before the impending obscurity of the new millennium. They have always offered that hedge against uncertainty. People always have and always will need to think and listen and speak and read, and teachers can rely on this. Let’s not ever lose sight of literacy of any sort, in any venue. Which reminds me…

“Isn’t that tough when we don’t know what the jobs of the future will be?”

I must be frank and admit… this notion of unimaginable jobs of the future never resonated with me. I don’t remember when I first heard it, or even who coined it, but way-back-whenever, it instantly struck me as either amazing clairvoyance or patent nonsense. I’ve heard it uttered umpteen times by local school administrators, and visiting Ministry staff, and various politicians promoting the latest new curriculum. The idea is widely familiar to most people in education these days: jobs of the future, a future we can’t even imagine! Wow!

Well, if the unimaginable future puzzles even the government, then good lord! What hope, the rest of us? And if the future is so unimaginable, how are we even certain to head any direction at all? When you’re lost in the wilderness, the advice is to stay put and wait for rescue. On the other hand, staying put doesn’t seem appropriate to this discussion; education does need to adapt and evolve, so we should periodically review and revise curricula. But what of this word unimaginable?

For months prior to its launch, proponents of BC’s new curriculum clarified – although, really, they admonished – that learning is, among other things, no longer about fingers quaintly turning the pages of outmoded textbooks. To paraphrase the cliché, that ship didn’t just sail, it sank. No need to worry, though. All aboard were saved thanks to new PDFs– er, I mean PFDs, personal floatation devices– er, um, that is to say personal floatation e-devices, the latest MOBI-equipped e-readers, to be precise. As for coming to know things (you know, the whole reason behind “reading” and all…), well, we have Google and the Internet for everything you ever did, or didn’t, need to know, not to mention a 24/7 news cycle, all available at the click of a trackpad. It’s the 21st century, and learning has reserved passage aboard a newer, better, uber-modern cruise ship where students recline in ergonomic deck chairs, their fingertips sliding across Smart screens like shuffleboard pucks. Welcome aboard! And did I mention? Technology is no mere Unsinkable Ship, it’s Sustainable too, saving forests of trees from the printing press (at a gigawatt-cost of electricity, mind, but let’s not pack too much baggage on this voyage).

Sorry, yes, that’s all a little facetious, and I confess to swiping as broadly and inaccurately as calling the future “unimaginable.” More to the point: for heaven’s sake, if we aren’t able to imagine the future, how on earth do we prepare anybody for it? Looking back, we should probably excuse Harland & Wolff, too – evidently, they knew nothing of icebergs. Except that they did know, just as Captain Smith was supposed to know how to avoid them.

But time and tide wait for no one which, as I gather, is how anything unimaginable arose in the first place. Very well, if we’re compelled toward the unknowable future, a cruise aboard the good ship Technology at least sounds pleasant. And if e-PFDs can save me weeks of exhausting time-consuming annoying life-skills practice – you know, like swimming lessons – so much the better. Who’s honestly got time for all that practical life-skills crap, anyway, particularly when technology can look after it for you – you know, like GPS.

If the 21st century tide is rising so rapidly that it’s literally unimaginable (I mean apart from being certain that we’re done with books), then I guess we’re wise to embrace this urgent… what is it, an alert? a prognostication? guesswork? Well, whatever it is, thank you, Whoever You Are, for such vivid foresight– hey, that’s another thing: who exactly receives the credit for guiding this voyage? Who’s our Captain aboard this cruise ship? Tech Departments might pilot the helm, or tend the engine room, but who’s the navigator charting our course to future ports of call? What’s our destination? Even the most desperate voyage has a destination; I wouldn’t even think a ship gets built unless it’s needed. Loosen your collars, everybody, it’s about to get teleological in here.

Q: What destination, good ship Technology?

A: The unknowable future…

Montague Dawson
Montague Dawson

Land?-ho! The Not-Quite-Yet-Discovered Country… hmm, would that be 21st century purgatory? Forgive my Hamlet reference – it’s from a mere book.

To comprehend the future, let’s consider the past. History can be instructive. Remember that apocryphal bit of historical nonsense, that Christopher Columbus “discovered America,” as if the entire North American continent lay indecipherably upon the planet, unbeknownst to Earthlings? (Or maybe you’re a 21st century zealot who only reads blogs and Twitter, I don’t know.) Faulty history aside, we can say that Columbus had an ambitious thesis, a western shipping route to Asia, without which he’d never have persuaded his political sponsors to back the attempt. You know what else we can say about Columbus, on top of his thesis? He also had navigation and seafaring skill, an established reputation that enabled him to approach his sponsors in the first place. As a man with a plan to chart the uncharted, even so Columbus possessed some means of measuring his progress and finding his way. In that respect, it might be more accurate to say he earned his small fleet of well-equipped ships. What history then unfolded tells its own tale, the point here simply that Columbus may not have had accurate charts, but he also didn’t set sail, clueless, to discover the unimaginable in a void of invisible nowhere.

But what void confronts us? Do we really have no clue what to expect? To hear the likes of Mackay tell it, with technological innovation this rapid, this influential, we’re going to need all hands on deck, all eyes trained forward, toward… what exactly? Why is the future so unimaginable? Here’s a theory of my very own: it’s not.

Snoopy the Blackeared Pirate

Discovering in the void might better describe Galileo, say, or Kepler, who against the mainstream recharted a mischarted solar system along with the physics that describe it. Where they disagreed over detail such as ocean tides (as I gather, Kepler was right), they each had pretty stable Copernican paradigms, mediated as much by their own empirical data as by education. Staring into the great void, perhaps these astronomers didn’t always recognise exactly what they saw, but they still had enough of the right stuff to interpret it. Again, the point here is not about reaching outcomes so much as holding a steady course. Galileo pilots himself against the political current and is historically vindicated on account of his curious mix of technological proficiency, field expertise, and persistent vision. For all that he was unable to predict or fully understand, Galileo still seemed to know where he was going.

Copernicus Kepler Galileo.png

I suppose if anyone might be accused of launching speculative missions into the great void of invisible nowhere, it would be NASA, but even there is clarity. Just to name a few: Pioneer, Apollo, Voyager, Hubble – missions with destinations, destinies, and legacies. Meanwhile, up in the middle of Nowhere, people now live in the International Space Station. NASA doesn’t launch people into space willy-nilly. It all happens, as it should, and as it must, in a context with articulated objectives. Such accomplishments do not arise because the future is unimaginable; on the contrary, they arise precisely because people are able to imagine the future.

Which brings me back to Mr Mackay and the government’s forum on education. It’s not accurate for me to pit one side against another when we all want students to succeed. If I’ve belaboured the point here, it’s because our task concerns young people, in loco parentis. Selling those efforts as some blind adventure seems, to me, the height of irresponsibility wrapped in an audacious marketing campaign disguised as an inevitable future, a ship setting sail so climb aboard, and hurry! Yes, I see where urgency is borne of rapid innovation, technological advancement made obsolete mere weeks or months later. For some, I know that’s thrilling. For me, it’s more like the America’s Cup race in a typhoon: thanks, but no thanks, I’ll tarry ashore a while longer, in no rush to head for open sea, not even aboard a vaunted ocean liner.

We simply mustn’t be so eager to journey into the unknown without objectives and a plan, not even accompanied as we are by machines that contain microprocessors, which is all “technology” seems to imply nowadays. There’s the respect that makes calamity of downloading the latest tablet apps, or what-have-you, just because the technology exists to make it available. How many times have teachers said the issue is not technology per se so much as knowing how best to use it? Teleology, remember? By the way, since we’re on the subject, what is the meaning of life? One theme seems consistent: the ambition of human endeavour. Sharpen weapon, kill beast. Discover fire, cook beast! Discover agriculture, domesticate beast. Realise surplus, and follows world-spanning conquest that eventually reaches stars.

Look, if learning is no longer about fingers quaintly turning the pages of outmoded textbooks, then fine. I still have my doubts – I’ve long said vinyl sounds better – but let that go. Can we please just drop the bandwagoning and sloganeering, and get more specific? By now, I’ve grown so weary of “the unimaginable future” as to give it the dreaded eye-roll. And if I’m a teenaged student, as much as I might be thrilled by inventing jobs of the future, I probably need to get to know me, too, what I’m all about.

In truth, educators do have one specific aim – personalized learning – which increasingly has come into curricular focus. Personalization raises some contentious issues, not least of which is sufficient funding since the need for individualized attention requires more time and resources per student. Nevertheless, it’s a strategy that I’ve found positive, and I agree it’s worth pursuing. That brings me back to Ken Osborne. One of the best lessons I gathered from his book was the practicality of meeting individuals wherever they reside as compared to determining broader needs and asking individuals to meet expectations.

Briefly, the debate presents itself as follows…

  • Side ‘A’ would determine communal needs and educate students to fill the roles

In my humble opinion, this is an eventual move toward social engineering and a return to unpleasant historical precedent. Know your history, everybody.

  • Side ‘B’ would assess an individual’s needs and educate a student to fulfil personal potential

In my humble opinion, this is a course that educators claim to follow everyday, especially these days, and one that they would do well to continue pursuing in earnest.

Apollo 11
Apollo 11

In my experience, students find collective learning models less relevant and less authentic than the inherent incentives found in personalized approaches that engender esteem and respect. Essentially, when we educate individuals, we leave them room to sort themselves out and accord them due respect for their ways and means along the way. In return, each person is able to grasp the value of personal responsibility. Just as importantly, the opportunity for self-actualisation is now not only unfettered but facilitated by school curricula, which I suspect is what was intended by all the “unimaginable” bluster. The communal roles from Osborne’s Side ‘A’ can still be filled, now by sheer numbers from the talent pool rather than by pre-conceived aims to sculpt square pegs for round holes.

Where I opened this essay with Anthony Mackay’s purposeful call to link business and education, I’ve been commenting as a professional educator because that is my field, so that is my purview. In fairness to government, I’ve found that more recent curricular promotion perhaps hints at reversing course from the murk of the “unimaginable” future by emphasizing, instead, more proactive talk of skills and empowerment. Even so, a different posture remains (touched upon in Katie Hyslop’s reporting of the forum and its participants, and a fairly common discursive thread in education in its own right) that implicitly conflates the aims of education and business, and even the arts. Curricular draft work distinguishes the “world of work” from details that otherwise describe British Columbia’s “educated citizen” (p. 2). [2] Both Ontario and Alberta’s curricular plans have developed comparably to BC’s, noting employers’ rising expectations that “a human capital plan” will address our ever-changing “world of work” (p. 5)[3] – it’s as if school’s industrial role were a given. Credit where it’s due, I suppose: they proceed from a vision towards a destination. And being neither an economist nor an industrialist, I don’t aim to question the broader need for business, entrepreneurship, or a healthy economy. Everybody needs to eat.

What I am is a professional educator, and that means I have been carefully and intentionally trained and accredited alongside my colleagues to envision, on behalf of all, what is best for students. So when I read a claim like Mr Mackay’s, that “what business wants in terms of the graduate is exactly what educators want in terms of the whole person,” I am wary that his educational vision and leadership are yielding our judgment to interests, such as commerce and industry, that lie beyond the immediately appropriate interests of students. Anthony Mackay demonstrates what is, for me, the greatest failing in education: leaders whose faulty vision makes impossible the very aims they set out to reach. (By the by, I’ve also watched such leadership condemn brilliant teaching that reaches those aims.) As much as a blanket statement, Mr Mackay makes an unfounded statement, and I could hardly do better to find an example of begging the question. If Mr Mackay is captain of the ship, then maybe responsible educators should be reading Herman Wouk – one last book, sorry, couldn’t resist.

Education is about empowering individuals to make their own decisions, and any way you slice it, individuals making decisions is how society diversifies itself. That includes diversifying the economy, as compared to the other way around (the economy differentiating individuals). Some people are inevitably more influential than others. All the more reason then for everybody, from captains of industry on down, to learn to accept responsibility for respecting an individual’s space, even while everybody learns to decide what course to ply for themselves. Personalized learning is the way to go as far as resources can be distributed, so leave that to the trained professional educators who are entrusted with the task, who are experts at reading the charts, spotting the hazards, and navigating the course, even through a void. Expertise is a headlight, or whatever those are called aboard ships, so where objectives require particular expertise, let us be lead by qualified experts.

And stop with the nonsense. No unimaginable future “world of work” should be the aim of students to discover while their teachers tag along like tour guides. Anyway, I thought the whole Columbus “discovery” thing had helped us to amend that sort of thinking, or maybe I was wrong. Or maybe the wrong people decided to ignore history and spend their time, instead, staring at something they convinced themselves was impossible to see.


 

3 February 2015

Vancouver Sun Weekday Sample

Tracy Sherlock VANCOUVER SUN View a longer version of this interview at vancouversun.com Sun Education Reporter tsherlock@vancouversun.com

Changing education to meet new needs

“The learning partnership has gotto go beyond the partnership of young person and family, teacher and school, to the community and supportive agencies.
TONY MACKAY CEO, CENTRE FOR STRATEGIC EDUCATION IN AUSTRALIA

Tony Mackay, CEO at the Centre for Strategic Education in Australia, was in Vancouver recently, facilitating a forum about changing the education system to make it more flexible and personalized. He spoke about the rapidly changing world and what it means for education.

Q Why does the education system need to change?

A The needs of the economy and our society are changing and therefore you need to have a learning system that fits the purpose, and that purpose is constantly shifting. So it’s not just a matter of saying we can reach a particular level and we’ll be OK, because you’ve got such a dynamic global context that you have a compelling case that says we will never be able to ensure our ongoing level of economic and social prosperity unless we have a learning system that can deliver young people who are ready — ready for further education, ready for the workforce, ready for a global context. That’s the compelling case for change.

Q Isn’t that tough when we don’t know what the jobs of the future will be?

A In the past we knew what the skill set was and we could prepare young people for specialization in particular jobs. Now we’re talking about skill sets that include creativity, problem solving, collaboration, and the global competence to be flexible and to have cultural understanding. It’s not either or, it’s both and — you need fantastic learning and brilliant learning in the domains, which we know are fundamental, but you also need additional skills that increasingly focus on emotional and social, personal and inter-personal, and perseverance and enterprising spirit. And we’re not saying we just want that for some kids, we want to ensure that all young people graduate with that skill set. And we know they’re going to have to effectively “learn” a living — they’re going to have to keep on learning in order to have the kind of life that they want and that we’re going to need to have an economy that thrives. I believe that’s a pretty compelling case for change.

Q How do you teach flexibility?

A When I think about the conditions for quality learning, it’s pretty clear that you need to be in an environment where not only are you feeling emotionally positive, you are being challenged — there’s that sense that you are challenged to push yourself beyond a level of comfort, but not so much that it generates anxiety and it translates into a lack of success and a feeling of failure that creates blockages to learning. You need to be working with others at the same time — the social nature of learning is essential. When you’re working with others on a common problem that is real and you have to work as a team and be collaborative. You have to know how to show your levels of performance as an individual and as a group. You can’t do any of that sort of stuff as you are learning together without developing flexibility and being adaptive. If you don’t adapt to the kind of environment that is uncertain and volatile, then you’re not going to thrive.

Q What does the science of learning tell us?

A We now know more about the science of learning than ever before and the question is are we translating that into our teaching and learning programs? It’s not just deeper learning in the disciplines, but we want more powerful learning in those 21st-century skills we talked about. That means we have to know more than ever before about the emotions of learning and how to engage young people and how young people can encourage themselves to self-regulate their learning.

The truth is that education is increasingly about personalization. How do you make sure that an individual is being encouraged in their own learning path? How do we make sure we’re tapping into their strengths and their qualities? In the end, that passion and that success in whatever endeavour is what will make them more productive and frankly, happier.

Q But how do you change an entire education system?

A Once you learn what practice is done and is successful, how do you spread that practice in a school system so it’s not just pockets of excellence, but you’ve actually got an innovation strategy that helps you to spread new and emerging practice that’s powerful? You’re doing this all in the context of a rapidly changing environment, which is why you need those skills like flexibility and creativity. The learning partnership has got to go beyond the partnership of young person and family, teacher and school, to the community and supportive agencies. If we don’t get the business community into this call to action for lifelong learning even further, we are not going to be able to get there. In the end, we are all interdependent. The economy of the future — and we’re talking about tomorrow — is going to require young people with the knowledge, skills and dispositions that employers are confident about and can build on.

Previously available at http://www.pressreader.com/canada/the-vancouver-sun/20150203/282183649467268/TextView

and http://newsinvancouver.com/qa-changing-education-to-meet-new-needs/

[1] Sean Silcoff, The Globe and Mail, BC to add computer coding to school curriculum (Jan 17th, 2016)

[2] Introduction to British Columbia’s Redesigned Curriculum (August 2015 – Draft)

[3] Inspiring Action on Education (June 2010)

What On Earth Were They Thinking?

How often have you heard somebody question people who lived “back then,” in the swirling historical mist, who somehow just didn’t have the knowledge, the nuance, or the capability that we so proudly wield today?

“Things back then were just a lot more simple.”

“They just weren’t as developed back then as we are today.”

“Society back then was a lot less informed than we are today.”

It’s difficult to even confront such statements out of their context, but where I’ve had all three of these spoken to me in (separate) conversations, I challenged each impression as an insinuation that we’re somehow smarter than all the peoples of history, more skilled, more sophisticated, more aware, more woke (as they say, these days), that we’re, in the main, altogether better than our elders merely for having lived later than they did. These days, apparently, “we’re more evolved” – ha! 🙂  more evolved, that’s always a good one, as if back then everyone was Australopithecus while here we are jetting across the oceans, toting our iPhones, and drinking fine wines. Well, sure, maybe things have changed since back then, whenever “then” was. But, more typically I’ve found, contemporary judgments levelled upon history are borne of an unintended arrogance, a symptom of 20:20 hindsight and the self-righteous assurance that, today, we’ve finally seen the error – actually, make that the errors – of their ways.

Surely, these days, few – or any – would believe that we’re ignorant or unaccomplished or incapable, not the way someone might say of us while looking back from our future. At any given point on the historical timeline, I wonder whether a person at that point would feel as good about their era, looking back, as any person on some other point of the timeline would feel, also looking back, about theirs. Is it a common tendency, this judgment of contemporary superiority? These days, we might well feel superior to people who had no indoor plumbing or viral inoculations or Internet access, just as someone earlier would have appreciated, say, some technological tool, a hydraulic lift to raise heavy objects, or a set of pulleys, or a first-class lever – choose whatever you like! It really is arbitrary for a person, looking back at history, to feel better about contemporary life because contemporary life infuses all that person’s experience while history’s something that must be learnt.

I’d say that learning history means learning whatever has lasted and been passed on because what has lasted and been passed on was deemed to have merit. We’re taught the history that has been deemed worth remembering. What I’ve found has been deemed worth remembering (i.e., the kinds of things I learned from History classes) are the general mood of the times, the disputes and fights that occurred (violent or academic), a select handful of the figures involved, and an inventory of whichever non-living innovations and technologies simultaneously arose alongside it all. If, later, we demerit and no longer pass on what has lasted up until then, passing on instead some different history, then that’s entirely indicative of us, now, versus anyone who came before us, and it reflects changed but not necessarily smarter or better priorities and values.

For me, we shouldn’t be saying we’re any smarter or better, only different. So much literature has lasted, so much art. Commerce has lasted, institutions have lasted, so much has lasted. Civilization has lasted. Cleverness, ingenuity, shrewdness, wit, insight, intellect, cunning, wisdom, kindness, compassion, deceit, pretence, honesty, so many many human traits – and they all transcend generations and eras. People vary, but human nature perdures. I’ll trust the experts, far more immersed in specific historical study than me, to identify slow or subtle changes in our traits – hell, I’ll even grant we may have culturally evolved, after all – and I can only imagine in how many ways the world is different now as compared to before now. But what does it mean to be better? Better than other people? Really? And who decides, and what’s the measure?

We can measure efficiency, for instance, so to say technology has advanced and is better than before is, I think, fair. Even then, an individual will have a subjective opinion – yours, mine, anybody’s – making culture not proactive and definitive but rather reactive and variable, a reflection, the result of comprised opinions that amplify what is shared and stifle what is not. As we’re taught merited history, you’re almost forced to concur, at least until we reconsider what has merit. That’s a sticking point because everyone will have an opinion on what is culturally better and what is culturally worse. Morality inevitably differs, and suddenly we have ethical debate, even disagreement, even discord. But to say people or culture are better, I think, is too subjective to rationalize and a questionable path to tread.

Consider this as well: we each know what we’ve learned, and as individuals, we’ve each learnt what we value. But what we’re taught is what’s broadly valued and, thereby, prescribed for all. We’ve all heard that rather hackneyed epigram, that those who neglect history are doomed to repeat it. Well, maybe the ones screwing up just didn’t learn the right history to begin with. I tend to abide by another hackneyed epigram, that they are wisest who know how little they know. Real historical wisdom and real historical understanding would be seeing and thinking and understanding as people earlier did. But short of firing up the Delorean for an extended visit some place, some time, it seems to me that judgments about history are made with an aplomb that might be better aimed at acknowledging our finite limitations. We’re no angels. If anything, this error of judgment speaks volumes about us. Condescension is what it is, but in my opinion, it’s no virtue.

We should hardly be judging the past as any less able or intelligent or kind or tolerant or virtuous as us, especially not if we aim to live up to today’s woke cultural embrace of acceptance. Being different should never be something critiqued; it should be something understood. Conversely, in proportion to how much we know, passing judgment is assumptive, and we all know what Oscar Wilde had to say about assuming (at least, we know if we’ve studied that piece of history). At the very least, we ought to admit our own current assumptions, mistakes, errors, accidents, troubles, disputes, and wars before we pass any judgment on historical ones. On that positive note, I will say that considering all this prompted me to notice something maybe more constructive – so often, at least in my experience, what we seemed to study in History class were troublemaking causes and effects, bad decisions, and selfishly motivated behaviours. Far more rarely was History class ever the study of effective decision-making and constructive endeavour – maybe the odd time, but not usually. Maybe my History teachers were, themselves, stifled as products of the system that educated them. What could they do but pass it along to me and my peers? Considering that, I might more readily understand how people, alive today, could conclude that all who came before were simply not as enlightened, as sophisticated, or as adept as we are now.

Yet that merely implicates contemporary ignorance: assumptions and mistakes still happen, errors still occur, accidents – preventable or not – haven’t stopped, troubles and disputes and wars rage on. If the axiom of being doomed to repeat history were no longer valid, we wouldn’t still feel and accept its truthful description, and it would have long ago faded from meaning. All I can figure is that we’re still poor at learning from history – the collective “we,” I mean, not you in particular (in case this essay was getting too personal). We need learned people in the right positions at the right times, if we hope to prevent the mistakes of history. Not enough people, I guess, have bothered to study the branches of history with genuine interest. Or, no, maybe enough people have studied various branches of history, but they don’t remember lessons sharply enough to take them on board. Or, no no, maybe plenty of people remember history, but the circumstances they face are just different enough to tip the scale out-of-favour. Epigram time: history doesn’t repeat, but it does rhyme. Or maybe we’re just full of ourselves, thinking that we’ve got it all solved when, evidently, we don’t.

It also dawned on me, considering all this, that high school “History” influences what many people think about broader “history.” My high school experience, university too, was mostly a study of politics and geography, and toss in what would be considered very elementary anthropology – all this as compared to those other branches of historical study. Archaeology and palaeontology come to mind as detailed, more scientific branches of history, but there are so many – literary history, philosophical history, religious, environmental, military, economic, technological, socio-cultural as I’ve already indicated, on and on they go, so many categories of human endeavour. I’ve even come across a thoughtful paper contemplating history as a kind of science, although one that is normative and susceptible to generational caprice. One final epigram: history is what gets written by the winners.

And that’s really the point here: throughout what we call human history, where we’ve subdivided it so many ways (right down to the perspective of every single person who ever bothered to contribute, if you want to break it down that far), it’s people all the way back, so it’s all biased, so it’s neither complete nor even accurate until you’ve spent oodles of time and effort creating a more composite comprehension of the available historical records. And, dear lord, who has time for that! History, in that respect, is barely conceivable in its entirety and hardly a thing to grasp so readily as to say, simply, “Back then…” History is people, and lives, and belief inscribed for all time. To know it is to know who lived it as well as who recorded it. Knowing others is empathy, and empathy is a skill trained and fuelled by curiosity and diligence, not emotion or opinion. Emotion and opinion come naturally and without effort. For me, valid history derives from informed empathy, not the other way around.

As far as recording history for future study, ultimately, it will have been people again recording and studying all of it, “it” being all of what people were doing to attract attention, and “doing” being whatever we care to remember and record. It’s all a bit cyclical, in itself, and completely biased, and someone will always be left out. So people might be forgiven when shaking their heads in judgment of the past because, without empathy, what else could they possibly know besides themselves?

The Latest Visual WHY

Click here to read the latest Visual Why.

Jean-Baptiste Regnault - Socrates Tears Alcibiades from the Embrace of Sensual Pleasure (1791)
“…could we ever know what art makes oneself better, if we were ignorant of what we are ourselves?”

“…a picture’s usually worth even more than the thousand words we commonly ascribe.”

The word “text” derives from Latin, texere, meaning to weave or fit together. For me, text connotes far more than just the printed word – photography, movies, music, sculpture, architecture, the list goes on and on. The Visual WHY offers a specific look at paintings, texts with no less substance and arguably far more aesthetic. But underpinning the textuality of art altogether is its human endeavour. And beyond weaving something together for the sake of weaving, weavers – artists, people – have a further end, communication. Artists across all media are still people with influences and motives for expressing themselves. Conjointly, texts of all kinds are plenty human: provocative and reflective. Whether rich and symbolic for a global audience, or doodled sketches for your own amusement, art is text, and text has purpose. As we try to understand it more thoroughly, we can’t help but raise the level of discourse. Who knows, someday maybe art will save the world…

For those who’ve been wondering about the painting featured both here and on this site’s front page, this latest update will explain that, too.

Click here to read the latest Visual Why.