Play’s the Thing…

I used to say to my students, “Find the overlap between our English coursework and, say, Trigonometry, or the link from persuasive writing to PhysEd. Where does Hamlet end and organic chemistry begin? Find that one out… there’s genius in that.” The courses my Department offered were called “English” and, helmed by some teachers, they were more traditional, as one might expect. The most common feedback I received from students, though, was how unlike English our coursework seemed to them. I took those remarks as a measure of success: my aim was to prepare young people, soon enough entering the world as older people, to be responsible… to families, communities, careers, and so forth. For me, that’s the purpose of school and its teachers.

What prompted me to reflect was reading Science, Order, and Creativity, by David Bohm and F. David Peat – specifically, such remarks as “the appropriate relationship between thought and experience… [in which] creative new perceptions take place when needed” (p. 49). That distinction between thought and experience reminded me of another distinction, this between dialogue and conversation. And again I was prompted to recall my English courses – what we had, I’d say, were definitely conversations, scratching new surfaces and digging into things with fluid spontaneity, as compared to the “my turn / your turn” protocol of dialogue, which might dig one trench but deeper and deeper. Where dialogue strikes me as instrumental, a means to an end, conversation is an end in itself, without start or finish but continual – that is, until the bell rings. We notoriously lived beyond the rigour of scheduling in some of my courses.

Those conversations were hard to let go. And what exactly were we after? “The creative person does not strictly know what he or she is looking for,” say Bohm and Peat. “The whole activity [is] play itself,” and no better description of teaching (at least, my teaching) have I ever read. Who knew I was so creative? Not me although I did have fun. So who knew teaching was just so much play? “The play’s the thing / wherein I’ll catch the conscience of–” well, anybody, really. I should clarify that I respected my colleagues and our Departmental philosophy as well as my professional obligation to Ministry curricula. At the same time, I relied on my own interests and concerns to guide our coursework, by day and by year. The result was a mixture of reading, discussion, writing, and presenting about topics as disparate as literature, film, fine art, civics, politics, economics, philosophy, etymology, all manner of topics – yes, even science and math – all bundled together in a process of classical rhetoric. Eventually, I developed a suitably disparate canon of texts, too, that flowed meaningfully from English 9 through 12. And I relied on students’ differences to alter and adjust the flavour however they might. I loved teaching for how creative it allowed me to be, and for how much creativity it provoked in my students. “Let come what comes,” Laertes tells Claudius – brazen, even foolhardy. Genius, perhaps?

Bohm and Peat seem to suggest that genius is not creativity per se so much as the effect of having challenged some assumptions, and maybe that’s mere semantic distinction. Either way, I like the notion. Later, reading Allen Repko, I found myself nodding likewise at what he calls “boundary crossing” (p. 22). There it was, this discovery of common threads in disparate disciplines, this crossing of amorphous boundaries, what my students have heard me call “genius” although I might now redefine that trait as “ingenuity.” Accompanying “boundary crossing” is a reaching across disciplines, with intent, what Repko calls “bridge building.” This, I think, I would call visionary. Discovery and vision, both what I would forever consider, as a teacher, to be meaningful developments of the learning process.

Repko also points out the origin of the word, “discipline,” deriving from the Romans and their need to “relate education to specific economic, political, and ecclesiastical ends” (p. 32). How delightfully Roman! I thought, reading that. Such instrumentalism, “the logic of utility.”[1] Finis at its finest: How long, O Lord! Will their legacy never end? But I trust in teaching and my unfailing students.

I enjoyed sixteen years teaching Secondary English to brilliant students. In that time, we developed a philosophy, addressed the BIG Questions, and fed our curiosity. But my planning process was seldom more than make-it-up-as-we-go. “We could never get away with this in Math,” I used to say to them, “although if you do find a way, I’d love to hear about it.”

 


[1] Phelan, A. (2009). A new thing in an old world? Instrumentalism, teacher education, and responsibility. In Riches, Caroline & Benson, Fiona J. (Eds.) Engaging in Conversation about Ideas in Teacher Education, (105-114). New York, NY: Peter Lang.

 

From The New York Times – “Free Speech and the Necessity of Discomfort” and further Reflections on Journalism

A needfully challenging appeal to raise the level of discourse, and an appropriate inclusion to The Rhetorical WHY, from an Opinion piece in The New York Times (Feb 22, 2018) by Op-Ed columnist, Bret Stephens:

This is the text of a lecture delivered at the University of Michigan on Tuesday [Feb 20, 2018]. The speech was sponsored by Wallace House.

“I’d like to express my appreciation for Lynette Clemetson and her team at Knight-Wallace for hosting me in Ann Arbor today. It’s a great honor. I think of Knight-Wallace as a citadel of American journalism. And, Lord knows, we need a few citadels, because journalism today is a profession under several sieges.…” [continue reading]

 

'Oddly enough, I feel offended...'
All That’s Fit to Print, Wiley Miller

Some thoughts of my own on the significance of a free press to our lives…

Next, I offer a series of responses I made to remarks by Hannah Arendt, published October 26, 1978 in The New York Review of Books, itself a report of her interview with Roger Errera). I encountered them in a Facebook post from The Bureau of Investigative Journalism.

 


Arendt: “… how can you have an opinion if you are not informed?”

Everybody has opinions – our five senses give us opinions. In order to be “informed,” we need discernment enough to detect accurate information.

Arendt: “If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer.”

For me, continual lies ultimately yield zero trust, but again, how would I know who’s even lying, but for my own discernment and experience?

At the least, if I were aware that all around were lies, that much I’d know is true. It’s not that “nobody believes anything any longer,” so much as it’s “everybody goes about searching out truth on their own.” The downside is when those individual searches for truth become disrespectful, as we’ve seen lately, or worse, chaotic.

Nevertheless, investigate! Accept responsibility to inform yourself. Accept or believe all with a grain of salt until such time as you can prove to your own satisfaction who and what are trustworthy. And, at that point, be tolerant, if not respectful, of others – this applies to everybody, all sides, liberals and conservatives and all points between. Taking the high road is not to be done with pride or smug assurance. It’s easy to nod and say, “I already do while others do not,” but even so, reflect upon yourself with each conversation, each debate, each exchange.

Open-minded and open-hearted – both are virtues, but they don’t have to be the same thing.

Arendt: “If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer.”

On its face, this statement could only be accurate if you had some clairvoyance or a crystal ball.

By “everybody” doing their own investigation and accepting responsibility to inform themselves, I mean everybody. We’re able to trust news & media sources to the extent that they have lived up to their responsibility… to the extent we’re aware that they have. I support proper, professional investigative journalism and public intellectualism, both of which I gather to be in decline.


'Well, apparently you haven't heard. . . personal opinions are the new facts.'
The New Facts, Chris Wildt

Finally, I offer two sets of remarks about journalism by two long-retired anchor-journalists of PBS fame, partners Robert MacNeil and Jim Lehrer. The first is transcribed from an exchange between them during a tribute to MacNeil upon his retirement in October 1995. The second – comprising two parts – is Lehrer’s closing words upon the “retirement” of his name from the title of the PBS NewsHour, on December 04, 2009. Following that, I’ve included a thoughtful follow-up by the PBS Ombudsman, Michael Getler, published the next week on December 11.

MacNeil’s remarks upon his retirement (October 20, 1995)…


MacNeil: You know, I’m constantly asked, and I know you are in interviews, and there have been a lot of them just now – I’m constantly asked, “But isn’t your program a little boring to some people?” and I find that amazing, because, well, sure, it probably is, but they’re people who don’t watch. The people who watch it all the time don’t find it boring, or they wouldn’t watch.

Lehrer: That’s right.

MacNeil: And it’s the strange idea that’s come out of this medium, because it’s become so much a captive of its tool – as its use as a sales tool that it’s driven increasingly, I think, by a tyranny of the popular. I mean, after all, you and I’ve said this to each other lots of times – might as well share it with the audience: what is the role of an editor? The role of an editor is to make– is to make judgments somewhere between what he thinks is important or what they think is important and what they think is interesting and entertaining.


Jim Lehrer’s guidelines of journalism (December 04, 2009)…


Lehrer: People often ask me if there are guidelines in our practice of what I like to call MacNeil/Lehrer journalism. Well, yes, there are. And here they are:

* Do nothing I cannot defend.

* Cover, write and present every story with the care I would want if the story were about me.

* Assume there is at least one other side or version to every story.

* Assume the viewer is as smart and as caring and as good a person as I am.

* Assume the same about all people on whom I report.

* Assume personal lives are a private matter, until a legitimate turn in the story absolutely mandates otherwise.

* Carefully separate opinion and analysis from straight news stories, and clearly label everything.

* Do not use anonymous sources or blind quotes, except on rare and monumental occasions.

* No one should ever be allowed to attack another anonymously.

* And, finally, I am not in the entertainment business.

Here is how I closed a speech about our changes to our PBS stations family last spring:

‘We really are the fortunate ones in the current tumultuous world of journalism right now. When we wake up in the morning, we only have to decide what the news is and how we are going to cover it. We never have to decide who we are and why we are there.’


 

I am struck by the continuity of their respective final comments, about entertainment – each, in his own way, seeks to distance journalism from vagary, each thereby implying that we are susceptible to emotional or whimsical tendencies, which evidently seem capable of overtaking our focus to learn; otherwise, why mention the point at all?

 


  • Watch Lehrer’s remarks here, in a functional if awkward series of video archives of that 2009 broadcast.
  • In May 2011, upon Lehrer’s retirement, MacNeil returned to offer his own reflections upon his friend and colleague that include some further worthwhile commentary upon contemporary TV journalism
  • Watch them during a more recent (October 25, 2016) retrospective interview from 92nd Street Y, a Jewish cultural and community centre in Manhattan.

 

I recall “Lehrer’s Rules,” as they were called, making a small stir – some of it more substantive, meaningful, and some the critical “woe-is-Us” lament at the passing of favourite things. In amongst it all, as I mentioned, were the following comments from PBS Ombudsman, Michael Getler, which I include here, at length, on account of PBS webpages’ tendency to disappear.

In fact, a number of the PBS pages where I found these articles are no longer active – where possible, I have checked, updated, and even added weblinks. But I believe Getler’s comments, like the rest, are worth preserving, on account of their potential to provoke us to think and learn more about a free press and its relation to ourselves.

 


“Lehrer’s Rules” by Michael Getler (December 11, 2009)

A couple of people wrote to me in the aftermath of that Dec. 4 sign-off to say how much they liked Lehrer’s guidelines and asked how they could get a copy. That’s why they are reproduced above. A subscriber to the widely-read Romenesko media news site also posted them there on Dec. 6 and they also were posted on the campus site of the Society of Professional Journalists (SPJ). “Whether you agree with all of Lehrer’s guidelines, or not,” that posting read, “he has surely earned our attention.”

That’s certainly true in my case. I’ve also been a devoted watcher of the NewsHour in all of its evolutions during most of the past 30-plus years, long before I took on this job four years ago. Although segments of the program have been the subject of critical ombudsman columns on a number of occasions, I’ve also said many times that it remains the best and most informative hour of news anywhere on television, and it has never been more important. I follow the news closely but almost always learn something from this broadcast every night.

Boring, at Times, But a Luxury Always

Sometimes, of course, it can seem boring. Sometimes the devotion to balanced he said/she said panel discussions can leave you frustrated and angry and no smarter than you were 15 minutes earlier. Sometimes the interviewing is less challenging than one might hope. But the luxury of an uninterrupted hour of serious, straight-forward news and analysis is just that these days, a luxury. And, in today’s world of media where fact and fiction, news and opinion, too often seem hopelessly blurred, it is good to have Lehrer – clearly a person of trust – still at work.

I had the sense when he added his guidelines to that closing segment last Friday that the 75-year-old Lehrer was trying to re-plant the flag of traditional, verifiable journalism that he has carried so well all these years so that it grows well beyond his tenure – whatever that turns out to be – and spreads to all the new platforms and audiences that the contemporary media world now encompasses.

Oddly, I did not get any e-mail from viewers commenting on the new NewsHour format, other than one critical message that said “do not post.” Maybe that’s a good sign since people usually write to me to complain.

Make no mistake, the now defunct NewsHour with Jim Lehrer is still quite recognizable within the new PBS NewsHour. So those who wrote earlier and said they didn’t want any change won’t be terribly disappointed. I, personally, found the first few days of the new format and approach to be a distinct improvement. The program seemed to have more zip and energy, faster paced, with good interviews and without the always predictable language that introduced the show in the past. It presented its news judgments more quickly, benefitted from the early introduction of other top staff members as co-anchors, and from the introduction of a promising “new guy,” Hari Sreenivasan, a former CBS and ABC correspondent who presents a headline summary from the newsroom and is the liaison to an expanded NewsHour Web operation.

Now, just to keep this a respectable ombudsman’s column, let me add a few quibbles when it comes to Lehrer’s rules, as posted above.

First, one of the interesting things about American journalism is that there are no agreed-upon national standards, no journalistic equivalent of the Hippocratic Oath for physicians. There are, of course, many universal values and practices that vast numbers of journalists have voluntarily adhered to generally for many years, best exemplified by SPJ’s Code of Ethics. But the fact is that all major news organizations – from the Associated Press to the New York Times to PBS and CBS – have their own guidelines and standards that they try and live by. And they all have their differences.

Naturally, a Few Quibbles

Lehrer’s guidelines embody lots of the good, praiseworthy stuff, and we come out of the same journalistic generation and traditions. But I think on a couple of points they are actually too nice, too lofty, cruising somewhere above some of the grittier realities of journalism.

For example, “Assume the viewer is as smart and as caring and as good a person as I am. Assume the same about all people on whom I report.” Really? Bernard Madoff? Osama bin Laden?

Then there is: “Assume personal lives are a private matter, until a legitimate turn in the story absolutely mandates otherwise.” I would argue, and have, that the NewsHour withheld from its viewers at the time a legitimate turn in a major story – reported by all other major news organizations – last year when it declined to inform them that a former senator and former candidate for the vice-presidency, John Edwards, issued a public statement and went on ABC Television to acknowledge that he had had an extra-marital affair with a woman who had been hired by his political action committee to make films for his campaign. That’s news.

Finally, there is, “Do not use anonymous sources or blind quotes, except on rare and monumental occasions.” I agree about the blind quotes when they are used to attack someone personally. But anonymous sources have often proved to be absolutely crucial to the public’s right to know what’s really going on in scores of major stories as they have unfolded from Watergate to secret CIA prisons overseas.

The most accurate and important pre-war stories challenging the Bush administration’s on-the-record but bogus case for Iraqi weapons of mass destruction were based on anonymous sources. Many of those stories, in part because they were based on anonymous sources, got buried or underplayed by newspapers at the time. Many of them never got reported at all on television, including the NewsHour. But there are times when there are mitigating circumstances – like internal threats within an administration or maybe jail time for leakers – when some sources must remain anonymous and when editors need to trust their reporters. And often you don’t know if the occasion is “rare and monumental” until it is too late. Pre-war Iraq, again, being Exhibit A.


 

Freedom of the Press_ Matteo Bertelli
Freedom of the Press, Matteo Bertelli

Some other links…

World Association of Newspapers and News Publishers

World Press Freedom Index

Ryerson School of Journalism

Edelman Trust Barometer

Freedom House

Deciding over Derrida’s Différance

As far as I understand Jacques Derrida’s différance, he observes that we understand our experiences as distinctive, but not exhaustive, communicated links or marks comprising an on-going decisive chain of experiential moments. As to the language we use to describe our experiences, a word has contextual meaning, both from its usage at any given time as well as from its etymology over the course of time. I tend to agree with this attendance to context as furnishing meaning, and I can also spot the rabbit hole that it poses. For example, to understand some word’s definition, I might look it up in the dictionary and be left to rely upon the definition of whomever decided what it meant while, at the same time, face all sorts of words in the definition that now need looking up, too – Sisyphean, indeed! Cruel but so usual. On the other hand, thanks to whomever for compiling the dictionary, a pretty utile compendium, I have to say.

To be clear, I am not intending to invoke logocentrism, by which all our words are accorded a decided meaning from a cultural centre, which propagates existing biases or “privileges”; Derrida would roll over in his grave. Granted, I may already have laid grounds here to be accused of logocentrism, myself, by writing with words (and I confess to using English because I didn’t think anyone had the patience to muddle over Wingdings). My present aim is to suggest how we might address the afore-mentioned rabbit-hole dilemma by searching for or (… almost afraid to say it) by deciding upon some definitions of our own. Not like a dictionary, but more like– well yes, okay, like a dictionary, but one that we’ll fashion from the ground-up, like when the light bulb would go on above Darla’s head, and Spanky would snap his fingers to say, “Hey, everyone! Maybe we can put on a play!” So, in the spirit of dissemination, hey everybody, maybe we can compile a dictionary! A real, deconstructive, crowd-sourced dictionary!

I’m not really compiling a dictionary. I’m just trying to make some sense of Derrida and différance. Let me try to illustrate what I mean from my own experience. Sometimes I play Walking Football, a version of the game where players are not permitted to run. Naturally, the debate is over what differentiates walking from running. We’ve agreed that walking means “always having at least one foot in contact with the ground during the striding motion.” Running means “having both feet leave the ground at some point during the striding motion.” This makes for certainty, and so long as our eyes are trained enough to spot feet in motion, which I can spot sometimes so clearly, with such immediacy, that its more like I’m watching, not playing – I’m ghinding it tuff even now to ghet the right words, but trust me. And so long as each player is willing to obey the rules – and, ohh my, there’s always that one player who just won’t. You know who I mean… *sigh… Anyway, so long as they’re not just words uttered that then float away in the breeze, our definitions of the rules for walking and running are useful.

Luckily, too, I might add, when we clarify the rules, we do so out loud, together, and don’t whisper it around in a circle, like when my daughter plays Telephone at a birthday party – after all, we want everyone to be clear. Finally, even if we have trouble spotting feet in motion, because it all happens too quickly, or even if that one player is a cheater at heart, the definitions themselves remain clear, and usually at least one or two of us can remember them well enough to recite back, as needed, usually with a lot of finger-pointing and furrowed brows. One time we even wrote the no-running rule on the gym chalkboard, and even though no one challenged this, on the grounds that writing is secondary to speech, everyone still understood why it was scrawled there, by which I mean everyone knew exactly who should read it the most – wow, does every game have that player? Incorrigible.

Bottom line: accountability is down to the sincerity and respect offered to each player by every other player who decides to participate. As an aside, the need for a referee, an arbiter, is all the more clear when the stakes are as high as bragging rights and free beer. But, even as we play for fun, the rules exist or else the game, as such, does not. (On that note, I find a lot of players just don’t like Walking Football and would rather play with running, and that’s fine, too: it’s their decision, and plenty other like-minded players keep both games afloat. I find the Walking game amplifies decision-making, so maybe this feature just appeals to me. And I still play traditional football, too.) My broader point is that any one person must decide to accept what has been defined and, likewise, any group of people must reach a consensus. Shared meaning matters because, otherwise, as I say, we don’t have a game, or else we have a very different one, or we just have anarchy. But whether that person, alone, or the group, altogether, searching for a way to decide upon meaning, has the patience to delve down the rabbit hole… well, yes, context does indeed matter – both usage and etymology. I’ve said and written as much, myself, for a long time. So, in light of all this, I hope I’ve gathered a little something of Derrida’s différance. I’m still learning.

Another illustration: in my teaching, I occasionally introduced this matter of contextual meaning by offering students a list of synonyms: “slim,” “slender,” “skinny,” “thin,” “narrow.” Each word, of course, has its own particular meaning. “If they meant the same thing,” I offered, “then we’d use the same word,” so just what explains the need for all these synonyms? Well, students would say, there are lots of different things out there that possess or demonstrate these various adjectives (my word, not theirs), so we’ve come up with words to describe them (and I think that’s a charitable “we,” like the Royal “We.”) As the discussion proceeded, I might ask which of these words typically describe human traits versus those – leaving aside metaphors – that typically do not. Next, which words typically possess positive connotations, and which negative, or neutral? And, as it pertains to the personification metaphors, which words are more easily envisioned versus those that really stretch the imagination, or even credibility? Eventually, I would shift from ontology to epistemology, posing the questions at the heart of my intention: For any of the previous questions about these synonyms, how do you know what you’re talking about? For what each of these words could mean, where have your assurances come from? Of course, the most frequent reply to that question was “the dictionary,” followed by “my parents” or “books I’ve read,” or “just everyday experience, listening and talking to people.” Occasionally, the reply was something akin to “Who cares… it just means what it means, doesn’t it?” In every reply, though, one common thread was detectable: the involvement of other people as part of the meaning-making process. Fair enough, we can’t all be Thoreau.

One more example: when is “red” no longer red but perhaps orange or purple? Well, for one thing, if you’re colour blind, the question means something entirely different, which I say not flippantly but again to illustrate how important dialogue and community are to deciding what something means. For another thing, we might wish to ask, in keeping with context-dependency, “Why even ask?” Again, this is not flippant or dismissive but practical: when does it matter so that we distinctly need to identify the colour red? Where a group of people might face the question over what is red versus what is orange or purple, we might expect some kind of discussion to ensue. And, whether asking as part of such a group or as a hermit, alone, I submit that one person’s decision about what is “red” is ultimately down to one person to determine: “Red is this,” or “This is red,” or even, “Gosh, I still can’t really decide.” Even a coerced decision we can still attribute to the one who forces the issue – one person has decided on behalf of another, however benignly or violently: might makes right, or red, as it were.

Coercion introduces a political consideration about whose authority or power has influence, similar to needing referees on account of those players who decide to run. The point, for now, is simply that a decision over what something means to a person is ultimately made by a person, leaving others to deal with that decision on their own terms in whatever way. But other people are part of the meaning-making process, even passively, or else I wouldn’t need language to begin with since the rest of you wouldn’t trouble me by existing. Not to worry, by the way, I appreciate you reading this far. From what I understand (and I am convinced I must learn more, being no avid student of either postmodernism or Derrida), his observation of différance either discounts or else offers no account for the arbitrary decision-making that people might make when they decide they’ve had enough. People tend to land somewhere in a community, and it’s the rare person who lives and plays wholly and uncompromisingly by their own rules. However, the fact that he felt différance was worth the effort to publicise and explain to the rest of us does reflect an arbitrary decision on the part of Derrida and says something about him.

So this is where I have more fundamental trouble understanding Derrida and différance – the very notion of “different,” as in, in what world could there not be an arbiter? Even a life alone would face endless decisions: what to eat, where to go, when to sleep, and so forth. From such musing – speaking of rabbit holes – I was led to reading about another philosopher named Jacques, this one Rancière, and what he calls the axiom of equality. In pure nutshell form, I take this to mean that no (socio-political) inequality exists until it has been claimed to exist – and note that it’s claimed in a boat-rocking kind of way, what the kids these days are calling “disruptive.” The upshot is that equality, itself, can only ever be theoretical because someone somewhere inevitably is and always will be marginalised by the arbitrary decisions of cultural hegemony. Still learning.

Back to the Walking Football analogy: if the rabbit hole of defining a word in the context of those that surround it, and then having to define, even further, all those words, and on and on, and experience is inexhaustible, and what’s the point, and lift a glass to nihilism… if that kind of limitless indefinite deconstructive search-and-compare lies at the heart of what is different, then maybe Derrida just found it difficult to reach agreement with other people. It stands to reason that, if he played Walking Football, Derrida might be the worst cheater on the floor, continually running when he should be walking, then denying it just the same as he tried to gain advantage. Maybe, fed up being called a cheater, he would take his ball and go home to play by himself, where no one could say he was wrong. Being alone, who would be there, whether as an obedient player or as a sneaky one, to challenge him?

In fact, maybe that’s why he chose to return to play the next day – for all the arguing, he enjoyed the game, or the attention, or the camaraderie, or the exercise, or whatever, more than being accused of cheating. I wonder if, perhaps, in the great game of philosophy football, he would have been the only rival to strike real fear in Diogenes – I mean awe & respect kind of fear, just to clarify, and I mean if they had lived at the same time. It’s hard to know about Diogenes since nothing he wrote down ever survived, and these days, I doubt more than a few can recall any of whatever he said, besides that lamp-carrying honesty thing. (We should all have such good spirit when it comes to our first principles.) Anyway, I think Diogenes played for Wimbledon.

Well, I am being unkind to Derrida. Evidently, he was a kinder person by nature than I have let on, as well as an advocate for all voices, all people. And the professional care, the uncompromising expertise he took to convey his ideas, to trouble himself with delving down the rabbit hole so arbitrarily – to go down at all but, moreover, to go so far when he might, just the same, have decided to halt. Delve as far as you like, but accept responsibility for your decision, every time. In that respect, how does Derrida differ from any other person facing decisions? Did he have still other motivations? No player who kicks a football is deliberately playing to lose, not unless they have been coerced by someone else to do so. On the other hand, for all I know, maybe what Derrida called red I would call blue. Be careful not to pass the ball to the wrong team! (By the way, in sport, dynasties are remembered precisely because they eventually come to an end.)

Was Derrida no less accountable and open to scrutiny than you, or me, or anybody else? To suggest that a word only makes sense based on how it differs from those around it is no less arbitrary than its reciprocal suggestion, that a word only makes sense based on how it describes only what it describes. Half-full / half-empty, six of one… Two sides of the same coin are still the same coin. Alternatively, who put him up to all this? Meanwhile, on his own, surely Derrida had it within himself, as people do when they reach a point, simply to say, “Here is enough. I decide to stop here. For me, [the item in question] means this.” If that doesn’t ring true and sound like him, well, I’d say that can be just as telling of his character; I heard it suggested, once, how we can be helped in knowing something by what it is not. So, fine – for Derrida to stake the claim called différance, I’m willing to concede him that moment. We all land somewhere, and we’re all hardly alike, even when we’re alike.

We are, each and every one of us, individual. But together we comprise something just as dynamic on a larger scale – one might construe us societally, or perhaps historically, anthropologically, or on and on, in whatever way through whichever lens. For me, différance appears an attempt to speak for all about all, prescriptively. A grand stab at philosophy, no question, and that’s the beauty of the equality of philosophy, with thanks to Rancière: we all have a part to play and a right to respond. For the time being, as I have understood Derrida and his thinking, and I willingly stand to be instructed further, différance strikes me as ironic, being an advocacy for the dynamic development of people and language and culture that self-assuredly asserts its own accuracy. That is not an uncommon indictment of postmodernists. What’s more, it is ohh, so human.

Lest We Forget

I am indebted to three of my students – Maddy, Kira, and Shannon – for collaborating to write this essay, which we formally read aloud during a school Remembrance Day ceremony in 2013. As I told them at the time, our planning sessions together were as good as any committee-style work I’ve ever done – everyone thoughtful, respectful, contributing, and focused – and I remain as proud of our group effort today as I was back then.

I have only slightly revised our essay, for fluidity, to suit a print-format but have endeavoured to avoid any substantive changes.

One hundred years ago, the Dominion of Canada’s soldiers fought in the Great War. By November 1917, the Canadian Expeditionary Force had been in Europe for over three years, staying one more year and sacrificing their safety and their lives for their country on behalf of the British Empire.

What is sacrifice? Sacrifice is soldiers seeing past terror on the battlefield, placing themselves into vulnerability, and giving themselves on our behalf. Each year on November 11, Remembrance Day in Canada, we recognize our soldiers by wearing a poppy over our hearts. Why a poppy is more well-known, yet since its adoption in 1921, how the symbolic pin has remained potent is perhaps less well-considered. A century later, in such a different world, the relevance of the poppy as a way of honoring the sacrifice of wartime warrants reflection.

As time passes, the poppy’s symbolism, in and of itself, remains the same. We change – people, culture – and inevitably, as we change, our relationship with the poppy changes, too, however much or little. The poppy, the same symbol, is different for those who feel firsthand the costs of war, so many people separated, harmed, and displaced, so many lives lost. Pains of loss are felt most intensely when they occur, by those who are closest to the people involved. For those of us with no direct wartime experience, what we feel and know matters, yet it also differs. To activate a more complete appreciation, one meaningful place to which we might turn is poetry. During World War I, poetry was a common means for those with direct wartime experience to share, and to cope.

For the lover in the poem, “To His Love,” by Ivor Gurney, one particular soldier’s death has wiped out his lover’s dreams for a comfortable future. Experiencing the fresh pains of loss, she could not possibly forget her soldier or his sacrifice. The poppy we wear both honours his sacrifice and “[hides] that red wet thing,” her loss. But, because of our distance, our poppy does not hold the same raw pain as it does for her, for those who have so immediately lost their loved ones. So, if our poppies do not hold that same raw pain, why do we continue to wear them?

The poppy stays the same because of the fact that each soldier’s death remains. Again, from “To His Love,” Gurney writes, “You would not know him now…” Generations removed, do we remember who this soldier was? Would we recognize him on the street? No. “But still he died.” We may find it difficult to assess the significance of his death, here in our world, far removed by time and distance. But let us appreciate, let us remember, in that moment of a soldier’s death, how he died: “with due regard for decent taste.” A soldier dies with dignity, for his own sake, because that is all he has. He is a small blip in the universe. “But still he died,” and that will forever be. And for all who loved him, and for all he loved, we remember.

We continue to honour our soldiers, and the sacrifices of all during wartime, because of the timelessness of that sacrifice, which each one makes. Even now, removed from war, we can find reasons to remember the deaths of soldiers because the memory that remains of each soldier embodies our definition of a hero: ordinary people facing extraordinary circumstances and giving themselves, perhaps giving their lives, on our behalf. When we wear their poppies, we let their deaths weigh on our present.

Let their deaths weigh on our present, and let their memories live in their stead.

“They shall grow not old, as we that are left grow old:

Age shall not weary them, nor the years condemn.

At the going down of the sun and in the morning

We will remember them.”

What On Earth Were They Thinking?

How often have you heard somebody question people who lived “back then,” in the swirling historical mist, who somehow just didn’t have the knowledge, the nuance, or the capability that we so proudly wield today?

“Things back then were just a lot more simple.”

“They just weren’t as developed back then as we are today.”

“Society back then was a lot less informed than we are today.”

It’s difficult to even confront such statements out of their context, but where I’ve had all three of these spoken to me in (separate) conversations, I challenged each impression as an insinuation that we’re somehow smarter than all the peoples of history, more skilled, more sophisticated, more aware, more woke (as they say, these days), that we’re, in the main, altogether better than our elders merely for having lived later than they did. These days, apparently, “we’re more evolved” – ha! 🙂  more evolved, that’s always a good one, as if back then everyone was Australopithecus while here we are jetting across the oceans, toting our iPhones, and drinking fine wines. Well, sure, maybe things have changed since back then, whenever “then” was. But, more typically I’ve found, contemporary judgments levelled upon history are borne of an unintended arrogance, a symptom of 20:20 hindsight and the self-righteous assurance that, today, we’ve finally seen the error – actually, make that the errors – of their ways.

Surely, these days, few – or any – would believe that we’re ignorant or unaccomplished or incapable, not the way someone might say of us while looking back from our future. At any given point on the historical timeline, I wonder whether a person at that point would feel as good about their era, looking back, as any person on some other point of the timeline would feel, also looking back, about theirs. Is it a common tendency, this judgment of contemporary superiority? These days, we might well feel superior to people who had no indoor plumbing or viral inoculations or Internet access, just as someone earlier would have appreciated, say, some technological tool, a hydraulic lift to raise heavy objects, or a set of pulleys, or a first-class lever – choose whatever you like! It really is arbitrary for a person, looking back at history, to feel better about contemporary life because contemporary life infuses all that person’s experience while history’s something that must be learnt.

I’d say that learning history means learning whatever has lasted and been passed on because what has lasted and been passed on was deemed to have merit. We’re taught the history that has been deemed worth remembering. What I’ve found has been deemed worth remembering (i.e., the kinds of things I learned from History classes) are the general mood of the times, the disputes and fights that occurred (violent or academic), a select handful of the figures involved, and an inventory of whichever non-living innovations and technologies simultaneously arose alongside it all. If, later, we demerit and no longer pass on what has lasted up until then, passing on instead some different history, then that’s entirely indicative of us, now, versus anyone who came before us, and it reflects changed but not necessarily smarter or better priorities and values.

For me, we shouldn’t be saying we’re any smarter or better, only different. So much literature has lasted, so much art. Commerce has lasted, institutions have lasted, so much has lasted. Civilization has lasted. Cleverness, ingenuity, shrewdness, wit, insight, intellect, cunning, wisdom, kindness, compassion, deceit, pretence, honesty, so many many human traits – and they all transcend generations and eras. People vary, but human nature perdures. I’ll trust the experts, far more immersed in specific historical study than me, to identify slow or subtle changes in our traits – hell, I’ll even grant we may have culturally evolved, after all – and I can only imagine in how many ways the world is different now as compared to before now. But what does it mean to be better? Better than other people? Really? And who decides, and what’s the measure?

We can measure efficiency, for instance, so to say technology has advanced and is better than before is, I think, fair. Even then, an individual will have a subjective opinion – yours, mine, anybody’s – making culture not proactive and definitive but rather reactive and variable, a reflection, the result of comprised opinions that amplify what is shared and stifle what is not. As we’re taught merited history, you’re almost forced to concur, at least until we reconsider what has merit. That’s a sticking point because everyone will have an opinion on what is culturally better and what is culturally worse. Morality inevitably differs, and suddenly we have ethical debate, even disagreement, even discord. But to say people or culture are better, I think, is too subjective to rationalize and a questionable path to tread.

Consider this as well: we each know what we’ve learned, and as individuals, we’ve each learnt what we value. But what we’re taught is what’s broadly valued and, thereby, prescribed for all. We’ve all heard that rather hackneyed epigram, that those who neglect history are doomed to repeat it. Well, maybe the ones screwing up just didn’t learn the right history to begin with. I tend to abide by another hackneyed epigram, that they are wisest who know how little they know. Real historical wisdom and real historical understanding would be seeing and thinking and understanding as people earlier did. But short of firing up the Delorean for an extended visit some place, some time, it seems to me that judgments about history are made with an aplomb that might be better aimed at acknowledging our finite limitations. We’re no angels. If anything, this error of judgment speaks volumes about us. Condescension is what it is, but in my opinion, it’s no virtue.

We should hardly be judging the past as any less able or intelligent or kind or tolerant or virtuous as us, especially not if we aim to live up to today’s woke cultural embrace of acceptance. Being different should never be something critiqued; it should be something understood. Conversely, in proportion to how much we know, passing judgment is assumptive, and we all know what Oscar Wilde had to say about assuming (at least, we know if we’ve studied that piece of history). At the very least, we ought to admit our own current assumptions, mistakes, errors, accidents, troubles, disputes, and wars before we pass any judgment on historical ones. On that positive note, I will say that considering all this prompted me to notice something maybe more constructive – so often, at least in my experience, what we seemed to study in History class were troublemaking causes and effects, bad decisions, and selfishly motivated behaviours. Far more rarely was History class ever the study of effective decision-making and constructive endeavour – maybe the odd time, but not usually. Maybe my History teachers were, themselves, stifled as products of the system that educated them. What could they do but pass it along to me and my peers? Considering that, I might more readily understand how people, alive today, could conclude that all who came before were simply not as enlightened, as sophisticated, or as adept as we are now.

Yet that merely implicates contemporary ignorance: assumptions and mistakes still happen, errors still occur, accidents – preventable or not – haven’t stopped, troubles and disputes and wars rage on. If the axiom of being doomed to repeat history were no longer valid, we wouldn’t still feel and accept its truthful description, and it would have long ago faded from meaning. All I can figure is that we’re still poor at learning from history – the collective “we,” I mean, not you in particular (in case this essay was getting too personal). We need learned people in the right positions at the right times, if we hope to prevent the mistakes of history. Not enough people, I guess, have bothered to study the branches of history with genuine interest. Or, no, maybe enough people have studied various branches of history, but they don’t remember lessons sharply enough to take them on board. Or, no no, maybe plenty of people remember history, but the circumstances they face are just different enough to tip the scale out-of-favour. Epigram time: history doesn’t repeat, but it does rhyme. Or maybe we’re just full of ourselves, thinking that we’ve got it all solved when, evidently, we don’t.

It also dawned on me, considering all this, that high school “History” influences what many people think about broader “history.” My high school experience, university too, was mostly a study of politics and geography, and toss in what would be considered very elementary anthropology – all this as compared to those other branches of historical study. Archaeology and palaeontology come to mind as detailed, more scientific branches of history, but there are so many – literary history, philosophical history, religious, environmental, military, economic, technological, socio-cultural as I’ve already indicated, on and on they go, so many categories of human endeavour. I’ve even come across a thoughtful paper contemplating history as a kind of science, although one that is normative and susceptible to generational caprice. One final epigram: history is what gets written by the winners.

And that’s really the point here: throughout what we call human history, where we’ve subdivided it so many ways (right down to the perspective of every single person who ever bothered to contribute, if you want to break it down that far), it’s people all the way back, so it’s all biased, so it’s neither complete nor even accurate until you’ve spent oodles of time and effort creating a more composite comprehension of the available historical records. And, dear lord, who has time for that! History, in that respect, is barely conceivable in its entirety and hardly a thing to grasp so readily as to say, simply, “Back then…” History is people, and lives, and belief inscribed for all time. To know it is to know who lived it as well as who recorded it. Knowing others is empathy, and empathy is a skill trained and fuelled by curiosity and diligence, not emotion or opinion. Emotion and opinion come naturally and without effort. For me, valid history derives from informed empathy, not the other way around.

As far as recording history for future study, ultimately, it will have been people again recording and studying all of it, “it” being all of what people were doing to attract attention, and “doing” being whatever we care to remember and record. It’s all a bit cyclical, in itself, and completely biased, and someone will always be left out. So people might be forgiven when shaking their heads in judgment of the past because, without empathy, what else could they possibly know besides themselves?

The Latest Visual WHY

Click here to read the latest Visual Why.

Jean-Baptiste Regnault - Socrates Tears Alcibiades from the Embrace of Sensual Pleasure (1791)
“…could we ever know what art makes oneself better, if we were ignorant of what we are ourselves?”

“…a picture’s usually worth even more than the thousand words we commonly ascribe.”

The word “text” derives from Latin, texere, meaning to weave or fit together. For me, text connotes far more than just the printed word – photography, movies, music, sculpture, architecture, the list goes on and on. The Visual WHY offers a specific look at paintings, texts with no less substance and arguably far more aesthetic. But underpinning the textuality of art altogether is its human endeavour. And beyond weaving something together for the sake of weaving, weavers – artists, people – have a further end, communication. Artists across all media are still people with influences and motives for expressing themselves. Conjointly, texts of all kinds are plenty human: provocative and reflective. Whether rich and symbolic for a global audience, or doodled sketches for your own amusement, art is text, and text has purpose. As we try to understand it more thoroughly, we can’t help but raise the level of discourse. Who knows, someday maybe art will save the world…

For those who’ve been wondering about the painting featured both here and on this site’s front page, this latest update will explain that, too.

Click here to read the latest Visual Why.

Catch-22: A Masterpiece by Joseph Heller

WARNING! This post is an analysis and celebration of Joseph Heller’s novel, Catch-22, and it DOES contain PLOT SPOILERS. If you wish to read the novel for the first time, do not read this post.

“Give the first twelve chapters a chance” has long been my advice to anyone who asks about Catch-22, Joseph Heller’s modernist masterpiece that critiques the absurdity of the military during wartime. If you haven’t read the book, I will hardly spoil things by explaining how eagerly we witless first-timers set out to read such a lauded modern classic, only to be confronted by what might be the most frustrating paragon of show-versus-tell in existence. (However, I will be discussing spoiler details from here on, so be warned.) From the seemingly disparate chapter titles to the disjointed narrative, which repeatedly folds back upon itself, from a maddeningly mirthful plot device, which tempts you to toss the book aside and deny its existence, to an irresolute closing – if you make it that far – the book continually challenges readers to deduce what’s happening and piece together what’s happened. Toss in what seems like an endless cadre of characters, ranging from odder to oddest to perhaps not so odd, the book is a challenge, no question.

For seven years, I assigned this book as summer reading for returning seniors. Oh, how the students complained about those twelve chapters – excessive! pointless! irritating! – only to feel more aggrieved at hearing, “Exactly,” my necessary reply. Once the venting subsided – usually at least half the first lesson – we’d begin discussing why Heller’s book could only be written this way as compared to some more conventional, accessible way.

For one thing, we need to meet the protagonist, Yossarian, and understand his circumstances so that, at appropriate upcoming times, which of course will have already occurred, we won’t criticise but will instead favour him. To this end, the entire story is told out-of-sequence, opening apparently in media res during Yossarian’s hospital stay. We have character introductions and letter censoring, foreshadowing how words and language will be manipulated while characters will be isolated, alienated, and demeaned. Subsequently, we learn the logic of Catch-22 from Doc Daneeka. And that Snowden dies. If we’ve navigated the twelve opening chapters and lived to tell about it, we learn that Yossarian, originally a young, excited airman, once needed two passes over a target in order to bomb it successfully, which gets his crewmember, Kraft, killed. Yossarian is further distressed upon returning when he receives a medal for the mission. Meanwhile, Milo opens his syndicate. The tension of tedium, the injustice of fortune. The folly of command, the depravity of humankind. Capping the story is the gruesome account of Snowden’s death, the key incident that incites Yossarian’s fear and lands him in hospital, where we first meet him – naturally, Heller waits until the end to tell us the beginning.

Heller writes with an absurd, illogical narrative style that characterises Yossarian’s internal eternal predicament, wending its way through isolation, alienation, discord, misery, paranoia, fear, senselessness, deception, vice, cruelty, even rape and murder. Catch-22 being what it is, its victims have zero-chance to overcome because the antagonists are permitted to do whatever the protagonists are unable to prevent. All along the way, Heller has Yossarian wanting out of the military (fly no missions = live), and he continually ups the ante between Yossarian and all the disturbing confrontations and contradictions that antagonise him, from his enemies and his commanders to his acquaintances and his comrades. But ultimately, and most potently, he has Yossarian suffering from his own self-interest. As the narrative flits and tumbles about, in its own progressive way, Yossarian’s self-interest evolves or, better to say, devolves. What does evolve, inversely to self-interest, is his compassion as he gradually grows more concerned for the men in his squadron, and which by Chapter 40, “Catch-22,” has extended to all innocent people beset by oppression, prejudice, and exploitation. So when Colonel Cathcart’s promised deal to send him home safely, definitely, comes ironically (fittingly!) at the expense of the squadron, Yossarian ultimately recovers enough self-reliance to overcome his personal anguish but not enough to remand himself to the cycle of absurdity. Given Heller’s dispersed timeline, describing Yossarian’s character development as a narrative arc or an evolution is less accurate than the piecing together of a jigsaw or the unveiling of a secret.

Perhaps unsurprisingly, Yossarian’s instinct for self-preservation is the source of his personal torment. His despondency and disgust over the preponderance of all human self-interest finally turn Yossarian’s decision to go AWOL, at criminal risk but personal safety. That such a climax works is because readers – like Yossarian – are no longer fighting back but giving in, yet even then Heller offers no respite – the story ends ambiguously, leaving readers to satisfy their own vexation. Even so, I suspect that Heller appreciated John Chancellor’s life-imitating-art intiative as one inspired by more than a spirit of fandom. So where some characters have been subjects of compassion, others agents of absurdity, readers’ resultant responses have also undergone a perfectly natural evolution, mirroring Yossarian’s character development and culminating with his terrifying walk through Rome. The horrors of “The Eternal City,” in this light, are not only an essential but an inevitable piece in Heller’s plan.

Yossarian’s shall-we-say militant decision to desert is borne of Snowden’s ugly death during the Avignon mission, only a week after the death of Kraft and the award of Yossarian’s medal. Seeing Snowden’s innards spilling rudely from his body nauseates Yossarian and haunts him throughout the entire (or, from Yossarian’s perspective, for the rest of) the story. Yossarian, inset by Heller on behalf of soldiers as a protagonist, has no way of making things better. His futile effort at comfort, “There, there” (p. 166), is comically insincere for its honest helplessness, an understated shriek from all soldiers continually sent to face death – not death without context but without resonance. However, for Yossarian and his comrades, the context of sacrifice is all too irrationally clear: thanks very much. Catch-22. Soldiers face the dilemma of following orders that entirely devalue their very existence.

Participation as a soldier offends Yossarian to the core, yet it also helps him to reconcile his fear over death: “… man is matter,” finite, mortal and – without spirit – simply “garbage.” In fact, this sentence sums human worth as a blunt statement: “The spirit gone, man is garbage” (p. 440). Six words of sad, harsh consequence, war, no longer wearing a comic mask. The absolute phrase, a terse syntactical effect, annuls man’s significance – spirited briefly, gone abruptly, an empty corporeal body left over, garbage. Garbage is a harsh image – rotting flesh, buzzing flies, scum, residue, stench. Pessimism, cynicism, worthlessness. On such terms, one wonders whether anyone might willingly die to save themselves, as it were, another troubling revelation engineered by a masterpiece of unprosaic illogic. Yet even on this point, Heller’s genius is flawless. Haunting though it is, Snowden’s death gradually reveals to Yossarian the very path to life and safety that he has pursued ever since the opening chapter in the hospital – which is to say, ever since Snowden’s death drove him there in the first place.

This is why Heller refers to Snowden’s death, specifically his entrails, as a “secret” because to reveal it any earlier would be to end the novel. And he calls it Snowden’s “grim secret” to illustrate Yossarian’s suppressed mental anguish. Heller has Yossarian recall Snowden a number of times, each admitting more detail, each growing more vivid, each driving him a little closer to his final resolution. Heller’s portrayal of Yossarian’s traumatised memories in this way suggests the nightmarish flashbacks that people, particularly soldiers, endure following the horrors of war. His final flashback in Chapter 41, “Snowden”, is prompted when Yossarian wards off the mysterious stranger in – where else? – the hospital. It’s most revelatory for Yossarian – and readers, by extension – because, here at the end of his patchy, appalling flashbacks, he is finally secure enough to divine for himself – or is it to admit to us? – the grim secret found in Snowden’s entrails. In the same way, the climax is most revelatory for readers who – at the mercy of Heller’s dispersed narrative structure – have been made to wait until the closing, when the time is finally ripe.

To get there, we are dragged unwittingly by Heller down a path of frustrating sympathy, illogical absurdity, and agonising anticipation. By the time Yossarian is introduced (in the opening chapter!) censoring letters and conniving a way to escape the war, he is that much nearer to desertion than we can yet know. Certainly, Snowden will convince us to desert as surely as he convinces Yossarian, but that will happen later, after Heller has aggravated our tolerance and mottled our innocence. Heller must drag us down Yossarian’s agonising path, or else he places us at risk of passing premature judgment upon not merely his protagonist but his entire message. Finally, when the moment arrives that we gather full appreciation of Snowden’s death, we have all we need to share in the vindication of Yossarian’s desertion.

So here is our way to grasp the grim secret behind the novel’s dissembling structure as restlessly and imperturbably as Yossarian does: the root of conflict, Snowden’s death, can only occur at the end of Heller’s narrative path, not Yossarian’s. The story simply works no other way.