Well, you had to know this one was coming… a meditation upon Hamlet.
This meditation, though, also happens to be a treatise on curriculum. I wrote this essay last year for a course I took with Dr William Pinar, who is Curricular Royalty on top of being a super guy. And, like me, he taught secondary English, so I felt I had a sympathetic ear.
Dr Pinar’s course was driven by chapters he was writing for a book about George Grant, who was (among many things) a philosopher, theologian, educator, and Canadian nationalist. Dr Pinar’s book is about Grant’s critique of time, technology, and teaching.
The series of posts, “A Kind of Certainty,” comprises my final paper, in which I attempt to present Hamlet, the character, by way of the same treatment that Dr Pinar presents Grant. That said, I don’t address technology here (although I do address it here and here), focusing instead upon teaching and curriculum, and granting due respect to the concept of time.
I debated how I might present this essay, whether to revise it into something more suited to the style and structure of my other blog posts. But it just proved far too difficult to change or remove anything without drastic revision, essentially having to rewrite the entire paper, so here it is in academic trim… citations, endnotes, and all – Dr Pinar is a big fan of endnotes, by the by, so that’s the explanation there.
I taught Hamlet in English 11. During what typically lasted five months, we considered, among other concepts, certainty and faith. One example of mine to illustrate these was to ask a student why she sat down so readily on her classroom chair. She would be puzzled and say something like, “Huh?” My reply was to note how much faith she evidently placed in that chair to support her without collapsing. Then she would laugh, and I would ask further whether she knew the manufacturer, or the designer, but of course she knew neither. Then I would ask how many other chairs that week had collapsed beneath her, and (apart from one, unfortunately!) the reply would be, “None.” My point, of course, grew clearer to everyone as this conversation progressed, so my next question was to the class: “How many people rode in a vehicle sometime this past week?” Once most confirmed it, I would ask the same basic question as that of the chair: how were you certain that vehicle was safe? I was more tactful where it came to car accidents, usually using my own spectacular examples (… I have two). Ultimately, my claim was that we might have as much as 99% certainty, yet for whatever doubt exists, we rely on faith or else we would never sit in chairs, or drive in cars, or whatever else. As my tone grew more grave, so did their nods and expressions, as if we ought to be dropping Hamlet to study car mechanics, or industrial first aid.
My students were typically alarmed when they realised their faith was only as certain as its object, be it a sturdy or rickety chair. Where extremes present themselves rather obviously, even so, in any case of such offhanded faith, we make ourselves collateral. As if we live on credit, certain that all will remain as it has done, we borrow on faith against our future well-being until it comes time, as it says in the fable, to pay the piper. Meanwhile, what seems certain to us we literally take-for-granted, begging the question with impunity, I suppose, since every day the sun continues to rise. Everyday, we overlook the caution, familiar to investors, that past performance does not necessarily indicate future potential, or as they say in the casino, the House never loses.
Maybe we never stop to consider just how loosely we play with certainty and faith in our day-to-day because doing so might mean never again stepping outside the door – no sense everyone being as hamstrung as the Prince of Denmark. Having studied the play as much as I have, I find every one of its concepts up for debate – arrghh – and where certainty and faith can actually seem either opposed or synonymous, that determination depends on yet another concept from the play, perspective. In any case, where it comes to certainty and faith – at least from my perspective – Hamlet is particularly instructive.
No matter your perspective, I would warn students, no matter where you stand or land, the play will then present you with a challenge of certainty, something I called the “Yeah, but…,” which was naturally a source of unending frustration. Conversely, and ironically, it was also a source of certainty since, like Hamlet in duplicitous Elsinore, at least we can be certain that everybody else thinks, shall we say, uniquely, if not differently. Hamlet’s return home to the web of Catholic Elsinore from the symbolic bastion of Lutheran reform, Wittenberg, on account of his father’s death, finds him divided not unlike the Elizabethans comprising Shakespeare’s audience, caught between two branches of Christian belief. The Bard besets his tragic hero with a matrix of inner turmoil – both secular and spiritual, of fealty and faith – a tesseract of beliefs such that Hamlet cannot reconcile any one to another, even as he quakes yet pines for some grand repose. For each possible value he might set down in his tables, his same self-assurance prompts Hamlet to pose questions more profound, rendering him unable to decide about, well, anything. Doubting that anyone can even interpret what it means to exist and, thereby, doubting that concern over living, or dying, or even debating the question is worthwhile, Hamlet, like the actors he so admires, effectively stands for nothing. As such, I admitted to my students, he was hardly an exemplary role model.
So, I suggested, to avoid the debilitating trap that befalls the brooding Prince, that of “thinking too precisely on the event” (Shakespeare, 1997, 4.4.41), we must simply and ultimately decide what we believe after having drawn such conclusions from the best available evidence. Easily said, yet is this not exactly what Hamlet is trying to do? Little wonder students find him so frustrating. Then again, I pointed out, all our sighing and huffing is its own judgment call, a very palpable hit borne of the frustration of those who are upset with him. With Hamlet’s inability to decide for most of the play comprising most of the play, and with him chastising his own cowardice and rebuking God-given reason as a consequence (2.2.571-580, 4.4.36-39, 43), a spendthrift sigh of our own is hardly unreasonable. On the other hand, observed one student, well on her way to modern material success, he sells tickets. Unquestionably, yes, Shakespeare made a meal of Hamlet making a meal of things. And, even though he doomed his protagonist from the start, the playwright does release Hamlet from his torturous hamster wheel – mercifully? – just before he meets his grand moment of truth.
Throughout the play, Shakespeare includes what I call “Let…” statements. Of particular significance are the following four statements, presented here in sequential order:
Of Claudius’s machinations, Hamlet tells Gertrude to “let it work” (3.4.205)
Exacting vengeance for his father’s murder, Laertes will “let come what comes” (4.5.136)
Having finally made peace with the certainty of death as well as the uncertainty of what lies beyond, Hamlet tells himself (alongside Horatio) to “let be” (5.2.224)
Later, as Horatio confronts doubts of his own, Hamlet tells him to “let go” (5.2.343)
Alternatively arranged, these statements help comprise, for me, a response to the famous question, “To be, or not to be.” This alternative arrangement derives from a sentence analysis exercise that my students and I would complete while preparing for the play. The sentence is from an essay by Drez (2001) about American pilots during WWII: “There were no souvenirs, but the grisly task of scrubbing decomposing remains from their boots later left a lasting memory” (p. 144). Briefly, the words later, left, and lasting illustrate the creation and the span of the airmen’s memories over time – the future, past, and present, respectively – made all the more ironic since the souvenirs they found were hardly the ones they sought. Using these three words alongside my own interpretation of each “Let…” statement, I have arranged them chronologically out-of-sequence with the play, using instead an interpretive application of temporality as three discrete periods to challenge the common concept of linear time as historical calendar pages or a ticking clock.
Click here to read Pt II: Curriculum, or What You Will
 Shame on us for carrying on so fallaciously! At pedestrian-controlled stoplights, we eventually step off the curb believing that drivers have halted their oncoming vehicles rather than carrying on through and running us down. To call the stoplight “pedestrian-controlled” is somewhat of an embellishment on the part of the city engineers, I think, a deferral to who really is favoured, for whatever reason, in the equation. But for the pedestrian to step off the curb is an act of faith, surely, since they abrogate control to the driver who has the car’s capability to accelerate and manoeuvre at his disposal. For that brief moment, only the driver’s motives keep the pedestrian safe. And careful though we are, accidents still happen in such everyday circumstances. Worst of all, as more recent times demonstrate, cars and trucks can be used precisely as weapons of terror against innocent people; the danger I speak of, the giving-and-taking of control, however uncommon, has now been realised. That changes attitudes profoundly.
Security measures, safety audits, protective equipment, government regulations – on and on goes the list of processes and people in which we place our faith, believing with some degree of certainty – or, as often as not, taking for granted on faith – that proper standards are being met that ensure our safety.
 Just my interpretation, mind you, “duplicitous Elsinore.” Certainly, you will have your own analysis.
 Since the time of those events described in the New Testament, their interpretation has divided Christian belief into myriad denominations, such as those found in both Shakespeare’s play and Elizabethan England: Catholicism and two respective branches of reform, the Protestant Reformation initiated by Martin Luther and the English Reformation decreed by King Henry VIII. I simply use “Christian belief” in a broad sense, wanting to avoid the suggestion that any particular denomination tops some hierarchy, since that sort of debate, here, is beside the point.
 For the duration of the essay, I shall refer to quotes from this cited edition of the play.
 Regrettably, but unsurprisingly, I’m hardly the first to devise this response to the famous question. Evidently, where my approach differs from other examples (Baumlin & Baumlin, 2002; Critchley & Webster, 2011) is connecting the four specified “Let…“ statements and Hamlet’s closing lines (5.2. 222-223, 358) with concepts of temporality.
 A full explanation of the four “Let…” statements and temporality demands its own essay, and I am already deep enough into Hamlet as it is, so for my weary negligence I ask some gracious leeway instead of a challenging “Yeah, but…”. Suffice to say, though, as we might feel this way or that about past or future, we still must inherently live each present moment, such as we are.
I used to say to my students, “Find the overlap between our English coursework and, say, Trigonometry, or the link from persuasive writing to PhysEd. Where does Hamlet end and organic chemistry begin? Find that one out… there’s genius in that.” The courses my Department offered were called “English” and, helmed by some teachers, they were more traditional, as one might expect. The most common feedback I received from students, though, was how unlike English our coursework seemed to them. I took those remarks as a measure of success: my aim was to prepare young people, soon enough entering the world as older people, to be responsible… to families, communities, careers, and so forth. For me, that’s the purpose of school and its teachers.
What prompted me to reflect was reading Science, Order, and Creativity, by David Bohm and F. David Peat – specifically, such remarks as “the appropriate relationship between thought and experience… [in which] creative new perceptions take place when needed” (p. 49). That distinction between thought and experience reminded me of another distinction, this between dialogue and conversation. And again I was prompted to recall my English courses – what we had, I’d say, were definitely conversations, scratching new surfaces and digging into things with fluid spontaneity, as compared to the “my turn / your turn” protocol of dialogue, which might dig one trench but deeper and deeper. Where dialogue strikes me as instrumental, a means to an end, conversation is an end in itself, without start or finish but continual – that is, until the bell rings. We notoriously lived beyond the rigour of scheduling in some of my courses.
Those conversations were hard to let go. And what exactly were we after? “The creative person does not strictly know what he or she is looking for,” say Bohm and Peat. “The whole activity [is] play itself,” and no better description of teaching (at least, my teaching) have I ever read. Who knew I was so creative? Not me although I did have fun. So who knew teaching was just so much play? “The play’s the thing / wherein I’ll catch the conscience of–” well, anybody, really. I should clarify that I respected my colleagues and our Departmental philosophy as well as my professional obligation to Ministry curricula. At the same time, I relied on my own interests and concerns to guide our coursework, by day and by year. The result was a mixture of reading, discussion, writing, and presenting about topics as disparate as literature, film, fine art, civics, politics, economics, philosophy, etymology, all manner of topics – yes, even science and math – all bundled together in a process of classical rhetoric. Eventually, I developed a suitably disparate canon of texts, too, that flowed meaningfully from English 9 through 12. And I relied on students’ differences to alter and adjust the flavour however they might. I loved teaching for how creative it allowed me to be, and for how much creativity it provoked in my students. “Let come what comes,” Laertes tells Claudius – brazen, even foolhardy. Genius, perhaps?
Bohm and Peat seem to suggest that genius is not creativity per se so much as the effect of having challenged some assumptions, and maybe that’s mere semantic distinction. Either way, I like the notion. Later, reading Allen Repko, I found myself nodding likewise at what he calls “boundary crossing” (p. 22). There it was, this discovery of common threads in disparate disciplines, this crossing of amorphous boundaries, what my students have heard me call “genius” although I might now redefine that trait as “ingenuity.” Accompanying “boundary crossing” is a reaching across disciplines, with intent, what Repko calls “bridge building.” This, I think, I would call visionary. Discovery and vision, both what I would forever consider, as a teacher, to be meaningful developments of the learning process.
Repko also points out the origin of the word, “discipline,” deriving from the Romans and their need to “relate education to specific economic, political, and ecclesiastical ends” (p. 32). How delightfully Roman! I thought, reading that. Such instrumentalism, “the logic of utility.”Finis at its finest: How long, O Lord! Will their legacy never end? But I trust in teaching and my unfailing students.
I enjoyed sixteen years teaching Secondary English to brilliant students. In that time, we developed a philosophy, addressed the BIG Questions, and fed our curiosity. But my planning process was seldom more than make-it-up-as-we-go. “We could never get away with this in Math,” I used to say to them, “although if you do find a way, I’d love to hear about it.”
So, it’s interesting, listening to people talk these days, quite frankly, in terms of their words, their language, their speech. I have an issue with what everyone’s saying – not like everyone everyone but, you know, it’s just their actual words when they talk about complex issues and such, or like politics, what with the whole Trump thing, you know, that Russia probe and the Mueller investigation and everything that goes with that. I’m also a bit of a news hound, and that’s really where I started noticing this on-air style of speeching, of making it sound thoughtful and taking them seriously.
And it’s so much out there, like an epidemic or something, which is interesting, which speaks to on-line streaming and TV news, talk radio, and pretty much the whole 24-hour news cycle. I was a high school English teacher for sixteen years, and I also started noticing all this, you know, frankly, during class discussions, too. And there was me, like guilty as anyone.
Here’s the thing, though, because I guess substance will always be up for debate, but that’s just it – it’s so wide-ranging that it’s like people have no idea they’re even doing it, which is interesting. It’s almost like it’s the new normal, which really begs the question – are people getting dumber? Is education failing us? In terms of intelligent debate, that will always be something that probably might be true or false. And let’s have those conversations!
But in terms of intelligible debate, it’s interesting because, when I listen to how people are talking, it gets really interesting because when I listen what they actually say, it’s like they’re making it all up on the spot in the moment as they go, so it’s just that that makes me not as sure it’s intelligent as it’s less intelligible. But it’s all in a sober tone, and they’re just expressing their opinion, which is democracy.
And that’s the thing – if you challenge anybody with all what I’m saying, clarity-wise, it’s interesting, they’ll get all defensive and whatnot, like it’s a personal attack that you’re calling them stupid or whatever, like you’re some kind of Grammar Jedi.
And, I mean, I get that. So that’s where I think people don’t really get it because I totally get where they’re coming from.
Seriously, who would want to be called like not intelligent or anything all like that, whatever, especially if we’re trying to discuss serious world issues like the whole Russia thing that’s been happening or the environment or all the issues in China and the Middle East? Or terrorism and all? I mean, if you look at all that’s happening in the world right now, but you’re going to get that detailed of the way someone talks, maybe you should look in the mirror.
And I mean, SNL did the most amazingggggggggg job with all this, back in the day, with Cecily Strong on Weekend Update as The Girl You Wish You Hadn’t Started A Conversation With At A Party. Comedy-wise, she even like makes a point, but basically, she’s furthering on intelligence, except I’m talking about intelligibility. But still, if you haven’t seen it, what can I tell you? Your missing out, SO FUNNY. She. Is. Amazing.
And that’s the other thing, and this one’s especially interesting, is just how there’s just SO MUCH out there, what with Google and the Internet, and Wikipedia and all, so who could possibly be expected to know like every single detail about all the different political things or the economy and all the stuff that’s out there? And it’s even more with speaking because pretty much most people aren’t like writing a book or something. (W’ll, and that’s just it – nobody speaks the way they write, so… )
Anyway, so yeah, no, it’s interesting. At the end of the day, first and foremost, one of the most interesting things is that everybody deserves to have a say because that’s democracy. And I think that gets really interesting. But the world gets so serious, probs I just need to sit down. See the bright side, like jokey headlines from newsleader, Buzzfeed, or 2017’s “Comey Bingo” from FiveThirtyEight. Gamify, people! News it up! Nothing but love for the national media outlet that helps gets you wasted. Or the one about the viral tweet, for an audience intimately familiar with pop culture? News should be taken seriously, and the world faces serious aspects, for sure. But the thing is, work hard but party harder! I mean, we’re only here for a good time, not a long time!
And it’s interesting ‘cuz people seem to require more frequent, more intense, more repeated engagement, to spice up their attention spans. There’s some good drinkinggames, too, on that, because politicians! I know, right? But not like drunk drunk, just like happy drunk, you know? Not sure if all this counts as it means we’re getting dumber, per se, but it’s just interesting.
So, yeah, it’s interesting because we’ve come such a long way, and history fought for our freedom and everything, so I just really think going forward we should just really appreciate that, and all, you know?
What is the story? In a nutshell, Facebook permitted C.A. wide access to its users’ private data, without their consent. Authorities now suspect that C.A. used the data to affect political influence in various countries around the world, specifically by way of on-line advertisements and news stories, both factual and contrived. People were fed information tailored to appeal to them and thereby challenged to discern the factual from the fictional.
Here are a sampling of reports about the story and its fallout, some of which has been severe.
Raising the level of discourse is all about discerning and appreciating peoples’ motives, thereby helping to determine what people are after, in order to understand why they do what they do.
To my students, I am confident that our coursework has prepared you to face the kind of cognitive assault launched by C.A. on people across the world. For my part, I’m comfortable keeping my Facebook account open… for the time being, anyway!
To everybody else, yes, absolutely this post is a plug for raising the level of discourse, an approach I encourage all of you to consider.
Read more about my own general take on Facebook here.
A needfully challenging appeal to raise the level of discourse, and an appropriate inclusion to The Rhetorical WHY, from an Opinion piece in The New York Times (Feb 22, 2018) by Op-Ed columnist, Bret Stephens:
“This is the text of a lecture delivered at the University of Michigan on Tuesday [Feb 20, 2018]. The speech was sponsored by Wallace House.
“I’d like to express my appreciation for Lynette Clemetson and her team at Knight-Wallace for hosting me in Ann Arbor today. It’s a great honor. I think of Knight-Wallace as a citadel of American journalism. And, Lord knows, we need a few citadels, because journalism today is a profession under several sieges.…” [continue reading]
Some thoughts of my own on the significance of a free press to our lives…
Arendt: “… how can you have an opinion if you are not informed?”
Everybody has opinions – our five senses give us opinions. In order to be “informed,” we need discernment enough to detect accurate information.
Arendt: “If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer.”
For me, continual lies ultimately yield zero trust, but again, how would I know who’s even lying, but for my own discernment and experience?
At the least, if I were aware that all around were lies, that much I’d know is true. It’s not that “nobody believes anything any longer,” so much as it’s “everybody goes about searching out truth on their own.” The downside is when those individual searches for truth become disrespectful, as we’ve seen lately, or worse, chaotic.
Nevertheless, investigate! Accept responsibility to inform yourself. Accept or believe all with a grain of salt until such time as you can prove to your own satisfaction who and what are trustworthy. And, at that point, be tolerant, if not respectful, of others – this applies to everybody, all sides, liberals and conservatives and all points between. Taking the high road is not to be done with pride or smug assurance. It’s easy to nod and say, “I already do while others do not,” but even so, reflect upon yourself with each conversation, each debate, each exchange.
Open-minded and open-hearted – both are virtues, but they don’t have to be the same thing.
Arendt: “If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer.”
On its face, this statement could only be accurate if you had some clairvoyance or a crystal ball.
By “everybody” doing their own investigation and accepting responsibility to inform themselves, I mean everybody. We’re able to trust news & media sources to the extent that they have lived up to their responsibility… to the extent we’re aware that they have. I support proper, professional investigative journalism and public intellectualism, both of which I gather to be in decline.
Finally, I offer two sets of remarks about journalism by two long-retired anchor-journalists of PBS fame, partners Robert MacNeil and Jim Lehrer. The first is transcribed from an exchange between them during a tribute to MacNeil upon his retirement in October 1995. The second – comprising two parts – is Lehrer’s closing words upon the “retirement” of his name from the title of the PBS NewsHour, on December 04, 2009. Following that, I’ve included a thoughtful follow-up by the PBS Ombudsman, Michael Getler, published the next week on December 11.
MacNeil’s remarks upon his retirement (October 20, 1995)…
MacNeil: You know, I’m constantly asked, and I know you are in interviews, and there have been a lot of them just now – I’m constantly asked, “But isn’t your program a little boring to some people?” and I find that amazing, because, well, sure, it probably is, but they’re people who don’t watch. The people who watch it all the time don’t find it boring, or they wouldn’t watch.
Lehrer: That’s right.
MacNeil: And it’s the strange idea that’s come out of this medium, because it’s become so much a captive of its tool – as its use as a sales tool that it’s driven increasingly, I think, by a tyranny of the popular. I mean, after all, you and I’ve said this to each other lots of times – might as well share it with the audience: what is the role of an editor? The role of an editor is to make– is to make judgments somewhere between what he thinks is important or what they think is important and what they think is interesting and entertaining.
Jim Lehrer’s guidelines of journalism (December 04, 2009)…
Lehrer: People often ask me if there are guidelines in our practice of what I like to call MacNeil/Lehrer journalism. Well, yes, there are. And here they are:
* Do nothing I cannot defend.
* Cover, write and present every story with the care I would want if the story were about me.
* Assume there is at least one other side or version to every story.
* Assume the viewer is as smart and as caring and as good a person as I am.
* Assume the same about all people on whom I report.
* Assume personal lives are a private matter, until a legitimate turn in the story absolutely mandates otherwise.
* Carefully separate opinion and analysis from straight news stories, and clearly label everything.
* Do not use anonymous sources or blind quotes, except on rare and monumental occasions.
* No one should ever be allowed to attack another anonymously.
* And, finally, I am not in the entertainment business.
Here is how I closed a speech about our changes to our PBS stations family last spring:
‘We really are the fortunate ones in the current tumultuous world of journalism right now. When we wake up in the morning, we only have to decide what the news is and how we are going to cover it. We never have to decide who we are and why we are there.’
I am struck by the continuity of their respective final comments, about entertainment – each, in his own way, seeks to distance journalism from vagary, each thereby implying that we are susceptible to emotional or whimsical tendencies, which evidently seem capable of overtaking our focus to learn; otherwise, why mention the point at all?
Watch Lehrer’s remarks here, in a functional if awkward series of video archives of that 2009 broadcast.
In May 2011, upon Lehrer’s retirement, MacNeil returned to offer his own reflections upon his friend and colleague that include some further worthwhile commentary upon contemporary TV journalism
I recall “Lehrer’s Rules,” as they were called, making a small stir – some of it more substantive, meaningful, and some the critical “woe-is-Us” lament at the passing of favourite things. In amongst it all, as I mentioned, were the following comments from PBS Ombudsman, Michael Getler, which I include here, at length, on account of PBS webpages’ tendency to disappear.
In fact, a number of the PBS pages where I found these articles are no longer active – where possible, I have checked, updated, and even added weblinks. But I believe Getler’s comments, like the rest, are worth preserving, on account of their potential to provoke us to think and learn more about a free press and its relation to ourselves.
A couple of people wrote to me in the aftermath of that Dec. 4 sign-off to say how much they liked Lehrer’s guidelines and asked how they could get a copy. That’s why they are reproduced above. A subscriber to the widely-read Romenesko media news site also posted them there on Dec. 6 and they also were posted on the campus site of the Society of Professional Journalists (SPJ). “Whether you agree with all of Lehrer’s guidelines, or not,” that posting read, “he has surely earned our attention.”
That’s certainly true in my case. I’ve also been a devoted watcher of the NewsHour in all of its evolutions during most of the past 30-plus years, long before I took on this job four years ago. Although segments of the program have been the subject of critical ombudsman columns on a number of occasions, I’ve also said many times that it remains the best and most informative hour of news anywhere on television, and it has never been more important. I follow the news closely but almost always learn something from this broadcast every night.
Boring, at Times, But a Luxury Always
Sometimes, of course, it can seem boring. Sometimes the devotion to balanced he said/she said panel discussions can leave you frustrated and angry and no smarter than you were 15 minutes earlier. Sometimes the interviewing is less challenging than one might hope. But the luxury of an uninterrupted hour of serious, straight-forward news and analysis is just that these days, a luxury. And, in today’s world of media where fact and fiction, news and opinion, too often seem hopelessly blurred, it is good to have Lehrer – clearly a person of trust – still at work.
I had the sense when he added his guidelines to that closing segment last Friday that the 75-year-old Lehrer was trying to re-plant the flag of traditional, verifiable journalism that he has carried so well all these years so that it grows well beyond his tenure – whatever that turns out to be – and spreads to all the new platforms and audiences that the contemporary media world now encompasses.
Oddly, I did not get any e-mail from viewers commenting on the new NewsHour format, other than one critical message that said “do not post.” Maybe that’s a good sign since people usually write to me to complain.
Make no mistake, the now defunct NewsHour with Jim Lehrer is still quite recognizable within the new PBS NewsHour. So those who wrote earlier and said they didn’t want any change won’t be terribly disappointed. I, personally, found the first few days of the new format and approach to be a distinct improvement. The program seemed to have more zip and energy, faster paced, with good interviews and without the always predictable language that introduced the show in the past. It presented its news judgments more quickly, benefitted from the early introduction of other top staff members as co-anchors, and from the introduction of a promising “new guy,” Hari Sreenivasan, a former CBS and ABC correspondent who presents a headline summary from the newsroom and is the liaison to an expanded NewsHour Web operation.
Now, just to keep this a respectable ombudsman’s column, let me add a few quibbles when it comes to Lehrer’s rules, as posted above.
First, one of the interesting things about American journalism is that there are no agreed-upon national standards, no journalistic equivalent of the Hippocratic Oath for physicians. There are, of course, many universal values and practices that vast numbers of journalists have voluntarily adhered to generally for many years, best exemplified by SPJ’s Code of Ethics. But the fact is that all major news organizations – from the Associated Press to the New York Times to PBS and CBS – have their own guidelines and standards that they try and live by. And they all have their differences.
Naturally, a Few Quibbles
Lehrer’s guidelines embody lots of the good, praiseworthy stuff, and we come out of the same journalistic generation and traditions. But I think on a couple of points they are actually too nice, too lofty, cruising somewhere above some of the grittier realities of journalism.
For example, “Assume the viewer is as smart and as caring and as good a person as I am. Assume the same about all people on whom I report.” Really? Bernard Madoff? Osama bin Laden?
Then there is: “Assume personal lives are a private matter, until a legitimate turn in the story absolutely mandates otherwise.” I would argue, and have, that the NewsHour withheld from its viewers at the time a legitimate turn in a major story – reported by all other major news organizations – last year when it declined to inform them that a former senator and former candidate for the vice-presidency, John Edwards, issued a public statement and went on ABC Television to acknowledge that he had had an extra-marital affair with a woman who had been hired by his political action committee to make films for his campaign. That’s news.
Finally, there is, “Do not use anonymous sources or blind quotes, except on rare and monumental occasions.” I agree about the blind quotes when they are used to attack someone personally. But anonymous sources have often proved to be absolutely crucial to the public’s right to know what’s really going on in scores of major stories as they have unfolded from Watergate to secret CIA prisons overseas.
The most accurate and important pre-war stories challenging the Bush administration’s on-the-record but bogus case for Iraqi weapons of mass destruction were based on anonymous sources. Many of those stories, in part because they were based on anonymous sources, got buried or underplayed by newspapers at the time. Many of them never got reported at all on television, including the NewsHour. But there are times when there are mitigating circumstances – like internal threats within an administration or maybe jail time for leakers – when some sources must remain anonymous and when editors need to trust their reporters. And often you don’t know if the occasion is “rare and monumental” until it is too late. Pre-war Iraq, again, being Exhibit A.
As far as I understand Jacques Derrida’s différance, he observes that we understand our experiences as distinctive, but not exhaustive, communicated links or marks comprising an on-going decisive chain of experiential moments. As to the language we use to describe our experiences, a word has contextual meaning, both from its usage at any given time as well as from its etymology over the course of time. I tend to agree with this attendance to context as furnishing meaning, and I can also spot the rabbit hole that it poses. For example, to understand some word’s definition, I might look it up in the dictionary and be left to rely upon the definition of whomever decided what it meant while, at the same time, face all sorts of words in the definition that now need looking up, too – Sisyphean, indeed! Cruel but so usual. On the other hand, thanks to whomever for compiling the dictionary, a pretty utile compendium, I have to say.
To be clear, I am not intending to invoke logocentrism, by which all our words are accorded a decided meaning from a cultural centre, which propagates existing biases or “privileges”; Derrida would roll over in his grave. Granted, I may already have laid grounds here to be accused of logocentrism, myself, by writing with words (and I confess to using English because I didn’t think anyone had the patience to muddle over Wingdings). My present aim is to suggest how we might address the afore-mentioned rabbit-hole dilemma by searching for or (… almost afraid to say it) by decidingupon some definitions of our own. Not like a dictionary, but more like– well yes, okay, like a dictionary, but one that we’ll fashion from the ground-up, like when the light bulb would go on above Darla’s head, and Spanky would snap his fingers to say, “Hey, everyone! Maybe we can put on a play!” So, in the spirit of dissemination, hey everybody, maybe we can compile a dictionary! A real, deconstructive, crowd-sourced dictionary!
I’m not really compiling a dictionary. I’m just trying to make some sense of Derrida and différance. Let me try to illustrate what I mean from my own experience. Sometimes I play Walking Football, a version of the game where players are not permitted to run. Naturally, the debate is over what differentiates walking from running. We’ve agreed that walking means “always having at least one foot in contact with the ground during the striding motion.” Running means “having both feet leave the ground at some point during the striding motion.” This makes for certainty, and so long as our eyes are trained enough to spot feet in motion, which I can spot sometimes so clearly, with such immediacy, that its more like I’m watching, not playing – I’m ghinding it tuff even now to ghet the right words, but trust me. And so long as each player is willing to obey the rules – and, ohh my, there’s always that one player who just won’t. You know who I mean… *sigh… Anyway, so long as they’re not just words uttered that then float away in the breeze, our definitions of the rules for walking and running are useful.
Luckily, too, I might add, when we clarify the rules, we do so out loud, together, and don’t whisper it around in a circle, like when my daughter plays Telephone at a birthday party – after all, we want everyone to be clear. Finally, even if we have trouble spotting feet in motion, because it all happens too quickly, or even if that one player is a cheater at heart, the definitions themselves remain clear, and usually at least one or two of us can remember them well enough to recite back, as needed, usually with a lot of finger-pointing and furrowed brows. One time we even wrote the no-running rule on the gym chalkboard, and even though no one challenged this, on the grounds that writing is secondary to speech, everyone still understood why it was scrawled there, by which I mean everyone knew exactly who should read it the most – wow, does every game have that player? Incorrigible.
Bottom line: accountability is down to the sincerity and respect offered to each player by every other player who decides to participate. As an aside, the need for a referee, an arbiter, is all the more clear when the stakes are as high as bragging rights and free beer. But, even as we play for fun, the rules exist or else the game, as such, does not. (On that note, I find a lot of players just don’t like Walking Football and would rather play with running, and that’s fine, too: it’s their decision, and plenty other like-minded players keep both games afloat. I find the Walking game amplifies decision-making, so maybe this feature just appeals to me. And I still play traditional football, too.) My broader point is that any one person must decide to accept what has been defined and, likewise, any group of people must reach a consensus. Shared meaning matters because, otherwise, as I say, we don’t have a game, or else we have a very different one, or we just have anarchy. But whether that person, alone, or the group, altogether, searching for a way to decide upon meaning, has the patience to delve down the rabbit hole… well, yes, context does indeed matter – both usage and etymology. I’ve said and written as much, myself, for a long time. So, in light of all this, I hope I’ve gathered a little something of Derrida’s différance. I’m still learning.
Another illustration: in my teaching, I occasionally introduced this matter of contextual meaning by offering students a list of synonyms: “slim,” “slender,” “skinny,” “thin,” “narrow.” Each word, of course, has its own particular meaning. “If they meant the same thing,” I offered, “then we’d use the same word,” so just what explains the need for all these synonyms? Well, students would say, there are lots of different things out there that possess or demonstrate these various adjectives (my word, not theirs), so we’ve come up with words to describe them (and I think that’s a charitable “we,” like the Royal “We.”) As the discussion proceeded, I might ask which of these words typically describe human traits versus those – leaving aside metaphors – that typically do not. Next, which words typically possess positive connotations, and which negative, or neutral? And, as it pertains to the personification metaphors, which words are more easily envisioned versus those that really stretch the imagination, or even credibility?
Eventually, I would shift from ontology to epistemology, posing the questions at the heart of my intention: For any of the previous questions about these synonyms, how do you know what you’re talking about? For what each of these words could mean, where have your assurances come from? Of course, the most frequent reply to that question was “the dictionary,” followed by “my parents” or “books I’ve read,” or “just everyday experience, listening and talking to people.” Occasionally, the reply was something akin to “Who cares… it just means what it means, doesn’t it?” In every reply, though, one common thread was detectable: the involvement of other people as part of the meaning-making process. Fair enough, we can’t all be Thoreau.
One more example: when is “red” no longer red but perhaps orange or purple? Well, for one thing, if you’re colour blind, the question means something entirely different, which I say not flippantly but again to illustrate how important dialogue and community are to deciding what something means. For another thing, we might wish to ask, in keeping with context-dependency, “Why even ask?” Again, this is not flippant or dismissive but practical: when does it matter so that we distinctly need to identify the colour red? Where a group of people might face the question over what is red versus what is orange or purple, we might expect some kind of discussion to ensue. And, whether asking as part of such a group or as a hermit, alone, I submit that one person’s decision about what is “red” is ultimately down to one person to determine: “Red is this,” or “This is red,” or even, “Gosh, I still can’t really decide.” Even a coerced decision we can still attribute to the one who forces the issue – one person has decided on behalf of another, however benignly or violently: might makes right, or red, as it were.
Coercion introduces a political consideration about whose authority or power has influence, similar to needing referees on account of those players who decide to run. The point, for now, is simply that a decision over what something means to a person is ultimately made by a person, leaving others to deal with that decision on their own terms in whatever way. But other people are part of the meaning-making process, even passively, or else I wouldn’t need language to begin with since the rest of you wouldn’t trouble me by existing. Not to worry, by the way, I appreciate you reading this far. From what I understand (and I am convinced I must learn more, being no avid student of either postmodernism or Derrida), his observation of différance either discounts or else offers no account for the arbitrary decision-making that people might make when they decide they’ve had enough. People tend to land somewhere in a community, and it’s the rare person who lives and plays wholly and uncompromisingly by their own rules. However, the fact that he felt différance was worth the effort to publicise and explain to the rest of us does reflect an arbitrary decision on the part of Derrida and says something about him.
So this is where I have more fundamental trouble understanding Derrida and différance – the very notion of “different,” as in, in what world could there not be an arbiter? Even a life alone would face endless decisions: what to eat, where to go, when to sleep, and so forth. From such musing – speaking of rabbit holes – I was led to reading about another philosopher named Jacques, this one Rancière, and what he calls the axiom of equality. In pure nutshell form, I take this to mean that no (socio-political) inequality exists until it has been claimed to exist – and note that it’s claimed in a boat-rocking kind of way, what the kids these days are calling “disruptive.” The upshot is that equality, itself, can only ever be theoretical because someone somewhere inevitably is and always will be marginalised by the arbitrary decisions of cultural hegemony. Still learning.
Back to the Walking Football analogy: if the rabbit hole of defining a word in the context of those that surround it, and then having to define, even further, all those words, and on and on, and experience is inexhaustible, and what’s the point, and lift a glass to nihilism… if that kind of limitless indefinite deconstructive search-and-compare lies at the heart of what is different, then maybe Derrida just found it difficult to reach agreement with other people. It stands to reason that, if he played Walking Football, Derrida might be the worst cheater on the floor, continually running when he should be walking, then denying it just the same as he tried to gain advantage. Maybe, fed up being called a cheater, he would take his ball and go home to play by himself, where no one could say he was wrong. Being alone, who would be there, whether as an obedient player or as a sneaky one, to challenge him?
In fact, maybe that’s why he chose to return to play the next day – for all the arguing, he enjoyed the game, or the attention, or the camaraderie, or the exercise, or whatever, more than being accused of cheating. I wonder if, perhaps, in the great game of philosophy football, he would have been the only rival to strike real fear in Diogenes – I mean awe & respect kind of fear, just to clarify, and I mean if they had lived at the same time. It’s hard to know about Diogenes since nothing he wrote down ever survived, and these days, I doubt more than a few can recall any of whatever he said, besides that lamp-carrying honesty thing. (We should all have such good spirit when it comes to our first principles.) Anyway, I think Diogenes played for Wimbledon.
Well, I am being unkind to Derrida. Evidently, he was a kinder person by nature than I have let on, as well as an advocate for all voices, all people. And the professional care, the uncompromising expertise he took to convey his ideas, to trouble himself with delving down the rabbit hole so arbitrarily – to go down at all but, moreover, to go so far when he might, just the same, have decided to halt. Delve as far as you like, but accept responsibility for your decision, every time. In that respect, how does Derrida differ from any other person facing decisions? Did he have still other motivations? No player who kicks a football is deliberately playing to lose, not unless they have been coerced by someone else to do so. On the other hand, for all I know, maybe what Derrida called red I would call blue. Be careful not to pass the ball to the wrong team! (By the way, in sport, dynasties are remembered precisely because they eventually come to an end.)
Was Derrida no less accountable and open to scrutiny than you, or me, or anybody else? To suggest that a word only makes sense based on how it differs from those around it is no less arbitrary than its reciprocal suggestion: a word only makes sense based on how it describes only what it describes. Half-full / half-empty, six of one… Two sides of the same coin are still the same coin. Alternatively, who put him up to all this? Meanwhile, on his own, surely Derrida had it within himself, as people do when they reach a point, simply to say, “Here is enough. I decide to stop here. For me, [the item in question] means this.” If that doesn’t ring true and sound like him, well, I’d say that can be just as telling of his character; I heard it suggested, once, how we can be helped in knowing something by what it is not. So, fine – for Derrida to stake the claim called différance, I’m willing to concede him that moment. We all land somewhere, and we’re all hardly alike, even when we’re alike.
We are, each and every one of us, individual. But together we comprise something just as dynamic on a larger scale – one might construe us societally, or perhaps historically, anthropologically, or on and on, in whatever way through whichever lens. For me, différance appears an attempt to speak for all about all, prescriptively. A grand stab at philosophy, no question, and that’s the beauty of the equality of philosophy, with thanks to Rancière: we all have a part to play and a right to respond. For the time being, as I have understood Derrida and his thinking, and I willingly stand to be instructed further, différance strikes me as ironic, being an advocacy for the dynamic development of people and language and culture that self-assuredly asserts its own accuracy. That is not an uncommon indictment of postmodernists. What’s more, it is ohh, so human.
“The needs of the economy and our society are changing and therefore you need to have a learning system that fits the purpose, and that purpose is constantly shifting.”
So said Anthony Mackay, CEO of the Centre for Strategic Education (CSE) in Australia, during an interview with Tracy Sherlock from The Vancouver Sun. Mr Mackay was at SFU’s Wosk Centre for Dialogue in Vancouver on January 29, 2015, facilitating a forum about the changing face of education. Although links to the forum’s webcast archive and Sherlock’s interview are now inactive, I did save a copy of the interview text at the time, posted here beneath this essay. Tracy Sherlock has since told me that she doesn’t know why the interview’s links have been disconnected (e-mail communication, January 27, 2017). Nonetheless, there remainsampleon-line and pdf-printpromotion and coverage of the event.
The forum and the interview were first brought to my attention via e-mail, shared by an enthusiastic colleague who hoped to spur discussion, which is altogether not an uncommon thing for teachers. Originally, I wrote distinct yet connected responses to a series of quotations from Mr Mackay’s interview. Here, some thirty-two months later, I’ve edited things into a more fluid essay although, substantively, my thoughts remain unchanged. Regrettably, so does the bigger picture.
For starters, Mr Mackay’s remark tips his hand – and that of the CSE – when he precedes society with economy. Spotting related news reports makes the idea somewhat more plausible, that of a new curriculum “…addressing a chronic skills shortage in one of the few areas of the Canadian economy that is doing well” (Silcoff). Meanwhile, in Sherlock’s interview [posted below this essay], Mr Mackay concludes by invoking “the business community,” “the economy of the future,” and employers’ confidence. Make no mistake, Mr Mackay is as ideological as anyone out there, including me and you and everybody, and I credit him for being obvious. On the other hand, he plays into the hands of the grand voice of public educators, perhaps willfully yet in a way that strikes me as disingenuous, couched in language so positive that you’re a sinner to challenge him. Very well, I accept the challenge.
Whatever “purpose” of education Mr Mackay has in mind, here, it’s necessarily more specific unto itself than to any single student’s interests or passions. In other words, as I take his portrayal, some student somewhere is a square peg about to be shown a round hole. Yet this so-called purpose is also “constantly shifting,” so perhaps these are triangular or star-shaped holes, or whatever, as time passes by.
Enter “discovery learning” – by the way, are we in classrooms, or out-and-about on some experiential trip? – and the teacher says only what the problem is, leaving the students to, well, discover the rest. I can see where it has a place; how it enables learning seems obvious enough since we learn by doing – teach someone to fish, and all. But when it comes to deciding which fish to throw back, or how many fish are enough when you don’t have a fridge to store them in before they rot and attract hungry bears… when it comes to deciding what’s more versus less important, those minutiae of mastery, it’s not always as easy as an aphorism or a live-stream video conference. Where it’s more hands-off from the teacher, in order to accommodate the student, discovery learning seems to me better suited to learners well past any novice stage. And if the teacher said, “Sorry, that’s not discovery learning,” would the students remain motivated? Some would; others most certainly would not: their problem, or the teacher’s? When both the teacher and the students say, “We really do need to follow my lead just now,” which party needs to compromise for the other, and to what extent? Teaching and learning ought to be a negotiation, yes, but never an adversarial one! In the case of “discovery learning,” I wonder whether “teacher” is even the right title anymore.
In any case, Mr Mackay appears guilty of placing the cart before the horse where it comes to educating students according to some systemic purpose. I’ve got more to say about this particular detail, what he calls “personalization.” For now, it’s worth setting some foundation: Ken Osborne wrote a book called Education, which I would recommend as a good basis for challenging Mr Mackay’s remarks from this interview.
That Osborne’s book was published in 1999 I think serves my point, which is to say that discernment, critical thinking, effective communication, and other such lauded 21st century skills were in style long before the impending obscurity of the new millennium. They have always offered that hedge against uncertainty. People always have and always will need to think and listen and speak and read, and teachers can rely on this. Let’s not ever lose sight of literacy of any sort, in any venue. Which reminds me…
“Isn’t that tough when we don’t know what the jobs of the future will be?”
I must be frank and admit… this notion of unimaginable jobs of the future never resonated with me. I don’t remember when I first heard it, or even who coined it, but way-back-whenever, it instantly struck me as either amazing clairvoyance or patent nonsense. I’ve heard it uttered umpteen times by local school administrators, and visiting Ministry staff, and various politicians promoting the latest new curriculum. The idea is widely familiar to most people in education these days: jobs of the future, a future we can’t even imagine! Wow!
Well, if the unimaginable future puzzles even the government, then good lord! What hope, the rest of us? And if the future is so unimaginable, how are we even certain to head any direction at all? When you’re lost in the wilderness, the advice is to stay put and wait for rescue. On the other hand, staying put doesn’t seem appropriate to this discussion; education does need to adapt and evolve, so we should periodically review and revise curricula. But what of this word unimaginable?
For months prior to its launch, proponents of BC’s new curriculum clarified – although, really, they admonished – that learning is, among other things, no longer about fingers quaintly turning the pages of outmoded textbooks. To paraphrase the cliché, that ship didn’t just sail, it sank. No need to worry, though. All aboard were saved thanks to new PDFs– er, I mean PFDs, personal floatation devices– er, um, that is to say personal floatation e-devices, the latest MOBI-equipped e-readers, to be precise. As for coming to know things (you know, the whole reason behind “reading” and all…), well, we have Google and the Internet for everything you ever did, or didn’t, need to know, not to mention a 24/7 news cycle, all available at the click of a trackpad. It’s the 21st century, and learning has reserved passage aboard a newer, better, uber-modern cruise ship where students recline in ergonomic deck chairs, their fingertips sliding across Smart screens like shuffleboard pucks. Welcome aboard! And did I mention? Technology is no mere Unsinkable Ship, it’s Sustainable too, saving forests of trees from the printing press (at a gigawatt-cost of electricity, mind, but let’s not pack too much baggage on this voyage).
Sorry, yes, that’s all a little facetious, and I confess to swiping as broadly and inaccurately as calling the future “unimaginable.” More to the point: for heaven’s sake, if we aren’t able to imagine the future, how on earth do we prepare anybody for it? Looking back, we should probably excuse Harland & Wolff, too – evidently, they knew nothing of icebergs. Except that they did know, just as Captain Smith was supposed to know how to avoid them.
But time and tide wait for no one which, as I gather, is how anything unimaginable arose in the first place. Very well, if we’re compelled toward the unknowable future, a cruise aboard the good ship Technology at least sounds pleasant. And if e-PFDs can save me weeks of exhausting time-consuming annoying life-skills practice – you know, like swimming lessons – so much the better. Who’s honestly got time for all that practical life-skills crap, anyway, particularly when technology can look after it for you – you know, like GPS.
If the 21st century tide is rising so rapidly that it’s literally unimaginable (I mean apart from being certain that we’re done with books), then I guess we’re wise to embrace this urgent… what is it, an alert? a prognostication? guesswork? Well, whatever it is, thank you, Whoever You Are, for such vivid foresight– hey, that’s another thing: who exactly receives the credit for guiding this voyage? Who’s our Captain aboard this cruise ship? Tech Departments might pilot the helm, or tend the engine room, but who’s the navigator charting our course to future ports of call? What’s our destination? Even the most desperate voyage has a destination; I wouldn’t even think a ship gets built unless it’s needed. Loosen your collars, everybody, it’s about to get teleological in here.
Q: What destination, good ship Technology?
A: The unknowable future…
Land?-ho! The Not-Quite-Yet-Discovered Country… hmm, would that be 21st century purgatory? Forgive my Hamlet reference – it’s from a mere book.
To comprehend the future, let’s consider the past. History can be instructive. Remember that apocryphal bit of historical nonsense, that Christopher Columbus “discovered America,” as if the entire North American continent lay indecipherably upon the planet, unbeknownst to Earthlings? (Or maybe you’re a 21st century zealot who only reads blogs and Twitter, I don’t know.) Faulty history aside, we can say that Columbus had an ambitious thesis, a western shipping route to Asia, without which he’d never have persuaded his political sponsors to back the attempt. You know what else we can say about Columbus, on top of his thesis? He also had navigation and seafaring skill, an established reputation that enabled him to approach his sponsors in the first place. As a man with a plan to chart the uncharted, even so Columbus possessed some means of measuring his progress and finding his way. In that respect, it might be more accurate to say he earned his small fleet of well-equipped ships. What history then unfolded tells its own tale, the point here simply that Columbus may not have had accurate charts, but he also didn’t set sail, clueless, to discover the unimaginable in a void of invisible nowhere.
But what void confronts us? Do we really have no clue what to expect? To hear the likes of Mackay tell it, with technological innovation this rapid, this influential, we’re going to need all hands on deck, all eyes trained forward, toward… what exactly? Why is the future so unimaginable? Here’s a theory of my very own: it’s not.
Discovering in the void might better describe Galileo, say, or Kepler, who against the mainstream recharted a mischarted solar system along with the physics that describe it. Where they disagreed over detail such as ocean tides (as I gather, Kepler was right), they each had pretty stable Copernican paradigms, mediated as much by their own empirical data as by education. Staring into the great void, perhaps these astronomers didn’t always recognise exactly what they saw, but they still had enough of the right stuff to interpret it. Again, the point here is not about reaching outcomes so much as holding a steady course. Galileo pilots himself against the political current and is historically vindicated on account of his curious mix of technological proficiency, field expertise, and persistent vision. For all that he was unable to predict or fully understand, Galileo still seemed to know where he was going.
I suppose if anyone might be accused of launching speculative missions into the great void of invisible nowhere, it would be NASA, but even there is clarity. Just to name a few: Pioneer, Apollo, Voyager, Hubble – missions with destinations, destinies, and legacies. Meanwhile, up in the middle of Nowhere, people now live in the International Space Station. NASA doesn’t launch people into space willy-nilly. It all happens, as it should, and as it must, in a context with articulated objectives. Such accomplishments do not arise because the future is unimaginable; on the contrary, they arise precisely because people are able to imagine the future.
Which brings me back to Mr Mackay and the government’s forum on education. It’s not accurate for me to pit one side against another when we all want students to succeed. If I’ve belaboured the point here, it’s because our task concerns young people, in loco parentis. Selling those efforts as some blind adventure seems, to me, the height of irresponsibility wrapped in an audacious marketing campaign disguised as an inevitable future, a ship setting sail so climb aboard, and hurry! Yes, I see where urgency is borne of rapid innovation, technological advancement made obsolete mere weeks or months later. For some, I know that’s thrilling. For me, it’s more like the America’s Cup race in a typhoon: thanks, but no thanks, I’ll tarry ashore a while longer, in no rush to head for open sea, not even aboard a vaunted ocean liner.
We simply mustn’t be so eager to journey into the unknown without objectives and a plan, not even accompanied as we are by machines that contain microprocessors, which is all “technology” seems to imply nowadays. There’s the respect that makes calamity of downloading the latest tablet apps, or what-have-you, just because the technology exists to make it available. How many times have teachers said the issue is not technology per se so much as knowing how best to use it? Teleology, remember? By the way, since we’re on the subject, what is the meaning of life? One theme seems consistent: the ambition of human endeavour. Sharpen weapon, kill beast. Discover fire, cook beast! Discover agriculture, domesticate beast. Realise surplus, and follows world-spanning conquest that eventually reaches stars.
Look, if learning is no longer about fingers quaintly turning the pages of outmoded textbooks, then fine. I still have my doubts – I’ve long said vinyl sounds better – but let that go. Can we please just drop the bandwagoning and sloganeering, and get more specific? By now, I’ve grown so weary of “the unimaginable future” as to give it the dreaded eye-roll. And if I’m a teenaged student, as much as I might be thrilled by inventing jobs of the future, I probably need to get to know me, too, what I’m all about.
In truth, educators do have one specific aim – personalized learning – which increasingly has come into curricular focus. Personalization raises some contentious issues, not least of which is sufficient funding since the need for individualized attention requires more time and resources per student. Nevertheless, it’s a strategy that I’ve found positive, and I agree it’s worth pursuing. That brings me back to Ken Osborne. One of the best lessons I gathered from his book was the practicality of meeting individuals wherever they reside as compared to determining broader needs and asking individuals to meet expectations.
Briefly, the debate presents itself as follows…
Side ‘A’ would determine communal needs and educate students to fill the roles
In my humble opinion, this is an eventual move toward social engineering and a return to unpleasant historical precedent. Know your history, everybody.
Side ‘B’ would assess an individual’s needs and educate a student to fulfil personal potential
In my humble opinion, this is a course that educators claim to follow everyday, especially these days, and one that they would do well to continue pursuing in earnest.
In my experience, students find collective learning models less relevant and less authentic than the inherent incentives found in personalized approaches that engender esteem and respect. Essentially, when we educate individuals, we leave them room to sort themselves out and accord them due respect for their ways and means along the way. In return, each person is able to grasp the value of personal responsibility. Just as importantly, the opportunity for self-actualisation is now not only unfettered but facilitated by school curricula, which I suspect is what was intended by all the “unimaginable” bluster. The communal roles from Osborne’s Side ‘A’ can still be filled, now by sheer numbers from the talent pool rather than by pre-conceived aims to sculpt square pegs for round holes.
Where I opened this essay with Anthony Mackay’s purposeful call to link business and education, I’ve been commenting as a professional educator because that is my field, so that is my purview. In fairness to government, I’ve found that more recent curricular promotion perhaps hints at reversing course from the murk of the “unimaginable” future by emphasizing, instead, more proactive talk of skills and empowerment. Even so, a different posture remains (touched upon in Katie Hyslop’s reporting of the forum and its participants, and a fairly commondiscursivethread in education in its own right) that implicitly conflates the aims of education and business, and even the arts. Curricular draft work distinguishes the “world of work” from details that otherwise describe British Columbia’s “educated citizen” (p. 2).  Both Ontario and Alberta’s curricular plans have developed comparably to BC’s, noting employers’ rising expectations that “a human capital plan” will address our ever-changing “world of work” (p. 5) – it’s as if school’s industrial role were a given. Credit where it’s due, I suppose: they proceed from a vision towards a destination. And being neither an economist nor an industrialist, I don’t aim to question the broader need for business, entrepreneurship, or a healthy economy. Everybody needs to eat.
What I am is a professional educator, and that means I have been carefully and intentionally trained and accredited alongside my colleagues to envision, on behalf of all, what is best for students. So when I read a claim like Mr Mackay’s, that “what business wants in terms of the graduate is exactly what educators want in terms of the whole person,” I am wary that his educational vision and leadership are yielding our judgment to interests, such as commerce and industry, that lie beyond the immediately appropriate interests of students. Anthony Mackay demonstrates what is, for me, the greatest failing in education: leaders whose faulty vision makes impossible the very aims they set out to reach. (By the by, I’ve also watched such leadership condemn brilliant teaching that reaches those aims.) As much as a blanket statement, Mr Mackay makes an unfounded statement, and I could hardly do better to find an example of begging the question. If Mr Mackay is captain of the ship, then maybe responsible educators should be reading Herman Wouk – one last book, sorry, couldn’t resist.
Education is about empowering individuals to make their own decisions, and any way you slice it, individuals making decisions is how society diversifies itself. That includes diversifying the economy, as compared to the other way around (the economy differentiating individuals). Some people are inevitably more influential than others. All the more reason then for everybody, from captains of industry on down, to learn to accept responsibility for respecting an individual’s space, even while everybody learns to decide what course to ply for themselves. Personalized learning is the way to go as far as resources can be distributed, so leave that to the trained professional educators who are entrusted with the task, who are experts at reading the charts, spotting the hazards, and navigating the course, even through a void. Expertise is a headlight, or whatever those are called aboard ships, so where objectives require particular expertise, let us be lead by qualified experts.
And stop with the nonsense. No unimaginable future “world of work” should be the aim of students to discover while their teachers tag along like tour guides. Anyway, I thought the whole Columbus “discovery” thing had helped us to amend that sort of thinking, or maybe I was wrong. Or maybe the wrong people decided to ignore history and spend their time, instead, staring at something they convinced themselves was impossible to see.
“The learning partnership has gotto go beyond the partnership of young person and family, teacher and school, to the community and supportive agencies. TONY MACKAY CEO, CENTRE FOR STRATEGIC EDUCATION IN AUSTRALIA
Tony Mackay, CEO at the Centre for Strategic Education in Australia, was in Vancouver recently, facilitating a forum about changing the education system to make it more flexible and personalized. He spoke about the rapidly changing world and what it means for education.
Q Why does the education system need to change?
A The needs of the economy and our society are changing and therefore you need to have a learning system that fits the purpose, and that purpose is constantly shifting. So it’s not just a matter of saying we can reach a particular level and we’ll be OK, because you’ve got such a dynamic global context that you have a compelling case that says we will never be able to ensure our ongoing level of economic and social prosperity unless we have a learning system that can deliver young people who are ready — ready for further education, ready for the workforce, ready for a global context. That’s the compelling case for change.
Q Isn’t that tough when we don’t know what the jobs of the future will be?
A In the past we knew what the skill set was and we could prepare young people for specialization in particular jobs. Now we’re talking about skill sets that include creativity, problem solving, collaboration, and the global competence to be flexible and to have cultural understanding. It’s not either or, it’s both and — you need fantastic learning and brilliant learning in the domains, which we know are fundamental, but you also need additional skills that increasingly focus on emotional and social, personal and inter-personal, and perseverance and enterprising spirit. And we’re not saying we just want that for some kids, we want to ensure that all young people graduate with that skill set. And we know they’re going to have to effectively “learn” a living — they’re going to have to keep on learning in order to have the kind of life that they want and that we’re going to need to have an economy that thrives. I believe that’s a pretty compelling case for change.
Q How do you teach flexibility?
A When I think about the conditions for quality learning, it’s pretty clear that you need to be in an environment where not only are you feeling emotionally positive, you are being challenged — there’s that sense that you are challenged to push yourself beyond a level of comfort, but not so much that it generates anxiety and it translates into a lack of success and a feeling of failure that creates blockages to learning. You need to be working with others at the same time — the social nature of learning is essential. When you’re working with others on a common problem that is real and you have to work as a team and be collaborative. You have to know how to show your levels of performance as an individual and as a group. You can’t do any of that sort of stuff as you are learning together without developing flexibility and being adaptive. If you don’t adapt to the kind of environment that is uncertain and volatile, then you’re not going to thrive.
Q What does the science of learning tell us?
A We now know more about the science of learning than ever before and the question is are we translating that into our teaching and learning programs? It’s not just deeper learning in the disciplines, but we want more powerful learning in those 21st-century skills we talked about. That means we have to know more than ever before about the emotions of learning and how to engage young people and how young people can encourage themselves to self-regulate their learning.
The truth is that education is increasingly about personalization. How do you make sure that an individual is being encouraged in their own learning path? How do we make sure we’re tapping into their strengths and their qualities? In the end, that passion and that success in whatever endeavour is what will make them more productive and frankly, happier.
Q But how do you change an entire education system?
A Once you learn what practice is done and is successful, how do you spread that practice in a school system so it’s not just pockets of excellence, but you’ve actually got an innovation strategy that helps you to spread new and emerging practice that’s powerful? You’re doing this all in the context of a rapidly changing environment, which is why you need those skills like flexibility and creativity. The learning partnership has got to go beyond the partnership of young person and family, teacher and school, to the community and supportive agencies. If we don’t get the business community into this call to action for lifelong learning even further, we are not going to be able to get there. In the end, we are all interdependent. The economy of the future — and we’re talking about tomorrow — is going to require young people with the knowledge, skills and dispositions that employers are confident about and can build on.