Development vs Winning: Actually, There Is No Such Thing

Also read the follow-up article to this post.

Outside of corruption, throwing the game, which has no place in this discussion, I submit that nobody deliberately plays to lose.

Specifically, I’m talking about football, less commonly known as soccer, and perhaps this discussion even applies to many different sports. But, as a player and coach, football is the beautiful game that I know best, so here goes.

Playing football, we would anticipate the team that makes the fewest mistakes ought to win – as in, the fewest mistakes both in and out of possession, from the kick-off until full-time. If so, then consistent quality performances are key because these should result in more opportunities to earn a win and prevent a loss. What’s more, as the reward for winning grows more lucrative, and the stakes are raised, players must all-the-more learn to develop that “consistent quality performance” on demand, under whatever pressure: effective decisions, executed at the proper moments, skillfully, every time, or at least as frequently as possible. Developing this “quality performance” consistency also demands that opponents earn victories rather than handing them the result, unimpeded, because now they’re challenged to execute just as consistently, if not just as flawlessly. As I say, no one competes to lose.

So, what of development and winning in light of all this? Too often, for me, these two ideas are falsely conflated into sides of what is truly a non-existent – or, at least, a very ill-conceived – debate. As ends-in-themselves, development and winning are typically deemed incompatible. Further, winning is then often vilified since winners produce losers while development is commended for being inclusive. At that point, I find the debate often sidetracks into competition versus fun, another false dichotomy, but in any case, the parameters are so muddled as to render all a meaningless waste of breath. For the sake of dispensing with the issue, I simply ask: why would we not reasonably expect to see fun in conjunction with competition? These are not oil and water, nor do they need to be, nor should they be deemed to be.

Football, the Game, can be played for fun, exhilaration, fitness, camaraderie, focus, perseverance, discipline, teamwork, all manner of virtues and benefits, yet all these on account of the very nature of the Game as a contest of opposition. And where one person finds things fun and enjoyable, another does not necessarily agree, yet who’s to say who is correct, if the Game has enabled all? All sorts of people find all sorts of fun in all sorts of things – who’s to say that finding competition to be fun is wrong, if only because it makes you squeamish? Just the same, if someone’s threshold for intense competitive drive is lower than another’s, can each still not enjoy playing with like-minded peers? In fact, just for instance, this is exactly why various youth and adult leagues categorize levels of play into (for ease of this discussion) gold, silver, and bronze tiers. Everyone must learn to play, and development (to whatever degree) will occur as they go. That implicates teammates, the quality of coaching, and other factors relating to a team or league’s motives for playing in the first place (i.e. gold vs silver vs bronze). Motive, however, does not change the nature of the Game, itself, or the nature of effective learning, development, coaching, and teaching.

As I see it, the issue is not Development for its Own Sake versus Winning for its Own Sake or even Development for its Own Sake versus Development in order to Win. The issue is Development and Learning as a concept, altogether, period, because how else could you learn to play? And the more you play, the more you develop. Whether that development is good or poor is down to context, and a separate issue.

And when the arguments start, what’s really being debated, it seems to me, is how any one person simply wants to be “right” and demand that everyone else agree with what constitutes “successful” participation in the Game. Ironically, it’s a territorial argument over ideology. But to win an egotistical war suggests to me that we might better spend our efforts re-evaluating our culture and how we wish to treat other people.

Fair enough, people want to be “right.” We all have egos. But can we at least offer some basis from which to claim what the word “successful” can mean? So here goes.

Since losing a match always remains a possibility, no matter how consistent our quality performance might be, we ought to measure “success” as the degree to which a player or team has developed that consistent quality of performance (process) over time, at their corresponding level and motive for play, regardless of winning (product).

**I’ll specify, as I did above, that where wins are lucrative – such as in professional play – the stakes grow higher, and different debates will ensue about what “success” means. Yet that’s a commercial issue, relating to development and learning on the basis of peoples’ patience and tolerance for financial pleasure or pain: in other words, the two issues are not inherently related but coincidental: a crowd of supporters or sponsors are willing to pay to back the team for a season.**

For the Game, itself, we must let winning take care of itself because players control what they are able to control, under conditions that also include the pitch, the ball, the referee, the weather, health, fitness, and so forth. So what can we measure? Measurements ought to fall under player and team control, e.g. shots at goal, completed passes, tackles won, saves made, etc. Far from counteracting the importance of winning, such consistent measurements of quality performance provide feedback, i.e. if our pass completion is 90% successful around the penalty box, then maybe we don’t score because our shooting is infrequent or inaccurate. One might even argue that the statistical measurements we gather are less important than the ones we’ve overlooked.

In any case, successful players and successful teams identify strong and weak areas by regularly measuring consistent quality across a range of performance details, and they develop each area for consistency – which we anticipate will translate into more wins – because consistent quality performances usually translate into what can be measured as an “ongoing success.” Success now defines a degree of purposeful, committed, consistent hard work, which makes for more focused, more effective training. Developmentally, the more successful you are, the more often you can theoretically win – but if your opponents also train and measure, and respond better than you do, then guess what? That’s called competition.

Development and winning not only can but already do co-exist. And they always have. It’s people who separate them, falsely, perhaps because they want to win more than they want to earn wins – or, worse, perhaps because they merely want to win a territorial argument about development vs winning that never existed before someone’s ego dreamt it up.

Beyond on-field training and competing, development and learning should cover a range of areas that affect yet lie beyond the Game, e.g. health, fitness, nutrition, goal setting, mental preparation, personal responsibility. Coaches ought to take players beyond the Game, teaching them how to train, how to contribute to a team, how to compete at higher levels of skill and intensity, how to manage the dynamics and emotions of competition, and how to conduct themselves with personal integrity in all respects. Of course, the Game is included within the scope of these matters because that’s why we’re a team in the first place. And the range of these inclusions will comprise a more holistic football program. We implement and evaluate that program as we go, or we ought to.

Effective programs inevitably reveal the crux of commitment, either thanks to peoples’ dedication or on account of their inconsistency. Effective programs encourage trust and a shared pursuit of common goals. Where trust and commitment are maintained consistently and respectfully, a team and its members learn to measure quality and respond consistently, i.e. successfully. Such programs require time, discipline, and patience to learn, but the degree to which participants buy into the philosophy is met with concomitant developmental consistency, and again, one can expect winning to result more often than not, relative to the quality of the opposition. Likewise, individual people can take credit for this-or-that achievement only relative to their teammates, who are also active participants in the program.

Active participation should find team members applying complementary strengths by filling key roles on the path to team success. Individual contributions accumulate, and if these have been consistently defined by common goals and measured for consistent quality, “success” is more likely because people can envision it more clearly and pursue it more meaningfully.

Opponents, especially of equal or slightly higher abilities, likewise play a key role in a team’s pursuit of success since measuring consistent quality performances against them is, in one sense, what the Game – and what sport – is all about. Active involvement in a program unites a team, preparing everyone for more advanced challenges. Occasionally, a teammate might advance to more elite programs, and when a team member grows beyond the scope of the program, that is a team success that all of us can share.

Lest We Forget

I am indebted to three of my students – Maddy, Kira, and Shannon – for collaborating to write this essay, which we formally read aloud during a school Remembrance Day ceremony in 2013. As I told them at the time, our planning sessions together were as good as any committee-style work I’ve ever done – everyone thoughtful, respectful, contributing, and focused – and I remain as proud of our group effort today as I was back then.

I have only slightly revised our essay, for fluidity, to suit a print-format but have endeavoured to avoid any substantive changes.

One hundred years ago, the Dominion of Canada’s soldiers fought in the Great War. By November 1917, the Canadian Expeditionary Force had been in Europe for over three years, staying one more year and sacrificing their safety and their lives for their country on behalf of the British Empire.

What is sacrifice? Sacrifice is soldiers seeing past terror on the battlefield, placing themselves into vulnerability, and giving themselves on our behalf. Each year on November 11, Remembrance Day in Canada, we recognize our soldiers by wearing a poppy over our hearts. Why a poppy is more well-known, yet since its adoption in 1921, how the symbolic pin has remained potent is perhaps less well-considered. A century later, in such a different world, the relevance of the poppy as a way of honoring the sacrifice of wartime warrants reflection.

As time passes, the poppy’s symbolism, in and of itself, remains the same. We change – people, culture – and inevitably, as we change, our relationship with the poppy changes, too, however much or little. The poppy, the same symbol, is different for those who feel firsthand the costs of war, so many people separated, harmed, and displaced, so many lives lost. Pains of loss are felt most intensely when they occur, by those who are closest to the people involved. For those of us with no direct wartime experience, what we feel and know matters, yet it also differs. To activate a more complete appreciation, one meaningful place to which we might turn is poetry. During World War I, poetry was a common means for those with direct wartime experience to share, and to cope.

For the lover in the poem, “To His Love,” by Ivor Gurney, one particular soldier’s death has wiped out his lover’s dreams for a comfortable future. Experiencing the fresh pains of loss, she could not possibly forget her soldier or his sacrifice. The poppy we wear both honours his sacrifice and “[hides] that red wet thing,” her loss. But, because of our distance, our poppy does not hold the same raw pain as it does for her, for those who have so immediately lost their loved ones. So, if our poppies do not hold that same raw pain, why do we continue to wear them?

The poppy stays the same because of the fact that each soldier’s death remains. Again, from “To His Love,” Gurney writes, “You would not know him now…” Generations removed, do we remember who this soldier was? Would we recognize him on the street? No. “But still he died.” We may find it difficult to assess the significance of his death, here in our world, far removed by time and distance. But let us appreciate, let us remember, in that moment of a soldier’s death, how he died: “with due regard for decent taste.” A soldier dies with dignity, for his own sake, because that is all he has. He is a small blip in the universe. “But still he died,” and that will forever be. And for all who loved him, and for all he loved, we remember.

We continue to honour our soldiers, and the sacrifices of all during wartime, because of the timelessness of that sacrifice, which each one makes. Even now, removed from war, we can find reasons to remember the deaths of soldiers because the memory that remains of each soldier embodies our definition of a hero: ordinary people facing extraordinary circumstances and giving themselves, perhaps giving their lives, on our behalf. When we wear their poppies, we let their deaths weigh on our present.

Let their deaths weigh on our present, and let their memories live in their stead.

“They shall grow not old, as we that are left grow old:

Age shall not weary them, nor the years condemn.

At the going down of the sun and in the morning

We will remember them.”

Wanted on the Voyage: Professional Teachers are Experts in their Field

The needs of the economy and our society are changing and therefore you need to have a learning system that fits the purpose, and that purpose is constantly shifting.”

So said Anthony Mackay, CEO of the Centre for Strategic Education (CSE) in Australia, during an interview with Tracy Sherlock from The Vancouver Sun. Mr Mackay was at SFU’s Wosk Centre for Dialogue in Vancouver on January 29, 2015, facilitating a forum about the changing face of education. Although links to the forum’s webcast archive and Sherlock’s interview are now inactive, I did save a copy of the interview text at the time, posted here beneath this essay. Tracy Sherlock has since told me that she doesn’t know why the interview’s links have been disconnected (e-mail communication, January 27, 2017). Nonetheless, there remains ample on-line and pdf-print promotion and coverage of the event.

The forum and the interview were first brought to my attention via e-mail, shared by an enthusiastic colleague who hoped to spur discussion, which is altogether not an uncommon thing for teachers. Originally, I wrote distinct yet connected responses to a series of quotations from Mr Mackay’s interview. Here, some thirty-two months later, I’ve edited things into a more fluid essay although, substantively, my thoughts remain unchanged. Regrettably, so does the bigger picture.

For starters, Mr Mackay’s remark tips his hand – and that of the CSE – when he precedes society with economy. Spotting related news reports makes the idea somewhat more plausible, that of a new curriculum “…addressing a chronic skills shortage in one of the few areas of the Canadian economy that is doing well” (Silcoff)[1]. Meanwhile, in Sherlock’s interview [posted below this essay], Mr Mackay concludes by invoking “the business community,” “the economy of the future,” and employers’ confidence. Make no mistake, Mr Mackay is as ideological as anyone out there, including me and you and everybody, and I credit him for being obvious. On the other hand, he plays into the hands of the grand voice of public educators, perhaps willfully yet in a way that strikes me as disingenuous, couched in language so positive that you’re a sinner to challenge him. Very well, I accept the challenge.

Whatever “purpose” of education Mr Mackay has in mind, here, it’s necessarily more specific unto itself than to any single student’s interests or passions. In other words, as I take his portrayal, some student somewhere is a square peg about to be shown a round hole. Yet this so-called purpose is also “constantly shifting,” so perhaps these are triangular or star-shaped holes, or whatever, as time passes by.

Enter “discovery learning” – by the way, are we in classrooms, or out-and-about on some experiential trip? – and the teacher says only what the problem is, leaving the students to, well, discover the rest. I can see where it has a place; how it enables learning seems obvious enough since we learn by doing – teach someone to fish, and all. But when it comes to deciding which fish to throw back, or how many fish are enough when you don’t have a fridge to store them in before they rot and attract hungry bears… when it comes to deciding what’s more versus less important, those minutiae of mastery, it’s not always as easy as an aphorism or a live-stream video conference. Where it’s more hands-off from the teacher, in order to accommodate the student, discovery learning seems to me better suited to learners well past any novice stage. And if the teacher said, “Sorry, that’s not discovery learning,” would the students remain motivated? Some would; others most certainly would not: their problem, or the teacher’s? When both the teacher and the students say, “We really do need to follow my lead just now,” which party needs to compromise for the other, and to what extent? Teaching and learning ought to be a negotiation, yes, but never an adversarial one! In the case of “discovery learning,” I wonder whether “teacher” is even the right title anymore.

In any case, Mr Mackay appears guilty of placing the cart before the horse where it comes to educating students according to some systemic purpose. I’ve got more to say about this particular detail, what he calls “personalization.” For now, it’s worth setting some foundation: Ken Osborne wrote a book called Education, which I would recommend as a good basis for challenging Mr Mackay’s remarks from this interview.

TechnoFailsThat Osborne’s book was published in 1999 I think serves my point, which is to say that discernment, critical thinking, effective communication, and other such lauded 21st century skills were in style long before the impending obscurity of the new millennium. They have always offered that hedge against uncertainty. People always have and always will need to think and listen and speak and read, and teachers can rely on this. Let’s not ever lose sight of literacy of any sort, in any venue. Which reminds me…

“Isn’t that tough when we don’t know what the jobs of the future will be?”

I must be frank and admit… this notion of unimaginable jobs of the future never resonated with me. I don’t remember when I first heard it, or even who coined it, but way-back-whenever, it instantly struck me as either amazing clairvoyance or patent nonsense. I’ve heard it uttered umpteen times by local school administrators, and visiting Ministry staff, and various politicians promoting the latest new curriculum. The idea is widely familiar to most people in education these days: jobs of the future, a future we can’t even imagine! Wow!

Well, if the unimaginable future puzzles even the government, then good lord! What hope, the rest of us? And if the future is so unimaginable, how are we even certain to head any direction at all? When you’re lost in the wilderness, the advice is to stay put and wait for rescue. On the other hand, staying put doesn’t seem appropriate to this discussion; education does need to adapt and evolve, so we should periodically review and revise curricula. But what of this word unimaginable?

For months prior to its launch, proponents of BC’s new curriculum clarified – although, really, they admonished – that learning is, among other things, no longer about fingers quaintly turning the pages of outmoded textbooks. To paraphrase the cliché, that ship didn’t just sail, it sank. No need to worry, though. All aboard were saved thanks to new PDFs– er, I mean PFDs, personal floatation devices– er, um, that is to say personal floatation e-devices, the latest MOBI-equipped e-readers, to be precise. As for coming to know things (you know, the whole reason behind “reading” and all…), well, we have Google and the Internet for everything you ever did, or didn’t, need to know, not to mention a 24/7 news cycle, all available at the click of a trackpad. It’s the 21st century, and learning has reserved passage aboard a newer, better, uber-modern cruise ship where students recline in ergonomic deck chairs, their fingertips sliding across Smart screens like shuffleboard pucks. Welcome aboard! And did I mention? Technology is no mere Unsinkable Ship, it’s Sustainable too, saving forests of trees from the printing press (at a gigawatt-cost of electricity, mind, but let’s not pack too much baggage on this voyage).

Sorry, yes, that’s all a little facetious, and I confess to swiping as broadly and inaccurately as calling the future “unimaginable.” More to the point: for heaven’s sake, if we aren’t able to imagine the future, how on earth do we prepare anybody for it? Looking back, we should probably excuse Harland & Wolff, too – evidently, they knew nothing of icebergs. Except that they did know, just as Captain Smith was supposed to know how to avoid them.

But time and tide wait for no one which, as I gather, is how anything unimaginable arose in the first place. Very well, if we’re compelled toward the unknowable future, a cruise aboard the good ship Technology at least sounds pleasant. And if e-PFDs can save me weeks of exhausting time-consuming annoying life-skills practice – you know, like swimming lessons – so much the better. Who’s honestly got time for all that practical life-skills crap, anyway, particularly when technology can look after it for you – you know, like GPS.

If the 21st century tide is rising so rapidly that it’s literally unimaginable (I mean apart from being certain that we’re done with books), then I guess we’re wise to embrace this urgent… what is it, an alert? a prognostication? guesswork? Well, whatever it is, thank you, Whoever You Are, for such vivid foresight– hey, that’s another thing: who exactly receives the credit for guiding this voyage? Who’s our Captain aboard this cruise ship? Tech Departments might pilot the helm, or tend the engine room, but who’s the navigator charting our course to future ports of call? What’s our destination? Even the most desperate voyage has a destination; I wouldn’t even think a ship gets built unless it’s needed. Loosen your collars, everybody, it’s about to get teleological in here.

Q: What destination, good ship Technology?

A: The unknowable future…

Montague Dawson
Montague Dawson

Land?-ho! The Not-Quite-Yet-Discovered Country… hmm, would that be 21st century purgatory? Forgive my Hamlet reference – it’s from a mere book.

To comprehend the future, let’s consider the past. History can be instructive. Remember that apocryphal bit of historical nonsense, that Christopher Columbus “discovered America,” as if the entire North American continent lay indecipherably upon the planet, unbeknownst to Earthlings? (Or maybe you’re a 21st century zealot who only reads blogs and Twitter, I don’t know.) Faulty history aside, we can say that Columbus had an ambitious thesis, a western shipping route to Asia, without which he’d never have persuaded his political sponsors to back the attempt. You know what else we can say about Columbus, on top of his thesis? He also had navigation and seafaring skill, an established reputation that enabled him to approach his sponsors in the first place. As a man with a plan to chart the uncharted, even so Columbus possessed some means of measuring his progress and finding his way. In that respect, it might be more accurate to say he earned his small fleet of well-equipped ships. What history then unfolded tells its own tale, the point here simply that Columbus may not have had accurate charts, but he also didn’t set sail, clueless, to discover the unimaginable in a void of invisible nowhere.

But what void confronts us? Do we really have no clue what to expect? To hear the likes of Mackay tell it, with technological innovation this rapid, this influential, we’re going to need all hands on deck, all eyes trained forward, toward… what exactly? Why is the future so unimaginable? Here’s a theory of my very own: it’s not.

Snoopy the Blackeared Pirate

Discovering in the void might better describe Galileo, say, or Kepler, who against the mainstream recharted a mischarted solar system along with the physics that describe it. Where they disagreed over detail such as ocean tides (as I gather, Kepler was right), they each had pretty stable Copernican paradigms, mediated as much by their own empirical data as by education. Staring into the great void, perhaps these astronomers didn’t always recognise exactly what they saw, but they still had enough of the right stuff to interpret it. Again, the point here is not about reaching outcomes so much as holding a steady course. Galileo pilots himself against the political current and is historically vindicated on account of his curious mix of technological proficiency, field expertise, and persistent vision. For all that he was unable to predict or fully understand, Galileo still seemed to know where he was going.

Copernicus Kepler Galileo.png

I suppose if anyone might be accused of launching speculative missions into the great void of invisible nowhere, it would be NASA, but even there is clarity. Just to name a few: Pioneer, Apollo, Voyager, Hubble – missions with destinations, destinies, and legacies. Meanwhile, up in the middle of Nowhere, people now live in the International Space Station. NASA doesn’t launch people into space willy-nilly. It all happens, as it should, and as it must, in a context with articulated objectives. Such accomplishments do not arise because the future is unimaginable; on the contrary, they arise precisely because people are able to imagine the future.

Which brings me back to Mr Mackay and the government’s forum on education. It’s not accurate for me to pit one side against another when we all want students to succeed. If I’ve belaboured the point here, it’s because our task concerns young people, in loco parentis. Selling those efforts as some blind adventure seems, to me, the height of irresponsibility wrapped in an audacious marketing campaign disguised as an inevitable future, a ship setting sail so climb aboard, and hurry! Yes, I see where urgency is borne of rapid innovation, technological advancement made obsolete mere weeks or months later. For some, I know that’s thrilling. For me, it’s more like the America’s Cup race in a typhoon: thanks, but no thanks, I’ll tarry ashore a while longer, in no rush to head for open sea, not even aboard a vaunted ocean liner.

We simply mustn’t be so eager to journey into the unknown without objectives and a plan, not even accompanied as we are by machines that contain microprocessors, which is all “technology” seems to imply nowadays. There’s the respect that makes calamity of downloading the latest tablet apps, or what-have-you, just because the technology exists to make it available. How many times have teachers said the issue is not technology per se so much as knowing how best to use it? Teleology, remember? By the way, since we’re on the subject, what is the meaning of life? One theme seems consistent: the ambition of human endeavour. Sharpen weapon, kill beast. Discover fire, cook beast! Discover agriculture, domesticate beast. Realise surplus, and follows world-spanning conquest that eventually reaches stars.

Look, if learning is no longer about fingers quaintly turning the pages of outmoded textbooks, then fine. I still have my doubts – I’ve long said vinyl sounds better – but let that go. Can we please just drop the bandwagoning and sloganeering, and get more specific? By now, I’ve grown so weary of “the unimaginable future” as to give it the dreaded eye-roll. And if I’m a teenaged student, as much as I might be thrilled by inventing jobs of the future, I probably need to get to know me, too, what I’m all about.

In truth, educators do have one specific aim – personalized learning – which increasingly has come into curricular focus. Personalization raises some contentious issues, not least of which is sufficient funding since the need for individualized attention requires more time and resources per student. Nevertheless, it’s a strategy that I’ve found positive, and I agree it’s worth pursuing. That brings me back to Ken Osborne. One of the best lessons I gathered from his book was the practicality of meeting individuals wherever they reside as compared to determining broader needs and asking individuals to meet expectations.

Briefly, the debate presents itself as follows…

  • Side ‘A’ would determine communal needs and educate students to fill the roles

In my humble opinion, this is an eventual move toward social engineering and a return to unpleasant historical precedent. Know your history, everybody.

  • Side ‘B’ would assess an individual’s needs and educate a student to fulfil personal potential

In my humble opinion, this is a course that educators claim to follow everyday, especially these days, and one that they would do well to continue pursuing in earnest.

Apollo 11
Apollo 11

In my experience, students find collective learning models less relevant and less authentic than the inherent incentives found in personalized approaches that engender esteem and respect. Essentially, when we educate individuals, we leave them room to sort themselves out and accord them due respect for their ways and means along the way. In return, each person is able to grasp the value of personal responsibility. Just as importantly, the opportunity for self-actualisation is now not only unfettered but facilitated by school curricula, which I suspect is what was intended by all the “unimaginable” bluster. The communal roles from Osborne’s Side ‘A’ can still be filled, now by sheer numbers from the talent pool rather than by pre-conceived aims to sculpt square pegs for round holes.

Where I opened this essay with Anthony Mackay’s purposeful call to link business and education, I’ve been commenting as a professional educator because that is my field, so that is my purview. In fairness to government, I’ve found that more recent curricular promotion perhaps hints at reversing course from the murk of the “unimaginable” future by emphasizing, instead, more proactive talk of skills and empowerment. Even so, a different posture remains (touched upon in Katie Hyslop’s reporting of the forum and its participants, and a fairly common discursive thread in education in its own right) that implicitly conflates the aims of education and business, and even the arts. Curricular draft work distinguishes the “world of work” from details that otherwise describe British Columbia’s “educated citizen” (p. 2). [2] Both Ontario and Alberta’s curricular plans have developed comparably to BC’s, noting employers’ rising expectations that “a human capital plan” will address our ever-changing “world of work” (p. 5)[3] – it’s as if school’s industrial role were a given. Credit where it’s due, I suppose: they proceed from a vision towards a destination. And being neither an economist nor an industrialist, I don’t aim to question the broader need for business, entrepreneurship, or a healthy economy. Everybody needs to eat.

What I am is a professional educator, and that means I have been carefully and intentionally trained and accredited alongside my colleagues to envision, on behalf of all, what is best for students. So when I read a claim like Mr Mackay’s, that “what business wants in terms of the graduate is exactly what educators want in terms of the whole person,” I am wary that his educational vision and leadership are yielding our judgment to interests, such as commerce and industry, that lie beyond the immediately appropriate interests of students. Anthony Mackay demonstrates what is, for me, the greatest failing in education: leaders whose faulty vision makes impossible the very aims they set out to reach. (By the by, I’ve also watched such leadership condemn brilliant teaching that reaches those aims.) As much as a blanket statement, Mr Mackay makes an unfounded statement, and I could hardly do better to find an example of begging the question. If Mr Mackay is captain of the ship, then maybe responsible educators should be reading Herman Wouk – one last book, sorry, couldn’t resist.

Education is about empowering individuals to make their own decisions, and any way you slice it, individuals making decisions is how society diversifies itself. That includes diversifying the economy, as compared to the other way around (the economy differentiating individuals). Some people are inevitably more influential than others. All the more reason then for everybody, from captains of industry on down, to learn to accept responsibility for respecting an individual’s space, even while everybody learns to decide what course to ply for themselves. Personalized learning is the way to go as far as resources can be distributed, so leave that to the trained professional educators who are entrusted with the task, who are experts at reading the charts, spotting the hazards, and navigating the course, even through a void. Expertise is a headlight, or whatever those are called aboard ships, so where objectives require particular expertise, let us be lead by qualified experts.

And stop with the nonsense. No unimaginable future “world of work” should be the aim of students to discover while their teachers tag along like tour guides. Anyway, I thought the whole Columbus “discovery” thing had helped us to amend that sort of thinking, or maybe I was wrong. Or maybe the wrong people decided to ignore history and spend their time, instead, staring at something they convinced themselves was impossible to see.


 

3 February 2015

Vancouver Sun Weekday Sample

Tracy Sherlock VANCOUVER SUN View a longer version of this interview at vancouversun.com Sun Education Reporter tsherlock@vancouversun.com

Changing education to meet new needs

“The learning partnership has gotto go beyond the partnership of young person and family, teacher and school, to the community and supportive agencies.
TONY MACKAY CEO, CENTRE FOR STRATEGIC EDUCATION IN AUSTRALIA

Tony Mackay, CEO at the Centre for Strategic Education in Australia, was in Vancouver recently, facilitating a forum about changing the education system to make it more flexible and personalized. He spoke about the rapidly changing world and what it means for education.

Q Why does the education system need to change?

A The needs of the economy and our society are changing and therefore you need to have a learning system that fits the purpose, and that purpose is constantly shifting. So it’s not just a matter of saying we can reach a particular level and we’ll be OK, because you’ve got such a dynamic global context that you have a compelling case that says we will never be able to ensure our ongoing level of economic and social prosperity unless we have a learning system that can deliver young people who are ready — ready for further education, ready for the workforce, ready for a global context. That’s the compelling case for change.

Q Isn’t that tough when we don’t know what the jobs of the future will be?

A In the past we knew what the skill set was and we could prepare young people for specialization in particular jobs. Now we’re talking about skill sets that include creativity, problem solving, collaboration, and the global competence to be flexible and to have cultural understanding. It’s not either or, it’s both and — you need fantastic learning and brilliant learning in the domains, which we know are fundamental, but you also need additional skills that increasingly focus on emotional and social, personal and inter-personal, and perseverance and enterprising spirit. And we’re not saying we just want that for some kids, we want to ensure that all young people graduate with that skill set. And we know they’re going to have to effectively “learn” a living — they’re going to have to keep on learning in order to have the kind of life that they want and that we’re going to need to have an economy that thrives. I believe that’s a pretty compelling case for change.

Q How do you teach flexibility?

A When I think about the conditions for quality learning, it’s pretty clear that you need to be in an environment where not only are you feeling emotionally positive, you are being challenged — there’s that sense that you are challenged to push yourself beyond a level of comfort, but not so much that it generates anxiety and it translates into a lack of success and a feeling of failure that creates blockages to learning. You need to be working with others at the same time — the social nature of learning is essential. When you’re working with others on a common problem that is real and you have to work as a team and be collaborative. You have to know how to show your levels of performance as an individual and as a group. You can’t do any of that sort of stuff as you are learning together without developing flexibility and being adaptive. If you don’t adapt to the kind of environment that is uncertain and volatile, then you’re not going to thrive.

Q What does the science of learning tell us?

A We now know more about the science of learning than ever before and the question is are we translating that into our teaching and learning programs? It’s not just deeper learning in the disciplines, but we want more powerful learning in those 21st-century skills we talked about. That means we have to know more than ever before about the emotions of learning and how to engage young people and how young people can encourage themselves to self-regulate their learning.

The truth is that education is increasingly about personalization. How do you make sure that an individual is being encouraged in their own learning path? How do we make sure we’re tapping into their strengths and their qualities? In the end, that passion and that success in whatever endeavour is what will make them more productive and frankly, happier.

Q But how do you change an entire education system?

A Once you learn what practice is done and is successful, how do you spread that practice in a school system so it’s not just pockets of excellence, but you’ve actually got an innovation strategy that helps you to spread new and emerging practice that’s powerful? You’re doing this all in the context of a rapidly changing environment, which is why you need those skills like flexibility and creativity. The learning partnership has got to go beyond the partnership of young person and family, teacher and school, to the community and supportive agencies. If we don’t get the business community into this call to action for lifelong learning even further, we are not going to be able to get there. In the end, we are all interdependent. The economy of the future — and we’re talking about tomorrow — is going to require young people with the knowledge, skills and dispositions that employers are confident about and can build on.

Previously available at http://www.pressreader.com/canada/the-vancouver-sun/20150203/282183649467268/TextView

and http://newsinvancouver.com/qa-changing-education-to-meet-new-needs/

[1] Sean Silcoff, The Globe and Mail, BC to add computer coding to school curriculum (Jan 17th, 2016)

[2] Introduction to British Columbia’s Redesigned Curriculum (August 2015 – Draft)

[3] Inspiring Action on Education (June 2010)

From KQED News – How a Chilliwack School Ditched Awards and Assemblies to Refocus on Kids and Learning

What a terrific story from Chilliwack, BC being covered by Linda Flanagan for California’s Bay Area PBS news outlet.

Photograph credit: Flickr/Sarah

Agreed! Excellence is a culture.

Some leaders talk a great game, but no matter the words coming out of their mouths, people respond to the culture they’re part of, and within it, they respond in both overt and subtle ways.

By the way, leaders aren’t limited to those in the head office… leaders are people who take initiative, work to their strengths, and lift others to do the same thing… so pay attention to making your strengths and inspiration constructive instead of deflating or injurious.

If you aren’t getting the results you expected, then reflect and consider what reality people are experiencing and which messages they may be receiving around your place, maybe unintended ones, that might conflict or work against your aims for attitude and behaviour.

It’s a shame when aims and culture contradict. It’s hypocrisy when aims are ignored or undermined by deliberately contradictory culture.

No shame in reflection. Reflection is learning, and learning’s a virtue.

Also agreed! School education should be “looking beyond the short term and thinking more about what kinds of adults they’re trying to develop.” That’s always been my approach.

Post-secondary, career, parenthood, civic involvement… all these and more will come about, and with guidance, let each person find their own way. But the adult human beings making their life decisions need a virtuous, thoughtful, positive foundation, and that’s what school education should always be about.

Click here to read Linda Flanagan’s story.

What On Earth Were They Thinking?

How often have you heard somebody question people who lived “back then,” in the swirling historical mist, who somehow just didn’t have the knowledge, the nuance, or the capability that we so proudly wield today?

“Things back then were just a lot more simple.”

“They just weren’t as developed back then as we are today.”

“Society back then was a lot less informed than we are today.”

It’s difficult to even confront such statements out of their context, but where I’ve had all three of these spoken to me in (separate) conversations, I challenged each impression as an insinuation that we’re somehow smarter than all the peoples of history, more skilled, more sophisticated, more aware, more woke (as they say, these days), that we’re, in the main, altogether better than our elders merely for having lived later than they did. These days, apparently, “we’re more evolved” – ha! 🙂  more evolved, that’s always a good one, as if back then everyone was Australopithecus while here we are jetting across the oceans, toting our iPhones, and drinking fine wines. Well, sure, maybe things have changed since back then, whenever “then” was. But, more typically I’ve found, contemporary judgments levelled upon history are borne of an unintended arrogance, a symptom of 20:20 hindsight and the self-righteous assurance that, today, we’ve finally seen the error – actually, make that the errors – of their ways.

Surely, these days, few – or any – would believe that we’re ignorant or unaccomplished or incapable, not the way someone might say of us while looking back from our future. At any given point on the historical timeline, I wonder whether a person at that point would feel as good about their era, looking back, as any person on some other point of the timeline would feel, also looking back, about theirs. Is it a common tendency, this judgment of contemporary superiority? These days, we might well feel superior to people who had no indoor plumbing or viral inoculations or Internet access, just as someone earlier would have appreciated, say, some technological tool, a hydraulic lift to raise heavy objects, or a set of pulleys, or a first-class lever – choose whatever you like! It really is arbitrary for a person, looking back at history, to feel better about contemporary life because contemporary life infuses all that person’s experience while history’s something that must be learnt.

I’d say that learning history means learning whatever has lasted and been passed on because what has lasted and been passed on was deemed to have merit. We’re taught the history that has been deemed worth remembering. What I’ve found has been deemed worth remembering (i.e., the kinds of things I learned from History classes) are the general mood of the times, the disputes and fights that occurred (violent or academic), a select handful of the figures involved, and an inventory of whichever non-living innovations and technologies simultaneously arose alongside it all. If, later, we demerit and no longer pass on what has lasted up until then, passing on instead some different history, then that’s entirely indicative of us, now, versus anyone who came before us, and it reflects changed but not necessarily smarter or better priorities and values.

For me, we shouldn’t be saying we’re any smarter or better, only different. So much literature has lasted, so much art. Commerce has lasted, institutions have lasted, so much has lasted. Civilization has lasted. Cleverness, ingenuity, shrewdness, wit, insight, intellect, cunning, wisdom, kindness, compassion, deceit, pretence, honesty, so many many human traits – and they all transcend generations and eras. People vary, but human nature perdures. I’ll trust the experts, far more immersed in specific historical study than me, to identify slow or subtle changes in our traits – hell, I’ll even grant we may have culturally evolved, after all – and I can only imagine in how many ways the world is different now as compared to before now. But what does it mean to be better? Better than other people? Really? And who decides, and what’s the measure?

We can measure efficiency, for instance, so to say technology has advanced and is better than before is, I think, fair. Even then, an individual will have a subjective opinion – yours, mine, anybody’s – making culture not proactive and definitive but rather reactive and variable, a reflection, the result of comprised opinions that amplify what is shared and stifle what is not. As we’re taught merited history, you’re almost forced to concur, at least until we reconsider what has merit. That’s a sticking point because everyone will have an opinion on what is culturally better and what is culturally worse. Morality inevitably differs, and suddenly we have ethical debate, even disagreement, even discord. But to say people or culture are better, I think, is too subjective to rationalize and a questionable path to tread.

Consider this as well: we each know what we’ve learned, and as individuals, we’ve each learnt what we value. But what we’re taught is what’s broadly valued and, thereby, prescribed for all. We’ve all heard that rather hackneyed epigram, that those who neglect history are doomed to repeat it. Well, maybe the ones screwing up just didn’t learn the right history to begin with. I tend to abide by another hackneyed epigram, that they are wisest who know how little they know. Real historical wisdom and real historical understanding would be seeing and thinking and understanding as people earlier did. But short of firing up the Delorean for an extended visit some place, some time, it seems to me that judgments about history are made with an aplomb that might be better aimed at acknowledging our finite limitations. We’re no angels. If anything, this error of judgment speaks volumes about us. Condescension is what it is, but in my opinion, it’s no virtue.

We should hardly be judging the past as any less able or intelligent or kind or tolerant or virtuous as us, especially not if we aim to live up to today’s woke cultural embrace of acceptance. Being different should never be something critiqued; it should be something understood. Conversely, in proportion to how much we know, passing judgment is assumptive, and we all know what Oscar Wilde had to say about assuming (at least, we know if we’ve studied that piece of history). At the very least, we ought to admit our own current assumptions, mistakes, errors, accidents, troubles, disputes, and wars before we pass any judgment on historical ones. On that positive note, I will say that considering all this prompted me to notice something maybe more constructive – so often, at least in my experience, what we seemed to study in History class were troublemaking causes and effects, bad decisions, and selfishly motivated behaviours. Far more rarely was History class ever the study of effective decision-making and constructive endeavour – maybe the odd time, but not usually. Maybe my History teachers were, themselves, stifled as products of the system that educated them. What could they do but pass it along to me and my peers? Considering that, I might more readily understand how people, alive today, could conclude that all who came before were simply not as enlightened, as sophisticated, or as adept as we are now.

Yet that merely implicates contemporary ignorance: assumptions and mistakes still happen, errors still occur, accidents – preventable or not – haven’t stopped, troubles and disputes and wars rage on. If the axiom of being doomed to repeat history were no longer valid, we wouldn’t still feel and accept its truthful description, and it would have long ago faded from meaning. All I can figure is that we’re still poor at learning from history – the collective “we,” I mean, not you in particular (in case this essay was getting too personal). We need learned people in the right positions at the right times, if we hope to prevent the mistakes of history. Not enough people, I guess, have bothered to study the branches of history with genuine interest. Or, no, maybe enough people have studied various branches of history, but they don’t remember lessons sharply enough to take them on board. Or, no no, maybe plenty of people remember history, but the circumstances they face are just different enough to tip the scale out-of-favour. Epigram time: history doesn’t repeat, but it does rhyme. Or maybe we’re just full of ourselves, thinking that we’ve got it all solved when, evidently, we don’t.

It also dawned on me, considering all this, that high school “History” influences what many people think about broader “history.” My high school experience, university too, was mostly a study of politics and geography, and toss in what would be considered very elementary anthropology – all this as compared to those other branches of historical study. Archaeology and palaeontology come to mind as detailed, more scientific branches of history, but there are so many – literary history, philosophical history, religious, environmental, military, economic, technological, socio-cultural as I’ve already indicated, on and on they go, so many categories of human endeavour. I’ve even come across a thoughtful paper contemplating history as a kind of science, although one that is normative and susceptible to generational caprice. One final epigram: history is what gets written by the winners.

And that’s really the point here: throughout what we call human history, where we’ve subdivided it so many ways (right down to the perspective of every single person who ever bothered to contribute, if you want to break it down that far), it’s people all the way back, so it’s all biased, so it’s neither complete nor even accurate until you’ve spent oodles of time and effort creating a more composite comprehension of the available historical records. And, dear lord, who has time for that! History, in that respect, is barely conceivable in its entirety and hardly a thing to grasp so readily as to say, simply, “Back then…” History is people, and lives, and belief inscribed for all time. To know it is to know who lived it as well as who recorded it. Knowing others is empathy, and empathy is a skill trained and fuelled by curiosity and diligence, not emotion or opinion. Emotion and opinion come naturally and without effort. For me, valid history derives from informed empathy, not the other way around.

As far as recording history for future study, ultimately, it will have been people again recording and studying all of it, “it” being all of what people were doing to attract attention, and “doing” being whatever we care to remember and record. It’s all a bit cyclical, in itself, and completely biased, and someone will always be left out. So people might be forgiven when shaking their heads in judgment of the past because, without empathy, what else could they possibly know besides themselves?

Suppose Trivial Grammar Were Actually the Road Map

Sometimes, the hardest part of teaching felt like finding a way to reach students when they just didn’t get it. But if there’s one thing I learned while teaching, it’s that it takes two. In fact, the hardest part of teaching was coming to realise it wasn’t them not getting it, it was me not getting them. In my own defense, I think we just never can know what another person’s motive truly is. It was times like that when I realised the true constructive value of respect and a good rapport. To have any hope of being open-minded, I intentionally needed to respect my students’ dignity, and I needed to be more self-aware as to how open- or closed-minded I was being. Humility has that way of being, well, humbling. These days I’m still fallible but a lot better off for knowing it. And, yes, humility’s easier said than done.

Over sixteen marvellous years teaching secondary English in a high school classroom, I learned that teaching is a relationship. Better still, it’s a rapport. I learned that it takes two, not just hearing and talking but listening and speaking in turn, and willingly. And, because bias is inescapable, I learned to consider a constructive question: what motives and incentives are driving anyone to listen and speak to anyone else? It has an admittedly unscrupulous undertone: what’s in it for me, what’s in it for them, who’s more likely to win out? The thought of incentives in high school likely evokes report cards, which is undeniable. But where listening (maybe speaking, too) to some degree means interpreting, what my students and I valued most was open-minded class discussion. With great respect for our rapport, we found the most positive approach was, “What’s in it for us?” The resulting back-and-forth was a continual quest for clarity, motivated on everyone’s behalf by incentives to want to understand – mutual trust and respect. Looking back, I’m pleased to say that tests and curricula seldom prevented us from pursuing what stimulated us most of all. We enjoyed very constructive lessons.

Of course, we studied through a lens of language and literature. Of particular interest to me was the construction of writing, by which I mean not just words but the grammar and punctuation that fit them together. My fascination for writing has been one of the best consequences of my own education, and I had encouraging – and one very demanding – writing teachers. In the classroom and on my own, I’ve always been drawn to structure as much as content, if not more so, which isn’t unorthodox although maybe not so common. The structure of writing gets me thinking on behalf of others: why has the writer phrased it this certain way? What other ways might be more or less well-suited for this audience? How might I have phrased something differently than this writer, and why? Most English teachers I know would agree that pondering such questions embodies a valuable constructive skill, these days trumpeted as critical thinking. I’d argue further that it’s even a pathway to virtue. Situated in context, such questions are inexhaustible, enabling a lifetime of learning, as literally every moment or utterance might be chosen for study.

In that respect, we loosely defined text beyond writing to include speech, body language, film, painting, music, architecture – literally any human interaction or endeavour. I’ll stick mostly with listening and speaking, reading and writing, just to simplify this discussion. The scope being so wide, really what our class sought to consider were aim and intention. So when students read a text for content, the WHAT, I’d ask them to consider choices made around vocabulary, syntax, arrangement, and so forth, the HOW. That inevitably posed further questions about occasion and motive, the WHY, which obliged varying degrees of empathy, humility, and discernment in reply: for a given writer, how best to write effectively on a topic while, for a given audience, what makes for skillful reading? What motives are inherent to each side of the dialogue? What incentives? These and others were the broader-based “BIG Question” objectives of my courses. They demanded detailed understanding of texts – heaven knows we did plenty of that. More importantly, the BIG Questions widened our context and appreciation even while they gave us focus. When times were frustrating, we had an answer for why studying texts mattered. Questions reflect motivation. Prior to exercising a constructive frame-of-mind, they help create one.

Questions, like everything else, also occur in a particular context. “Context is everything,” I would famously say, to the point where one class had it stencilled for me on a T-Shirt. So much packed into those three plain words – everything, I suppose. And that’s really my thesis here: if we aim to be constructive, and somehow do justice to that over-taxed concept, critical thinking, then we need to be actively considering what we hear and say or read and write alongside other people, and what it all makes us think for ourselves – especially when we disagree. (Is active thinking the same as critical thinking? I’m sure the phrase is hardly original, but I’ll consider the two kinds of thinking synonymous.) During my last 3-4 years in the classroom, all this came to be known by the rallying cry, “Raise the level of discourse!” These days, however, the sentiment is proving far more serious than something emblazoned on a T-Shirt.

I’m referring, of course, to the debacle that has been the 2016 U.S. Presidential election and its aftermath. Specifically, I have in mind two individual remarks, classic teachable moments inspired by current events. The first remark, from an NPR article by Brian Naylor on the fallout over the executive order banning Muslim immigrants, is attributed to the President. The second remark is a response in the comment section that follows Naylor’s article, representative of many commenters’ opinions. To begin, I’ll explain how something as detailed as grammar and punctuation can help raise the level of discourse, especially with such a divisive topic. From there, I’ll consider more broadly how and why we must always accept responsibility for this active language – sometimes correct grammar should matter not just to nit-pickers but to everybody.

In the article (February 8, 2017), Brian Naylor writes:

“Trump read parts of the statute that he says gives him authority to issue the ban on travel from seven predominantly Muslim nations, as well as a temporary halt in refugee admissions. ‘A bad high school student would understand this; anybody would understand this,’ he said.”

We all know the 45th U.S. President can be brusque, even bellicose, besides his already being a belligerent blundering buffoon. This comment was received in that light by plenty, me included. For instance, by classifying “bad” (versus “good”), the President appeals at once to familiar opposites: insecurity and self-worth. We’ve all felt the highs and lows of being judged by others, so “bad” versus “good” is an easy comparison and, thereby, a rudimentary emotional appeal. However, more to my point, his choice to compare high school students with lawyers, hyperbole or not, was readily construed as belittling since, rationally, everyone knows the difference between adult judges and teenaged students. That his ire on this occasion was aimed at U.S. District Judge James Robart is not to be misunderstood. Ironically, though, the President invokes the support of minors in a situation where they have neither legal standing nor professional qualification, rendering his remark not just unnecessarily divisive but inappropriate, and ignorant besides – although he must have known kids aren’t judges, right?

To be fair, here’s a slightly longer quotation of the President’s first usage of “bad student”:

“I thought, before I spoke about what we’re really here to speak about, I would read something to you. Because you could be a lawyer– or you don’t have to be a lawyer: if you were a good student in high school or a bad student in high school, you can understand this.”

Notice, in the first place, that I’ve transcribed and punctuated his vocal statement, having watched and listened to video coverage. As a result, I have subtly yet inevitably interpreted his intended meaning, whatever it actually was. Yet my punctuation offers only what I believe the President meant since they’re my punctuation marks.

So here’s another way to punctuate it, for anyone who feels this is what the President said:

“Because you could be a lawyer, or you don’t have to be a lawyer – if you were a good student in high school or a bad student in high school, you can understand this.”

Here’s another:

“Because you could be a lawyer. Or you don’t have to be a lawyer. If you were a good student in high school or a bad student in high school, you can understand this.”

Finally, but not exhaustively, here’s another:

“Because you could be a lawyer… or you don’t have to be a lawyer; if you were a good student in high school or a bad student in high school, you can understand this.”

Other combinations are possible.

Rather than dismiss all this as pedantry, I’d encourage you to see where I’m coming from and consider the semantics of punctuation. I’m hardly the only one to make the claim, and I don’t just refer to Lynne Truss. Punctuation does affect meaning, both what was intended and what was perceived. To interpret the President’s tone-of-voice, or his self-interrupting stream-of-consciousness, or his jarring pattern-of-speech, or whatever else, is to partly infer what he had in mind while speaking. We interpret all the time, listening not only to words but tone and volume, and by watching body language and facial expression. None of that is typically written down as such, except perhaps as narrative prose in some novel. The point here is that, in writing, punctuation fills part of the interpretive gloss.

Note also where a number of news headlines have used the word “even” as an interpreted addition of a word the President did not actually say. Depending upon how we punctuate his statement, inclusive of everything from words to tone to gestures to previous behaviour, perhaps we can conclude that he did imply “even” or, more accurately, perhaps it’s okay to suggest that it’s what he intended to imply. But he didn’t say it.

If we’re going to raise the level of discourse to something constructive, we need to balance between accepting whatever the President intended to mean by his statement with what we’ve decided he intended to mean. In the classroom, I put it to students as such: “Ask yourself where his meaning ends and yours begins.” It’s something akin to the difference between assuming (based on out-and-out guesswork because, honestly, who besides himself could possibly know what the President is thinking) and presuming (based on some likelihood from the past because, heaven knows, this President has offered plenty to influence our expectations). Whatever he meant by referring to good and bad students might be enraging, humbling, enlightening – anything. But only if we consider the overlap, where his meaning ends and ours begins, are we any better off ourselves, as analysts. Effective communication takes two sides, and critical thinking accounts for both of them.

Effective, though, is sometimes up for debate, not merely defining it but even deciding why it matters. Anyway, can’t we all generally figure out what somebody means? Isn’t fussing over details like grammar more about somebody’s need to be right? I’d argue that taking responsibility for our language includes details like grammar precisely so that an audience is not left to figure things out, or at least so they have as little ambiguity to figure out as possible. Anything less from a speaker or writer is lazy and irresponsible.

In the Comments section following Naylor’s article, a reader responds as follows:

“Precisely describing Trump’s base…bad high school students who’s [sic] level of education topped out in high school, and poorly at that. This is exactly what Trump and the GOP want, a poorly educated populous [sic] that they can control with lies and bigoted rhetoric.”

Substantively, the commenter – let’s call him Joe – uses words that (a) oversimplify, blanketing his fellow citizens, and (b) presume, placing Joe inside the President’s intentions. Who knows, maybe Joe’s correct, but I doubt he’s clairvoyant or part of the President’s inner circle. On the other hand, we’re all free to draw conclusions, to figure things out. So, on what basis has Joe made his claims? At a word count of 42, what was he aiming to contribute? Some of his diction is charged, yet at a mere two sentences, it’s chancy to discern his motives or incentives, lest we be as guilty as he is by characterising him as he characterises the President. Even if I’m supportive of Joe, it’s problematic defending his remarks for the same reason – they leave such a gap to fill. At 42 words, where he ends is necessarily where the rest of us begin, and maybe I’m simply better off ignoring his comment and starting from scratch. Maybe that’s fine, too, since we should all have our own opinions. In any event, Joe has hardly lived up to any measure of responsibility to anybody, himself included – here I am parsing his words months later in another country. I’d even say Joe loses this fight since his inflammatory diction and sweeping fallacy play to his opponents, if they so choose. Unsurprisingly, Joe’s comment is not at all constructive.

For all its faults, his comment aptly demonstrates the two-way nature of dialogue. On the one side, responsibility falls to each reader or listener to bring their research and experience, then discern for themselves what was meant. In that regard, Joe has left us with a lot of work to do, if we’re motivated enough to bother. Yet I chose his particular comment as mere illustration – literally hundreds of others, just as brief and labour-intensive, scroll by below Naylor’s article… so much work for us to do, or else to dismiss, or perhaps to gain-say, if not insult. On that note, consider the other side: responsibility falls to the speaker or writer to offer substantive claims as well as the evidence that prompted them. In this instance, no matter the justification for offering something at all, what can a two-sentence comment add to issues as complex and long-standing as, say, Presidential politics? Whether or not on-line comments are democracy in action, certainly offering 42 words in two sentences struggles to promote a meaningful, substantive exchange of ideas.

I used to liken such on-line comments to my students as standing in line, debating with others while waiting for coffee, before returning to our cars or our lives, none the more informed except perhaps annoyed by some while appreciative of others. With the best intentions, we might excuse people, overlooking that we’re the ones who walked out and drove away – maybe we were late for work that day. We’ve been closed-minded to the degree that we haven’t sought to reach a thorough understanding, and certainly we’ve failed to raise the level of discourse. Would it have been better to just say nothing, grab our coffee, and leave?

Yes, I think so, which may not be easy to accept. Conversely, consider that reasoning from presumption and enthymeme is not reasoning at all. Further, consider that two sentences of 42 words or a few minutes spent chatting in the coffee line will barely scratch the surface. Who can say what motivates people to contribute so readily yet so sparsely? Recent times are emotional, growing more volatile, and potentially far more dangerous, as a result. We see in Joe’s comment, and so many others like it, that trust and respect are divisively encased in separate echo chambers. By virtue of us versus them, both sides are challenged to be open-minded.

Worse, the so-called era of “post-truth” impedes exactly the constructive dialogue we need right now, raising ire and diatribe in place of substance and equanimity. Satire compounds disagreement and grows that much more venomous, and ridicule has a way of locking closed doors. I don’t support proceeding from pretence or unfounded opinion – there’s nothing whatsoever to show for an exchange-of-opinion based on falsehood. The burden of post-truth is far too high. A bias and the truth can co-exist, and they do, guaranteed – one truth, objective, and one bias per person, subjective. Bias is an inevitable fact of existence. Left unchecked, bias obviates respect, which is why a constructive approach is so crucial. As I’ve said elsewhere, post-truth is anti-trust, at least for me, and, at its furthest extent, a threat to civil security, which sounds alarmist – good, let it. We need to attend to this. More than ever now, we need respect or, failing that, at least greater tolerance. That’s for starters.

Worse still, in this post-truth world, fictional claims face no arbiter but the other side so distrusted and maligned. The kind of polarised situation made infamous in Washington, DC is spreading, realised in a zillion on-line comments like Joe’s with every article published. Hopefully not this one, unless maybe someone hasn’t actually read this. On such a perilous path – facts in dispute, emotions enflamed – each side qualifies “open-minded” as unique to themselves and misappropriated by the rest. That’s significantly divisive and the recipe for unrest that I spy, and it sounds my alarm. In that divided state, in lieu of anything left to discuss, even as reality has its way of catching up, what damage might already be done? Especially when facing fellow citizens, whatever we choose now must accord with what we’re prepared to accept later. Let that sober thought sink to the core because the less we share common goals, the more we’re set to clash over unshared ones. But it’s within us to converse and to converge.

Let’s be willing to listen with empathy, understand with compassion, research with diligence, and respond with substance. Do your own investigation. Accept responsibility to inform yourself. Yes, take what you find with a grain of salt until you can believe to your own satisfaction what is right and trustworthy. Yet, even then, be tolerant if not respectful of others – too much salt is harmful. We all have our own motives and incentives for listening and participating, so let’s dig deeper than how pissed off we are with the other side: walking the high road with pride or smug assurance is really the low road and a path of hubris. It’s closed-minded, but not in the sense that we haven’t sought to reach a thorough understanding of the other side. It’s closed-minded to the degree that we haven’t sought to understand how and why the other side reached their position to begin with.

None of this is hard to understand. Once upon a time, we decided that education mattered, and it’s no accident that the trivium – grammar, rhetoric, dialectic – was given a central role. These days, its value in niche markets, notably private Christian education, is enough to switch some people off, which sadly exemplifies this entire discussion. I believe classical education is valuable for all. We’ve neglected it to our detriment, perhaps to our peril. We have a lot in common, more than we might credit, with our neighbours and fellow citizens. It’s not like they grew up on Mars. We’re not significantly different – hands up if you’re a human being. Start with that, some basic human dignity.

There’s a lot to be offered by rapport in our relationships, and little to expect without it. All we can do is understand the other person’s interpretation, and they ours, and go from there – or else not. And it’s easy to nod and say, “I already do that while others do not.” But reflect upon yourself anyway, in every conversation, debate, or exchange. Humility is a virtue, even when kept low-key. Everybody bears responsibility for their own participation. The more we live up to being respectful, even of those whom we oppose, the more progress we’re liable to make – however slowly it might happen.

As I said at the outset, yes, humility’s easier said than done. But by the same token, why write this essay if 42 words would do? We must neither hide ourselves away nor proceed prematurely. We must be able to discern flaws of reason, and we must be able to communicate with humility if we aim to deliver – and, more critically, if we hope to be received – from a place of thoughtfully considered understanding. Whether or not we truly trust one another, let’s help put the logos back in dialogue and accept our own responsibility to approach people with intentional self-awareness. Let’s seize the opportunity to be role-models – you just never know what somebody else is thinking. Let’s raise the level of discourse. And let’s remember that taking the high road must be open-hearted as well as open-minded.

 

 

Teaching Open-Mindedly in the Post-Truth Era

I had brilliant students, can’t say enough about them, won’t stop trying. I happened to be in touch with one alumna – as sharp a thinker as I’ve ever met, and a beautiful writer – in the wake of the U.S. election campaign last winter and wrote the following piece in response to a question she posed:

How do you teach open-mindedly in the post-truth era?

I was pleased that she asked, doubly so at having a challenging question to consider! And I thoroughly enjoyed the chance to compose a thoughtful reply. Not my own best writing, perhaps, but containing some insight, anyway.

I’ve revised things here a little, for a broader audience, but nothing of substance has changed.

How do you teach open-mindedly in the post-truth era?

Good heavens. Hmm… with respect for peoples’ dignity, is my most immediate response. But such a question.

Ultimately, it takes two because any kind of teaching is a relationship – better still, a rapport, listening and speaking in turn, and willingly. Listening, not just hearing. But if listening (and speaking) is interpreting, then bias is inescapable, and there needs to be continual back-and-forth efforts to clarify, motivated by incentives to want to understand: that means mutual trust and respect, and both sides openly committed. So one question I’d pose back to this question pertains to the motives and incentives for teaching (or learning) ‘X’ in the first place. Maybe this question needs a scenario, to really illustrate details, but trust and respect seem generally clear enough.

Without trust and respect, Side ‘A’ is left to say, “Well, maybe some day they’ll come around to our way of thinking” (that being a kind portrayal) and simply walks away. This, I think, is closed-minded to the degree that ‘A’ hasn’t sought to reach a thorough understanding (although maybe ‘A’ has). Anyway, it’s not necessarily mean-spirited were someone to say this. With the best intentions, ‘A’ might conclude that ‘B’ is just not ready for the “truth.” But broadly, I’d say this is more like quitting than teaching, which is to say a total failure to teach, at least as I take “teaching” from your question. It would differ somewhat if said by the learner vs the teacher. Then we might conclude that the learner lacked motivation or confidence, for some reason, or perhaps felt alone or unsupported, but again… scenarios.

Another thing to say is, “Well, you just can’t argue with stupid,” as in we can’t even agree on facts, but saying this would be passing judgment on ol’ stupid over there, and likewise hardly open-minded. Personally, I’d never say bias precludes truth, only that we’ll never escape our biases. The real trouble is having bias at all, which I think is what necessitates trust and respect because the less of these is all the more turmoil. I figure any person’s incentive to listen arises from whatever they think will be to their own benefit for having listened. But “benefit” you could define to infinity, and that’s where the post-truth bit is really the troublesome bit because all you have is to trust that other person’s interpretation, and they yours, or else not.

Yeah, I see “post-truth” as “anti-trust,” and for me, that’s a powderkeg, the most ominous outcome arisen of late. People need incentives to listen, but if treating them with dignity and respect isn’t reaching them, then whatever they wanted to begin with wasn’t likely a positive relationship with me. That’s telling of the one side, if not both sides. At the same time, it’s harder to say of students that they have no incentives to listen or that they don’t trust teachers on account of some broader post-truth culture – that might be changing, who knows, but I hope not.

But I’m leaving some of your question behind, and I don’t want to lose sight of where it’s directed more towards the person doing the teaching (how do you teach open-mindedly…).

That part of the question was also in my immediate reaction: respect peoples’ dignity. If it’s me teaching, say, then to have any hope of being open-minded, I intentionally need to respect that other person’s dignity. I need to be more self-aware, on a sliding scale, as to how open- or closed-minded I’m being just now, on this-or-that issue. So that’s empathy, but it’s self aware, and it’s intentional.

Me being me, I’d still be the realist and say you just can never really know what the other person’s motive truly is – pre-truth or post-truth world, doesn’t matter. But whether or not you trust the other, or they you, the real valuable skill is being able to discern flaws of reason, which is what I always said about you – you’ve always been one to see through the bull shit and get to the core of something. I’m no guru or icon, I’m just me, but as I see it just now, the zeitgeist is an emotional one more than a rational one, and there’s plenty to understand why that might be the case. Given that emotional dominance, I do think post-truth makes the world potentially far more dangerous, as a result.

Whatever incentives people are identifying for themselves, these days, are pretty distinct, and that’s a hard one for unity. That saying about partisan politics – “We want the same things; we just differ how to get there” – that doesn’t apply as widely right now so, by virtue of the other side being “the other” side, neither side’s even able to be open-minded beyond themselves because trust and respect are encased in the echo chambers. Things have become distinctly divisive – partisan politics, I mean, but I wonder how deeply those divisions can cut.

So, for instance, lately I find with my Dad that I listen and may not always agree, but where I don’t always agree, he’s still my Dad, and I find myself considering what he says based on his longevity – he’s seen the historic cycle, lived through history repeating itself. And I obviously trust and respect my Dad, figuring, on certain issues, that he must know more than me. On other issues, he claims to know more. On others still, I presume he does. Based on trust and respect, I give him the benefit of the doubt through and through. One of us has to give, when we disagree, or else we’d just continually argue over every disagreement. Someone has to make “benefit” finite by acquiescing, by compromising themselves right out of the debate. If you want peace, someone has to give, right? So should I trust my Dad? I respect him because he’s given me plenty good reason after such a long time. Certainly I’m familiar with his bias, grown accustomed to it – how many times over my life have I simply taken his bias for granted?

I see it even more clearly with my daughter, now, who trusts me on account of (i) her vulnerability yet (ii) my love. The more she lives and learns alongside me, as time passes by, the more cyclically that outlook is reiterated, like how self-fulfilling prophecy works. Other parents keep warning me that the day’s coming when she’ll become the cynical teenager, and I’m sure it will – I remember going through it, myself. But I’m older, now, and back to respecting my Dad, so at least for some relationships, the benefit of the doubt returns. My Dad preceded me, kept different circles than me, and has lived through two or three very different generations than I have. Even as we see the same world, we kind of don’t. So this is what I wonder about that deep cut of division, reaching the level of family – not just one given family but right across the entire population. Do I fact-check my Dad, or myself, or maybe both? Should I? Because neither one of us is infallible.

Anyway, the child of the parent, it’s as good an example as I can think of for questioning what it means to learn with an open mind because there’s no such thing as “unbiased,” yet love, trust, and respect are hardly what we’d call “closed-minded,” except that they are, just in a positive way. They leave no room for scepticism or wariness, the kinds of traits we consider acceptable in healthy proportions. But “teaching” with an open-mind takes on so much more baggage, I think, because the teacher occupies the de facto and the de jure seat-of-power, at least early on – school is not a democracy (although that seems, now, to be changing, too), yet teachers are no more or less trustworthy on the face of it than any other person. That’s also why I reduce my response to respecting human dignity because, even if it’s positive closed-minded, at least it’s a do-no-harm approach.

That jibes with everything I’ve learned about good teaching, as in good teaching ultimately reduces to strong, healthy relationships. Short-term fear vs long-term respect – it’s obvious which has more lasting positive influence. And since influencing others with our bias is inevitable, we ought to take responsibility for pursuing constructive outcomes, or else it’s all just so much gambling. At the core, something has to matter to everybody, or we’re done.