Life in the Classroom

People often say that school is a training ground, or a practice session, a rehearsal for life, which usually means career. This whole “real-life” authentic learning stuff basically privileges the work place over the school classroom.

If the flipside to words like “real” and “authentic” is words like “fake,” “false,” or “contrived,” then maybe we’re just being sloppy with our words. Unless that’s actually what we think of school. Well, I’m probably just being too fussy because none of this is what’s meant by real-life authentic learning, is it?

Or, if it is, wouldn’t it mean that no one’s keeping score until “real life” happens, and no one’s getting paid in school since results are pretend, and no one’s responsible because, hey, it’s all just theoretical? School the Great Training Ground implies that school is not really a place that matters because it only teaches about life or career, which comes later…

And hey, wait, that’s just fine, isn’t it? School does teach stuff for later, for when things matter, for when it really counts.

But hey, wait.

Doesn’t anybody think school actually is part of life? Part of kids’ lives, and teachers’ lives? Kids and teachers are alive. School is not some neverland of make-believe. School is real. It has real, live people in it. A classroom is an authentic place. It’s a – go ahead, say it with me, now – “c l a s s r o o m.” Right! What goes on in school is plenty real and means more than “Yeah, this doesn’t really count.” Laughter and friendship matter. Anxiety and stress matter. Living affects us. Learning is meaningful. Here you go, how about this: if classrooms were inauthentic, if learning were just practice for later, if school weren’t real, then why are we grading people? Shouldn’t we save that for the regular season, when it really matters?

Students are young people who live a huge chunk of their lives every day in a classroom. For them, school is every bit as real as any other place they go.

Gratisography Pexels red-school-blur-factory-451
Image by Gratisography from Pexels

It has ups and downs, friendship and rivalry, anxiety and stress and reward and good times. Calling school “sheltered” essentially discounts being there because a kid’s life, relative to themselves and their own experience, is all there is. Kind of like any other point in our lives. Our lives are real, and we don’t just switch on like a light bulb upon leaving some unreal, inauthentic place – there is no such place.

School being this pretend place is adult condescension – oh-so well-intentioned, of course, on that infamous and overcrowded road to hell. Sure, kids have limited experience. Sure, they might be sheltered according to someone else’s perspective – fine, then that’s how big a kid’s world is. So be it. It remains that kid’s perspective. It’s not pretend.

How big a life is becomes how much that person can cope. School life, family life, sports, hobbies, friends, on and on – all of these teach us something about living. Are none of these other parts of life somehow sheltered, too, like school? Who decided that school takes that prize? How about piano lessons? Those are something like school… teachers, practice, grades. Tests, recitals, performances. How about sports, with competitors, and winning and losing. With coaches and referees who hold us accountable. Are these somehow less unreal or inauthentic, or pretend, than a teacher? Are they some kind of real life learning that school is not? We often compare sport to life, and value the life lessons it teaches, but when do we short-change sport by saying it’s less real or some kind of shelter?

In fact, there’s a subtle difference to what we say about sport: “Such great preparation for life!” Affirming, valuable, such a boon. In my experience, the attitude bestowed by adults upon the value of youth sport is nothing but positive, as compared to school…

Which parts of life is nobody grading? If school is so sheltered, aren’t we actually handcuffing teachers who attempt to teach accountability? I suppose it’s not ironic that kids live in “the real world” every minute they’re not in school.

Take the kid who sees school as sheltered preparation for the real world. The day that kid argues with their friend on Saturday, what has school really prepared them for? How to argue with friends, or how to be sheltered?

Telling kids that school shelters them from the harsh world out there is misleading.

Teacher: “You need to do better on your homework. If this were a job and I were your boss, you’d never get away with this.”

Student: Right, and since it’s not a job…

What’s the chance this kid did a less-than-stellar job on homework because they’ve learned school is sheltered? Hey, if it’s all just rehearsal, why bother? If it’s only my teacher telling me off, and not a real boss, then who cares? Duly noted: someday it will matter. But not here in school, not today. If we constantly send the message that school doesn’t really matter because the real world is still out there, what will young people grow to understand from their time in school?

But if kids were told and shown that school matters, just like the rest of life, maybe they’d understand it more respectfully, value it more meaningfully. Adults like to say that life is about learning and that we should all be lifelong learners. Seems to me that would make school a place to practise learning. And if the focus in school were simply on learning, then focusing on doing your homework, on careers and jobs, on “If the teacher was your boss…,” on whichever-whatever details… then all the details would just be mere details, as in not really all that important.

If life is about learning – lifelong learning – then what possible sense does it make to shelter anybody from anything? Shouldn’t we just live, and learn all the way through? At school, learn at school. At work, learn at work. With family, learn with family, and so in every circumstance. When life happens, you learn about it, and now you know a little more than before. If learning is what matters, how about we learn how to learn? And then learn, in every possible situation. Life’s going to happen, anyway.

Featured Image Credit: Image by sunil kargwal from Pixabay

On Having Re-Read Garnet Angeconeb’s “Speaking My Truth: The Journey to Reconciliation”

Problematizer, philosopher, contrarian. Critic, cynic, shit-disturber. I think I’ve been a problematizer all my life, just not in every part of my life.

This post is not me trying to be an Opinion columnist. And it’s not me trying to be some do-gooder or, worse, some feel-gooder. This is from me about me, something I was prompted to write from having read about somebody else. It’s a step in a process. Read on, and take issue as you must or as you will.

On Having Re-Read Garnet Angeconeb’s “Speaking My Truth: The Journey to Reconciliation”

As far back as I can remember, Canada has told me that we embrace diversity. Not always using that word, but still that message.

Our various governments and politicians, our cultural day-to-day. Our jobs, our coursework. Campus life, news coverage, social media, friends around a table… continually, there it is, Canada telling me, telling all of us, how diverse and tolerant we are. For me, the word “diversity” became Canada’s overwhelming self-portrayal soon after the 9/11 terror attacks. But I remember the call as far back as Grade 1 class, an amenable time of life for anyone to be told such grand concepts.

We sat cross-legged around the front carpet as Mrs McCrae sounded out the word cosmopolitan on the chalkboard. This word, she told us, was what our Prime Minister, Pierre Trudeau, wanted for the country, for us: a mixture of people from all over the world, a good thing. That winter, 1977, Amanda’s Uncle Ron visited with footage of his trip through China – silent Super-8 images of crowded urban Shanghai, so drab and different from Vancouver, and the Great Wall, so mythic and unhurried. I remember we read Holling C. Holling’s marvellous book and sat wholly entranced not once but twice, a bunch of six year-olds, watching Bill Mason’s “Paddle to the Sea.” I can still hear the pops and scratches of the spliced soundtrack churning past the projector bulb, can still smell the chilly school building, which stands as ever just a few blocks away.

And I can still see Mrs McCrae tapping out the syllables with her chalk that day on the front carpet. Whether she gripped the class, I can’t say, though I knew we were listening. I was listening. I remember thinking and thinking about what she told us, thinking intensely, and when Trudeau was re-elected a year or so later, his name and face sounded it out for me, again and again: “Cos-mo-pol-i-tan.” Canada, for me, has quietly been cosmopolitan all my life because, when I was six, that’s what I was told.

Childhood experiences are momentous… their measure smaller, more intense, less diluted than an adult’s. Two years before, at our sunny backyard farewell, my pre-school teacher, Mrs Needham, gave me a goodbye card, green construction paper with a cheerful yellow lion glued to the front. Think Before You ActI sat in her arms when my turn came, and she read to me what she wrote: “Remember, Scott,” printed in big friendly letters, “to think before you act.” To say I never forgot, really, I always remembered. I remembered and did what she told me to do. I still have the card, bundled with the rest of my degrees and diplomas – the one that cost the least, time and money, perhaps worth most of all.

Primary lessons, intangibles, engrained so early. They can resurface, borne of reflection, yet years might pass before we notice them staring back from the mirror. When we’re children, we take our cues from adults, as we learn to become adults ourselves. My Dad coached my soccer team a few seasons. One evening, he said, “Hands up if you can think.” We dutifully raised our hands, and he told us, “If you can think, you can play.” Some nodded, some laughed. In fact, his point was to single out one particular boy, whose name is lost although I see his face so clearly, as a way to build his confidence. It’s something I only learned when my Dad told me years later. I’ve been coaching, myself, nearly thirty years now, and I still use this one: if you can think, you can play. I like its appeal, that basis of something commonly shared.

By the time I was an adult, whenever that happened, lessons like these were so engrained it took me years to notice them – and these just some influences I remember. How many I forget? That’s fair. Memories, influences, lessons… intangibles are borne of reflection. What took longer still was noticing their effect, or better to say having their effect pointed out to me, despite their staring back my whole life whenever I looked in the mirror. These days, I’m obliged to notice that who and what I am affects the way people treat me. I just took longer to feel obliged, and to notice that I noticed. I was blind to my self, really. But eventually… and I could list off a bunch of traits. It’s just that listing them seems a bit petty. There’s nothing petty about how I’ve been treated most of my life on account of who and what I am.

Once, long ago, I felt compelled to correct a customer who took our shared resemblance as a chance to share his bigotry. It was finally my girlfriend’s photo that told him we disagreed. Another time it was my daughter’s photo although this person assured me her comments weren’t about my daughter, specifically, just other people. Last spring, on my daughter’s field trip, the guide gave me the instructions instead of her teacher, who stood next to him. He’d just greeted her, shaken her hand. At the hospital, some of the nurses and specialists – not all of them, just some – spoke to me as though my Dad weren’t even in the room, even while answering the lucid, specific questions that he asked himself. At a restaurant, the smiling host walked right past our party planner and welcomed me with an outstretched hand, well over twenty feet away. I was holding the door open for our group to come inside.

Innocuous moments… hardly seem worth mentioning. No injuries, no threats. No danger. Yet also nothing trivial about somebody else reduced by a decision to single me out. Each time, I clarified. Told them different, set them straight. Other times, too, since then. And I only surmise what distinguished me as somehow different each time because no one ever actually said. On the other hand, I well understood what made me a punchline for howlie-jokes… party with the new in-laws, around the table, in the rec room, wherever it was. Once or twice I’d retort, but no no, they insisted, I was out of line. Just joking, they said. “All in fun, Shark Bait!” No offense, just some gentle family hazing.

Just… now there’s a versatile word. Maybe they were right. Maybe I just needed to learn how to smile. Years later, outside the Blue Mosque, made to feel especially white and tall and exceptional, I was more prepared, thanks to those (now ex-) jokester in-laws. Thanks, then, from an “out-law,” as that family liked to call us, we who married in. I’m better at it now, laughing and letting go. The stakes no longer feel so high. But I’ve had good fortune in life. Not that I mean to compare.

Where or why or how differences began encouraging us to neglect each other I don’t know. People warrant consideration. Dignity and respect for all is not a platitude. I can only speak for myself. Even in difficulty, my childhood was fortunate. I took a while to notice that who I am, what I am, affects the way I’m treated, or as I said before, I took a while to notice that I noticed. But I did. For years after, I kept to myself, thinking that nothing I had to offer could amount to what people knew better for themselves. Respect others by minding my own business.

I suppose I offered when it seemed constructive. But mine, as any life, was entirely simply mine, so I actively sought to respect everyone else’s autonomy to go it alone, just like I was doing. And surely they did go it alone, didn’t they, since that’s what I was doing… I became expert at minding my own business, and you should have, too – would have saved me years of road rage. Anyway, none of this was my decision to begin with, right? Tall, white, heterosexual, male, the petty list… offering an opinion about how to treat people, about anything, really, would not just look self-righteous, it would be self-righteous. Wouldn’t it? Respectful detachment, I thought. Respect people to be themselves. People are different, being themselves. Let them be. Respectful detachment.

Diversity. Diversity is Canada’s strength, after all. That’s what Prime Minister Justin Trudeau has been telling the world. He and I are the same age; I wonder if his teachers ever told him about being cosmopolitan. I figure his Dad must have done. Our greatest strength, says Trudeau the Lesser. Strong because of our differences. And, he warned, so easy to take for granted – that bit reminded me of when people say Canada is “historically a nation of Peacekeepers.” Not that the two messages relate, but just the presumption to think that every Canadian has ever known that we have ever known this all along. As for peacekeeping, I think the BNA Act preceded the UN by some 82 years, during which time Canada earned a reputation for fighting in both World Wars and dozens of armed conflicts prior to the 20th century. Although, granted, for much of that time, “Canada” wasn’t even a country yet. Are we squeamish over such history? p.s. in 2017, a Canadian sniper fired a record setting rifle shot, keeping the peace from 3.54km away. That sentence makes the sniper sound cowardly, but that’s not my intention. Irony is my intention, and Canadians are my target. So hey, why not just rewrite my sentence?

When the times are a-changin’, does that make us more informed, egalitarian, advanced, sophisticated, woke? I guess we’re just better now than we were back then. Or maybe we put ourselves at risk by misconstruing our memories and cherry-picking our history for judgments good and ill. I mean our judgments about history, by the way, not theirs when it wasn’t yet history. Big difference. Diverse, you might say. I’d say there’s a lot more to history than any one’s record of it. Historical records are just the bits that someone decided to keep, out of what someone could remember, out of what people felt was worth remembering that way, for the reason they want to remember it.

As surely as Canada has always been this nation of peacekeepers– p.p.s. if you’ve missed the irony, I’m saying we haven’t always been peacekeepers. I’m saying that we owe much to our military for their combat as much as their peacekeeping. I’m saying our trumpeting diversity is afforded time and space thanks to a lot of others who assertively provide for Canada’s stability. I’m saying the world is far more layered and interconnected and ironic in time and space than any lone one of us can account in any one moment. Anyway, as surely as the doubt cast upon “always this nation of peacekeepers,” have we also always been that nation of immigrants? As in, are we not colonists and settlers? That change to history’s now a-comin’, too, judged and rewritten every time we utter the words.

So who’s responsible for Canada’s history? I mean all of it, not just the recorded bits. Which people, or culture, or government, exactly, should we say has slowly come their ways to initiate change… change that respects dignity and culture, change that redresses abusive power and obtuse ignorance and acknowledges history? Whose decisions were they that have caused incalculable suffering to entire nations of people over a few centuries? Where’s the change that leaves less room to agree with platitudes like diversity being Canada’s greatest strength? Weigh that against change that gets our goods moving again, and our traffic, and jobs and our economy. Both matter, or is this just long-term / short-term? Rights and recognition vs. stable day-to-day? Driving change or punting the ball? These are different motives because they have different outcomes. Our priorities drive our politics. I don’t know what he thinks before he acts. But politics is a tangled web, and I suspect he would welcome that benefit to my doubt. Well, it’s a free country, right? We all have opinions. Diversity is a strength except when it isn’t.

This is but one moment. I’ve been prompted to look back at myself. It’s the history I know best. I’m compelled to think before I act because that’s what my teacher told me to do. Luckily, it’s also a lesson that seems wisely conceived, if not always exercised, like when emotions run high. As for cosmopolitan mixture, life alongside others without need for comparison or category– With every passing day, I still benefit in ways I will not even realise, thanks to the historical currency borne by that list of traits issued from a lottery of birth. But, since then, also thanks to snipers in far-away places. Thanks to a lot of things, really, way more than I can explain or know about. Maybe it’s easiest just to say I’m not alone, in my history or my present day-to-day or, so it would seem, in my future. I am not alone, and neither are you. Here we are, with each other, all of us. And all my respectful detachment… amounted to nothing, or may as well have. That could be disappointing, a life’s effort or, in fact, just perfectly Canadian, things in this country being so unresolved.

Something needs to happen, so I’ve begun by working with who and what’s been given to me. I encourage all to do the same.

School

Assessment as Analogy

Photo by Dave Mullen on Unsplash

 

We teachers like to talk about teacher stuff, like assessment and curriculum. But shop talk can leave them yawning in the aisles, so sometimes I like to try using analogies. Analogies are great because, since they’re not exact, that can actually help shine more light on what your trying to understand.

Here’s an analogy: I walk into a theatre and sit down behind somebody. Seeing the back of his head, I now know what it looks like – his hairstyle, for instance, or the shape of his head. I have no idea what his face looks like since that I cannot see from behind. Something else I notice is his height – even sitting down, he’s obviously going to block my view of the screen and, since I’ve been waiting a long time to see this movie, I decide to change seats.

So I move ahead into his row. Now I sit almost beside this tall stranger, just a few seats away. Now, from the side, I can see the profile of his face – eyebrows, nose, chin. At one point, he turns to face me, looking out for his friend who went for popcorn, and I can fully take in his face. Now I know what he looks like from the front. Except it’s probably clearer to say, Now I’ve seen him from the front and remember what his face looks like – I make this distinction because it’s not like he turned to look for his friend, then stayed that way. He turned, briefly, then turned back toward the screen, facing forward again, and I’m left seeing his profile once more.

Sitting a few seats away, to say that I know what his profile looks like, side-on, or that I remember what his face looks like, just as I remember what the back of his head looks like – this is probably most accurate. In this theatre situation, nothing’s too hard to remember, anyway, since the whole experience only takes a minute or two and, besides, we’re sitting near enough to remind me of those other views, even though I can only see him from one of three perspectives at a time: either behind, or beside, or facing.

An analogy, remember? This one’s kind of dumb, I guess, but I think it gets the point across. The point I’m comparing is assessment, how we test for stuff we’ve learned.

As I understand the shift from traditional education (positivist knowledge-based curricula, teacher-led instruction, transactional testing) to what’s being called “the New Education” (constructivist student-centred curricula, self-directed students, transformational learning), I’d liken traditional testing to trying to remember what the back of his head looked like after I switched seats. As I say, I might get some clues from his profile while sitting beside him. But once I’m no longer actually sitting behind him, then all I can really do is remember. “What can you remember?” = assessment-of-learning (AoL)

In the New Education, I wouldn’t need to remember the back of his head because that’s probably not what I’d be asked. Where I sit, now, is beside him, so an assessment would account for where I now sit, beside him, not where I used to sit, behind. That makes the assessment task no longer about remembering but more in line with something immediate, something now as I sit beside him, seeing his profile, or during that moment when he turns and I fully see his face. A test might ask me to illustrate what I was thinking or how I was feeling right at that moment. “What are you thinking?” = assessment-for-learning (AfL)

There’s also assessment-as-learning (AaL), which could be me and my friend assessing each other’s reactions, say, as we both watch this tall stranger beside us. In the New Education, AaL is the most valued assessment of all because it places students into a pseudo-teaching role by getting them thinking about how and why assessment is helpful.

When proponents of the New Education talk about authentic learning and real-life problems, what I think they mean – by analogy – are those things staring us in the face. Making something meaningful of my current perspective doesn’t necessarily require me to remember something specific. I might well remember something, but that’s not the test. The New Education is all about now for the future.

In fact, both traditional education and the New Education favour a perspective that gives us some direction, heading into the future: traditional education is about the past perspective, what we remember from where we were then, while the New Education is about the present perspective, what we see now from where we are now. It’s a worthy side note that, traditional and contemporary alike, education is about perspective – where we are and where we focus.

By favouring the past, assessing what we remember, the result is that traditional education implies continuation of the past into the future. Sure, it might pay lip service to the future, but that’s not as potent as what comes about from stressing remembrance of the past. Meanwhile, lying in between, the present is little more than a vehicle or conveyance for getting from back then to later on. You’re only “in the moment,” as it were, as you work to reach that next place. But this is ironic because, as we perceive living and life chronologically, we’re always in the moment, looking back to the past and ahead to the future. So it must seem like the future never arrives – pretty frustrating.

The New Education looks to the future, too, asking us to speculate or imagine from someplace we might later be. But, by favouring the present, assessing what we think and feel, and what we imagine might be, the New Education trades away the frustration of awaiting the future for a more satisfying “living in the moment.” We seem to live in a cultural era right now that really values living in the moment, living for now – whether that’s cause or effect of the New Education, I don’t know. In any case, says the New Education, the future is where we’re headed, and the present is how we’re getting there, so sit back and let’s enjoy the ride.

As it regards the past, the New Education seems to pose at least two different attitudes. First, the New Education seems to embrace the past if that past meets the criterion that it was oppressive and is now in need of restoration. Maybe this is a coincidental occurrence of cultural change and curricular change that happen to suit each other. Or maybe this is what comes of living in the moment, focusing on the here-and-now: we’re able to take stock, assess for the future, and identify things, which have long been one way, that now we feel compelled to change. Second, the New Education seems dismissive of the past. Maybe this is also because of that past oppression, or maybe it’s leftover ill will for traditional education, which is kind of the same thing. What often swings a pendulum is vilification.

Whatever it is, we ought to remember that dismissing the past dismisses our plurality – we are all always only from the past, being ever-present as we are. We can’t time-travel. We are inescapably constrained by the past from the instant we’re born. What has happened is unalterable. The future arrives, and we take it each moment by moment. To dismiss the past is delusory because the past did happen – we exist as living proof.

For all its fondness and all its regret, the past is as undeniable as the future is unavoidable, for all its expectancy and all its anxiety. As we occupy the place we are, here, with the perspective it affords us, now, we need the courage to face the future along with the discipline to contextualise the past. As we live in the moment, we are bound and beholden to all three perspectives – past, present, future. Incidentally, that happens to be where my analogy broke down. In a theatre, we can only sit in one seat at a time. Let’s count our blessings that living and learning offer so much more.

What it Was is What it Is: I Don’t Know What Else to Say

At the same school, in the same department, for so long… eventually I found what seemed to be some effective teaching strategies and stuck with those, but boy it took a while. There’s been more than one teacher to have offered something like an apology, half-joking, half rueful, to all those early students, who were basically guinea pigs while we figured ourselves out in the classroom. I mentioned this in a paper for a graduate course and earned the critique of “triumphalism” – feedback from the professor, which I took as a suggestion to go ahead and “problematize my assumptions,” to use the lingo. In the moment, I bristled, the new kid in town learning how to be part of the academy, wondering what exactly had prompted my professor to claim with such certainty the question of my certainty.

Maybe I’ll just mention, since I’ve brought it up… I’ve since found the academy has an endemic logical pitfall all its own, an oddly hypocritical veneer of uncertainty: “All knowledge is provisional.” Post-modernism at its finest? Indeed, who can really say.

In all seriousness, though, and fairness, I grant the aim of the sceptical outlook. Heck, I try to possess one – healthy scepticism, to guard against arrogance and narrow thinking (… and innovation too, come to think of it, although that one maybe for another time). I value Socratic humility, which I ultimately decided not to call Socratic ignorance, and try to model it although how successfully I can’t say – especially not *joking-slash-rueful* back in those early days. So when someone with expertise in curriculum and teaching theory lay triumphalism at my feet, I thought to myself, Well, at least I ought to consider it. And I did.

And I do, and I still am. That reflective side of critique, the side you get from being on the receiving end, it can help us spot our assumptions and our shortcomings. I suspect the whole point was simply to light a fire within me. And hey, I’ve gone and written this, haven’t I? And hey, if settling into some effective teaching strategies weren’t triumphalist and undesirable, that would probably encourage complacency among teachers, or possibly even stagnation. On the other hand, after so long teaching in the same department at the same school, I suspect there’s more than one teacher who’s ended up feeling like part of the woodwork. Certainly, for me, as I’m sure for the students, there was a marked difference between me, the new guy, and me five, ten, fifteen years later. Then again–

Looking back, now, at what I called “effective”… it rounds out as, well, effective because what happened happened that way – nothing’s perfect, but all considered, my students seemed broadly to have learned what they felt were some useful things. The classroom years I spent, developing as I did to reach the point I reached, came about from the feedback I received each day, each term, as students and I came together lesson upon lesson, class after class. Details along the way, course evaluations I asked students to complete each June, reports back from post-secondary adventuring… there are always issues to address along with encouragements to appreciate, and I admit: no grand theory did I have in mind, as though I were contributing to the historical record. I just wanted to make things better for kids the following year, which eventually I think I was able to do.

Where I gave thought to improving my teaching was (a) relative to myself, (b) on behalf of my students, (c) in the context of my school. At least, that was what I thought when I was teaching. In that respect, what can I possibly say now, looking back, as to what might have been apart from what did be? I had to do something. And my life was never going to be any less full or busy or complicated than it turned out to be, so in all sincerity I did what I could. Eventually, it seemed to work out pretty well. Effectively.

Look, if somebody did celebrate triumphantly, in the classroom, facing the students, day in day out… ? What an ass! As it was, for those students who did find my teaching effective in this way or that, or worse, for those who didn’t – did I leave them with some suggestion that I basked in triumphant glow? I hope not. Like I said, I eventually found and stuck with what I thought worked, and that took years. Meanwhile, that’s the job. Isn’t it?

For me, the professor’s criticism, in whatever light it was offered, reflects more upon her embrace of uncertainty (presumably the academic embrace I described above) than it does upon my curricular relationships when I was teaching. And I heed the lesson, not for the first time in my career, that sitting in judgment of others can be a difficult perch.

Teaching’s Other Greatest Reward

“Texts are not the curriculum,” I was told during Pro-D by an administrator, the Director of Curriculum and Innovation. The session had been arranged to introduce a revised K–12 curriculum and was billed as a great unfolding at the onset of the 21st century. “Texts are a resource for implementing lessons and practising skills,” she concluded. By this, I took her to mean that notation, for example, is a resource for students to finger piano keys or pluck guitar strings, which is something music teachers might accept. I took her to mean that landscape is fodder for brushstrokes and blending, something art teachers might accept. I took her to mean that a poet’s intimate, inspired reveries, shared in careful verse, is raw material for students who are learning to analyse and write, which I grant English teachers might accept. I took her to mean that I should consider her remark a resource and that this issue was now settled, which some teachers in earshot seemed to accept. To this day, I wonder whether a musician, or a painter, or a poet might accept her remark, but in that moment, I let it go.

I suppose I should be more forthcoming: I used to joke with parents, on Meet the Teacher Night, that I could be teaching my coursework just as well using texts like Curious George and a recipe book. That I decided to use Shakespeare, or Sandra Cisneros, or Thomas King, and that I would in fact be asking students literally to stare out the window as part of a textual analysis exercise—all just as arbitrary—illustrated the point: I built my course around some particular themes that reflected me and what I believed important about life. This, in turn, was meant to illustrate to students, and now parents, how bias plays a noteworthy if subtly influential role in our lives and our learning.

My larger points were twofold: firstly, no, texts are not the curriculum per se and, secondly, our Department’s approach to English Language Arts (ELA) focused more on skill development, less on content consumption. For us, anyway, the revised curriculum was reaffirming. What I merely assumed in all this—and presumed that parents assumed it, too—was that our Department’s approach was commensurate with the school’s expectations, and the Ministry’s, as well as with our province’s educational history and the general ELA approach found in classrooms across North America, for which I had some albeit minimal evidence by which to make the claim. As a secondary ELA teacher, I chose my texts on the basis that they helped expedite my curricular responsibilities. I suppose it would be fair to say that, for me, texts were a resource for implementing lessons and practising skills.

What was it, then, that niggled me about the Director’s comment at the Pro-D session? Did it have to do with decision-making, as in who gets to decide what to teach, and how, and why? Would that make it about autonomy, some territorial drawing of lines in professional sand? Was it more my own personal confrontation, realising that musicians and painters and poets deserve better than to be considered lesson fodder? I had never approached my lessons so clinically or instrumentally before—had I? Maybe I was having my attention drawn into really considering curriculum, taking the time to puzzle out what that word means, and implies, and represents. And if I never really had puzzled it out, what kind of experience was I creating for my students? I’ve always felt that I have done right by my students, but even so… how much better, still, to be done?

Months later, I sat at a table doing prep work next to a colleague, and a third sat down to join us. Eventually, as the conversation turned from incidents to editorials, the third teacher spread her hands wide and concluded, “But ultimately education is all about relationships.” In the next split-second moment, I was confronted by the entirety of my teaching philosophy, nearly a clarion call except I had nowhere to stand and run, so I just remained in my seat, quietly agreeing and chuckling at the truth of it all. We all did. That was my final year before returning as a student to a doctoral program. These days, I search and select texts to read so I can write texts of my own about particular themes that reflect me and what I believe important about curriculum, and teaching, and education.

I should say I no longer wonder why the Director’s remark that day, about texts, didn’t set me to thinking about curriculum, not like my colleagues did, sitting and chatting around that table.

The Conceit of A. I.


 

From a technological perspective, I can offer a lay opinion of A.I. But check out some more technical opinions than mine, too:

MIT: The Seven Deadly Sins

Edge: The Myth of AI

The Guardian: The Discourse is Unhinged

NYT: John Markoff

Futurism: You Have No Idea…

IEET: Is AI a Myth?

Open Mind: Provably Beneficial Artificial Intelligence

Medium: A Critical Reading List

AdWeek: Burger King

 


The Conceit of A.I.

Time and energy… the one infinite, the other hardly so. The one an abstraction, the other all too real. But while time ticks ceaselessly onward, energy forever needs replenishing. We assign arbitrary limits to time, by calendar, by clock, and as the saying goes, there’s only so much time in a day. Energy, too, we can measure, yet often we equate both time and energy monetarily, if not by actual dollars and cents: we can pay attention, spend a day at the beach, save energy – the less you burn, the more you earn! And certainly, as with money, most people would agree that we just never seem to have enough time or energy.

Another way to frame time and energy is as an investment. We might invest our time and energy learning to be literate, or proficient with various tools, or with some device that requires skilful application. Everything, from a keyboard or a forklift or a tennis racquet to a paring knife or an elevator or a golf club to a cell phone or a self-serve kiosk or the new TV remote, everything takes some knowledge and practice. By that measure, there are all kinds of literacies – we might even say, one of every kind. But no matter what it is, or how long it takes to master, or why we’d even bother, we shall reap what we sow, which is an investment analogy I bet nobody expected.

Technology returns efficiency. In fact, like nothing else, it excels at creating surplus time and energy, enabling us to devote ourselves to other things and improve whichever so-called literacies we choose. The corollary, of course, is that some literacies fade as technology advances. Does this matter, with so many diverse interests and only so much time and energy to invest? How many of us even try everything we encounter, much less master it? Besides, for every technological advancement we face, a whole new batch of things must now be learned. So, for all that technological advancement aids our learning and creates surplus time and energy, we as learners remain the central determinant as to how to use our time and energy.

Enter the classroom what’s lately been called Artificial Intelligence (A.I.). Of course, A.I. has received plenty of enthusiastic attention, concern, and critique as a developing technological tool, for learning as well as plenty other endeavours and industries. A lengthy consideration from The New York Times offers a useful, broad overview of A.I.: a kind of sophisticated computer programming that collates, provides, and predicts information in real time. Silicon Valley designers aim to have A.I. work at least somewhat independently of its users, so they have stepped away from older, familiar input-output modes, what’s called symbolic A.I., a “top down” approach that demands tediously lengthy entry of preparatory rules and data. Instead, they are engineering “from the ground up,” building inside the computer a neural network that mimics a brain – albeit, a very small one, rivalling a mouse – that can teach itself via trial-and-error to detect and assess patterns found in the data that its computer receives. At these highest echelons, the advancement of A.I. is awe-inspiring.

Now for the polemic.

In the field of education, where I’m trained and most familiar, nothing about A.I. is nearly so clear. Typically, I’ve found classroom A.I. described cursorily, by function or task:

  • A.I. facilitates individualized learning
  • A.I. furnishes helpful feedback
  • A.I. monitors student progress
  • A.I. highlights possible areas of concern
  • A.I. lightens the marking load

On it goes… A.I., the panacea. Okay, then, so in a classroom, how should we picture what is meant by “A.I.”?

Mr. Dukane
“Anybody remember Mr. Dukane?”

Specific examples of classroom A.I. are hard to come by, beyond top ten lists and other generalized descriptions. I remember those library film-strip projectors we used in Grade 1, with the tape decks attached. Pressing “Play,” “Stop,” and “Eject” was easy enough for my six year-old fingers, thanks to engineers who designed the machines and producers who made the film strips, even if the odd time the librarian had to load them for us. (At home, in a similar vein, how many parents ruefully if necessarily consider the T.V. a “babysitter” although, granted, these days it’s probably an iPad. But personification does not make for intelligence… does it? Didn’t we all understand that Max Headroom was just a cartoon?) There’s a trivia game app with the hand-held clickers, and there’s an on-line plagiarism detector – both, apparently, are A.I. For years, I had a Smart Board although I think that kind of branding is just so much capitalism and harshly cynical. Next to the Smart Board was a whiteboard, and I used to wonder if, someday, they’d develop some windshield wiper thing to clean it. I even wondered if someday I wouldn’t use it anymore. For the record, I like whiteboards. I use them, happily, all the time.

Look, I can appreciate this “ground-up” concept as it applies to e-machines. (I taught English for sixteen years, so metaphor’s my thing.) But intelligence? Anyway, there seems no clear definition of classroom A.I., and far from seeming intelligent to me, none of what’s out there even seems particularly dim-witted so much as pre-programmed. As far as I can tell, so-called classroom A.I. is stuff that’s been with us all along, no different these days than any tool we already know and use. So how is “classroom A.I.” A. I. of any kind, symbolic or otherwise?

"... so whose the Sub?"
“Hey, so who’s the Sub today?”

Symbolic A.I., at least the basis of it, seems not too dissimilar to what I remember about computers and even some video arcade favourites from back in the day. Granted, integrated circuits and micro-processers are a tad smaller and faster these days compared to, say, 1982 (… technology benefitting from its own surplus?). Perhaps more germane to this issue is the learning curve, the literacy, demanded of something “intelligent.” Apparently, a robot vacuum learns the room that it cleans, which as I gather is the “ground-up” kind of Symbolic A.I. Now, for all the respect and awe I can muster for a vacuum cleaner—and setting all “ground-up” puns aside—I still expect slightly less from this robot than passing the written analysis section of the final exam. (I taught English for sixteen years, so written analysis is my thing.) It seems to me that a given tool can be no more effective than its engineering and usage, and for that, isn’t A.I.’s “intelligence” more indicative of its creator’s ingenuity or its user’s aptitude than of itself or its pre-programmed attributes?

Press Any Key to Begin

By the same token, could proponents of classroom A.I. maybe just ease off a bit from their retcon appropriation of language? I appreciate getting caught up in the excitement, the hype—I mean, it’s 21st century mania out there, candy floss and roller coasters—but that doesn’t mean you can just go about proclaiming things as “A.I.” or, worse, proclaiming A.I. to be some burgeoning technological wonder of classrooms nationwide when… it’s really not. Current classroom A.I. is simply every device that has always already existed in classrooms for decades—that could include living breathing teachers, if the list of functions above is any guide. Okay then, hey! just for fun: if classroom tools can include teachers who live and breathe, by the same turn let’s be more inclusive and call A.I. a “substitute teacher.”

Another similarly common tendency I’ve noted in descriptions of classroom A.I. is to use words like “data,” “algorithm,” and “training” as anthropomorphic proxy for experience, decision-making, and judgment, i.e. for learning. Such connotations are applied as simply as we might borrow a shirt from our sibling’s closet, as liberally as we might shake salt on fries, and they appeal to the like-minded, who share the same excitement. To my mind, judicious intelligence is never so cavalier, and it doesn’t take much horse-sense to know that too much salt is bad for you, or that your sibling might be pissed off after they find their shirt missing. As for actually manufacturing some kind of machine-based intelligence, well… it sure is easy to name something “Artificial Intelligence,” much less bestow “intelligence” by simply declaring it! The kind of help I had back in the day, as I see it, was something I just now decided to call “S.I.”: sentient intelligence.

Facetiousness aside, I grant probably every teacher has spent some time flying on auto-pilot, and I’ve definitely had days that left me feeling like an android. And fair enough: something new shakes things up and may require some basic literacy. There’s no proper use of any tool, device, or interface without some learned practical foundation: pencil and paper, protractor, chalk slates, the abacus. How about books, or by ultimate extension, written language, itself? These are all teaching tools, and each has a learning curve. So is A.I. a tool, a device, an interface? All of the above? I draw the line where it comes to classroom tools that don’t coach the basketball team or have kids of their own to pick up by 5pm: the moniker, “A.I.,” seems more than a bit generous. And hey, one more thing, on that note: wouldn’t a truer account of A.I., the tool, honour its overt yet seemingly ignored tag, “artificial”? R2D2 and C-3PO may be the droids we’re looking for, but they’re still just science fiction.

Fantastic tales aside, technological advancements in what is called the field of A.I. have and will continue to yield useful, efficient innovation. And now I mean real Silicon Valley A.I., not retcon classroom A.I. But even so, to what ends? What specifically is this-or-that A.I. for? In a word: why? We’re headed down an ontological road, and even though people can’t agree on whether we can truly consider our self, we’re proceeding with A.I. in the eventual belief that it can. “It will,” some say. Not likely, I suspect. Not ever. But even if I’m wrong, why would anyone hope that A.I. could think for itself?

Artificial Intelligence
10. Be “A.I.”    20. Go to 10     Run

Hasn’t Heidegger presented us with enough of a challenge, as it is? Speaking of time and energy, let’s talk opportunity costs. Far greater minds than mine have lamented our ominous embrace with technology. Isn’t the time and energy spent on A.I.—every second, every joule of it—a slap-in-the-face of our young people and the investment that could have been made in them? It’s ironic that we teach them to develop the very technology that will eventually wash them away.

Except that it won’t. I may be out on a limb to say so, but I suspect we will sooner fall prey to the Twitterverse and screen-worship than A.I. will fulfil some sentient Rise of the Machines. The Borg make good villains, and even as I watch a lobby full of Senior Band students in Italy, staring at their iPhones, and fear assimilation and, yes, worry for humanity… I reconsider because the Borg are still just a metaphor (… sixteen years, remember?). Anyway, as a teacher I am more driven to reach my students with my own message than I am to snatch that blasted iPhone from their hands, much as I might like to. On the other hand, faced with a dystopian onslaught of Replicants, Westworld Gunslingers, and Decepticons, would we not find ourselves merely quivering under the bed, frantically reading up on Isaac Asimov while awaiting the arrival of Iron Man? Even Luke Skywalker proved susceptible to the Dark Side’s tempting allure of Mechanized Humanity; what possible response could we expect from a mere IB cohort of inquiry-based Grade 12 critical thinkers and problem-solvers?

The Borg
“Resistance is futile.”

At the very least, any interruption of learners by teachers with some classroom tool ought to be (i) preceded by a primer on its literacy, i.e. explaining how to use that particular tool in (ii) a meaningful context or future setting, i.e. explaining why to use that particular tool, before anybody (iii) begins rehearsing and/or mastering that particular tool, i.e. successfully executing whatever it does. If technology helps create surplus time and energy, then how and why and what had better be considered because we only have so much time and energy at our disposal. The what, the how, and the why are hardly new concepts, but they aren’t always fully considered or appreciated either. They are, however, a means of helpful focusing that few lessons should be without.

As a teacher, sure, I tend to think about the future. But that means spending time and paying attention to what we’re up to, here and now, in the present. To that end, I have an interest in protecting words like “learning” and “intelligence” from ambiguity and overuse. For all the 21st century hearts thumping over the Cinderella-transformation of ENIAC programmable computation to A.I., and the I.o.T., and whatever lies beyond, our meagre acknowledgement of the ugly step-sister, artificiality, is foreboding. Mimicry is inauthentic, but neither is it without consequence. Let’s take care that the tools we create as means don’t replace the ends we originally had in mind because if any one human trait can match the trumpeting of technology’s sky-high potential—for me at least, not sure for you—I’d say its hubris.

Another fantastic tale comes to mind: Frankenstein’s monster. Technological advancement can be as wonderful as horrifying, probably usually somewhere in between. However it’s characterised or defined, though, by those who create it, it will be realised in the end by those who use it, if not those who face it. For most people, the concept of cell phones in 1982 was hardly imagined. Four decades later, faces down and thumbs rapid-fire, the ubiquity of cell phones is hardly noticed.

I May Be Wrong About This, But…

Before introducing the moral pairing of right and wrong to my students, I actually began with selfish and selfless because I believe morality has a subjective element, even in the context of religion, where we tend to decide for ourselves whether or not we believe or ascribe to a faith.

As I propose them, selfish and selfless are literal, more tangible, even quantifiable: there’s me, and there’s not me. For this reason, I conversely used right and wrong to discuss thinking and bias. For instance, we often discussed Hamlet’s invocation of thinking: “… there is nothing good or bad, but thinking makes it so” (II, ii, 249-250). Good and bad, good and evil, right and wrong… while not exactly synonymous, these different pairings do play in the same ballpark. Still, as I often said to my students about synonyms, “If they meant the same thing, we’d use the same word.” So leaving good and bad to the pet dog, and good and evil to fairy tales, I presently consider the pairing of right and wrong, by which I mean morality, as a means to reconcile Hamlet’s declaration about thinking as some kind of moral authority.

My own thinking is that we have an innate sense of right and wrong, deriving in part from empathy, our capacity to stand in someone else’s shoes and identify with that perspective – look no further than storytelling itself. Being intrinsic and relative to others, empathy suggests an emotional response and opens the door to compassion, what we sometimes call the Golden Rule. Compassion, for Martha Nussbaum, is that means of “[hooking] our imaginations to the good of others… an invaluable way of extending our ethical awareness” (pp. 13-14). Of course, the better the storytelling, the sharper the hook, and the more we can relate; with more to go on, our capacity for empathy, i.e. our compassion, rises. Does that mean we actually will care more? Who knows! But I think the more we care about others, the more we tend to agree with them about life and living. If all this is so, broadly speaking, if our measure for right derives from empathy, then perhaps one measure for what is right is compassion.

And if we don’t care, or care less? After all, empathy’s no guarantee. We might just as reasonably expect to face from other people continued self-interest, deriving from “the more intense and ambivalent emotions of… personal life” (p. 14). Emotions have “history,” Nussbaum decides (p. 175), which we remember in our day-to-day encounters. They are, in general, multifaceted, neither a “special saintly distillation” of positive nor some “dark and selfish” litany of negative, to use the words of Robert Solomon (p. 4). In fact, Solomon claims that we’re not naturally selfish to begin with, and although I disagree with that, on its face, I might accept it with qualification: our relationships can supersede our selfishness when we decide to prioritise them. So if we accept that right and wrong are sensed not just individually but collectively, we might even anticipate where one could compel another to agree. Alongside compassion, then, to help measure right, perhaps coercion can help us to measure wrong: yes, we may care about other people, but if we care for some reason, maybe that’s why we agree with them, or assist them, or whatever. Yet maybe we’re just out to gain for ourselves. Whatever our motive, we treat other people accordingly, and it all gets variously deemed “right” or “wrong.”

I’m not suggesting morality is limited solely to the workings of compassion and coercion, but since I limited this discussion to right and wrong, I hope it’s helping illuminate why I had students begin first with what is selfish and selfless. That matters get “variously deemed,” as I’ve just put it, suggests that people seldom see any-and-all things so morally black and white as to conclude, “That is definitely wrong, and this is obviously right.” Sometimes, of course, but not all people always for all things. Everybody having an opinion – mine being mine, yours being yours, as the case may be – that’s still neither here nor there to the fact that every body has an opinion, mine being mine and yours being yours. On some things, we’ll agree while, on some things, we won’t.

At issue is the degree that I’m (un)able to make personal decisions about right and wrong, the degree that I might feel conspicuous, perhaps uneasy, even cornered or fearful – and wrong – as compared to feeling assured, supported, or proud, even sanctimonious – and right. Standing alone from the crowd can be, well… lonely. What’s more, having some innate sense of right and wrong doesn’t necessarily help me act, not if I feel alone, particularly not if I feel exposed. At that point, whether from peer pressure or social custom peering over my shoulder, the moral question about right and wrong can lapse into an ethical dilemma, the moral spectacle of my right confronted by some other right: would I steal a loaf of bread to feed my starving family? For me, morality is mediated (although not necessarily defined, as Hamlet suggests) by where one stands at that moment, by perspective, in which I include experience, education, relationships, and whatever values and beliefs one brings to the decisive moment. I’m implying what amounts to conscience as a personal measure for morality, but there’s that one more consideration that keeps intervening: community. Other people. Besides selfish me, everybody else. Selfless not me.

Since we stand so often as members of communities, we inevitably derive some values and beliefs from those pre-eminent opinions and long-standing traditions that comprise them. Yet I hardly mean to suggest that a shared culture of community is uniform – again, few matters are so black or white. Despite all that might be commonly held, individual beliefs comprising shared culture, if anything, are likely heterogeneous: it’s the proverbial family dinner table on election night. Even “shared” doesn’t rule out some differentiation. Conceivably, there could be as many opinions as people possessing them. What we understand as conscience, then, isn’t limited to what “I believe” because it still may not be so easy to disregard how-many-other opinions and traditions. Hence the need for discussion – to listen, and think – for mutual understanding, in order to determine right from wrong. Morality, in that sense, is concerted self-awareness plus empathy, the realised outcome of combined inner and outer influences, as we actively and intuitively adopt measures that compare how much we care about the things we face everyday.

Say we encounter someone enduring loss or pain. We still might conceivably halt our sympathies before falling too deeply into them: Don’t get too involved, you might tell yourself, you’ve got plenty of your own to deal with. Maybe cold reason deserves a reputation for callusing our decision-making, but evidently, empathy does not preclude our capacity to reason with self. On the other hand, as inconsistent as it might seem, one could not function or decide much of anything, individually, without empathy because, without it, we would have no measure. As we seem able to reason past our own feelings, we also wrestle echoing pangs of conscience that tug from the other side, which sometimes we call compassion or, other times, a guilt trip. Whatever to call it, clearly we hardly live like hermits, devoid of human contact and its resultant emotions. Right and wrong, in that respect, are socially individually determined.

One more example… there’s this argument that we’re desensitized by movies, video games, the TV news cycle, and so forth. For how-many-people, news coverage of a war-torn city warrants hardly more than the glance at the weather report that follows. In fact, for how-many-people, the weather matters more. Does this detachment arise from watching things once-removed, two-dimensionally, on a viewscreen? Surely, attitudes would be different if, instead of rain, it were shells and bombs falling on our heads from above. Is it no surprise, then, as easily as we’re shocked or distressed by the immediacy of witnessing a car accident on the way to our favourite restaurant, that fifteen minutes later we might conceivably feel more annoyed that there’s no parking? Or that, fifteen minutes later again, engrossed by a menu of appetizers and entrees and desserts, we’re exasperated because they’re out of fresh calamari. Are right and wrong more individually than socially determined? Have we just become adept at prioritising them, even diverting them, by whatever is immediately critical to individual well-being? That victim of the car accident isn’t nearly as worried about missing their dinner reservation.

Somewhat aside from all this, but not really… I partially accept the idea that we can’t control what happens, we can only control our response. By “partially” I mean that, given time, yes, we learn to reflect, plan, act, and keep calm carrying on like the greatest of t-shirts. After a while, we grow more accustomed to challenges and learn to cope. But sometimes what we encounter is so sudden, or unexpected, or shocking that we can’t contain a visceral response, no matter how accustomed or disciplined we may be. However, there is a way to take Hamlet’s remark about “thinking” that upends this entire meditation, as if to say our reaction was predisposed, even premeditated, like having a crystal ball that foresees the upcoming shock. Then we could prepare ourselves, rationalise, and control not what happens but our response to it while simply awaiting the playing-out of events.

Is Solomon wise to claim that we aren’t essentially or naturally selfish? Maybe he just travelled in kinder, gentler circles – certainly, he was greatly admired. Alas, though, poor Hamlet… troubled by jealousy, troubled by conscience, troubled by ignorance or by knowledge, troubled by anger and death. Troubled by love and honesty, troubled by trust. Troubled by religion, philosophy, troubled by existence itself. Is there a more selfish character in literature? He’s definitely more selfish than me! Or maybe… maybe Hamlet’s right, after all, and it really is all just how you look at things: good or bad, it’s really just a state of mind. For my part, I just can’t shake the sense that Solomon’s wrong about our innate selfishness, and for that, I guess I’m my own best example. So, for being unable to accept his claim, well, I guess that one’s on me.