The Conceit of A. I.


 

From a technological perspective, I can offer a lay opinion of A.I. But check out some more technical opinions than mine, too:

MIT: The Seven Deadly Sins

Edge: The Myth of AI

The Guardian: The Discourse is Unhinged

NYT: John Markoff

Futurism: You Have No Idea…

IEET: Is AI a Myth?

Medium: A Critical Reading List

AdWeek: Burger King

 


The Conceit of A.I.

Time and energy… the one infinite, the other hardly so. The one an abstraction, the other all too real. But while time ticks ceaselessly onward, energy forever needs replenishing. We assign arbitrary limits to time, by calendar, by clock, and as the saying goes, there’s only so much time in a day. Energy, too, we can measure, yet often we equate both time and energy monetarily, if not by actual dollars and cents: we can pay attention, spend a day at the beach, save energy – the less you burn, the more you earn! And certainly, as with money, most people would agree that we just never seem to have enough time or energy.

Another way to frame time and energy is as an investment. We might invest our time and energy learning to be literate, or proficient with various tools, or with some device that requires skilful application. Everything, from a keyboard or a forklift or a tennis racquet to a paring knife or an elevator or a golf club to a cell phone or a self-serve kiosk or the new TV remote, everything takes some knowledge and practice. By that measure, there are all kinds of literacies – we might even say, one of every kind. But no matter what it is, or how long it takes to master, or why we’d even bother, we shall reap what we sow, which is an investment analogy I bet nobody expected.

Technology returns efficiency. In fact, like nothing else, it excels at creating surplus time and energy, enabling us to devote ourselves to other things and improve whichever so-called literacies we choose. The corollary, of course, is that some literacies fade as technology advances. Does this matter, with so many diverse interests and only so much time and energy to invest? How many of us even try everything we encounter, much less master it? Besides, for every technological advancement we face, a whole new batch of things must now be learned. So, for all that technological advancement aids our learning and creates surplus time and energy, we as learners remain the central determinant as to how to use our time and energy.

Enter the classroom what’s lately been called Artificial Intelligence (A.I.). Of course, A.I. has received plenty of enthusiastic attention, concern, and critique as a developing technological tool, for learning as well as plenty other endeavours and industries. A lengthy consideration from The New York Times offers a useful, broad overview of A.I.: a kind of sophisticated computer programming that collates, provides, and predicts information in real time. Silicon Valley designers aim to have A.I. work at least somewhat independently of its users, so they have stepped away from older, familiar input-output modes, what’s called symbolic A.I., a “top down” approach that demands tediously lengthy entry of preparatory rules and data. Instead, they are engineering “from the ground up,” building inside the computer a neural network that mimics a brain – albeit, a very small one, rivalling a mouse – that can teach itself via trial-and-error to detect and assess patterns found in the data that its computer receives. At these highest echelons, the advancement of A.I. is awe-inspiring.

Now for the polemic.

In the field of education, where I’m trained and most familiar, nothing about A.I. is nearly so clear. Typically, I’ve found classroom A.I. described cursorily, by function or task:

  • A.I. facilitates individualized learning
  • A.I. furnishes helpful feedback
  • A.I. monitors student progress
  • A.I. highlights possible areas of concern
  • A.I. lightens the marking load

On it goes… A.I., the panacea. Okay, then, so in a classroom, how should we picture what is meant by “A.I.”?

Mr. Dukane
“Anybody remember Mr. Dukane?”

Specific examples of classroom A.I. are hard to come by, beyond top ten lists and other generalized descriptions. I remember those library film-strip projectors we used in Grade 1, with the tape decks attached. Pressing “Play,” “Stop,” and “Eject” was easy enough for my six year-old fingers, thanks to engineers, who designed the machines, and producers, who made the film strips, even if the odd time the librarian had to load them for us. (At home, in a similar vein, how many parents ruefully if necessarily consider the T.V. a “babysitter” although, granted, these days it’s probably an iPad. But personification does not make for intelligence… does it? Didn’t we all understand that Max Headroom was just a cartoon?) There’s a trivia game app with the hand-held clickers, and there’s an on-line plagiarism detector – both, apparently, are A.I. For years, I had a Smart Board although I think that kind of branding is just so much capitalism and harshly cynical. Next to the Smart Board was a whiteboard, and I used to wonder if, someday, they’d develop some windshield wiper thing to clean it. I even wondered if someday I wouldn’t use it anymore. For the record, I like whiteboards. I use them, happily, all the time.

Look, I can appreciate this “ground-up” concept as it applies to e-machines. (I taught English for sixteen years, so metaphor’s my thing.) But intelligence? Anyway, there seems no clear definition of classroom A.I., and far from seeming intelligent to me, none of what’s out there even seems particularly dim-witted so much as pre-programmed. As far as I can tell, so-called classroom A.I. is stuff that’s been with us all along, no different these days than any tool we already know and use. So how is “classroom A.I.” A. I. of any kind, symbolic or otherwise?

"... so whose the Sub?"
“Hey, so whose the Sub today?”

Symbolic A.I., at least the basis of it, seems not too dissimilar to what I remember about computers and even some video arcade favourites from back in the day. Granted, integrated circuits and micro-processers are a tad smaller and faster these days compared to, say, 1982 (… technology benefitting from its own surplus?). Perhaps more germane to this issue is the learning curve, the literacy, demanded of something “intelligent.” Apparently, a robot vacuum learns the room that it cleans, which as I gather is the “ground-up” kind of Symbolic A.I. Now, for all the respect and awe I can muster for a vacuum cleaner—and setting all “ground-up” puns aside—I still expect slightly less from this robot than passing the written analysis section of the final exam. (I taught English for sixteen years, so written analysis is my thing.) It seems to me that a given tool can be no more effective than its engineering and usage, and for that, isn’t A.I.’s “intelligence” more indicative of its creator’s ingenuity or its user’s aptitude than of itself or its pre-programmed attributes?

Press Any Key to Begin

By the same token, could proponents of classroom A.I. maybe just ease off a bit from their retcon appropriation of language? I appreciate getting caught up in the excitement, the hype—I mean, it’s 21st century mania out there, with candy floss and roller coasters!—but that doesn’t mean you can just go about proclaiming things as “A.I.” or, worse, proclaiming A.I. to be some burgeoning technological wonder of classrooms nationwide when… it’s really not. Current classroom A.I. is simply every device that has always already existed in classrooms for decades—that could include living breathing teachers, if the list of functions above is any guide. Okay then, hey! just for fun: if classroom tools can include teachers who live and breathe, by the same turn let’s be more inclusive and call A.I. a “substitute teacher.”

Another similarly common tendency I’ve noted in descriptions of classroom A.I. is to use words like “data,” “algorithm,” and “training” as anthropomorphic proxy for experience, decision-making, and judgment, i.e. for learning. Such connotations are applied as simply as we might borrow a shirt from our sibling’s closet, as liberally as we might shake salt on fries, and they appeal to the like-minded, who share the same excitement. To my mind, judicious intelligence is never so cavalier, and it doesn’t take much horse-sense to know that too much salt is bad for you, or that your sibling might be pissed off after they find their shirt missing. As for actually manufacturing some kind of machine-based intelligence, well… it sure is easy to name something “Artificial Intelligence,” much less bestow “intelligence” by simply declaring it! The kind of help I had back in the day, as I see it, was something I just now decided to call “S.I.”: sentient intelligence.

Facetiousness aside, I grant probably every teacher has spent some time flying on auto-pilot, and I’ve definitely had days that left me feeling like an android. And fair enough: something new shakes things up and may require some basic literacy. There’s no proper use of any tool, device, or interface without some learned practical foundation: pencil and paper, protractor, chalk slates, the abacus. How about books, or by ultimate extension, written language, itself? These are all teaching tools, and each has a learning curve. So is A.I. a tool, a device, an interface? All of the above? I draw the line where it comes to classroom tools that don’t coach the basketball team or have kids of their own to pick up by 5pm: the moniker, “A.I.,” seems more than a bit generous. And hey, one more thing, on that note: wouldn’t a truer account of A.I., the tool, honour its overt yet seemingly ignored tag, “artificial”? R2D2 and C-3PO may be the droids we’re looking for, but they’re still just science fiction.

Fantastic tales aside, technological advancements in what is called the field of A.I. have and will continue to yield useful, efficient innovation. And now I mean real Silicon Valley A.I., not retcon classroom A.I. But even so, to what ends? What specifically is this-or-that A.I. for? In a word: why? We’re headed down an ontological road, and even though people can’t agree on whether we can truly consider our self, we’re proceeding with A.I. in the eventual belief that it can. “It will,” some say. Not likely, I suspect. Not ever. But even if I’m wrong, why would anyone hope that A.I. could think for itself?

Artificial Intelligence
10. Be “A.I.”    20. Go to 10     Run

Hasn’t Heidegger presented us with enough of a challenge, as it is? Speaking of time and energy, let’s talk opportunity costs. Far greater minds than mine have lamented our ominous embrace with technology. Isn’t the time and energy spent on A.I.—every second, every joule of it—a slap-in-the-face of our young people and the investment that could have been made in them? It’s ironic that we teach them to develop the very technology that will eventually wash them away.

Except that it won’t. I may be out on a limb to say so, but I suspect we will sooner fall prey to the Twitterverse and screen-worship than A.I. will fulfil some sentient Rise of the Machines. The Borg make good villains, and even as I watch a lobby full of Senior Band students in Italy, staring at their iPhones, and fear assimilation and, yes, worry for humanity… I reconsider because the Borg are still just a metaphor (… sixteen years, remember?). Anyway, as a teacher I am more driven to reach my students with my own message than I am to snatch that blasted iPhone from their hands, much as I might like to. On the other hand, faced with a dystopian onslaught of Replicants, Westworld Gunslingers, and Decepticons, would we not find ourselves merely quivering under the bed, frantically reading up on Isaac Asimov while awaiting the arrival of Iron Man? Even Luke Skywalker proved susceptible to the Dark Side’s tempting allure of Mechanized Humanity; what possible response could we expect from a mere IB cohort of inquiry-based Grade 12 critical thinkers and problem-solvers?

The Borg
“Resistance is futile.”

At the very least, any interruption of learners by teachers with some classroom tool ought to be (i) preceded by a primer on its literacy, i.e. explaining how to use that particular tool in (ii) a meaningful context or future setting, i.e. explaining why to use that particular tool, before anybody (iii) begins rehearsing and/or mastering that particular tool, i.e. successfully executing whatever it does. If technology helps create surplus time and energy, then how and why and what had better be considered because we only have so much time and energy at our disposal. The what, the how, and the why are hardly new concepts, but they aren’t always fully considered or appreciated either. They are, however, a means of helpful focusing that few lessons should be without.

As a teacher, sure, I tend to think about the future. But that means spending time and paying attention to what we’re up to, here and now, in the present. To that end, I have an interest in protecting words like “learning” and “intelligence” from ambiguity and overuse. For all the 21st century hearts thumping over the Cinderella-transformation of ENIAC programmable computation to A.I., and the I.o.T., and whatever lies beyond, our meagre acknowledgement of the ugly step-sister, artificiality, is foreboding. Mimicry is inauthentic, but neither is it without consequence. Let’s take care that the tools we create as means don’t replace the ends we originally had in mind because if any one human trait can match the trumpeting of technology’s sky-high potential—for me at least, not sure for you—I’d say its hubris.

Another fantastic tale comes to mind: Frankenstein’s monster. Technological advancement can be as wonderful as horrifying, probably usually somewhere in between. However it’s characterised or defined, though, by those who create it, it will be realised in the end by those who use it, if not those who face it. For most people, the concept of cell phones in 1982 was hardly imagined. Four decades later, faces down and thumbs rapid-fire, the ubiquity of cell phones is hardly noticed.

A Kind of Certainty: IV. A Kind of Faith

Click here to read Pt III. A Scripture of Truth

 


A Kind of Certainty

4. A Kind of Faith

For all this, what exactly does it mean to be educated? From the sole perspective – yours, mine, anybody’s – free thinking means freedom granted to individuals to believe and behave as they do, then investing proportionate faith that they continue to believe and behave as we do. Of course, anyone’s beliefs might vary, freely, from ours, as compared to everyone conforming to the same beliefs and behaviours. Imagine that world, where every inhabitant lived according to self-established morality. In such a world, how would there come about any rule of law? Even real, lived experience here in Canada is tenuous, relying on everyone to rely on everyone else.[1] Whether out of respect for each other, out of gaining some advantage, out of fear for paying a fine or going to jail – on it goes, accountability, but the individual freedom we avouch is as ready to dissipate as the smoke of a powderkeg. For all its enlightenment, free-thinking is quicksand: shifting, uncertain, deceiving, solid ground by mere appearance. Is it any wonder that the liberty and reason of Enlightenment individuation has led us to Postmodernism relativism, identity politics, and alternative facts? Be careful what you wish for. If there are any true binaries, to trust or not to trust must certainly be one. What need for faith when we trust that we are all alike, that all around is 100% certain?

Such a world is hardly plausible for me. I have learned not to trust everybody I meet. In the world I know, we need discernment and persuasive rhetorical skill to skirt potential conflicts and get others onside. And when others have discernment and persuasive rhetorical skill, too? Seen in that light, the curricular task is competitive, not cooperative. Even so, we might still argue that curriculum is collaborative, and it does not have to be belligerent. Curriculum falls within the scope of some given morality, morality being a question of right and wrong, positive opposing negative: to x, or not to x. However, curriculum itself is an ethical choice between alternatives and is, thereby, an empowering decision. We must therefore ask to x, or to y, which are positives, a question of competing rights, and not right competing against wrong.

And anywhere right does oppose wrong, curriculum should not permit a choice because wrong is simply wrong and not something that responsible choice can decide.[2] Beyond simply learning about the freedom to think, curriculum is about learning how to make choices that are set within the scope of defined morality. Question the morality, compare it to another morality, and we are Hamlet: we are lost. But decide, and accept the morality, and question only those choices intrinsic to its milieu… now we are educating ourselves and others, however precisely or narrowly, for as long as we care to pursue whatever makes us curious.

For me, someone is educated who thinks, and discerns, and has aims. Admittedly, such aims could be countered or rationalised pragmatically or else, more perversely, aimed beyond oneself to harm others – thinking in itself, after all, is not inherently moral. So if morality is a thing to be taught and also learned, then an educated person, for me, is someone who learns generosity of some kind, hospitality. Being educated means learning to give of oneself, for others or on behalf of others, in positive, constructive ways. This belief, I suppose, reflects my learned morality, which I am as pleased in all caring as utility to pass along. Perhaps your morality differs. To that end, education, in itself, should intentionally be both constructive and benevolent in consideration of that sense of kairos, what is appropriate in the moment for teacher and learner, even as those moments accumulate over the passage of chronos-time, like endless waves upon the shore.[3] Then again, who am I to anybody that the sole importance of my opinion should determine an education? If I am outnumbered, what is this sense of education that I describe but some solitary means of facing an existence nasty, brutish, and short? This thing called school will be the death of me!

Hawai'i Summer 2008
” ‘Bove the contentious waves he kept, and oar’d / Himself with his good arms in lusty stroke–” Away from shore? A certainty all its own…

See? Recruiting Hamlet’s cycle of misery seems all too easy “‘where the postmodern turn of mind appears to privilege the particular over the general’” (Roberts, 2003, p. 458). Frankly, I think our present culture regards the individual far too much. Naturally, the consequent short-changing of the bigger community picture has been playing out over chronos-time since, with every decision, there has been consequence. However, Roberts continues, “… ‘for Freire both [the particular and the general] depend on each other for their intelligibility’.” So perhaps a good education – by which I mean not just a moral one but an effectual one – is best measured with due consideration for its balance of the particular and the general, the heterogeneous and the homogenous, the certainty and the ambiguity, the inductive and the deductive. A little healthy scepticism, a little cloud for the silver lining. A little dram in the substance, to paraphrase Hamlet. “A little dab’ll do ya,” quips McMurphy.[4] You can’t have one without the other, sings the primus inter pares.[5]

We defy augury by flouting convention, even law, because we are free agents who do what we please. Some will have more courage than others, and some are just more foolhardy, but no one is literally predictable. We defy augury by being unpredictable, even inscrutable, although maybe the rest of you just never really knew me that well to begin with. Sometimes I even surprise myself. We defy augury by defying our senses, by not comprehending the world that we apprehend, which really is to say we see only what we want to see and recognise only what we already know. If there is special providence in the fall of a sparrow, what matter when we have spent all our time watching the chickadees? I cannot shake free from critiquing our cultural veneration of the individual: the less our shared beliefs converge and reciprocate a healthy community, the greater our insistence upon personal liberty to go our own way, then all the more do we miss the point of understanding exactly what freedom really is. True freedom results from having choices, and what creates choice is not the persuasive liberty of unequivocal individualism but discipline: to do ‘x’, or ‘y’, or ‘z’.

Shakespeare’s “Let…” statements are not so colloquial as to suggest the fatalism of c’est la vie, or the aimlessness of go with the flow[6] – these, for me, amount to giving up, or else giving in. The tragedy of Hamlet is that the curriculum he really needed – the people he could trust, who would be willing to help him – they were already there, at his side the whole time, as ready and willing as ever, so long as he gave a little back, so long as he offered just a dram of willingness to coincide with their beliefs – to his own scandal, maybe, but who in the real world is so selfish as they might expect to have their cake and eat it, too?[7] As compared to going it alone, Hamlet might have humbled himself and cast his lot with those to whom he is closest.[8] His education from Wittenberg proved sufficient to challenge his upbringing in Elsinore, amply suggested by his continued trust to enlist and confide in Horatio throughout the play; as far as that went, the rest of us would do well to heed his lesson with due respect: if only Hamlet had not divided his loyalty but decided, once and finally, exactly who he was and whom he trusted, then lived up to his declaration with discipline. With integrity.

The most common criticism aimed his way by my students was essentially, “Get over yourself, and grow up!” Make a decision with the discipline to accept the consequences, which is to say, accept your personal responsibility. To be fair, Hamlet finally, triumphantly, does place his faith in Horatio, whom he entrusts to tell his story. Granted, he only asks once he is terminally poisoned but hey, better to ask while alive to breathe the words than come back and haunt Horatio as the next in a line of Ghosts. As for Shakespeare, whatever exactly it was that he saw in us, this ethical curricular dilemma, evidently he felt its redemptive quality was worth its cost, as Horatio makes known – or will do – for pledging to tell his dying friend’s tale to Fortinbras. Shakespeare’s appeal by way of Hamlet is not one of giving up or giving in. It is one of giving over, to something bigger than ourselves, to something in which faith placed is faith assured, and “attuned” (Pinar, 2017b, p. 1), and certain beyond our own devices.

What that object of faith might be… perhaps it comes as no surprise, but Shakespeare has a “Let…” statement for that, too: “… let your own discretion be your tutor” (3.2.17). I never included this one in the list for my students because, until writing this essay, I had never fit it in as such a central constituent. Hamlet delivers the line, as any nervous director might do opening night, during the aforementioned lecture to the Players before the Mousetrap performance.[9] All the more ironic, of course, is that his lecture hardly exemplifies the statement, which would be fine if Hamlet, the director, did not assume the stage during the performance but let the actors get on with their craft. Hamlet, by contrast, twice assumes the stage to augment the performance. (Ahh, what to do about such insecurity! At least he sells tickets, you may remember.) Anxious or not, the wisdom of his advisement, taken for all, is easy for a lay audience to misinterpret, particularly as it comes buried within lines of such mundane theatrical detail. Shakespeare does not suggest that we give in to our discretion, carte blanche. He suggests that we give over to our discretion as a kind of teacher-student relationship.

Let curriculum be to trust your own better judgment, to search your feelings,[10] yet to grant with humility that more may exist than meets the eye. Let discretion be a “tutor,” yet while you let it, also think before you act – and think during and after, too – because “… the purpose of playing… was and is, to hold… the mirror up to nature” (3.2.17-23). Whether this amounts to something esoteric or spiritual is down to the beholder,[11] yet if that is true for any one of us, it must be true for all of us. Each one of us is finite and individual, and curriculum is composite, a sum greater than the whole of its parts, as in all of us, transcending time and space. As a force of faith, curriculum is vast indeed.

 

Click here for the Bibliography

Click here to read the closing reflection to “A Kind of Certainty”: Pt V. Fleeting Uncertainty

 


Endnotes

[1] How often I referred students to Canadian Liberal MP Stephen Owen’s definition for democracy: “the pluralistic respect for citizens empowered to self-govern within the rule of law.” Democracy, so often simplified as “majority rule,” is more accurately understood (in my opinion) as entirely dependent upon its constituents. Democracy works because we all agree to make it work. Every member therefore has a personal responsibility to respect and live up to the standard of the law on behalf of every other member. One disobedient person weakens the system and places everybody, including themselves, at risk. Either we set that person straight, or we jail them, but unless we protect the system, we are only certain to lose it.

[2] *Sigh… culture precedes law, I would argue, and we endlessly debate and litigate what should be right versus what should be wrong. This is politics and the justice system at work, issue by issue, and with enough lobbying and / or civil disobedience, any given topic might be up for consideration.

[3] Okay, so I did find a way to toss in some surf.

[4] https://youtu.be/d_mASr1djMM?t=1m33s (Zaentz, Douglas, & Forman, 1975)

[5] aka the Chairman of the Board, aka Ol’ Blue Eyes

[6] In Canada, we might say that Shakespeare’s appeal to “let go” means don’t grip the stick too tight. “Hold on loosely,” as Donnie Van Zant would sing, or “Give a little bit,” from Roger Hodgson. None fully clarifies the expression, as I gather Shakespeare intended it, but the notion of giving way in deference to others is helpful, for a start.

[7] Of course, the best rejoinder here would be, “He who dies with the most toys wins,” to which I would reply, “You can’t take it with you.” But dialectical bumper-stickers were never my strong suit, and I digress, even for end-notes.

On second thought, the best rejoinder is to say Hamlet is fictional, not of the real world. All the more reason to admire him as perhaps Shakespeare’s best creative feat, so life-like are he and the rest of the characters who populate the play.

[8] Between Opheila and Horatio, he nearly does so twice, and even towards Gertrude he aims some meager hope and sympathy. Alas, yet another essay…

[9] Shakespeare includes numerous allusions throughout the play to the theatre milieu, its characters and culture, and its place in Elizabethan society, many of which can be construed as humorous and even as insider jokes shared amongst his theatre company and his regular audience.

[10] https://youtu.be/bv20ZoBcdO8?t=1m43s (Kurtz & Kershner, 1980). A clever mash-up of the Star Wars scene with characters from The Lion King (https://www.youtube.com/watch?v=Mf6JKC_V4v0) suggests that this idea about an inner core of balanced discretion or a healthy scepticism, if not desperate inner turmoil, has resonated beyond Shakespeare’s work into our own theatrical pop culture.

[11] I learned, for my own spiritual belief, to distinguish between what many religions have people do, as compared to what God through Christ has already done. The primary reference, here, is to the Resurrection and what Christ has done for all. Whether one chooses to believe or not is up to them, and should be, which is the essence of my belief: what comes down to a matter of personal choice is to believe, or not to believe. Consider Ephesians 2:8-9, for example, in which Paul explains that we are saved not by works but by grace, so that none can boast: justification by grace through faith in God is the essence of Christianity, and I emphasise that part of it left up to us, to have faith in God. Some consider this ridiculous, and that is neither here nor there to me although I wish no ill upon anyone. Upon believing, upon faith, one can grasp how a selfless attitude of giving – giving of oneself – matters as compared to more selfish concerns over what is given or how much is given.

Such concerns do arise since, as I believe, all inherit Original Sin, a concept that one must accept before anything else in Christian doctrine of any stripe will make sense: we all have inherited an imperfection to believe and have faith in our selves, apart from the God who created us; to go our own way; to obey our own inclinations and not His. This pride-of-self, set in motion by the conniving serpent’s lure that whetted Eve’s curiosity, then Adam’s, enough for them to disobey one simple command… this original “missing of the mark” prompted Adam, Eve, and all their offspring to realise within themselves what had never before even appeared on their radar screens: that obedience was only appreciable once disobedience had been tried. It’s the same binary idea as saying, “You only really understand peace once you experience war,” and so forth. So, for instance, in offering to God (Genesis 4:3-4), where Cain brings some, Abel brings the choicest; yes, each still gives, yet Cain is furious upon seeing the difference in God’s response between their offerings. The sense is that Abel gives in faithful obedience what Cain withholds for himself, Abel trusting God, in a way that Cain does not, that God will give back and look after him. Cain trusts in what he can manage and control for himself; evidently, he does not trust like his brother that God will give back. Perhaps he does not even believe that God created them although, if he does believe this, how much worse his distrust.

Avenging his own honour by killing his brother is a choice Cain makes, entirely selfish and sinfully predictable. This, for me, opens explanation as to why God allows evil to prosper: He gave us free will, in His image, out of love, to choose or to not choose His gift of salvation; to believe or not to believe in His Gospel, as a matter of faith; to trust Him or to trust something else. In either case, we, the people, are answerable for all we do. As I say, back then, Cain perhaps did or didn’t know he was God’s creation – he is left to his own account for that. These days, though, how many people hardly even consider God as real, much less as Creator or Benefactor? However, if God offered us no doubt of His existence, then what would necessitate faith? Were He to provide 100% certainty, anyone then would have no choice but to believe, of necessity, or else be a fool not to believe and delude themselves in spite of the certainty. As it is, some think believers are deluded; truly, you can’t convince all the people all the time, and you definitely should not force belief. All this, for me, is consistent with a caring God who has conferred free will. So, where some condemn believers as guilty of the crimes and evils committed in the name of Christianity (or religions altogether), in fact, I fully agree: hateful beliefs and violent acts are an abomination of how God would have us treat each other.

But, again, he has bestowed upon us the free will to decide and behave, and I argue that all such crimes and evils, whether in the name of religions or not, reflect Original Sin, our turning-away from God; they do not reflect God. They cannot reflect the character of God, whose nature is neither criminal nor evil; rather, they reflect the character of our selves, who are selfishly proud. People are responsible for bastardising and usurping doctrine in order to gain for themselves, something akin to Cain, so blatantly transparently selfish. Further, as that kind of belief and behaviour continues, it roots until generations have perhaps forgotten or lost any other way to believe and behave. We are human, taken for all, and finite in power and awareness. We can do no other than we continue to prove ourselves capable of doing – and in this I include both good and evil that we do – and this, truly, is why we’re in need of salvation. So much gets lost in scriptural debate over details – details that warrant discussion yet, being details, they are also prone to misinterpretation and thereby require careful, long-studied contextual understanding – but the basic doctrine and the loving character of God I find rather straightforward. It’s people who complicate and screw it up, not God. And I’m as guilty, neither better nor worse but just plain equal to every other person trying to live under our circumstances. So I try my best to respect peoples’ dignity, everyone’s.

My choice has been to believe based on the preponderance of evidence that I’ve learned and studied for many years – the careful, long-studied contextual understanding I mention above. I have plenty more to learn, but my point is that I did have to learn, to begin with. I did not just suddenly have some nuanced supreme understanding of Christian doctrine – indeed, I’m wary that superficial knowledge is so frequently the cause of the crimes and evils people commit in the name of religion. I consider myself blessed to have had the freedom to choose what to study without duress and to have had an education provided by good teachers who understood what makes for good curriculum. I have never felt assaulted or oppressed as far as my education is concerned – or my life, for that matter – and, furthermore, I achingly, mournfully recognise that so so many others cannot agree. Why not me, I can’t say, but I count myself as blessed for this, if for no other reason in my existence. I know so well that not everyone has enjoyed such Providence.

There is so much abuse and violence out there, person-upon-person, and I suggest that I, or you or anyone, ought to be enabled to read, search, and decide for ourselves whether or not to believe something. And never forced, and never judged. Personally, I’m not a big church-goer – I have done, but I don’t much anymore. But I still quietly personally maintain my faith. Even offering this endnote struck me as bold, but I wanted this post to be thorough and honest. I believe evidence exists – we have only to look for it: “Knock, and the door shall be opened” is God’s encouragement, to be proactive and search for Him rather than sitting idly by awaiting, or else ignoring, His imminent return. Nonsense, this, for some. And I can comprehend the doubt. But I don’t share it. By the same token, I offer my testimony, but I don’t impose it. People today who demand to see evidence – God performing miracles, say – are asking Him to lay foundations all over again. But, by analogy, a building only needs one foundation, so why would God repeat that process? Enough evidence has been documented over time, for me, that I now readily believe and join the church being built on the existing foundation. Again, as I opened this rather long endnote, what matters most is what He has already done: we have only to believe, with no further need to see more miracles, which is really what having faith is all about.

A Kind of Certainty: III. A Scripture of Truth

Click here to read Pt II. Curriculum, or What You Will

 


A Kind of Certainty

3. A Scripture of Truth

Motive is the key, I would suggest to students: to know motive is to know the truth. And I offered this suggestion knowing full-well the timeworn joke about the actor who asks, “What’s my motivation?” Whats the Motivation Just as we can never cover it all and must go with whatever we decide to include, we also cannot (nor should not) try to present it all, ask it all, or attempt it all in one go. Yes, the odd non sequitur can break the monotony – everyone needs a laugh, now and then. But as with all clever comedy, timing is everything, and curriculum is about more than good humour and bad logic. In that regard, given what has already been said about spotting pertinence, curriculum is about motives: to include, or not to include.

And we must try to comprehend this decision from more than one perspective; each in their own way, both teacher and student ponder what to include and what to disregard during any given lesson: “Teachers are problem-posing, not just in the obvious sense that they require students to doubt whether they know something… [but] implicitly [asking] them to question their understanding of what counts as knowledge” (Beckett, 2013, p. 54-55). People generally will not doubt themselves without good reason, or else with a lot of faith in whoever is asking. Challenged to reconstruct or reorganise an experience (Dewey, 1916), more than likely we will want to know why. Curriculum addresses ‘why’.

Why! take Hamlet, for instance… deigning to know a little something about role-playing, he offers some curricular particulars while lecturing the Players ahead of the Mousetrap performance, although really this is to say Shakespeare offered them. Writers famously cringe as rehearsing actors and directors dismember their carefully worked dialogue – or is that another hackneyed joke? In any case, Shakespeare opens Act 3 with some forty lines of advice from Hamlet to the Players, whose replies are little beyond short and polite (although ‘why’ has evidently been left for you and your theatre company to ascertain). These follow some forty lines in Act 2 during an exchange between Hamlet and Rosencrantz about theatre companies, all of which could simply be played as a dose of comic relief amidst the far “weightier matters” of the play (Guyton, 2013). Tried another way, Hamlet’s lines about acting embody the very perplexity of his prolonged tumult: he takes for granted that his listener will attempt to reconcile what he says with whatever uncertainty they might have. What better job description, a “teacher”? Otherwise, why even bother to open his mouth?

What need to teach when we trust that we are all alike, that all around is 100% certain? As it pertains to telling the Players about acting, Hamlet wants no assurance that his audience must bridge some gap of certainty over his trustworthiness, not so far as he is concerned.[1] Indeed, common to live productions that I have watched, he is as relaxed and certain in offering his advice as the Players are in hearing it, like preaching to the choir.[2] Their relationship, apparently going back some time, suggests mutual respect and a shared faith not merely to listen but to understand in listening. It suggests a kind of shared attunement, something mutual, like a kind of curriculum founded upon trust. For all we might want to trust those around us, for all we might want some certainty that we are respected by others – or, perhaps more so, that we are believed – what a torment life would be if our every utterance were considered a lie. Then the only certainty would be the assurance that no one ever believed you, and if that still counts for something, it is dreadfully cold comfort.[3]

We citizens of 21st century post-modernist [your label here] North America may not have descended nearly so low although Klein (2014) does presciently discuss politics, the national discourse, and an observed decline in public intellectualism (Byers, 2014; Coates, 2014; Herman, 2017; Mishra & Gregory, 2015). Where Klein encompasses individuals and the processes, systems, and institutions that they innervate while going about their daily lives, he describes Dewey’s “conjoint communicated experience” (Dewey, 1916, p. 101) and implicates “an extraordinarily complicated conversation” (Pinar, Reynolds, Slattery & Taubman, 2006, p. 848), one that occurs everyday and includes everybody. But since we are forbidden to compel but only persuade the beliefs of free thinkers, we realise that all our perceived uncertainty can only be bridged by a kind of faith: we depend either upon others to see things as we do, or else we depend upon our rhetorical skill to persuade them toward our way. Or we live tense lives full of disagreement and antipathy. ’Swounds, but life would be a lot more stable and certain if we all just believed the same things!

Hamlet craves certainty, to the point where the dilemma of his doubt halts him so dead in his tracks that he is prompted to question existence itself. Where it comes to enacting vengeance – but, really, where it comes to everything we witness in the play – Hamlet – and, really, every character[4] – craves certainty and assurance while suffering from uncertainty and reluctance, which means, of course, that he craves and suffers from both ends. Indeed, a piece of him is certain. But comprising “one part wisdom and ever three parts coward” (4.4.42-43), he wages an unequal battle against himself. He wanders from room to room searching to free himself from his purgatorial tesseract, challenged not simply by one retrograde faith but by several, the consequence of conveying curriculum from Wittenberg back to Elsinore where, previously, he had received, to say the least, an impressionable upbringing. The upshot, given the conflicting decisions he faces, is that Hamlet would rather renounce any mutual faith of any sort and rely upon a certainty all his own: himself.

Yet he even doubts his ability to self-persuade, just as he holds no faith in anyone whose judgment he fears. As a result, he is rightly miserable and lives an exaggerated moment-to-moment existence, “…enraptured with, submerged in, the present, no longer a moment in but a suspension of time, absorbed by – fused with – the images in front of [his] face, oblivious to what might be beyond [him]” (Pinar, 2017, p. 12). Pinar describes a kairos moment of chronos time, as if Cecelia, while watching The Purple Rose of Cairo (Greenhut & Allen, 1985), could press “Pause.” He may not have been Woody Allen’s modernist contemporary, but Shakespeare still appeared to possess enough prescience to machinate a rather, shall we say, enlightened viewpoint; many consider The Tragedy of Hamlet, Prince of Denmark to be the Magnum Opus of English literature, not just Shakespeare. Evidently, he knew exactly how to craft such a rich and roundly individuated protagonist, one certain enough to persist for over 400 years. Certainty the Bard found within himself, and that he bestows (albeit perversely) upon Prince Hamlet, who “[knows] not seems” (1.2.76). Faith he found within himself, too, but that he saves for his audience, trusting them, freeing them, to spot it when the time is right, rendering what they will get unto those who will get it.

By the same token, may the rest get whatever they will get. As far as curriculum is concerned, one size has never fit all, nor should it ever be so.

 

Click here for the Bibliography

Click here to read Pt IV. A Kind of Faith

 


Endnotes

[1] I always suspected a handful of my students were just humoring me – have I mentioned they were brilliant?

[2] Sometimes, these lines have even been cut, to help shorten the play from its typical four-hour length.

[3] Elsinore seems just such a place. But they are wise who “… give it welcome” (1.5.165) since at least, then, you can get on with functioning, knowing where you stand relative to all the other prevaricating liars and weasels who inhabit the place alongside you.

[4] Every character, that is, with the possible exceptions of the Gravedigger, who apparently is most cheerful and self-assured, and Fortinbras, who suffers perhaps not pains of doubt so much as loss, and then always with something up his sleeve. I might also include Horatio in this reflection, but I fear, then, the need for an endnote to the endnotes, to do him any justice.

A Kind of Certainty: II. Curriculum, or What You Will

Click here to read Pt I. An Uncertain Faith

 


A Kind of Certainty

2. Curriculum, or What You Will

Baumlin (2002) distinguishes three concepts of temporality. Chronos is linearity, our colloquial passage of time, “non-human; impersonal objective nature” (p. 155), from which we understandably define past, present, and future. In relation to this is kairos, a single point in time, “[describing] the quintessentially human experience of time as an aspect of individual consciousness, deliberation, and action… that single fleeting moment … when an individual’s fortune is ‘set in motion’, … [providing] the means” and yielding “Fortuna, the consequences” (p. 155). Interwoven with kairos, then, is Occasio, the cause to Fortuna’s effect, a sense of “‘right-timing’ and prudent[1] action,” an opportunity[2] to better the capricious lies of fortune and fate. Although this sense of opportunity was emancipating, it also engendered accountability for consequences.

The developing belief that we possessed not mere agency but free will weighed upon Renaissance thinking, a trait that Shakespeare often imparted to his characters, Hamlet (4.4.46-52) being but one example.[3] By the time 17th century Elizabethans first watched Hamlet on stage, the humanist challenge to “a grim… Christian sufferance and resignation to time” (Baumlin, 2002, p. 149) was well underway. Unsurprisingly, Shakespeare offers nothing firm in Hamlet as to where our belief should lie, either with fortune or with free will; indeed, leaving the debate ruptured and inconclusive seems more to his point. To this end, perhaps most notable is his placement of Hamlet alongside Horatio in the graveyard to ponder the dust and fortune of Alexander, Yorick, and – hard upon – Ophelia.

In handling Yorick’s skull, Hamlet revives the poor fellow’s “infinite jest [and] excellent fancy” (5.1.186), memories of such fond “pitch and moment” (3.1.86) as to “reactivate” (Pinar, 2017a, p. 4) his own childhood, even momentarily. Such specific remembrances educed by Hamlet (which is to say, by Shakespeare) expose the springe of kairos; ultimately, certainty is beyond our capacity, rough-hew it[4] how we will. Colloquially, this might seem obvious (i.e., “the best laid plans…” and so forth, and no one person apparently able to pick the right lottery numbers each week). Yet the extent to which we consider ourselves masters of our own fortune is, for Hamlet, presently in the graveyard, a kind of epiphany, “a spiritual (re-) awakening, a transformation” (Baumlin & Baumlin, 2002, p. 180).[5] He decides that yielding himself over to “divinity” (5.2.10) is wise as compared to the folly of trying to control what was never within his grasp to begin with.

He does not give up any freedom so much as give over to dependence, which of course is a leap of faith. Shakespeare poses a question of allegiance – to obey, or not to obey – further compounded by which allegiance – obedience to father, or to Father; to free will, or to fortune; to an unweeded garden, or to what dreams may come – all these are the question.[6] Shakespeare has Hamlet “reconstruct” (Pinar, 2017a, p. 7) his conceptions of allegiance and obedience during the exchange with the Gravedigger, which hardens Hamlet’s resolve yet also enables him to come to terms with his tormenting dilemma over fealty and honour. By the time his confrontation with Claudius is inevitable,[7] Hamlet’s decision to “let be” (5.2.224) “[marks his] final transcendence of deliberative action in worldly time” (Baumlin & Baumlin, 2002, p. 180). Thus is indicated the subtle dominance of the third temporal concept, aion, “the fulfillment of time” (Baumlin, 2002, p. 155), a circularity like the uroboros, the serpent swallowing its tail. As such, aion signifies what is boundless or infinite, neither more nor less than eternity.

Oddly enough, these three concepts, in concert, can seem both time and place,[8] describing a “spatial-temporal sequence … from point, to line, to circle”; from “natural to human to divine orders” (p. 155). I am not fixed to the idea of a “sequence,” but the general composite still shapes my response to Hamlet’s most famous question of all.[9]

 


Let go. Learn from the past, but don’t dwell on it.

left (past)

Let it work. Anticipate the future, but no need to control it.

later (future)

Let come what comes. Every possible decision will still yield consequences.

Let be. Pay attention now to what is now.

The readiness is all. (5.2.222-223)

lasting (present)

The rest is silence. (5.2.358)

(a clever double-meaning here: “the rest” = either past regrets and future anxieties or else the undiscovered country, death)


 

As I take them, these four “Let…” statements amount to sound wisdom, like trusted advice from teacher to student or parent to child. As a student and child, myself, writing this paper, I faced some question of certainty – the same question, strangely enough, that we ask about curriculum: what is worth including? By the same token, what is worth omitting, and from there, what will also be otherwise left out or unmentioned? Whatever we decide, one thing is certain: we can neither cover nor even conceive it all, which of course was my original problem. In fact, knowing as much as we know can even shed paradoxical light onto how much we remain in the dark. Eventually, as my Dad recommended over the phone, I simply needed the courage to make a decision and go with it, and even with his voice in my ear, I knew my own advice with my students had always been the same.

Hanging up, I reasoned further that any feedback I did receive – from peers during revision or from my professor’s formal evaluation – would illustrate how effectively I had collated and communicated my message. Beyond that, say revising the paper for publishing, I would have some ready direction. And there it was, I realised, staring me in the face, curriculum in a nutshell: conversations, decisions, actions, evaluations, reflections – all these, in relation to me as I wrote this essay, amounted to a lived curricular experience of my very own.[10] My curriculum, like this essay, does not simply pose the straightforward question about what is worth including. That question is insufficient. More particularly, my curriculum, like this essay,[11] prompts me to consider what is worth including in light of the audience, the topic, what is already known about the topic, and whatever aims exist in further pursuit of the topic.[12] Succinctly, my curriculum – all our curricula – is contextual, multitudinous, and a question of – questions of – what is particularly worth knowing about any topic of study under the sun: “Why this, why here, and why now?”[13] That is the question.

Well, maybe that is the question. The essence of this question, this curricular particular, lies in kairos, the concept of opportune timing or occasion that “signals the need to bring universal ideas and principles to bear in historical time and situations [i.e., deductively] and, thus, calls for decisions [requiring] wisdom and critical judgment” (Smith, 1986, p. 15). We can only note what matters to us once we have a reference point. And since nothing occurs in a vacuum, any detail can be potentially informative, so we must learn to pointedly ask not, “In what way(s) do I already know what I’m looking at?” but rather, “In what way(s) do I not know what I am looking at?” which tends to be deductive. Typically, curriculum begins inductively, with what someone already knows, and we all know plenty of things. But we generally bring to bear only what we deem relevant to the moment. By the same token, someone who knows what is relevant to the moment has a kind of prescient “mechanism” (Christodoulou, 2014, p. 54) for spotting what will likely be of use.[14] So curriculum is a means of determining, if not discovering, in the moment what works. It is, therefore, also a means of coming to know ourselves.

As we develop confidence and self-esteem, and dignity, we grow to feel that we have something to contribute, that we matter, all of which prepares us for helping others. Curriculum helps us to sort out our values and beliefs,[15] which provide a frame-of-reference in order to select and, later, to measure our day-to-day efforts. Of course, none of this happens immediately; we need time to grow more self- and other-aware, each kairos experience filing in alongside the rest, like a crowd of ticket holders. I can only wonder whether Shakespeare might have characterised curriculum as something akin to being held over for an indefinite engagement. In any event, we never stop learning – may our auditoriums ever sell out – as we continually induce as well as encounter influence. But how deliberately do we do so? Maybe that is the question.

 

Click here for the Bibliography

Click here for Pt III. A Scripture of Truth

 


Endnotes

[1] As Baumlin (2002) notes, “For the student of prudentia, time reveals itself as golden Opportunity rather than as fickle, devastating Fortune” (p. 141). Certainly, Shakespeare and his Elizabethan audiences were feeling such debate permeate their own lived experiences, a dram of an idea that, once diffused, might only thereafter suffuse.

[2] According to Claflin (1921), “‘opportunity’ in Shakespeare means more than it does now [in the 20th century]; it is closer to the original force of Latin opportunus, and means ‘a specially favourable occasion’” (p. 347). Curiously enough, however, as I searched a concordance of Hamlet (Crystal & Crystal, 2002), I found no usage of “opportunity” whatsoever and only three of “chance,” most notably that of Hamlet to Horatio: “You that look pale and tremble at this chance…” (5.2.334) in reference to the dead and dying at the play’s closing. Of further interest is the concordance’s report that Shakespeare used “opportunity” throughout his entire catalogue of poems and plays only sixteen times as compared to “chance,” which he used 114 times.

[3] Kiefer (1983) examines Fortune at length as one colour in Shakespeare’s palette for his characters, noting of King Lear: “In no other of Shakespeare’s plays do characters invoke Fortune so insistently [or] so frequently at pivotal points of the action” (p. 296).

[4] Read either “certainty” or “our capacity,” here, in place of “it”; either works just as well. The line from the play I have paraphrased, of course, because the original antecedent is “our ends” (5.2.10) in place of “them” (5.2.11). However, where I have changed the diction of the thought, as a matter of perspective, the meaning remains intact. The implication that we – in essence – play God might not be nearly so alien for Shakespeare’s audience as to their feudal predecessors. By contrast, to postmodern audiences these days, the notion of a divinity standing apart from our own free will and shaping our ends might be the more alien concept.

I might finally point out that Shakespeare, as his creator, is Hamlet’s god, of a kind. But that analogy does not last long under scrutiny since Hamlet, being a fictional character, has no sentience, free agency, or tangibility, and actors who portray him are left with prescribed dialogue and beliefs.

[5] Because I am ultimately discussing what Shakespeare did, his characters being only conveyances as such, I was tempted to complete this sentence with a line from Macbeth, as follows: “The extent to which he considers himself master of his own fortune, presently in the graveyard, is laid plain for Hamlet, leaving him to conclude only that ‘…all our yesterdays have lighted fools the way to dusty death’ (5.5.22-23).” The key difference, of course, is that Hamlet decides against being a fool whereas Macbeth seems all too keen to excel at it. Where Hamlet best demonstrates a respect for “divinity [shaping] our ends,” Macbeth better represents the rough-hewing bit, which makes him a far less redeeming character in the end. So, upon reflection, it seemed prudent to stick substantively to just the one play. Thank heaven for endnotes, I guess.

[6] Had he fallen clearly to one side, as a subject to his monarch, Shakespeare might very well have sealed whatever freedom he did enjoy; his own response, evidently, was to render unto Caesar, and render unto God, and continue writing plays. Four centuries on, what is there about us, that we might think we are any less susceptible than he was to coming to terms with our finite nature? We live in civil society, by the rule of law under a Constitution, within which are Rights and Freedoms that include the assurance to believe, or not to believe, whatever we decide suits us best. Furthermore, we have the advantage over Hamlet in that his example exhorts us, interminably – just ask my students, remember? Alas, though, poor Yorick.

[7] As Horatio notes, “It must be shortly known [to Claudius]” that Hamlet has tricked Rosencrantz and Guildenstern to their deaths at the hands of England (5.2.71-72), a move by Hamlet in his contest that must certainly harden his uncle’s resolve to have Hamlet dealt with once and for all. Of course, Claudius had sent Hamlet to England to be killed, but in secret, on account of both Gertrude and the public’s love for the Prince (4.7.5-24). However, in dispatching his childhood comrades – and with such calculation (5.2.57-70) – Hamlet has now given Claudius justifiable means to overcome any such favourable opinion as might have benefitted Gertrude’s “son” (5.1.296).

[8] Time and place are what we commonly refer to as setting in English class, which is a curious way to consider eternity.

[9] Seldom mentioned amidst all the consternation is that Hamlet does not actually ask a question. If he had, he might have worded it as, “Is it to be, or not to be?” In that case, we would need to know what “it” means. Alive? Dead? Happy? Sad? Anything goes, I suppose, but then… what would you expect? He might have been asking, “Am I…” or “Are we to be, or not to be?” But where that is still somewhat existential and vague, now we might want to know whether his use of the verb, to be, is more open-ended or copular. I suspect Shakespeare knew enough about a general audience to trust that only the most fastidious grammarians would fuss over particulars such as antecedents and verb tenses in the dialogue. Otherwise, why decide to use the most protean verb in the English language?

[10] As far as lived curricular experiences go, there are many like it – as many as there are people to have them – but this one is mine.

[11] At this early stage, I confess: why struggle writing a paper when I could use the age-old trick of writing a paper about writing the paper? Why…? Because the age-old trick is just that – a trick – and spinning academic wheels stalls any hope of contributing to knowledge, so I would hardly be honouring my responsibility if I tried pulling that off. Still… the paper within a paper got me thinking about Hamlet, which oddly enough had been my original inspiration for this essay. As my students used to say… once you study Hamlet, he just never goes away. How true, how true…

[12] According to Hartmann (2014), it was just such questions that prompted Ezra Klein to leave The Washington Post and establish Vox.com in 2014.

[13] Students in my all courses learned to rue the question “Why?” so much so, one year, that it became a running joke simply to utter “why” as a pat-response, whether as a question, an interjection, a plea, a curse, an epithet – those last two maybe reserved for me, I don’t really know. In honour of their perseverance, and their angst, I named my blog The Rhetorical WHY.

[14] Surrounded by Winkies, confronted by certain capture, only Scarecrow eyes the chandelier above the Wicked Witch, so only he can yank Tin Man’s axe across in time to chop the rope that suspends it. Hardly the grandeur or the gravitas of Hamlet, I realise, but The Wizard of Oz has much to offer pertaining to curricular theory as well as teacher autonomy.

[15] In keeping with the three temporal concepts, perhaps a more suitable metaphor than threading our own needles would be to say we surf a long pipeline. But, this essay being more concerned with curriculum and theatre, any such Hang-Ten imagery is better suited to another time, like connecting curriculum to gnarly waves and bodacious beaches (“Surf’s Up,” 2015). Anyway, certainly no one would ever dream of linking Hamlet to surfing (“’Hamlet’s BlackBerry’,” 2010) in the first place, would they?

Play’s the Thing…

I used to say to my students, “Find the overlap between our English coursework and, say, Trigonometry, or the link from persuasive writing to PhysEd. Where does Hamlet end and organic chemistry begin? Find that one out… there’s genius in that.” The courses my Department offered were called “English” and, helmed by some teachers, they were more traditional, as one might expect. The most common feedback I received from students, though, was how unlike English our coursework seemed to them. I took those remarks as a measure of success: my aim was to prepare young people, soon enough entering the world as older people, to be responsible… to families, communities, careers, and so forth. For me, that’s the purpose of school and its teachers.

What prompted me to reflect was reading Science, Order, and Creativity, by David Bohm and F. David Peat – specifically, such remarks as “the appropriate relationship between thought and experience… [in which] creative new perceptions take place when needed” (p. 49). That distinction between thought and experience reminded me of another distinction, this between dialogue and conversation. And again I was prompted to recall my English courses – what we had, I’d say, were definitely conversations, scratching new surfaces and digging into things with fluid spontaneity, as compared to the “my turn / your turn” protocol of dialogue, which might dig one trench but deeper and deeper. Where dialogue strikes me as instrumental, a means to an end, conversation is an end in itself, without start or finish but continual – that is, until the bell rings. We notoriously lived beyond the rigour of scheduling in some of my courses.

Those conversations were hard to let go. And what exactly were we after? “The creative person does not strictly know what he or she is looking for,” say Bohm and Peat. “The whole activity [is] play itself,” and no better description of teaching (at least, my teaching) have I ever read. Who knew I was so creative? Not me although I did have fun. So who knew teaching was just so much play? “The play’s the thing / wherein I’ll catch the conscience of–” well, anybody, really. I should clarify that I respected my colleagues and our Departmental philosophy as well as my professional obligation to Ministry curricula. At the same time, I relied on my own interests and concerns to guide our coursework, by day and by year. The result was a mixture of reading, discussion, writing, and presenting about topics as disparate as literature, film, fine art, civics, politics, economics, philosophy, etymology, all manner of topics – yes, even science and math – all bundled together in a process of classical rhetoric. Eventually, I developed a suitably disparate canon of texts, too, that flowed meaningfully from English 9 through 12. And I relied on students’ differences to alter and adjust the flavour however they might. I loved teaching for how creative it allowed me to be, and for how much creativity it provoked in my students. “Let come what comes,” Laertes tells Claudius – brazen, even foolhardy. Genius, perhaps?

Bohm and Peat seem to suggest that genius is not creativity per se so much as the effect of having challenged some assumptions, and maybe that’s mere semantic distinction. Either way, I like the notion. Later, reading Allen Repko, I found myself nodding likewise at what he calls “boundary crossing” (p. 22). There it was, this discovery of common threads in disparate disciplines, this crossing of amorphous boundaries, what my students have heard me call “genius” although I might now redefine that trait as “ingenuity.” Accompanying “boundary crossing” is a reaching across disciplines, with intent, what Repko calls “bridge building.” This, I think, I would call visionary. Discovery and vision, both what I would forever consider, as a teacher, to be meaningful developments of the learning process.

Repko also points out the origin of the word, “discipline,” deriving from the Romans and their need to “relate education to specific economic, political, and ecclesiastical ends” (p. 32). How delightfully Roman! I thought, reading that. Such instrumentalism, “the logic of utility.”[1] Finis at its finest: How long, O Lord! Will their legacy never end? But I trust in teaching and my unfailing students.

I enjoyed sixteen years teaching Secondary English to brilliant students. In that time, we developed a philosophy, addressed the BIG Questions, and fed our curiosity. But my planning process was seldom more than make-it-up-as-we-go. “We could never get away with this in Math,” I used to say to them, “although if you do find a way, I’d love to hear about it.”

 


[1] Phelan, A. (2009). A new thing in an old world? Instrumentalism, teacher education, and responsibility. In Riches, Caroline & Benson, Fiona J. (Eds.) Engaging in Conversation about Ideas in Teacher Education, (105-114). New York, NY: Peter Lang.

 

Development and Learning: Part II – Youth Football

In the previous post, I proposed that development and learning co-exist alongside winning and that contriving debate to place them at odds actually misconstrues their concerted relationship. I add, here, that development and learning are characteristic of people, and winning and losing is inherent to the Game of Football and to sport in general. In other words, development & learning and winning & losing are not at odds; they arise in concert as people compete with one another by participating as opponents when they play a game.

I also suggest in that post that all sorts of people have fun playing the Game of Football for all sorts of reasons and that competition and fun, like development and winning, are not and should not be mutually exclusive.

Another facet to this topic, based on the inherent nature of winning & losing to sport, is that any and all attempts to win are justifiable. This discussion becomes especially heated in the context of youth sport because such a purist approach can be detrimental to the players as they learn how to play and be members of a team. In that light, what I discuss below is development & learning in youth sport – specifically, in youth football (soccer).

To those who say that the Game is purely about winning & losing: saying so is a red herring. We must account for the fact that youth football has been distinguished from the adult game, and this distinction is for good reason.

Although at first it might seem contradictory, I already grant that the objective of the Game of Football is to win. I have clearly claimed that every team plays to win. Nobody plays to lose – in sport, or cards, or board games, or any game. Youth modifications don’t change that. Yes, as in any game, the objective in the Game of Football is to win.

Old Trafford
Theatre of Dreams

But the objective lies apart from learning how to play and training to play to win. The modifications to Youth Football have come about on account of younger peoples’ traits and abilities. By analogy, it’s like when cars are modified for those learning to drive: two steering wheels, wider mirrors, or driving on quieter out-of-the-way roads, or using VR simulators. There’s a gradual learning process by which new drivers grow accustomed to the road.

Reversing that analogy, U9s play 7 a-side on a smaller field with a smaller ball and various rule alterations – the very existence of such modifications is evidence that the Youth Game differs from the Adult Game on account of youth differing from adults.

If someone is coaching a Youth team in accordance with the modifications, they tacitly acknowledge the difference. Therefore, to see nothing wrong with a purist viewpoint – that winning is utterly and always justifiable, even in the context of youth football – strikes me as insincere, perhaps in denial that young people differ from adults, or that priorities are skewed to place the self-security of winning above all else, or that someone is ignorant or uninterested in child growth & development , or some combination of these.

To simply say the Game is about winning… yes, it’s correct as far as the pure Game is understood, as a concept, but it reduces your margin for error. On that basis, we’d better be flawless now, and play with mastery, or else we amount to nothing more than a loser and a failure. I suspect none of our teams is flawless, as much as a purist belief might require them to be.

One youth team I coached (Ass’t Coach) years ago was successful enough that, during our U11 year, we were able to play versus three professional F.A. Academy sides. The results were 0-15, 0-5, 0-9. We had no illusions, and our players were shattered by the reality that same-age teams could have such quality and be so dominant, just as we were back home – that’s how we were accepted to play these Academies in the first place. In any event, there it was: a level of mastery relative to us that we were obliged to respect.

Match #4 vs Aston Villa Academy

So, given a belief in the purist objective of winning… unless you take on similar opponents, who can challenge your team, then purist winning reflects poorly upon you, making you look ignorant, if not cowardly. If the Game is simply to be played to its purest, then nothing short of mastery will do. And if that playing field is to be a level one, then the best example of mastery we have, in reality, is the pro game. To purists, I say this: if you test your youth team at that level, as I’ve done, you may well discover that…

(a) your challenge may not even be accepted but, if it is, then

(b) you may have a rude awakening.

In fact, that may be exactly what a purist needs. On the other hand, if it comes at your players’ expense, it’s not worth the cost. As I say, our team was shattered, and we had a great deal of respect for youth training and development, being professional educators and researchers as we (still) are.

Birmingham City FC Training Ground
A Visit to Birmingham City FC

Things are always much easier when all’s well and we’re winning. Real humility is found when we aren’t winning. To those who take a purist approach to sport, enjoy the ride at the top while it lasts because, someday, you may discover that you’ve not learned how to cope, yourselves.

Development vs Winning: Actually, There Is No Such Thing

Also read the follow-up article to this post.

Outside of corruption, throwing the game, which has no place in this discussion, I submit that nobody deliberately plays to lose.

Specifically, I’m talking about football, less commonly known as soccer, and perhaps this discussion even applies to many different sports. But, as a player and coach, football is the beautiful game that I know best, so here goes.

Playing football, we would anticipate the team that makes the fewest mistakes ought to win – as in, the fewest mistakes both in and out of possession, from the kick-off until full-time. If so, then consistent quality performances are key because these should result in more opportunities to earn a win and prevent a loss. What’s more, as the reward for winning grows more lucrative, and the stakes are raised, players must all-the-more learn to develop that “consistent quality performance” on demand, under whatever pressure: effective decisions, executed at the proper moments, skillfully, every time, or at least as frequently as possible. Developing this “quality performance” consistency also demands that opponents earn victories rather than handing them the result, unimpeded, because now they’re challenged to execute just as consistently, if not just as flawlessly. As I say, no one competes to lose.

So, what of development and winning in light of all this? Too often, for me, these two ideas are falsely conflated into sides of what is truly a non-existent – or, at least, a very ill-conceived – debate. As ends-in-themselves, development and winning are typically deemed incompatible. Further, winning is then often vilified since winners produce losers while development is commended for being inclusive. At that point, I find the debate often sidetracks into competition versus fun, another false dichotomy, but in any case, the parameters are so muddled as to render all a meaningless waste of breath. For the sake of dispensing with the issue, I simply ask: why would we not reasonably expect to see fun in conjunction with competition? These are not oil and water, nor do they need to be, nor should they be deemed to be.

Football, the Game, can be played for fun, exhilaration, fitness, camaraderie, focus, perseverance, discipline, teamwork, all manner of virtues and benefits, yet all these on account of the very nature of the Game as a contest of opposition. And where one person finds things fun and enjoyable, another does not necessarily agree, yet who’s to say who is correct, if the Game has enabled all? All sorts of people find all sorts of fun in all sorts of things – who’s to say that finding competition to be fun is wrong, if only because it makes you squeamish? Just the same, if someone’s threshold for intense competitive drive is lower than another’s, can each still not enjoy playing with like-minded peers? In fact, just for instance, this is exactly why various youth and adult leagues categorize levels of play into (for ease of this discussion) gold, silver, and bronze tiers. Everyone must learn to play, and development (to whatever degree) will occur as they go. That implicates teammates, the quality of coaching, and other factors relating to a team or league’s motives for playing in the first place (i.e. gold vs silver vs bronze). Motive, however, does not change the nature of the Game, itself, or the nature of effective learning, development, coaching, and teaching.

As I see it, the issue is not Development for its Own Sake versus Winning for its Own Sake or even …for its Own Sake versus Development in order to Win. The issue is Development and Learning as a concept, altogether, period, because how else could you learn to play? And the more you play, the more you develop. Whether that development is good or poor is down to context, and a separate issue.

And when the arguments start, what’s really being debated, it seems to me, is how any one person simply wants to be “right” and demand that everyone else agree with what constitutes “successful” participation in the Game. Ironically, it’s a territorial argument over ideology. But to win an egotistical war suggests to me that we might better spend our efforts re-evaluating our culture and how we wish to treat other people.

Fair enough, people want to be “right.” We all have egos. But can we at least offer some basis from which to claim what the word “successful” can mean? So here goes.

Since losing a match always remains a possibility, no matter how consistent our quality performance might be, we ought to measure “success” as the degree to which a player or team has developed that consistent quality of performance (process) over time, at their corresponding level and motive for play, regardless of winning (product).

**I’ll specify, as I did above, that where wins are lucrative – such as in professional play – the stakes grow higher, and different debates will ensue about what “success” means. Yet that’s a commercial issue, relating to development and learning on the basis of peoples’ patience and tolerance for financial pleasure or pain: in other words, the two issues are not inherently related but coincidental: a crowd of supporters or sponsors are willing to pay to back the team for a season.**

For the Game, itself, we must let winning take care of itself because players control what they are able to control, under conditions that also include the pitch, the ball, the referee, the weather, health, fitness, and so forth. So what can we measure? Measurements ought to fall under player and team control, e.g. shots at goal, completed passes, tackles won, saves made, etc. Far from counteracting the importance of winning, such consistent measurements of quality performance provide feedback, i.e. if our pass completion is 90% successful around the penalty box, then maybe we don’t score because our shooting is infrequent or inaccurate. One might even argue that the statistical measurements we gather are less important than the ones we’ve overlooked.

In any case, successful players and successful teams identify strong and weak areas by regularly measuring consistent quality across a range of performance details, and they develop each area for consistency – which we anticipate will translate into more wins – because consistent quality performances usually translate into what can be measured as an “ongoing success.” Success now defines a degree of purposeful, committed, consistent hard work, which makes for more focused, more effective training. Developmentally, the more successful you are, the more often you can theoretically win – but if your opponents also train and measure, and respond better than you do, then guess what? That’s called competition.

Development and winning not only can but already do co-exist. And they always have. It’s people who separate them, falsely, perhaps because they want to win more than they want to earn wins – or, worse, perhaps because they merely want to win a territorial argument about development vs winning that never existed before someone’s ego dreamt it up.

Beyond on-field training and competing, development and learning should cover a range of areas that affect yet lie beyond the Game, e.g. health, fitness, nutrition, goal setting, mental preparation, personal responsibility. Coaches ought to take players beyond the Game, teaching them how to train, how to contribute to a team, how to compete at higher levels of skill and intensity, how to manage the dynamics and emotions of competition, and how to conduct themselves with personal integrity in all respects. Of course, the Game is included within the scope of these matters because that’s why we’re a team in the first place. And the range of these inclusions will comprise a more holistic football program. We implement and evaluate that program as we go, or we ought to.

Effective programs inevitably reveal the crux of commitment, either thanks to peoples’ dedication or on account of their inconsistency. Effective programs encourage trust and a shared pursuit of common goals. Where trust and commitment are maintained consistently and respectfully, a team and its members learn to measure quality and respond consistently, i.e. successfully. Such programs require time, discipline, and patience to learn, but the degree to which participants buy into the philosophy is met with concomitant developmental consistency, and again, one can expect winning to result more often than not, relative to the quality of the opposition. Likewise, individual people can take credit for this-or-that achievement only relative to their teammates, who are also active participants in the program.

Active participation should find team members applying complementary strengths by filling key roles on the path to team success. Individual contributions accumulate, and if these have been consistently defined by common goals and measured for consistent quality, “success” is more likely because people can envision it more clearly and pursue it more meaningfully.

Opponents, especially of equal or slightly higher abilities, likewise play a key role in a team’s pursuit of success since measuring consistent quality performances against them is, in one sense, what the Game – and what sport – is all about. Active involvement in a program unites a team, preparing everyone for more advanced challenges. Occasionally, a teammate might advance to more elite programs, and when a team member grows beyond the scope of the program, that is a team success that all of us can share.