Teaching’s Other Greatest Reward

“Texts are not the curriculum,” I was told during Pro-D by an administrator, the Director of Curriculum and Innovation. The session had been arranged to introduce a revised K–12 curriculum and was billed as a great unfolding at the onset of the 21st century. “Texts are a resource for implementing lessons and practising skills,” she concluded. By this, I took her to mean that notation, for example, is a resource for students to finger piano keys or pluck guitar strings, which is something music teachers might accept. I took her to mean that landscape is fodder for brushstrokes and blending, something art teachers might accept. I took her to mean that a poet’s intimate, inspired reveries, shared in careful verse, is raw material for students who are learning to analyse and write, which I grant English teachers might accept. I took her to mean that I should consider her remark a resource and that this issue was now settled, which some teachers in earshot seemed to accept. To this day, I wonder whether a musician, or a painter, or a poet might accept her remark, but in that moment, I let it go.

I suppose I should be more forthcoming: I used to joke with parents, on Meet the Teacher Night, that I could be teaching my coursework just as well using texts like Curious George and a recipe book. That I decided to use Shakespeare, or Sandra Cisneros, or Thomas King, and that I would in fact be asking students literally to stare out the window as part of a textual analysis exercise—all just as arbitrary—illustrated the point: I built my course around some particular themes that reflected me and what I believed important about life. This, in turn, was meant to illustrate to students, and now parents, how bias plays a noteworthy if subtly influential role in our lives and our learning.

My larger points were twofold: firstly, no, texts are not the curriculum per se and, secondly, our Department’s approach to English Language Arts (ELA) focused more on skill development, less on content consumption. For us, anyway, the revised curriculum was reaffirming. What I merely assumed in all this—and presumed that parents assumed it, too—was that our Department’s approach was commensurate with the school’s expectations, and the Ministry’s, as well as with our province’s educational history and the general ELA approach found in classrooms across North America, for which I had some albeit minimal evidence by which to make the claim. As a secondary ELA teacher, I chose my texts on the basis that they helped expedite my curricular responsibilities. I suppose it would be fair to say that, for me, texts were a resource for implementing lessons and practising skills.

What was it, then, that niggled me about the Director’s comment at the Pro-D session? Did it have to do with decision-making, as in who gets to decide what to teach, and how, and why? Would that make it about autonomy, some territorial drawing of lines in professional sand? Was it more my own personal confrontation, realising that musicians and painters and poets deserve better than to be considered lesson fodder? I had never approached my lessons so clinically or instrumentally before—had I? Maybe I was having my attention drawn into really considering curriculum, taking the time to puzzle out what that word means, and implies, and represents. And if I never really had puzzled it out, what kind of experience was I creating for my students? I’ve always felt that I have done right by students, but even so… how much better, still, to be done?

Months later, I sat at a table doing prep work next to a colleague, and a third sat down to join us. Eventually, as the conversation turned from incidents to editorials, the third teacher spread her hands wide and concluded, “But ultimately education is all about relationships.” In the next split-second moment, I was confronted by the entirety of my teaching philosophy, nearly a clarion call except I had nowhere to stand and run, so I just remained in my seat, quietly agreeing and chuckling at the truth of it all. We all did. That was my final year before returning as a student to a doctoral program. These days, I search and select texts to read so I can write texts of my own about particular themes that reflect me and what I believe important about curriculum, and teaching, and education.

I should say I no longer wonder why the Director’s remark that day, about texts, didn’t set me to thinking about curriculum, not like my colleagues did, sitting and chatting around that table.

The Conceit of A. I.


From a technological perspective, I can offer a lay opinion of A.I. But check out some more technical opinions than mine, too:

MIT: The Seven Deadly Sins

Edge: The Myth of AI

The Guardian: The Discourse is Unhinged

NYT: John Markoff

Futurism: You Have No Idea…

IEET: Is AI a Myth?

Open Mind: Provably Beneficial Artificial Intelligence

Medium: A Critical Reading List

AdWeek: Burger King


The Conceit of A.I.

Time and energy… the one infinite, the other hardly so. The one an abstraction, the other all too real. But while time ticks ceaselessly onward, energy forever needs replenishing. We assign arbitrary limits to time, by calendar, by clock, and as the saying goes, there’s only so much time in a day. Energy, too, we can measure, yet often we equate both time and energy monetarily, if not by actual dollars and cents: we can pay attention, spend a day at the beach, save energy – the less you burn, the more you earn! And certainly, as with money, most people would agree that we just never seem to have enough time or energy.

Another way to frame time and energy is as an investment. We might invest our time and energy learning to be literate, or proficient with various tools, or with some device that requires skilful application. Everything, from a keyboard or a forklift or a tennis racquet to a paring knife or an elevator or a golf club to a cell phone or a self-serve kiosk or the new TV remote, everything takes some knowledge and practice. By that measure, there are all kinds of literacies – we might even say, one of every kind. But no matter what it is, or how long it takes to master, or why we’d even bother, we shall reap what we sow, which is an investment analogy I bet nobody expected.

Technology returns efficiency. In fact, like nothing else, it excels at creating surplus time and energy, enabling us to devote ourselves to other things and improve whichever so-called literacies we choose. The corollary, of course, is that some literacies fade as technology advances. Does this matter, with so many diverse interests and only so much time and energy to invest? How many of us even try everything we encounter, much less master it? Besides, for every technological advancement we face, a whole new batch of things must now be learned. So, for all that technological advancement aids our learning and creates surplus time and energy, we as learners remain the central determinant as to how to use our time and energy.

Enter the classroom what’s lately been called Artificial Intelligence (A.I.). Of course, A.I. has received plenty of enthusiasm, attention, concern, and critique as a developing technological tool, for learning as well as plenty other endeavours and industries. A lengthy consideration from The New York Times offers a useful, broad overview of A.I.: a kind of sophisticated computer programming that collates, provides, and predicts information in real time. Silicon Valley designers aim to have A.I. work at least somewhat independently of its users, so they have stepped away from older, familiar input-output modes, what’s called symbolic A.I., a “top down” approach that demands tediously lengthy entry of preparatory rules and data. Instead, they are engineering “from the ground up,” building inside the computer a neural network that mimics a brain – albeit, a very small one, rivalling a mouse – that can teach itself via trial-and-error to detect and assess patterns found in the data that its computer receives. At these highest echelons, the advancement of A.I. is awe-inspiring.

Now for the polemic.

In the field of education, where I’m trained and most familiar, nothing about A.I. is nearly so clear. Typically, I’ve found classroom A.I. described cursorily, by function or task:

  • A.I. facilitates individualized learning
  • A.I. furnishes helpful feedback
  • A.I. monitors student progress
  • A.I. highlights possible areas of concern
  • A.I. lightens the marking load

On it goes… A.I., the panacea. Okay, then, so in a classroom, how should we picture what is meant by “A.I.”?

Mr. Dukane
“Anybody remember Mr. Dukane?”

Specific examples of classroom A.I. are hard to come by, beyond top ten lists and other generalized descriptions. I remember those library film-strip projectors we used in Grade 1, with the tape decks attached. Pressing “Play,” “Stop,” and “Eject” was easy enough for my six year-old fingers, thanks to engineers who designed the machines and producers who made the film strips, even if the odd time the librarian had to load them for us. (At home, in a similar vein, how many parents ruefully if necessarily consider the T.V. a “babysitter” although, granted, these days it’s probably an iPad. But personification does not make for intelligence… does it? Didn’t we all understand that Max Headroom was just a cartoon?) There’s a trivia game app with the hand-held clickers, and there’s an on-line plagiarism detector – both, apparently, are A.I. For years, I had a Smart Board although I think that kind of branding is just so much capitalism and harshly cynical. Next to the Smart Board was a whiteboard, and I used to wonder if, someday, they’d develop some windshield wiper thing to clean it. I even wondered if someday I wouldn’t use it anymore. For the record, I like whiteboards. I use them, happily, all the time.

Look, I can appreciate this “ground-up” concept as it applies to e-machines. (I taught English for sixteen years, so metaphor’s my thing.) But intelligence? Anyway, there seems no clear definition of classroom A.I., and far from seeming intelligent to me, none of what’s out there even seems particularly dim-witted so much as pre-programmed. As far as I can tell, so-called classroom A.I. is stuff that’s been with us all along, no different these days than any tool we already know and use. So how is “classroom A.I.” A.I. of any kind, symbolic or otherwise?

"... so whose the Sub?"
“Hey, so who’s the Sub today?”

Symbolic A.I., at least the basis of it, seems not too dissimilar to what I remember about computers and even some video arcade favourites from back in the day. Granted, integrated circuits and micro-processers are a tad smaller and faster these days compared to, say, 1982 (… technology benefitting from its own surplus?) Perhaps more germane to this issue is the learning curve, the literacy, demanded of something “intelligent.” Apparently, a robot vacuum learns the room that it cleans, which as I gather is the “ground-up” kind of A.I. Now, for all the respect and awe I can muster for a vacuum cleaner—and setting all “ground-up” puns aside—I still expect slightly less from this robot than passing the written analysis section of the final exam. (I taught English for sixteen years, so written analysis is my thing.) It seems to me that a given tool can be no more effective than its engineering and usage, and for that, isn’t A.I.’s “intelligence” more indicative of its creator’s ingenuity or its user’s aptitude than of itself or its pre-programmed attributes?

Press Any Key to Begin

By the same token, could proponents of classroom A.I. maybe just ease off a bit from their retcon appropriation of language? I appreciate getting caught up in the excitement, the hype—I mean, it’s 21st century mania out there, candy floss and roller coasters—but that doesn’t mean you can just go about proclaiming things as “A.I.” or, worse, proclaiming A.I. to be some burgeoning technological wonder of classrooms nationwide when… it’s really not. Current classroom A.I. is simply every device that has always already existed in classrooms for decades—that could include living breathing teachers, if the list of functions above is any guide. Okay then, hey! just for fun: if classroom tools can include teachers who live and breathe, by the same turn let’s be more inclusive and call A.I. a “substitute teacher.”

Another similarly common tendency I’ve noted in descriptions of classroom A.I. is to use words like “data,” “algorithm,” and “training” as anthropomorphic proxy for experience, decision-making, and judgment, i.e. for learning. Such connotations are applied as simply as we might borrow a shirt from our sibling’s closet, as liberally as we might shake salt on fries, and they appeal to the like-minded, who share the same excitement. To my mind, judicious intelligence is never so cavalier, and it doesn’t take much horse-sense to know that too much salt is bad for you, or that your sibling might be pissed off after they find their shirt missing. As for actually manufacturing some kind of machine-based intelligence, well… it sure is easy to name something “Artificial Intelligence,” much less bestow “intelligence” by simply declaring it! The kind of help I had back in the day, as I see it, was something I just now decided to call “S.I.”: sentient intelligence.

Facetiousness aside, I grant probably every teacher has spent some time flying on auto-pilot, and I’ve definitely had days that left me feeling like an android. And fair enough: something new shakes things up and may require some basic literacy. There’s no proper use of any tool, device, or interface without some learned practical foundation: pencil and paper, protractor, chalk slates, the abacus. How about books, or by ultimate extension, written language, itself? These are all teaching tools, and each has a learning curve. So is A.I. a tool, a device, an interface? All of the above? I draw the line where it comes to classroom tools that don’t coach the basketball team or have kids of their own to pick up by 5pm: the moniker “A.I.” seems more than a bit generous. And hey, one more thing, on that note: wouldn’t a truer account of A.I., the tool, honour its overt yet seemingly ignored tag, “artificial”? R2D2 and C-3PO may be the droids we’re looking for, but they’re still just science fiction.

Fantastic tales aside, technological advancements in what is called the field of A.I. have and will continue to yield useful, efficient innovation. And now I mean real Silicon Valley A.I., not retcon classroom A.I. But even so, to what ends? What specifically is this-or-that A.I. for? In a word: why? We’re headed down an ontological road, and even though people can’t agree on whether we can truly consider our self, we’re proceeding with A.I. in the eventual belief that it can. “It will,” some say. Not likely, I suspect. Not ever. But even if I’m wrong, why would anyone hope that A.I. could think for itself?

Artificial Intelligence
10. Be “A.I.”    20. Go to 10     Run

Hasn’t Heidegger presented us with enough of a challenge, as it is? Speaking of time and energy, let’s talk opportunity costs. Far greater minds than mine have lamented our ominous embrace with technology. Isn’t the time and energy spent on A.I.—every second, every joule of it—a slap-in-the-face of our young people and the investment that could have been made in them? It’s ironic that we teach them to develop the very technology that will eventually wash them away.

Except that it won’t. I may be out on a limb to say so, but I suspect we will sooner fall prey to the Twitterverse and screen-worship than A.I. will fulfil some sentient Rise of the Machines. The Borg make good villains, and even as I watch a lobby full of Senior Band students in Italy, staring at their iPhones, and fear assimilation and, yes, worry for humanity… even then I reconsider because the Borg are still just a metaphor (… sixteen years, remember?) As a teacher I am more driven to reach my students with my own message than I am to snatch that blasted iPhone from their hands, much as I might like to. On the other hand, faced with a dystopian onslaught of Replicants, Westworld Gunslingers, and Decepticons, would we not find ourselves merely quivering under the bed, frantically reading up on Isaac Asimov while awaiting the arrival of Iron Man? Even Luke Skywalker proved susceptible to the Dark Side’s tempting allure of Mechanized Humanity; what possible response could we expect from a mere IB cohort of inquiry-based Grade 12 critical thinkers and problem-solvers?

The Borg
“Resistance is futile.”

At the very least, any interruption of learners by teachers with some classroom tool ought to be…

  1. preceded by a primer on its literacy,
    • i.e. explaining how to use that particular tool in…
  2. a meaningful context or future setting,
    • i.e. explaining why to use that particular tool, before anybody…
  3. begins rehearsing and/or mastering that particular tool,
    • i.e. successfully executing whatever it does

If technology helps create surplus time and energy, then how and why and what had better be considered because we only have so much time and energy at our disposal. The what, the how, and the why are hardly new concepts, but they aren’t always fully considered or appreciated either. They are, however, a means of helpful focusing that few lessons should be without.

As a teacher, sure, I tend to think about the future. But that means spending time and paying attention to what we’re up to, here and now, in the present. To that end, I have an interest in protecting words like “learning” and “intelligence” from ambiguity and overuse. For all the 21st century hearts thumping over the Cinderella-transformation of ENIAC programmable computation to A.I., and the I.o.T., and whatever lies beyond… for all that, our meagre acknowledgement of the ugly step-sister, artificiality, is foreboding. Mimicry is inauthentic, but neither is it without consequence. Let’s take care that the tools we create as means don’t replace the ends we originally had in mind because if any one human trait can match the trumpeting of technology’s sky-high potential—for me at least, not sure for you—I’d say its hubris.

Another fantastic tale comes to mind: Frankenstein’s monster. Technological advancement can be as wonderful as horrifying, probably usually somewhere in between. However it’s characterised or defined, though, by those who create it, it will be realised in the end by those who use it, if not those who face it. For most people, the concept of cell phones in 1982 was hardly imagined. Four decades later, faces down and thumbs rapid-fire, the ubiquity of cell phones is hardly noticed.

A Kind of Certainty: IV. A Kind of Faith

Click here to read Pt III. A Scripture of Truth


A Kind of Certainty

IV. A Kind of Faith

For all this, what exactly does it mean to be educated? From the sole perspective – yours, mine, anybody’s – free thinking means freedom granted to individuals to believe and behave as they do, then investing proportionate faith that they continue to believe and behave as we do.

Of course, anyone’s beliefs might vary, freely, from ours, as compared to everyone conforming to the same beliefs and behaviours. Imagine that world, where every inhabitant lived according to self-established morality. In such a world, how would there come about any rule of law? Even real, lived experience here in Canada is tenuous, relying on everyone to rely on everyone else.[1] Whether out of respect for each other, out of gaining some advantage, out of fear for paying a fine or going to jail – on it goes, accountability, but the individual freedom we avouch is as ready to dissipate as the smoke of a powderkeg. For all its enlightenment, free-thinking is quicksand: shifting, uncertain, deceiving, solid ground by mere appearance. Is it any wonder that the liberty and reason of Enlightenment individuation has led us to Post-modernism, relativism, identity politics, and alternative facts? Be careful what you wish for. If there are any true binaries, to trust or not to trust must certainly be one. What need for faith when we trust that we are all alike, that all around is 100% certain?

Such a world is hardly plausible for me. I have learned not to trust everybody I meet. In the world I know, we need discernment and persuasive rhetorical skill to skirt potential conflicts and get others onside. And when others have discernment and persuasive rhetorical skill, too? Seen in that light, the curricular task is competitive, not cooperative. Even so, we might still argue that curriculum is collaborative, and it does not have to be belligerent. Curriculum falls within the scope of some given morality, morality being a question of right and wrong, positive opposing negative: to x, or not to x. However, curriculum itself is an ethical choice between alternatives and is, thereby, an empowering decision. We must therefore ask to x, or to y, which are positives, a question of competing rights, and not right competing against wrong.

And anywhere right does oppose wrong, curriculum should not permit a choice because wrong is simply wrong and not something that responsible choice can decide.[2] Beyond simply learning about the freedom to think, curriculum is about learning how to make choices that are set within the scope of defined morality. Question the morality, compare it to another morality, and we are Hamlet: we are lost. But decide, and accept the morality, and question only those choices intrinsic to its milieu… now we are educating ourselves and others, however precisely or narrowly, for as long as we care to pursue whatever makes us curious.

For me, someone is educated who thinks, and discerns, and has aims. Admittedly, such aims could be countered or rationalised pragmatically or else, more perversely, aimed beyond oneself to harm others – thinking in itself, after all, is not inherently moral. So if morality is a thing to be taught and also learned, then an educated person, for me, is someone who learns generosity of some kind, hospitality. Being educated means learning to give of oneself, for others or on behalf of others, in positive, constructive ways. This belief, I suppose, reflects my learned morality, which I am as pleased in all caring as utility to pass along. Perhaps your morality differs.

To that end, education, in itself, should intentionally be both constructive and benevolent in consideration of that sense of kairos, what is appropriate in the moment for teacher and learner, even as those moments accumulate over the passage of chronos-time, like endless waves upon the shore.[3] Then again, who am I to anybody, such that the sole importance of my opinion should determine an education? If I am outnumbered, what is this sense of education that I describe but some solitary means of facing an existence nasty, brutish, and short? This thing called school will be the death of me!

Hawai'i Summer 2008
” ‘Bove the contentious waves he kept, and oar’d / Himself with his good arms in lusty stroke–” Away from shore? A certainty all its own…

See? Recruiting Hamlet’s cycle of misery seems all too easy “‘where the postmodern turn of mind appears to privilege the particular over the general’” (Roberts, 2003, p. 458). Frankly, I think our present culture regards the individual far too much. Naturally, the consequent short-changing of the bigger community picture has been playing out over chronos-time since, with every decision, there has been consequence. However, Roberts continues, “… ‘for Freire both [the particular and the general] depend on each other for their intelligibility’.” So perhaps a good education – by which I mean not just a moral one but an effectual one – is best measured with due consideration for its balance of the particular and the general, the heterogeneous and the homogenous, the certainty and the ambiguity, the inductive and the deductive. A little healthy scepticism, a little cloud for the silver lining. A little dram in the substance, to paraphrase Hamlet. “A little dab’ll do ya,” quips McMurphy.[4] You can’t have one without the other, sings the primus inter pares.[5]

We defy augury by flouting convention, even law, because we are free agents who do what we please. Some will have more courage than others, and some are just more foolhardy, but no one is literally predictable. We defy augury by being unpredictable, even inscrutable, although maybe the rest of you just never really knew me that well to begin with. Sometimes I even surprise myself. We defy augury by defying our senses, by not comprehending the world that we apprehend, which really is to say we see only what we want to see and recognise only what we already know. If there is special providence in the fall of a sparrow, what matter when we have spent all our time watching the chickadees? I cannot shake free from critiquing our cultural veneration of the individual: the less our shared beliefs converge and reciprocate a healthy community, the greater our insistence upon personal liberty to go our own way, then all the more do we miss the point of understanding exactly what freedom really is. True freedom results from having choices, and what creates choice is not the persuasive liberty of unequivocal individualism but discipline: to do ‘x’, or ‘y’, or ‘z’.

Shakespeare’s “Let…” statements are not so colloquial as to suggest the fatalism of c’est la vie, or the aimlessness of go with the flow[6] – these, for me, amount to giving up, or else giving in. The tragedy of Hamlet is that the curriculum he really needed – the people he could trust, who would be willing to help him – they were already there, at his side the whole time, as ready and willing as ever, so long as he gave a little back, so long as he offered just a dram of willingness to coincide with their beliefs – to his own scandal, maybe, but who in the real world is so selfish as they might expect to have their cake and eat it, too?[7] As compared to going it alone, Hamlet might have humbled himself and cast his lot with those to whom he is closest.[8] His education from Wittenberg proved sufficient to challenge his upbringing in Elsinore, amply suggested by his continued trust to enlist and confide in Horatio throughout the play; as far as that went, the rest of us would do well to heed his lesson with due respect: if only Hamlet had not divided his loyalty but decided, once and finally, exactly who he was and whom he trusted, then lived up to his declaration with discipline. With integrity.

The most common criticism aimed his way by my students was essentially, “Get over yourself, and grow up!” Make a decision with the discipline to accept the consequences, which is to say, accept your personal responsibility. To be fair, Hamlet finally, triumphantly, does place his faith in Horatio, whom he entrusts to tell his story. Granted, he only asks once he is terminally poisoned but hey, better to ask while alive to breathe the words than come back and haunt Horatio as the next in a line of Ghosts. As for Shakespeare, whatever exactly it was that he saw in us, this ethical curricular dilemma, evidently he felt its redemptive quality was worth its cost, as Horatio makes known – or will do – for pledging to tell his dying friend’s tale to Fortinbras. Shakespeare’s appeal by way of Hamlet is not one of giving up or giving in. It is one of giving over, to something bigger than ourselves, to something in which faith placed is faith assured, and “attuned” (Pinar, 2017b, p. 1), and certain beyond our own devices.

What that object of faith might be… perhaps it comes as no surprise, but Shakespeare has a “Let…” statement for that, too: “… let your own discretion be your tutor” (3.2.17). I never included this one in the list for my students because, until writing this essay, I had never fit it in as such a central constituent. Hamlet delivers the line, as any nervous director might do opening night, during the aforementioned lecture to the Players before the Mousetrap performance.[9] All the more ironic, of course, is that his lecture hardly exemplifies the statement, which would be fine if Hamlet, the director, did not assume the stage during the performance but let the actors get on with their craft. Hamlet, by contrast, twice assumes the stage to augment the performance. (Ahh, what to do about such insecurity… at least he sells tickets, you may remember.) Anxious or not, the wisdom of his advisement, taken for all, is easy for a lay audience to misinterpret, particularly as it comes buried within lines of such mundane theatrical detail. Shakespeare does not suggest that we give in to our discretion, carte blanche. He suggests that we give over to our discretion as a kind of teacher-student relationship.

Let curriculum be to trust your own better judgment, to search your feelings,[10] yet to grant with humility that more may exist than meets the eye. Let discretion be a “tutor,” yet while you let it, also think before you act – and think during and after, too – because “… the purpose of playing… was and is, to hold… the mirror up to nature” (3.2.17-23). Whether this amounts to something esoteric or spiritual is down to the beholder,[11] yet if that is true for any one of us, it must be true for all of us. Each one of us is finite and individual, and curriculum is composite, a sum greater than the whole of its parts, as in all of us, transcending time and space. As a force of faith, curriculum is vast indeed.

Click here for the Bibliography

Click here to read the closing reflection to “A Kind of Certainty”: Pt V. Fleeting Uncertainty


Endnotes

[1] How often I referred students to Canadian Liberal MP Stephen Owen’s definition for democracy: “the pluralistic respect for citizens empowered to self-govern within the rule of law.” Democracy, so often simplified as “majority rule,” is more accurately understood (in my opinion) as entirely dependent upon its constituents. Democracy works because we all agree to make it work. Every member therefore has a personal responsibility to respect and live up to the standard of the law on behalf of every other member. One disobedient person weakens the system and places everybody, including themselves, at risk. Either we set that person straight, or we jail them, but unless we protect the system, we are only certain to lose it.

[2] *Sigh… culture precedes law, I would argue, and we endlessly debate and litigate what should be right versus what should be wrong. This is politics and the justice system at work, issue by issue, and with enough lobbying and / or civil disobedience, any given topic might be up for consideration.

[3] Okay, so I did find a way to toss in some surf.

[4] https://youtu.be/d_mASr1djMM?t=1m33s (Zaentz, Douglas, & Forman, 1975)

[5] aka the Chairman of the Board, aka Ol’ Blue Eyes

[6] In Canada, we might say that Shakespeare’s appeal to “let go” means don’t grip the stick too tight. “Hold on loosely,” as Donnie Van Zant would sing, or “Give a little bit,” from Roger Hodgson. None fully clarifies the expression, as I gather Shakespeare intended it, but the notion of giving way in deference to others is helpful, for a start.

[7] Of course, the best rejoinder here would be, “He who dies with the most toys wins,” to which I would reply, “You can’t take it with you.” But dialectical bumper-stickers were never my strong suit, and I digress, even for end-notes.

On second thought, the best rejoinder is to say Hamlet is fictional, not of the real world. All the more reason to admire him as perhaps Shakespeare’s best creative feat, so life-like are he and the rest of the characters who populate the play.

[8] Between Ophelia and Horatio, he nearly does so twice, and even towards Gertrude he aims some meager hope and sympathy. Alas, yet another essay…

[9] Shakespeare includes numerous allusions throughout the play to the theatre milieu, its characters and culture, and its place in Elizabethan society, many of which can be construed as humorous and even as insider jokes shared amongst his theatre company and his regular audience.

[10] https://youtu.be/bv20ZoBcdO8?t=1m43s (Kurtz & Kershner, 1980). A clever mash-up of the Star Wars scene with characters from The Lion King (https://www.youtube.com/watch?v=Mf6JKC_V4v0) suggests that this idea about an inner core of balanced discretion or a healthy scepticism, if not desperate inner turmoil, has resonated beyond Shakespeare’s work into our own theatrical pop culture.

[11] I learned, for my own spiritual belief, to distinguish between what many religions have people do, as compared to what God through Christ has already done. The primary reference, here, is to the Resurrection and what Christ has done for all. Whether one chooses to believe or not is up to them, and should be, which is the essence of my belief: what comes down to a matter of personal choice is to believe, or not to believe. Consider Ephesians 2:8-9, for example, in which Paul explains that we are saved not by works but by grace, so that none can boast: justification by grace through faith in God is the essence of Christianity, and I emphasise that part of it left up to us, to have faith in God. Some consider this ridiculous, and that is neither here nor there to me although I wish no ill upon anyone. Upon believing, upon faith, one can grasp how a selfless attitude of giving – giving of oneself – matters as compared to more selfish concerns over what is given or how much is given.

Such concerns do arise since, as I believe, all inherit Original Sin, a concept that one must accept before anything else in Christian doctrine of any stripe will make sense: we all have inherited an imperfection to believe and have faith in our selves, apart from the God who created us; to go our own way; to obey our own inclinations and not His. This pride-of-self, set in motion by the conniving serpent’s lure that whetted Eve’s curiosity, then Adam’s, enough for them to disobey one simple command… this original “missing of the mark” prompted Adam, Eve, and all their offspring to realise within themselves what had never before even appeared on their radar screens: that obedience was only appreciable once disobedience had been tried. It’s the same binary idea as saying, “You only really understand peace once you experience war,” and so forth. So, for instance, in offering to God (Genesis 4:3-4)… where Cain brings some, Abel brings the choicest; yes, each still gives, yet Cain is furious upon seeing the difference in God’s response between their offerings. The sense is that Abel gives in faithful obedience what Cain withholds for himself, Abel trusting God, in a way that Cain does not, that God will give back and look after him. Cain trusts in what he can manage and control for himself; evidently, he does not trust like his brother that God will give back. Perhaps he does not even believe that God created them although, if he does believe this, how much worse his distrust.

Avenging his own honour by killing his brother is a choice Cain makes, entirely selfish and sinfully predictable. This, for me, opens explanation as to why God allows evil to prosper: He gave us free will, in His image, out of love, to choose or to not choose His gift of salvation; to believe or not to believe in His Gospel, as a matter of faith; to trust Him or to trust something else. In either case, we, the people, are answerable for all we do. As I say, back then, Cain perhaps did or didn’t know he was God’s creation – he is left to his own account for that. These days, though, how many people hardly even consider God as real, much less as Creator or Benefactor? However, if God offered us no doubt of His existence, then what would necessitate faith? Were He to provide 100% certainty, anyone then would have no choice but to believe, of necessity, or else be a fool not to believe and delude themselves in spite of the certainty. As it is, some think believers are deluded; truly, you can’t convince all the people all the time, and you definitely should not force belief. All this, for me, is consistent with a caring God who has conferred free will. So, where some condemn believers as guilty of the crimes and evils committed in the name of Christianity (or religions altogether), in fact, I fully agree: hateful beliefs and violent acts are an abomination of how God would have us treat each other.

But, again, he has bestowed upon us the free will to decide and behave, and I argue that all such crimes and evils, whether in the name of religions or not, reflect Original Sin, our turning-away from God; they do not reflect God. They cannot reflect the character of God, whose nature is neither criminal nor evil; rather, they reflect the character of our selves, who are selfishly proud. People are responsible for bastardising and usurping doctrine in order to gain for themselves, something akin to Cain, so blatantly transparently selfish. Further, as that kind of belief and behaviour continues, it roots until generations have perhaps forgotten or lost any other way to believe and behave. We are human, taken for all, and finite in power and awareness. We can do no other than we continue to prove ourselves capable of doing – and in this I include both good and evil that we do – and this, truly, is why we’re in need of salvation. So much gets lost in scriptural debate over details – details that warrant discussion yet, being details, they are also prone to misinterpretation and thereby require careful, long-studied contextual understanding – but the basic doctrine and the loving character of God I find rather straightforward. It’s people who complicate and screw it up, not God. And I’m as guilty, neither better nor worse but just plain equal to every other person trying to live under our circumstances. So I try my best to respect peoples’ dignity, everyone’s.

My choice has been to believe based on the preponderance of evidence that I’ve learned and studied for many years – the careful, long-studied contextual understanding I mention above. I have plenty more to learn, but my point is that I did have to learn, to begin with. I did not just suddenly have some nuanced supreme understanding of Christian doctrine – indeed, I’m wary that superficial knowledge is so frequently the cause of the crimes and evils people commit in the name of religion. I consider myself blessed to have had the freedom to choose what to study without duress and to have had an education provided by good teachers who understood what makes for good curriculum. I have never felt assaulted or oppressed as far as my education is concerned – or my life, for that matter – and, furthermore, I achingly, mournfully recognise that so so many others cannot agree. Why not me, I can’t say, but I count myself as blessed for this, if for no other reason in my existence. I know so well that not everyone has enjoyed such Providence.

There is so much abuse and violence out there, person-upon-person, and I suggest that I, or you or anyone, ought to be enabled to read, search, and decide for ourselves whether or not to believe something. And never forced, and never judged. Personally, I’m not a big church-goer – I have done, but I don’t much anymore. But I still quietly personally maintain my faith. Even offering this endnote struck me as bold, but I wanted this post to be thorough and honest. I believe evidence exists – we have only to look for it: “Knock, and the door shall be opened” is God’s encouragement, to be proactive and search for Him rather than sitting idly by awaiting, or else ignoring, His imminent return. Nonsense, this, for some. And I can comprehend the doubt. But I don’t share it. By the same token, I offer my testimony, but I don’t impose it. People today who demand to see evidence – God performing miracles, say – are asking Him to lay foundations all over again. But, by analogy, a building only needs one foundation, so why would God repeat that process? Enough evidence has been documented over time, for me, that I now readily believe and join the church being built on the existing foundation. Again, as I opened this rather long endnote, what matters most is what He has already done: we have only to believe, with no further need to see more miracles, which is really what having faith is all about.