“… Whose the Forest of Them All?” See What I Did There?

Imagine somebody offers you a friendly smile, but you snarl back. What might be their next reaction? Would they be amused and take it as a friendly jibe, just typical “you”? I suppose that would depend on how well they knew you. Would they be bemused because they don’t know you so well? Really, snarling at a friendly smile…? We’re perfect strangers, for goodness’ sake! Would they be confused because they’re not from around here and just can’t reckon the response in any way? A person’s reaction to your snarl might conceivably be anything—it depends on so many factors, and even in these three suggestions, one can find how-many-more details, nuances, and possibilities that take things further. Any “next reaction,” you might finally conclude, just depends on the person.

That response, “it depends,” is often criticised as merely wishy-washy yet, apparently, there’s an ironic ring of absoluteness to it, like the postmodern clarion call that nothing is true except for this statement. The reason I pose the scenario at all is to consider who really provides us with our sense of self. Supposing this person smiled at me, I might snarl in the first way, as a jibe, because I’m sure they’ll get the joke. But what if they don’t get it? What if this person even knows me pretty well, and they just don’t get it, not this time? Or what if they feel this just wasn’t the time for joking around? Their next reaction will depend on these and / or plenty of other factors. But again, I raise the scenario to consider how we gather—or, no no, to consider who really provides us—with our sense of self.

And there you have it, the issue: do we each gather our own sense of self, internally, or do others provide us with our sense of self, externally?

I don’t want to revert simply to the nature-nurture argument or chicken-and-the-egg. We seem inescapably bound to considering these by degree—hence, the absolutism that it depends. So, then, to consider by degree… the metaphor I have in mind is that of a mirror. Something someone does induces a response from me. Subsequently, what I provoke in that other person can tell me something more about myself, so long as I’m willing (and able?) to discern my self—myself?—from what they reflect. Whatever next reaction of theirs follows my snarl, this other person’s reaction serves as a mirrored reflection of me, at least insofar as this other person is concerned. If they laugh at my snarl, then hey, I guess they affirm me as a friend with an appropriate sense of humour; the jibe is appreciated, and maybe we’re even a little closer friends than before. Their positive reaction is my feedback, like looking at myself in a mirror, and my sense of self is in some way provoked on account of them by what they reflect.

I suppose there’s room to discuss a lack of empathy, here, even sociopathic behaviour—these seem also to be part of that endless list of details, nuances, and possibilities. But in acknowledging them, let’s leave them for another day.

If my snarl induces a frown from the other person, or some kind of puzzlement or disapproval, then what they affirm for me is less friendly or wonderful, yet may be just as clear—maybe they snarl back, even more fiercely, or maybe they stomp away with clenched fists. Maybe now I feel worried, in which case my sense of self could suffer from insecurity or dismay—oh dear, they didn’t get the joke! Or maybe they are saddened, and I feel smug—take that, you deserve it—or hostile—get lost, I never liked you anyway—which reinforces my sense of superiority, some kind of self-importance. The list of possibilities goes on—it depends—but, in any case, I’m able to find myself reaffirmed by that other person’s reaction. I’m “able to” because my snarl clearly exposes my stake in how this other person influences the way I consider my sense of self: why would I even take notice of them in the first place, much less snarl, much less take concern of their next reaction, if they meant nothing to me?

The point is that the other person’s reaction provides me a measure, a reason, a reflection by which to gauge my self as myself. Basically, thank you, because I couldn’t do it without you and everybody else, and you’re welcome because neither could you without me, or everybody else.

Now, pretend there are no other people—you, alone, exist as the sole human being. You happen to be walking through a grove of, say, birch trees, obviously getting no reactions as we’ve just considered about smiles and snarls. But as the wind whishes by, fluttering leaves and swaying branches, you take in the world around you with a relative means of judgment that wades through various combinations of reactors provoking reactions from reactees: Are the trees reacting to me? Is the wind reacting to me, or the trees to the wind? and so forth. You can see all sorts of things happening, but how can you be sure what provokes or reflects what else? Some songbirds are flitting about, high up in the branches: Are they chirping at me? You might not even call them “song” birds (that is, if you even had language—what need for language, really, as one sole person?) For all we know, the birds would actually scare you, and you might rightly call them “scarebirds” or something—in this pretend scenario, with you the sole human being, we’re also pretending that you know nothing in the way of biology or flora or fauna. Those are ways of understanding the world developed in the real life community of human beings, not in some pretend scenario of solo existence.

In that land of pretend, after weeks of sunshine, what might be your sense of self on the day it rained, or on the day the leaves yellowed and fell to the ground in heavier, colder winds? Would you even be considering your “self” apart from the entirety of what surrounds you? Here we are, again, at nature-nurture, only this time you might conceivably consider the two in synthesis: not as separately discrete influences—there is nature, and there is nurture—but as one-and-the-same, naturenurture, thereby placing you into the world of existence as part-of-a-greater-whole. Your sense of self could conceivably be more cosmic, in that literal sense of orderliness, and more holistic, in that sense of connectedness. To mix metaphors, you might feel a mere cog in the wheel, a mere wheel of the gears, yet entirely necessary, just the same. Or how about this: I wonder how imperative my right hand feels, as compared to my left, when I write with a pen, but they’re both pretty important when I play golf.

We can conceivably warrant our selves to ourselves, but—as we step back into the land of real life and other people—we cannot live in total oblivion of the people around us. I grant the possibility of living within ourselves as our selves, rendering the responses and reactions of any one, and those alongside, as nothing other than colliding self-interests, but still… That other people can authorize our sense of self—your sense, my sense—seems as inescapable, as definite, as did nature-nurture or chicken-and-the-egg.

In this little thought experiment, I’ve been wondering whether we each sense our self as reflective of the reactions we induce. How much do we incorporate the feedback we get after snarling at a friendly smile? Do we see that other person as though staring at ourselves in a mirror? And, if so, does that mean we’re each of us necessarily, essentially, and thereby compellingly part of a greater whole? Like trees of a forest, or cogs in a wheel, or limbs to a body? For all this, maybe it’s only an issue because we’re able to raise such questions, to begin with.

I May Be Wrong About This, But…

Before introducing the moral pairing of right and wrong to my students, I actually began with selfish and selfless because I believe morality has a subjective element, even in the context of religion, where we tend to decide for ourselves whether or not we believe or ascribe to a faith.

As I propose them, selfish and selfless are literal, more tangible, even quantifiable: there’s me, and there’s not me. For this reason, I conversely used right and wrong to discuss thinking and bias. For instance, we often discussed Hamlet’s invocation of thinking: “… there is nothing good or bad, but thinking makes it so” (II, ii, 249-250). Good and bad, good and evil, right and wrong… while not exactly synonymous, these different pairings do play in the same ballpark. Still, as I often said to my students about synonyms, “If they meant the same thing, we’d use the same word.” So leaving good and bad to the pet dog, and good and evil to fairy tales, I presently consider the pairing of right and wrong, by which I mean morality, as a means to reconcile Hamlet’s declaration about thinking as some kind of moral authority.

My own thinking is that we have an innate sense of right and wrong, deriving in part from empathy, our capacity to stand in someone else’s shoes and identify with that perspective – look no further than storytelling itself. Being intrinsic and relative to others, empathy suggests an emotional response and opens the door to compassion, what we sometimes call the Golden Rule. Compassion, for Martha Nussbaum, is that means of “[hooking] our imaginations to the good of others… an invaluable way of extending our ethical awareness” (pp. 13-14). Of course, the better the storytelling, the sharper the hook, and the more we can relate; with more to go on, our capacity for empathy, i.e. our compassion, rises. Does that mean we actually will care more? Who knows! But I think the more we care about others, the more we tend to agree with them about life and living. If all this is so, broadly speaking, if our measure for right derives from empathy, then perhaps one measure for what is right is compassion.

And if we don’t care, or care less? After all, empathy’s no guarantee. We might just as reasonably expect to face from other people continued self-interest, deriving from “the more intense and ambivalent emotions of… personal life” (p. 14). Emotions have “history,” Nussbaum decides (p. 175), which we remember in our day-to-day encounters. They are, in general, multifaceted, neither a “special saintly distillation” of positive nor some “dark and selfish” litany of negative, to use the words of Robert Solomon (p. 4). In fact, Solomon claims that we’re not naturally selfish to begin with, and although I disagree with that, on its face, I might accept it with qualification: our relationships can supersede our selfishness when we decide to prioritise them. So if we accept that right and wrong are sensed not just individually but collectively, we might even anticipate where one could compel another to agree. Alongside compassion, then, to help measure right, perhaps coercion can help us to measure wrong: yes, we may care about other people, but if we care for some reason, maybe that’s why we agree with them, or assist them, or whatever. Yet maybe we’re just out to gain for ourselves. Whatever our motive, we treat other people accordingly, and it all gets variously deemed “right” or “wrong.”

I’m not suggesting morality is limited solely to the workings of compassion and coercion, but since I limited this discussion to right and wrong, I hope it’s helping illuminate why I had students begin first with what is selfish and selfless. That matters get “variously deemed,” as I’ve just put it, suggests that people seldom see any-and-all things so morally black and white as to conclude, “That is definitely wrong, and this is obviously right.” Sometimes, of course, but not all people always for all things. Everybody having an opinion – mine being mine, yours being yours, as the case may be – that’s still neither here nor there to the fact that every body has an opinion, mine being mine and yours being yours. On some things, we’ll agree while, on some things, we won’t.

At issue is the degree that I’m (un)able to make personal decisions about right and wrong, the degree that I might feel conspicuous, perhaps uneasy, even cornered or fearful – and wrong – as compared to feeling assured, supported, or proud, even sanctimonious – and right. Standing alone from the crowd can be, well… lonely. What’s more, having some innate sense of right and wrong doesn’t necessarily help me act, not if I feel alone, particularly not if I feel exposed. At that point, whether from peer pressure or social custom peering over my shoulder, the moral question about right and wrong can lapse into an ethical dilemma, the moral spectacle of my right confronted by some other right: would I steal a loaf of bread to feed my starving family? For me, morality is mediated (although not necessarily defined, as Hamlet suggests) by where one stands at that moment, by perspective, in which I include experience, education, relationships, and whatever values and beliefs one brings to the decisive moment. I’m implying what amounts to conscience as a personal measure for morality, but there’s that one more consideration that keeps intervening: community. Other people. Besides selfish me, everybody else. Selfless not me.

Since we stand so often as members of communities, we inevitably derive some values and beliefs from those pre-eminent opinions and long-standing traditions that comprise them. Yet I hardly mean to suggest that a shared culture of community is uniform – again, few matters are so black or white. Despite all that might be commonly held, individual beliefs comprising shared culture, if anything, are likely heterogeneous: it’s the proverbial family dinner table on election night. Even “shared” doesn’t rule out some differentiation. Conceivably, there could be as many opinions as people possessing them. What we understand as conscience, then, isn’t limited to what “I believe” because it still may not be so easy to disregard how-many-other opinions and traditions. Hence the need for discussion – to listen, and think – for mutual understanding, in order to determine right from wrong. Morality, in that sense, is concerted self-awareness plus empathy, the realised outcome of combined inner and outer influences, as we actively and intuitively adopt measures that compare how much we care about the things we face everyday.

Say we encounter someone enduring loss or pain. We still might conceivably halt our sympathies before falling too deeply into them: Don’t get too involved, you might tell yourself, you’ve got plenty of your own to deal with. Maybe cold reason deserves a reputation for callusing our decision-making, but evidently, empathy does not preclude our capacity to reason with self. On the other hand, as inconsistent as it might seem, one could not function or decide much of anything, individually, without empathy because, without it, we would have no measure. As we seem able to reason past our own feelings, we also wrestle echoing pangs of conscience that tug from the other side, which sometimes we call compassion or, other times, a guilt trip. Whatever to call it, clearly we hardly live like hermits, devoid of human contact and its resultant emotions. Right and wrong, in that respect, are socially individually determined.

One more example… there’s this argument that we’re desensitized by movies, video games, the TV news cycle, and so forth. For how-many-people, news coverage of a war-torn city warrants hardly more than the glance at the weather report that follows. In fact, for how-many-people, the weather matters more. Does this detachment arise from watching things once-removed, two-dimensionally, on a viewscreen? Surely, attitudes would be different if, instead of rain, it were shells and bombs falling on our heads from above. Is it no surprise, then, as easily as we’re shocked or distressed by the immediacy of witnessing a car accident on the way to our favourite restaurant, that fifteen minutes later we might conceivably feel more annoyed that there’s no parking? Or that, fifteen minutes later again, engrossed by a menu of appetizers and entrees and desserts, we’re exasperated because they’re out of fresh calamari. Are right and wrong more individually than socially determined? Have we just become adept at prioritising them, even diverting them, by whatever is immediately critical to individual well-being? That victim of the car accident isn’t nearly as worried about missing their dinner reservation.

Somewhat aside from all this, but not really… I partially accept the idea that we can’t control what happens, we can only control our response. By “partially” I mean that, given time, yes, we learn to reflect, plan, act, and keep calm carrying on like the greatest of t-shirts. After a while, we grow more accustomed to challenges and learn to cope. But sometimes what we encounter is so sudden, or unexpected, or shocking that we can’t contain a visceral response, no matter how accustomed or disciplined we may be. However, there is a way to take Hamlet’s remark about “thinking” that upends this entire meditation, as if to say our reaction was predisposed, even premeditated, like having a crystal ball that foresees the upcoming shock. Then we could prepare ourselves, rationalise, and control not what happens but our response to it while simply awaiting the playing-out of events.

Is Solomon wise to claim that we aren’t essentially or naturally selfish? Maybe he just travelled in kinder, gentler circles – certainly, he was greatly admired. Alas, though, poor Hamlet… troubled by jealousy, troubled by conscience, troubled by ignorance or by knowledge, troubled by anger and death. Troubled by love and honesty, troubled by trust. Troubled by religion, philosophy, troubled by existence itself. Is there a more selfish character in literature? He’s definitely more selfish than me! Or maybe… maybe Hamlet’s right, after all, and it really is all just how you look at things: good or bad, it’s really just a state of mind. For my part, I just can’t shake the sense that Solomon’s wrong about our innate selfishness, and for that, I guess I’m my own best example. So, for being unable to accept his claim, well, I guess that one’s on me.

A Kind of Certainty: II. Curriculum, or What You Will

Click here to read Pt I. An Uncertain Faith


A Kind of Certainty

II. Curriculum, or What You Will

Baumlin (2002) distinguishes three concepts of temporality. Chronos is linearity, our colloquial passage of time, “non-human; impersonal objective nature” (p. 155), from which we understandably define past, present, and future. In relation to this is kairos, a single point in time, “[describing] the quintessentially human experience of time as an aspect of individual consciousness, deliberation, and action… that single fleeting moment … when an individual’s fortune is ‘set in motion’, … [providing] the means” and yielding “Fortuna, the consequences” (p. 155). Interwoven with kairos, then, is Occasio, the cause to Fortuna’s effect, a sense of “‘right-timing’ and prudent[1] action,” an opportunity[2] to better the capricious lies of fortune and fate. Although this sense of opportunity was emancipating, it also engendered accountability for consequences.

The developing belief that we possessed not mere agency but free will weighed upon Renaissance thinking and was a trait that Shakespeare often imparted to his characters, Hamlet (4.4.46-52) being but one example.[3] By the time 17th century Elizabethans first watched Hamlet on stage, the humanist challenge to “a grim… Christian sufferance and resignation to time” (Baumlin, 2002, p. 149) was well underway. Unsurprisingly, Shakespeare offers nothing firm in Hamlet as to where our belief should lie, either with fortune or with free will; indeed, leaving the debate ruptured and inconclusive seems more to his point. To this end, perhaps most notable is his placement of Hamlet alongside Horatio in the graveyard to ponder the dust and fortune of Alexander, Yorick, and – hard upon – Ophelia.

In handling Yorick’s skull, Hamlet revives the poor fellow’s “infinite jest [and] excellent fancy” (5.1.186), memories of such fond “pitch and moment” (3.1.86) as to “reactivate” (Pinar, 2017a, p. 4) his own childhood, even momentarily. Such specific remembrances educed by Hamlet (which is to say, by Shakespeare) expose the springe of kairos; ultimately, certainty is beyond our capacity, rough-hew it[4] how we will. Colloquially, this might seem obvious, i.e. “the best laid plans…” and so forth, and no one person apparently able to pick the right lottery numbers each week. Yet the extent to which we consider ourselves masters of our own fortune is, for Hamlet, presently in the graveyard, a kind of epiphany, “a spiritual (re-) awakening, a transformation” (Baumlin & Baumlin, 2002, p. 180).[5] He decides that yielding himself over to “divinity” (5.2.10) is wise as compared to the folly of trying to control what was never within his grasp to begin with.

He does not give up any freedom so much as give over to dependence, which of course is a leap of faith. Shakespeare poses a question of allegiance – to obey, or not to obey – further compounded by which allegiance – obedience to father, or to Father; to free will, or to fortune; to an unweeded garden, or to what dreams may come – all these are the question.[6] Shakespeare has Hamlet “reconstruct” (Pinar, 2017a, p. 7) his conceptions of allegiance and obedience during the exchange with the Gravedigger, which hardens Hamlet’s resolve yet also enables him to come to terms with his tormenting dilemma over fealty and honour. By the time his confrontation with Claudius is inevitable,[7] Hamlet’s decision to “let be” (5.2.224) “[marks his] final transcendence of deliberative action in worldly time” (Baumlin & Baumlin, 2002, p. 180). Thus is indicated the subtle dominance of the third temporal concept, aion, “the fulfillment of time” (Baumlin, 2002, p. 155), a circularity like the uroboros, the serpent swallowing its tail. As such, aion signifies what is boundless or infinite, neither more nor less than eternity.

Oddly enough, these three concepts, in concert, can seem both time and place,[8] describing a “spatial-temporal sequence… from point, to line, to circle”; from “natural to human to divine orders” (p. 155). I am not fixed to the idea of a “sequence,” but the general composite still shapes my response to Hamlet’s most famous question of all.[9]


Let go. Learn from the past, but don’t dwell on it.
left (past)

Let it work. Anticipate the future, but no need to control it.
later (future)

Let come what comes. Every possible decision will still yield consequences.
Let be. Pay attention now to what is now.
The readiness is all. (5.2.222-223)
lasting (present)

The rest is silence. (5.2.358)
(a clever double-meaning here: “the rest” = either past regrets and future anxieties or else the undiscovered country, death)


As I take them, these four “Let…” statements amount to sound wisdom, like trusted advice from teacher to student or parent to child. As a student and child, myself, writing this paper, I faced some question of certainty – the same question, strangely enough, that we ask about curriculum: what is worth including? By the same token, what is worth omitting, and from there, what will also be otherwise left out or unmentioned? Whatever we decide, one thing is certain: we can neither cover nor even conceive it all, which of course was my original problem. In fact, knowing as much as we know can even shed paradoxical light onto how much we remain in the dark. Eventually, as my Dad recommended over the phone, I simply needed the courage to make a decision and go with it, and even with his voice in my ear, I knew my own advice with my students had always been the same.

Hanging up, I reasoned further that any feedback I did receive – from peers during revision or from my professor’s formal evaluation – would illustrate how effectively I had collated and communicated my message. Beyond that, say revising the paper for publishing, I would have some ready direction. And there it was, I realised, staring me in the face, curriculum in a nutshell: conversations, decisions, actions, evaluations, reflections – all these, in relation to me as I wrote this essay, amounted to a lived curricular experience of my very own.[10] My curriculum, like this essay, does not simply pose the straightforward question about what is worth including. That question is insufficient. More particularly, my curriculum, like this essay,[11] prompts me to consider what is worth including in light of the audience, the topic, what is already known about the topic, and whatever aims exist in further pursuit of the topic.[12] Succinctly, my curriculum – all our curricula – is contextual, multitudinous, and a question of – questions of – what is particularly worth knowing about any topic of study under the sun: “Why this, why here, and why now?”[13] That is the question.

Well, maybe that is the question. The essence of this question, this curricular particular, lies in kairos, the concept of opportune timing or occasion that “signals the need to bring universal ideas and principles to bear in historical time and situations [i.e. deductively] and, thus, calls for decisions [requiring] wisdom and critical judgment” (Smith, 1986, p. 15). We can only note what matters to us once we have a reference point. And since nothing occurs in a vacuum, any detail can be potentially informative, so we must learn to pointedly ask not, “In what way(s) do I already know what I’m looking at?” but rather, “In what way(s) do I not know what I am looking at?” which tends to be deductive. Typically, curriculum begins inductively, with what someone already knows, and we all know plenty of things. But we generally bring to bear only what we deem relevant to the moment. By the same token, someone who knows what is relevant to the moment has a kind of prescient “mechanism” (Christodoulou, 2014, p. 54) for spotting what will likely be of use.[14] So curriculum is a means of determining, if not discovering, in the moment what works. It is, therefore, also a means of coming to know ourselves.

As we develop confidence and self-esteem, and dignity, we grow to feel that we have something to contribute, that we matter, all of which prepares us for helping others. Curriculum helps us to sort out our values and beliefs,[15] which provide a frame-of-reference in order to select and, later, to measure our day-to-day efforts. Of course, none of this happens immediately; we need time to grow more self- and other-aware, each kairos experience filing in alongside the rest, like a crowd of ticket holders. I can only wonder whether Shakespeare might have characterised curriculum as something akin to being held over for an indefinite engagement. In any event, we never stop learning – may our auditoriums ever sell out – as we continually induce as well as encounter influence. But how deliberately do we do so? Maybe that is the question.

Click here for the Bibliography

Click here for Pt III. A Scripture of Truth


Endnotes

[1] As Baumlin (2002) notes, “For the student of prudentia, time reveals itself as golden Opportunity rather than as fickle, devastating Fortune” (p. 141). Certainly, Shakespeare and his Elizabethan audiences were feeling such debate permeate their own lived experiences, a dram of an idea that, once diffused, might only thereafter suffuse.

[2] According to Claflin (1921), “‘opportunity’ in Shakespeare means more than it does now [in the 20th century]; it is closer to the original force of Latin opportunus, and means ‘a specially favourable occasion’” (p. 347). Curiously enough, however, as I searched a concordance of Hamlet (Crystal & Crystal, 2002), I found no usage of “opportunity” whatsoever and only three of “chance,” most notably that of Hamlet to Horatio: “You that look pale and tremble at this chance…” (5.2.334) in reference to the dead and dying at the play’s closing. Of further interest is the concordance’s report that Shakespeare used “opportunity” throughout his entire catalogue of poems and plays only sixteen times as compared to “chance,” which he used 114 times.

[3] Kiefer (1983) examines Fortune at length as one colour in Shakespeare’s palette for his characters, noting of King Lear: “In no other of Shakespeare’s plays do characters invoke Fortune so insistently [or] so frequently at pivotal points of the action” (p. 296).

[4] Read either “certainty” or “our capacity,” here, in place of “it”; either works just as well. The line from the play I have paraphrased, of course, because the original antecedent is “our ends” (5.2.10) in place of “them” (5.2.11). However, where I have changed the diction of the thought, as a matter of perspective, the meaning remains intact. The implication that we – in essence – play God might not be nearly so alien for Shakespeare’s audience as to their feudal predecessors. By contrast, to postmodern audiences these days, the notion of a divinity standing apart from our own free will and shaping our ends might be the more alien concept.

I might finally point out that Shakespeare, as his creator, is Hamlet’s god, of a kind. But that analogy does not last long under scrutiny since Hamlet, being a fictional character, has no sentience, free agency, or tangibility, and actors who portray him are left with prescribed dialogue and beliefs.

[5] Because I am ultimately discussing what Shakespeare did, his characters being only conveyances as such, I was tempted to complete this sentence with a line from Macbeth, as follows: “The extent to which he considers himself master of his own fortune, presently in the graveyard, is laid plain for Hamlet, leaving him to conclude only that ‘…all our yesterdays have lighted fools the way to dusty death’ (5.5.22-23).” The key difference, of course, is that Hamlet decides against being a fool whereas Macbeth seems all too keen to excel at it. Where Hamlet best demonstrates a respect for “divinity [shaping] our ends,” Macbeth better represents the rough-hewing bit, which makes him a far less redeeming character in the end. So, upon reflection, it seemed prudent to stick substantively to just the one play. Thank heaven for endnotes, I guess.

[6] Had he fallen clearly to one side, as a subject to his monarch, Shakespeare might very well have sealed whatever freedom he did enjoy; his own response, evidently, was to render unto Caesar, and render unto God, and continue writing plays. Four centuries on, what is there about us, that we might think we are any less susceptible than he was to coming to terms with our finite nature? We live in civil society, by the rule of law under a Constitution, within which are Rights and Freedoms that include the assurance to believe, or not to believe, whatever we decide suits us best. Furthermore, we have the advantage over Hamlet in that his example exhorts us, interminably – just ask my students, remember? Alas, though, poor Yorick.

[7] As Horatio notes, “It must be shortly known [to Claudius]” that Hamlet has tricked Rosencrantz and Guildenstern to their deaths at the hands of England (5.2.71-72), a move by Hamlet in his contest that must certainly harden his uncle’s resolve to have Hamlet dealt with once and for all. Of course, Claudius had sent Hamlet to England to be killed, but in secret, on account of both Gertrude and the public’s love for the Prince (4.7.5-24). However, in dispatching his childhood comrades – and with such calculation (5.2.57-70) – Hamlet has now given Claudius justifiable means to overcome any such favourable opinion as might have benefitted Gertrude’s “son” (5.1.296).

[8] Time and place are what we commonly refer to as setting in English class, which is a curious way to consider eternity.

[9] Seldom mentioned amidst all the consternation is that Hamlet does not actually ask a question. If he had, he might have worded it as, “Is it to be, or not to be?” In that case, we would need to know what “it” means. Alive? Dead? Happy? Sad? Anything goes, I suppose, but then… what would you expect? He might have been asking, “Am I…” or “Are we to be, or not to be?” But where that is still somewhat existential and vague, now we might want to know whether his use of the verb, to be, is more open-ended or copular. I suspect Shakespeare knew enough about a general audience to trust that only the most fastidious grammarians would fuss over particulars such as antecedents and verb tenses in the dialogue. Otherwise, why decide to use the most protean verb in the English language?

[10] As far as lived curricular experiences go, there are many like it – as many as there are people to have them – but this one is mine.

[11] At this early stage, I confess: why struggle writing a paper when I could use the age-old trick of writing a paper about writing the paper? Why…? Because the age-old trick is just that – a trick – and spinning academic wheels stalls any hope of contributing to knowledge, so I would hardly be honouring my responsibility if I tried pulling that off. Still… the paper within a paper got me thinking about Hamlet, which oddly enough had been my original inspiration for this essay. As my students used to say… once you study Hamlet, he just never goes away. How true, how true…

[12] According to Hartmann (2014), it was just such questions that prompted Ezra Klein to leave The Washington Post and establish Vox.com in 2014.

[13] Students in all my courses learned to rue the question “Why?” so much so, one year, that it became a running joke simply to utter “why” as a pat-response, whether as a question, an interjection, a plea, a curse, an epithet – those last two maybe reserved for me, I don’t really know. In honour of their perseverance, and their angst, I named my blog The Rhetorical WHY.

[14] Surrounded by Winkies, confronted by certain capture, only Scarecrow eyes the chandelier above the Wicked Witch, so only he can yank Tin Man’s axe across in time to chop the rope that suspends it. Hardly the grandeur or the gravitas of Hamlet, I realise, but The Wizard of Oz has much to offer pertaining to curricular theory as well as teacher autonomy.

[15] In keeping with the three temporal concepts, perhaps a more suitable metaphor than threading our own needles would be to say we surf a long pipeline. But, this essay being more concerned with curriculum and theatre, any such Hang-Ten imagery is better suited to another time, like connecting curriculum to gnarly waves and bodacious beaches (“Surf’s Up,” 2015). Anyway, certainly no one would ever dream of linking Hamlet to surfing (“‘Hamlet’s BlackBerry’,” 2010) in the first place, would they?

%d bloggers like this: