I May Be Wrong About This, But…

Before introducing the moral pairing of right and wrong to my students, I actually began with selfish and selfless because I believe morality has a subjective element, even in the context of religion where we tend to decide for ourselves whether or not we believe or ascribe to a faith.

As I propose them, selfish and selfless are literal, more tangible, even quantifiable: there’s me, and there’s not me. For this reason, I conversely used right and wrong to discuss thinking and bias. For instance, we often discussed Hamlet’s invocation of thinking: “… there is nothing good or bad, but thinking makes it so” (II, ii, 249-250). Good and bad, good and evil, right and wrong… while not exactly synonymous, these different pairings do play in the same ballpark. Still, as I often said to my students about synonyms, “If they meant the same thing, we’d use the same word.” So leaving good and bad to the pet dog, and good and evil to fairy tales, I presently consider the pairing of right and wrong, by which I mean morality, as a means to reconcile Hamlet’s declaration about thinking as some kind of moral authority.

My own thinking is that we have an innate sense of right and wrong, deriving in part from empathy, our capacity to stand in someone else’s shoes and identify with that perspective – look no further than storytelling itself. Being intrinsic and relative to others, empathy suggests an emotional response and opens the door to compassion, what we sometimes call the Golden Rule. Compassion, for Martha Nussbaum, is that means of “[hooking] our imaginations to the good of others… an invaluable way of extending our ethical awareness” (pp. 13-14). Of course, the better the storytelling, the sharper the hook, and the more we can relate; with more to go on, our capacity for empathy, i.e. our compassion, rises. Does that mean we actually will care more? Who knows! But I think the more we care about others, the more we tend to agree with them about life and living. If all this is so, broadly speaking, if our measure for right derives from empathy, then perhaps one measure for what is right is compassion.

And if we don’t care, or care less? After all, empathy’s no guarantee. We might just as reasonably expect to face from other people continued self-interest, deriving from “the more intense and ambivalent emotions of… personal life” (p. 14). Emotions have “history,” Nussbaum decides (p. 175), which we remember in our day-to-day encounters. They are, in general, multifaceted, neither a “special saintly distillation” of positive nor some “dark and selfish” litany of negative, to use the words of Robert Solomon (p. 4). In fact, Solomon claims that we’re not naturally selfish to begin with, and although I disagree with that, on its face, I might accept it with qualification: our relationships can supersede our selfishness when we decide to prioritise them. So if we accept that right and wrong are sensed not just individually but collectively, we might even anticipate where one could compel another to agree. Alongside compassion, then, to help measure right, perhaps coercion can help us to measure wrong: yes, we may care about other people, but if we care for some reason, maybe that’s why we agree with them, or assist them, or whatever. Yet maybe we’re just out to gain for ourselves. Whatever our motive, we treat other people accordingly, and it all gets variously deemed “right” or “wrong.”

I’m not suggesting morality is limited solely to the workings of compassion and coercion, but since I limited this discussion to right and wrong, I hope it’s helping illuminate why I had students begin first with what is selfish and selfless. That matters get “variously deemed,” as I’ve just put it, suggests that people seldom see any-and-all things so morally black and white as to conclude, “That is definitely wrong, and this is obviously right.” Sometimes, of course, but not all people always for all things. Everybody having an opinion – mine being mine, yours being yours, as the case may be – that’s still neither here nor there to the fact that every body has an opinion, mine being mine and yours being yours. On some things, we’ll agree while, on some things, we won’t.

At issue is the degree that I’m (un)able to make personal decisions about right and wrong, the degree that I might feel conspicuous, perhaps uneasy, even cornered or fearful – and wrong – as compared to feeling assured, supported, or proud, even sanctimonious – and right. Standing alone from the crowd can be, well… lonely. What’s more, having some innate sense of right and wrong doesn’t necessarily help me act, not if I feel alone, particularly not if I feel exposed. At that point, whether from peer pressure or social custom peering over my shoulder, the moral question about right and wrong can lapse into an ethical dilemma, the moral spectacle of my right confronted by some other right: would I steal a loaf of bread to feed my starving family? For me, morality is mediated (although not necessarily defined, as Hamlet suggests) by where one stands at that moment, by perspective, in which I include experience, education, relationships, and whatever values and beliefs one brings to the decisive moment. I’m implying what amounts to conscience as a personal measure for morality, but there’s that one more consideration that keeps intervening: community. Other people. Besides selfish me, everybody else. Selfless not me.

Since we stand so often as members of communities, we inevitably derive some values and beliefs from those pre-eminent opinions and long-standing traditions that comprise them. Yet I hardly mean to suggest that a shared culture of community is uniform – again, few matters are so black or white. Despite all that might be commonly held, individual beliefs comprising shared culture, if anything, are likely heterogeneous: it’s the proverbial family dinner table on election night. Even “shared” doesn’t rule out some differentiation. Conceivably, there could be as many opinions as people possessing them. What we understand as conscience, then, isn’t limited to what “I believe” because it still may not be so easy to disregard how-many-other opinions and traditions. Hence the need for discussion – to listen, and think – for mutual understanding, in order to determine right from wrong. Morality, in that sense, is concerted self-awareness plus empathy, the realised outcome of combined inner and outer influences, as we actively and intuitively adopt measures that compare how much we care about the things we face everyday.

Say we encounter someone enduring loss or pain. We still might conceivably halt our sympathies before falling too deeply into them: Don’t get too involved, you might tell yourself, you’ve got plenty of your own to deal with. Maybe cold reason deserves a reputation for callusing our decision-making, but evidently, empathy does not preclude our capacity to reason with self. On the other hand, as inconsistent as it might seem, one could not function or decide much of anything, individually, without empathy because, without it, we would have no measure. As we seem able to reason past our own feelings, we also wrestle echoing pangs of conscience that tug from the other side, which sometimes we call compassion or, other times, a guilt trip. Whatever to call it, clearly we hardly live like hermits, devoid of human contact and its resultant emotions. Right and wrong, in that respect, are socially individually determined.

One more example… there’s this argument that we’re desensitized by movies, video games, the TV news cycle, and so forth. For how-many-people, news coverage of a war-torn city warrants hardly more than the glance at the weather report that follows. In fact, for how-many-people, the weather matters more. Does this detachment arise from watching things once-removed, two-dimensionally, on a viewscreen? Surely, attitudes would be different if, instead of rain, it were shells and bombs falling on our heads from above. Is it no surprise, then, as easily as we’re shocked or distressed by the immediacy of witnessing a car accident on the way to our favourite restaurant, that fifteen minutes later we might conceivably feel more annoyed that there’s no parking? Or that, fifteen minutes later again, engrossed by a menu of appetizers and entrees and desserts, we’re exasperated because they’re out of fresh calamari. Are right and wrong more individually than socially determined? Have we just become adept at prioritising them, even diverting them, by whatever is immediately critical to individual well-being? That victim of the car accident isn’t nearly as worried about missing their dinner reservation.

Somewhat aside from all this, but not really… I partially accept the idea that we can’t control what happens, we can only control our response. By “partially” I mean that, given time, yes, we learn to reflect, plan, act, and keep calm carrying on like the greatest of t-shirts. After a while, we grow more accustomed to challenges and learn to cope. But sometimes what we encounter is so sudden, or unexpected, or shocking that we can’t contain a visceral response, no matter how accustomed or disciplined we may be. However, there is a way to take Hamlet’s remark about “thinking” that upends this entire meditation, as if to say our reaction was predisposed, even premeditated, like having a crystal ball that foresees the upcoming shock. Then we could prepare ourselves, rationalise, and control not what happens but our response to it while simply awaiting the playing-out of events.

Is Solomon wise to claim that we aren’t essentially or naturally selfish? Maybe he just travelled in kinder, gentler circles – certainly, he was greatly admired. Alas, though, poor Hamlet… troubled by jealousy, troubled by conscience, troubled by ignorance or by knowledge, troubled by anger and death. Troubled by love and honesty, troubled by trust. Troubled by religion, philosophy, troubled by existence itself. Is there a more selfish character in literature? He’s definitely more selfish than me! Or maybe… maybe Hamlet’s right, after all, and it really is all just how you look at things: good or bad, it’s really just a state of mind. For my part, I just can’t shake the sense that Solomon’s wrong about our innate selfishness, and for that, I guess I’m my own best example. So, for being unable to accept his claim, well, I guess that one’s on me.

A Kind of Certainty: III. A Scripture of Truth

Click here to read Pt II. Curriculum, or What You Will

 


A Kind of Certainty

3. A Scripture of Truth

Motive is the key, I would suggest to students: to know motive is to know the truth. And I offered this suggestion knowing full-well the timeworn joke about the actor who asks, “What’s my motivation?” Whats the Motivation Just as we can never cover it all and must go with whatever we decide to include, we also cannot (nor should not) try to present it all, ask it all, or attempt it all in one go. Yes, the odd non sequitur can break the monotony – everyone needs a laugh, now and then. But as with all clever comedy, timing is everything, and curriculum is about more than good humour and bad logic. In that regard, given what has already been said about spotting pertinence, curriculum is about motives: to include, or not to include.

And we must try to comprehend this decision from more than one perspective; each in their own way, both teacher and student ponder what to include and what to disregard during any given lesson: “Teachers are problem-posing, not just in the obvious sense that they require students to doubt whether they know something… [but] implicitly [asking] them to question their understanding of what counts as knowledge” (Beckett, 2013, p. 54-55). People generally will not doubt themselves without good reason, or else with a lot of faith in whoever is asking. Challenged to reconstruct or reorganise an experience (Dewey, 1916), more than likely we will want to know why. Curriculum addresses ‘why’.

Why! take Hamlet, for instance… deigning to know a little something about role-playing, he offers some curricular particulars while lecturing the Players ahead of the Mousetrap performance, although really this is to say Shakespeare offered them. Writers famously cringe as rehearsing actors and directors dismember their carefully worked dialogue – or is that another hackneyed joke? In any case, Shakespeare opens Act 3 with some forty lines of advice from Hamlet to the Players, whose replies are little beyond short and polite (although ‘why’ has evidently been left for you and your theatre company to ascertain). These follow some forty lines in Act 2 during an exchange between Hamlet and Rosencrantz about theatre companies, all of which could simply be played as a dose of comic relief amidst the far “weightier matters” of the play (Guyton, 2013). Tried another way, Hamlet’s lines about acting embody the very perplexity of his prolonged tumult: he takes for granted that his listener will attempt to reconcile what he says with whatever uncertainty they might have. What better job description, a “teacher”? Otherwise, why even bother to open his mouth?

What need to teach when we trust that we are all alike, that all around is 100% certain? As it pertains to telling the Players about acting, Hamlet wants no assurance that his audience must bridge some gap of certainty over his trustworthiness, not so far as he is concerned.[1] Indeed, common to live productions that I have watched, he is as relaxed and certain in offering his advice as the Players are in hearing it, like preaching to the choir.[2] Their relationship, apparently going back some time, suggests mutual respect and a shared faith not merely to listen but to understand in listening. It suggests a kind of shared attunement, something mutual, like a kind of curriculum founded upon trust. For all we might want to trust those around us, for all we might want some certainty that we are respected by others – or, perhaps more so, that we are believed – what a torment life would be if our every utterance were considered a lie. Then the only certainty would be the assurance that no one ever believed you, and if that still counts for something, it is dreadfully cold comfort.[3]

We citizens of 21st century post-modernist [your label here] North America may not have descended nearly so low although Klein (2014) does presciently discuss politics, the national discourse, and an observed decline in public intellectualism (Byers, 2014; Coates, 2014; Herman, 2017; Mishra & Gregory, 2015). Where Klein encompasses individuals and the processes, systems, and institutions that they innervate while going about their daily lives, he describes Dewey’s “conjoint communicated experience” (Dewey, 1916, p. 101) and implicates “an extraordinarily complicated conversation” (Pinar, Reynolds, Slattery & Taubman, 2006, p. 848), one that occurs everyday and includes everybody. But since we are forbidden to compel but only persuade the beliefs of free thinkers, we realise that all our perceived uncertainty can only be bridged by a kind of faith: we depend either upon others to see things as we do, or else we depend upon our rhetorical skill to persuade them toward our way. Or we live tense lives full of disagreement and antipathy. ’Swounds, but life would be a lot more stable and certain if we all just believed the same things!

Hamlet craves certainty, to the point where the dilemma of his doubt halts him so dead in his tracks that he is prompted to question existence itself. Where it comes to enacting vengeance – but, really, where it comes to everything we witness in the play – Hamlet – and, really, every character[4] – craves certainty and assurance while suffering from uncertainty and reluctance, which means, of course, that he craves and suffers from both ends. Indeed, a piece of him is certain. But comprising “one part wisdom and ever three parts coward” (4.4.42-43), he wages an unequal battle against himself. He wanders from room to room searching to free himself from his purgatorial tesseract, challenged not simply by one retrograde faith but by several, the consequence of conveying curriculum from Wittenberg back to Elsinore where, previously, he had received, to say the least, an impressionable upbringing. The upshot, given the conflicting decisions he faces, is that Hamlet would rather renounce any mutual faith of any sort and rely upon a certainty all his own: himself.

Yet he even doubts his ability to self-persuade, just as he holds no faith in anyone whose judgment he fears. As a result, he is rightly miserable and lives an exaggerated moment-to-moment existence, “…enraptured with, submerged in, the present, no longer a moment in but a suspension of time, absorbed by – fused with – the images in front of [his] face, oblivious to what might be beyond [him]” (Pinar, 2017, p. 12). Pinar describes a kairos moment of chronos time, as if Cecelia, while watching The Purple Rose of Cairo (Greenhut & Allen, 1985), could press “Pause.” He may not have been Woody Allen’s modernist contemporary, but Shakespeare still appeared to possess enough prescience to machinate a rather, shall we say, enlightened viewpoint; many consider The Tragedy of Hamlet, Prince of Denmark to be the Magnum Opus of English literature, not just Shakespeare. Evidently, he knew exactly how to craft such a rich and roundly individuated protagonist, one certain enough to persist for over 400 years. Certainty the Bard found within himself, and that he bestows (albeit perversely) upon Prince Hamlet, who “[knows] not seems” (1.2.76). Faith he found within himself, too, but that he saves for his audience, trusting them, freeing them, to spot it when the time is right, rendering what they will get unto those who will get it.

By the same token, may the rest get whatever they will get. As far as curriculum is concerned, one size has never fit all, nor should it ever be so.

 

Click here for the Bibliography

Click here to read Pt IV. A Kind of Faith

 


Endnotes

[1] I always suspected a handful of my students were just humoring me – have I mentioned they were brilliant?

[2] Sometimes, these lines have even been cut, to help shorten the play from its typical four-hour length.

[3] Elsinore seems just such a place. But they are wise who “… give it welcome” (1.5.165) since at least, then, you can get on with functioning, knowing where you stand relative to all the other prevaricating liars and weasels who inhabit the place alongside you.

[4] Every character, that is, with the possible exceptions of the Gravedigger, who apparently is most cheerful and self-assured, and Fortinbras, who suffers perhaps not pains of doubt so much as loss, and then always with something up his sleeve. I might also include Horatio in this reflection, but I fear, then, the need for an endnote to the endnotes, to do him any justice.

A Kind of Certainty: II. Curriculum, or What You Will

Click here to read Pt I. An Uncertain Faith

 


A Kind of Certainty

2. Curriculum, or What You Will

Baumlin (2002) distinguishes three concepts of temporality. Chronos is linearity, our colloquial passage of time, “non-human; impersonal objective nature” (p. 155), from which we understandably define past, present, and future. In relation to this is kairos, a single point in time, “[describing] the quintessentially human experience of time as an aspect of individual consciousness, deliberation, and action… that single fleeting moment … when an individual’s fortune is ‘set in motion’, … [providing] the means” and yielding “Fortuna, the consequences” (p. 155). Interwoven with kairos, then, is Occasio, the cause to Fortuna’s effect, a sense of “‘right-timing’ and prudent[1] action,” an opportunity[2] to better the capricious lies of fortune and fate. Although this sense of opportunity was emancipating, it also engendered accountability for consequences.

The developing belief that we possessed not mere agency but free will weighed upon Renaissance thinking, a trait that Shakespeare often imparted to his characters, Hamlet (4.4.46-52) being but one example.[3] By the time 17th century Elizabethans first watched Hamlet on stage, the humanist challenge to “a grim… Christian sufferance and resignation to time” (Baumlin, 2002, p. 149) was well underway. Unsurprisingly, Shakespeare offers nothing firm in Hamlet as to where our belief should lie, either with fortune or with free will; indeed, leaving the debate ruptured and inconclusive seems more to his point. To this end, perhaps most notable is his placement of Hamlet alongside Horatio in the graveyard to ponder the dust and fortune of Alexander, Yorick, and – hard upon – Ophelia.

In handling Yorick’s skull, Hamlet revives the poor fellow’s “infinite jest [and] excellent fancy” (5.1.186), memories of such fond “pitch and moment” (3.1.86) as to “reactivate” (Pinar, 2017a, p. 4) his own childhood, even momentarily. Such specific remembrances educed by Hamlet (which is to say, by Shakespeare) expose the springe of kairos; ultimately, certainty is beyond our capacity, rough-hew it[4] how we will. Colloquially, this might seem obvious (i.e. “the best laid plans…” and so forth, and no one person apparently able to pick the right lottery numbers each week). Yet the extent to which we consider ourselves masters of our own fortune is, for Hamlet, presently in the graveyard, a kind of epiphany, “a spiritual (re-) awakening, a transformation” (Baumlin & Baumlin, 2002, p. 180).[5] He decides that yielding himself over to “divinity” (5.2.10) is wise as compared to the folly of trying to control what was never within his grasp to begin with.

He does not give up any freedom so much as give over to dependence, which of course is a leap of faith. Shakespeare poses a question of allegiance – to obey, or not to obey – further compounded by which allegiance – obedience to father, or to Father; to free will, or to fortune; to an unweeded garden, or to what dreams may come – all these are the question.[6] Shakespeare has Hamlet “reconstruct” (Pinar, 2017a, p. 7) his conceptions of allegiance and obedience during the exchange with the Gravedigger, which hardens Hamlet’s resolve yet also enables him to come to terms with his tormenting dilemma over fealty and honour. By the time his confrontation with Claudius is inevitable,[7] Hamlet’s decision to “let be” (5.2.224) “[marks his] final transcendence of deliberative action in worldly time” (Baumlin & Baumlin, 2002, p. 180). Thus is indicated the subtle dominance of the third temporal concept, aion, “the fulfillment of time” (Baumlin, 2002, p. 155), a circularity like the uroboros, the serpent swallowing its tail. As such, aion signifies what is boundless or infinite, neither more nor less than eternity.

Oddly enough, these three concepts, in concert, can seem both time and place,[8] describing a “spatial-temporal sequence … from point, to line, to circle”; from “natural to human to divine orders” (p. 155). I am not fixed to the idea of a “sequence,” but the general composite still shapes my response to Hamlet’s most famous question of all.[9]

 


Let go. Learn from the past, but don’t dwell on it.

left (past)

Let it work. Anticipate the future, but no need to control it.

later (future)

Let come what comes. Every possible decision will still yield consequences.

Let be. Pay attention now to what is now.

The readiness is all. (5.2.222-223)

lasting (present)

The rest is silence. (5.2.358)

(a clever double-meaning here: “the rest” = either past regrets and future anxieties or else the undiscovered country, death)


 

As I take them, these four “Let…” statements amount to sound wisdom, like trusted advice from teacher to student or parent to child. As a student and child, myself, writing this paper, I faced some question of certainty – the same question, strangely enough, that we ask about curriculum: what is worth including? By the same token, what is worth omitting, and from there, what will also be otherwise left out or unmentioned? Whatever we decide, one thing is certain: we can neither cover nor even conceive it all, which of course was my original problem. In fact, knowing as much as we know can even shed paradoxical light onto how much we remain in the dark. Eventually, as my Dad recommended over the phone, I simply needed the courage to make a decision and go with it, and even with his voice in my ear, I knew my own advice with my students had always been the same.

Hanging up, I reasoned further that any feedback I did receive – from peers during revision or from my professor’s formal evaluation – would illustrate how effectively I had collated and communicated my message. Beyond that, say revising the paper for publishing, I would have some ready direction. And there it was, I realised, staring me in the face, curriculum in a nutshell: conversations, decisions, actions, evaluations, reflections – all these, in relation to me as I wrote this essay, amounted to a lived curricular experience of my very own.[10] My curriculum, like this essay, does not simply pose the straightforward question about what is worth including. That question is insufficient. More particularly, my curriculum, like this essay,[11] prompts me to consider what is worth including in light of the audience, the topic, what is already known about the topic, and whatever aims exist in further pursuit of the topic.[12] Succinctly, my curriculum – all our curricula – is contextual, multitudinous, and a question of – questions of – what is particularly worth knowing about any topic of study under the sun: “Why this, why here, and why now?”[13] That is the question.

Well, maybe that is the question. The essence of this question, this curricular particular, lies in kairos, the concept of opportune timing or occasion that “signals the need to bring universal ideas and principles to bear in historical time and situations [i.e., deductively] and, thus, calls for decisions [requiring] wisdom and critical judgment” (Smith, 1986, p. 15). We can only note what matters to us once we have a reference point. And since nothing occurs in a vacuum, any detail can be potentially informative, so we must learn to pointedly ask not, “In what way(s) do I already know what I’m looking at?” but rather, “In what way(s) do I not know what I am looking at?” which tends to be deductive. Typically, curriculum begins inductively, with what someone already knows, and we all know plenty of things. But we generally bring to bear only what we deem relevant to the moment. By the same token, someone who knows what is relevant to the moment has a kind of prescient “mechanism” (Christodoulou, 2014, p. 54) for spotting what will likely be of use.[14] So curriculum is a means of determining, if not discovering, in the moment what works. It is, therefore, also a means of coming to know ourselves.

As we develop confidence and self-esteem, and dignity, we grow to feel that we have something to contribute, that we matter, all of which prepares us for helping others. Curriculum helps us to sort out our values and beliefs,[15] which provide a frame-of-reference in order to select and, later, to measure our day-to-day efforts. Of course, none of this happens immediately; we need time to grow more self- and other-aware, each kairos experience filing in alongside the rest, like a crowd of ticket holders. I can only wonder whether Shakespeare might have characterised curriculum as something akin to being held over for an indefinite engagement. In any event, we never stop learning – may our auditoriums ever sell out – as we continually induce as well as encounter influence. But how deliberately do we do so? Maybe that is the question.

 

Click here for the Bibliography

Click here for Pt III. A Scripture of Truth

 


Endnotes

[1] As Baumlin (2002) notes, “For the student of prudentia, time reveals itself as golden Opportunity rather than as fickle, devastating Fortune” (p. 141). Certainly, Shakespeare and his Elizabethan audiences were feeling such debate permeate their own lived experiences, a dram of an idea that, once diffused, might only thereafter suffuse.

[2] According to Claflin (1921), “‘opportunity’ in Shakespeare means more than it does now [in the 20th century]; it is closer to the original force of Latin opportunus, and means ‘a specially favourable occasion’” (p. 347). Curiously enough, however, as I searched a concordance of Hamlet (Crystal & Crystal, 2002), I found no usage of “opportunity” whatsoever and only three of “chance,” most notably that of Hamlet to Horatio: “You that look pale and tremble at this chance…” (5.2.334) in reference to the dead and dying at the play’s closing. Of further interest is the concordance’s report that Shakespeare used “opportunity” throughout his entire catalogue of poems and plays only sixteen times as compared to “chance,” which he used 114 times.

[3] Kiefer (1983) examines Fortune at length as one colour in Shakespeare’s palette for his characters, noting of King Lear: “In no other of Shakespeare’s plays do characters invoke Fortune so insistently [or] so frequently at pivotal points of the action” (p. 296).

[4] Read either “certainty” or “our capacity,” here, in place of “it”; either works just as well. The line from the play I have paraphrased, of course, because the original antecedent is “our ends” (5.2.10) in place of “them” (5.2.11). However, where I have changed the diction of the thought, as a matter of perspective, the meaning remains intact. The implication that we – in essence – play God might not be nearly so alien for Shakespeare’s audience as to their feudal predecessors. By contrast, to postmodern audiences these days, the notion of a divinity standing apart from our own free will and shaping our ends might be the more alien concept.

I might finally point out that Shakespeare, as his creator, is Hamlet’s god, of a kind. But that analogy does not last long under scrutiny since Hamlet, being a fictional character, has no sentience, free agency, or tangibility, and actors who portray him are left with prescribed dialogue and beliefs.

[5] Because I am ultimately discussing what Shakespeare did, his characters being only conveyances as such, I was tempted to complete this sentence with a line from Macbeth, as follows: “The extent to which he considers himself master of his own fortune, presently in the graveyard, is laid plain for Hamlet, leaving him to conclude only that ‘…all our yesterdays have lighted fools the way to dusty death’ (5.5.22-23).” The key difference, of course, is that Hamlet decides against being a fool whereas Macbeth seems all too keen to excel at it. Where Hamlet best demonstrates a respect for “divinity [shaping] our ends,” Macbeth better represents the rough-hewing bit, which makes him a far less redeeming character in the end. So, upon reflection, it seemed prudent to stick substantively to just the one play. Thank heaven for endnotes, I guess.

[6] Had he fallen clearly to one side, as a subject to his monarch, Shakespeare might very well have sealed whatever freedom he did enjoy; his own response, evidently, was to render unto Caesar, and render unto God, and continue writing plays. Four centuries on, what is there about us, that we might think we are any less susceptible than he was to coming to terms with our finite nature? We live in civil society, by the rule of law under a Constitution, within which are Rights and Freedoms that include the assurance to believe, or not to believe, whatever we decide suits us best. Furthermore, we have the advantage over Hamlet in that his example exhorts us, interminably – just ask my students, remember? Alas, though, poor Yorick.

[7] As Horatio notes, “It must be shortly known [to Claudius]” that Hamlet has tricked Rosencrantz and Guildenstern to their deaths at the hands of England (5.2.71-72), a move by Hamlet in his contest that must certainly harden his uncle’s resolve to have Hamlet dealt with once and for all. Of course, Claudius had sent Hamlet to England to be killed, but in secret, on account of both Gertrude and the public’s love for the Prince (4.7.5-24). However, in dispatching his childhood comrades – and with such calculation (5.2.57-70) – Hamlet has now given Claudius justifiable means to overcome any such favourable opinion as might have benefitted Gertrude’s “son” (5.1.296).

[8] Time and place are what we commonly refer to as setting in English class, which is a curious way to consider eternity.

[9] Seldom mentioned amidst all the consternation is that Hamlet does not actually ask a question. If he had, he might have worded it as, “Is it to be, or not to be?” In that case, we would need to know what “it” means. Alive? Dead? Happy? Sad? Anything goes, I suppose, but then… what would you expect? He might have been asking, “Am I…” or “Are we to be, or not to be?” But where that is still somewhat existential and vague, now we might want to know whether his use of the verb, to be, is more open-ended or copular. I suspect Shakespeare knew enough about a general audience to trust that only the most fastidious grammarians would fuss over particulars such as antecedents and verb tenses in the dialogue. Otherwise, why decide to use the most protean verb in the English language?

[10] As far as lived curricular experiences go, there are many like it – as many as there are people to have them – but this one is mine.

[11] At this early stage, I confess: why struggle writing a paper when I could use the age-old trick of writing a paper about writing the paper? Why…? Because the age-old trick is just that – a trick – and spinning academic wheels stalls any hope of contributing to knowledge, so I would hardly be honouring my responsibility if I tried pulling that off. Still… the paper within a paper got me thinking about Hamlet, which oddly enough had been my original inspiration for this essay. As my students used to say… once you study Hamlet, he just never goes away. How true, how true…

[12] According to Hartmann (2014), it was just such questions that prompted Ezra Klein to leave The Washington Post and establish Vox.com in 2014.

[13] Students in my all courses learned to rue the question “Why?” so much so, one year, that it became a running joke simply to utter “why” as a pat-response, whether as a question, an interjection, a plea, a curse, an epithet – those last two maybe reserved for me, I don’t really know. In honour of their perseverance, and their angst, I named my blog The Rhetorical WHY.

[14] Surrounded by Winkies, confronted by certain capture, only Scarecrow eyes the chandelier above the Wicked Witch, so only he can yank Tin Man’s axe across in time to chop the rope that suspends it. Hardly the grandeur or the gravitas of Hamlet, I realise, but The Wizard of Oz has much to offer pertaining to curricular theory as well as teacher autonomy.

[15] In keeping with the three temporal concepts, perhaps a more suitable metaphor than threading our own needles would be to say we surf a long pipeline. But, this essay being more concerned with curriculum and theatre, any such Hang-Ten imagery is better suited to another time, like connecting curriculum to gnarly waves and bodacious beaches (“Surf’s Up,” 2015). Anyway, certainly no one would ever dream of linking Hamlet to surfing (“’Hamlet’s BlackBerry’,” 2010) in the first place, would they?

A Kind of Certainty: I. An Uncertain Faith

Well, you had to know this one was coming… a meditation upon Hamlet.

This meditation, though, also happens to be a treatise on curriculum. I wrote this essay last year for a course I took with Dr William Pinar, who is Curricular Royalty on top of being a super guy. And, like me, he taught secondary English, so I felt I had a sympathetic ear.

Dr Pinar’s course was driven by chapters he’s written for an upcoming book about George Grant, who was (among many things) a philosopher, theologian, educator, and Canadian nationalist. Dr Pinar’s book is about Grant’s critique of time, technology, and teaching.

The series of posts, “A Kind of Certainty,” comprises my final paper, in which I attempt to present Hamlet, the character, by way of the same treatment that Dr Pinar presents Grant. That said, I don’t address technology here (although I do address it here), focusing instead upon teaching and curriculum, and granting due respect to the concept of time.

I debated how I might present this essay, whether to revise it into something more suited to the style and structure of my other blog posts. But it just proved far too difficult to change or remove anything without drastic revision, essentially having to rewrite the entire paper, so here it is in academic trim… citations, endnotes, and all – Dr Pinar is a big fan of endnotes, by the by, so that’s the explanation there.

Here, too, is the bibliography.

 


A Kind of Certainty

1. An Uncertain Faith

I taught Hamlet in English 11. During what typically lasted five months, we considered, among other concepts, certainty and faith. One example of mine to illustrate these was to ask a student why she sat down so readily on her classroom chair. She would be puzzled and say something like, “Huh?” My reply was to note how much faith she evidently placed in that chair to support her without collapsing. Then she would laugh, and I would ask further whether she knew the manufacturer, or the designer, but of course she knew neither. Then I would ask how many other chairs that week had collapsed beneath her, and (apart from one, unfortunately!) the reply would be, “None.” My point, of course, grew clearer to everyone as this conversation progressed, so my next question was to the class: “How many people rode in a vehicle sometime this past week?” Once most confirmed it, I would ask the same basic question as that of the chair: how were you certain that vehicle was safe? I was more tactful where it came to car accidents, usually using my own spectacular examples (… I have two). Ultimately, my claim was that we might have as much as 99% certainty, yet for whatever doubt exists, we rely on faith or else we would never sit in chairs, or drive in cars, or whatever else. As my tone grew more grave, so did their nods and expressions, as if we ought to be dropping Hamlet to study car mechanics, or industrial first aid.

My students were typically alarmed when they realised their faith was only as certain as its object, be it a sturdy or rickety chair. Where extremes present themselves rather obviously, even so, in any case of such offhanded faith, we make ourselves collateral. As if we live on credit, certain that all will remain as it has done, we borrow on faith against our future well-being until it comes time, as it says in the fable, to pay the piper. Meanwhile, what seems certain to us we literally take-for-granted, begging the question with impunity, I suppose, since every day the sun continues to rise.[1] Everyday, we overlook the caution, familiar to investors, that past performance does not necessarily indicate future potential, or as they say in the casino, the House never loses.

Maybe we never stop to consider just how loosely we play with certainty and faith in our day-to-day because doing so might mean never again stepping outside the door – no sense everyone being as hamstrung as the Prince of Denmark. Having studied the play as much as I have, I find every one of its concepts up for debate – arrghh – and where certainty and faith can actually seem either opposed or synonymous, that determination depends on yet another concept from the play, perspective. In any case, where it comes to certainty and faith – at least from my perspective – Hamlet is particularly instructive.

No matter your perspective, I would warn students, no matter where you stand or land, the play will then present you with a challenge of certainty, something I called the “Yeah, but…,” which was naturally a source of unending frustration. Conversely, and ironically, it was also a source of certainty since, like Hamlet in duplicitous Elsinore,[2] at least we can be certain that everybody else thinks, shall we say, uniquely, if not differently. Hamlet’s return home to the web of Catholic Elsinore from the symbolic bastion of Lutheran reform, Wittenberg, on account of his father’s death, finds him divided not unlike the Elizabethans comprising Shakespeare’s audience, caught between two branches of Christian belief.[3] The Bard besets his tragic hero with a matrix of inner turmoil – both secular and spiritual, of fealty and faith – a tesseract of beliefs such that Hamlet cannot reconcile any one to another, even as he quakes yet pines for some grand repose. For each possible value he might set down in his tables, his same self-assurance prompts Hamlet to pose questions more profound, rendering him unable to decide about, well, anything. Doubting that anyone can even interpret what it means to exist and, thereby, doubting that concern over living, or dying, or even debating the question is worthwhile, Hamlet, like the actors he so admires, effectively stands for nothing. As such, I admitted to my students, he was hardly an exemplary role model.

So, I suggested, to avoid the debilitating trap that befalls the brooding Prince, that of “thinking too precisely on the event” (Shakespeare, 1997, 4.4.41),[4] we must simply and ultimately decide what we believe after having drawn such conclusions from the best available evidence. Easily said, yet is this not exactly what Hamlet is trying to do? Little wonder students find him so frustrating. Then again, I pointed out, all our sighing and huffing is its own judgment call, a very palpable hit borne of the frustration of those who are upset with him. With Hamlet’s inability to decide for most of the play comprising most of the play, and with him chastising his own cowardice and rebuking God-given reason as a consequence (2.2.571-580, 4.4.36-39, 43), a spendthrift sigh of our own is hardly unreasonable. On the other hand, observed one student, well on her way to modern material success, he sells tickets. Unquestionably, yes, Shakespeare made a meal of Hamlet making a meal of things. And, even though he doomed his protagonist from the start, the playwright does release Hamlet from his torturous hamster wheel – mercifully? – just before he meets his grand moment of truth.

Throughout the play, Shakespeare includes what I call “Let…” statements. Of particular significance are the following four statements, presented here in sequential order:

  1. Of Claudius’s machinations, Hamlet tells Gertrude to “let it work” (3.4.205)
  2. Exacting vengeance for his father’s murder, Laertes will “let come what comes” (4.5.136)
  3. Having finally made peace with the certainty of death as well as the uncertainty of what lies beyond, Hamlet tells himself (alongside Horatio) to “let be” (5.2.224)
  4. Later, as Horatio confronts doubts of his own, Hamlet tells him to “let go” (5.2.343)

Alternatively arranged, these statements help comprise, for me, a response to the famous question, “To be, or not to be.”[5] This alternative arrangement derives from a sentence analysis exercise that my students and I would complete while preparing for the play. The sentence is from an essay by Drez (2001) about American pilots during WWII: “There were no souvenirs, but the grisly task of scrubbing decomposing remains from their boots later left a lasting memory” (p. 144). Briefly, the words later, left, and lasting illustrate the creation and the span of the airmen’s memories over time – the future, past, and present, respectively – made all the more ironic since the souvenirs they found were hardly the ones they sought. Using these three words alongside my own interpretation of each “Let…” statement, I have arranged them chronologically out-of-sequence with the play, using instead an interpretive application of temporality as three discrete periods[6] to challenge the common concept of linear time as historical calendar pages or a ticking clock.

 

Click here for the Bibliography

Click here to read Pt II: Curriculum, or What You Will

 


Endnotes

[1] Shame on us for carrying on so fallaciously! At pedestrian-controlled stoplights, we eventually step off the curb believing that drivers have halted their oncoming vehicles rather than carrying on through and running us down. To call the stoplight “pedestrian-controlled” is somewhat of an embellishment on the part of the city engineers, I think, a deferral to who really is favoured, for whatever reason, in the equation. But for the pedestrian to step off the curb is an act of faith, surely, since they abrogate control to the driver who has the car’s capability to accelerate and manoeuvre at his disposal. For that brief moment, only the driver’s motives keep the pedestrian safe. And careful though we are, accidents still happen in such everyday circumstances. Worst of all, as more recent times demonstrate, cars and trucks can be used precisely as weapons of terror against innocent people; the danger I speak of, the giving-and-taking of control, however uncommon, has now been realised. That changes attitudes profoundly.

Security measures, safety audits, protective equipment, government regulations – on and on goes the list of processes and people in which we place our faith, believing with some degree of certainty – or, as often as not, taking for granted on faith – that proper standards are being met that ensure our safety.

[2] Just my interpretation, mind you, “duplicitous Elsinore.” Certainly, you will have your own analysis.

[3] Since the time of those events described in the New Testament, their interpretation has divided Christian belief into myriad denominations, such as those found in both Shakespeare’s play and Elizabethan England: Catholicism and two respective branches of reform, the Protestant Reformation initiated by Martin Luther and the English Reformation decreed by King Henry VIII. I simply use “Christian belief” in a broad sense, wanting to avoid the suggestion that any particular denomination tops some hierarchy, since that sort of debate, here, is beside the point.

[4] For the duration of the essay, I shall refer to quotes from this cited edition of the play.

[5] Regrettably, but unsurprisingly, I’m hardly the first to devise this response to the famous question. Evidently, where my approach differs from other examples (Baumlin & Baumlin, 2002; Critchley & Webster, 2011) is connecting the four specified “Let…“ statements and Hamlet’s closing lines (5.2. 222-223, 358) with concepts of temporality.

[6] A full explanation of the four “Let…” statements and temporality demands its own essay, and I am already deep enough into Hamlet as it is, so for my weary negligence I ask some gracious leeway instead of a challenging “Yeah, but…”. Suffice to say, though, as we might feel this way or that about past or future, we still must inherently live each present moment, such as we are.

Teaching Open-Mindedly in the Post-Truth Era

A year on, and this one, sadly, only seems more relevant…

[Originally published June 16, 2017]

I had brilliant students, can’t say enough about them, won’t stop trying. I happened to be in touch with one alumna – as sharp a thinker as I’ve ever met, and a beautiful writer – in the wake of the 2016 U.S. election campaign and wrote the following piece in response to a question she posed:

How do you teach open-mindedly in the post-truth era?

I was pleased that she asked, doubly so at having a challenging question to consider. And I thoroughly enjoyed the chance to compose a thoughtful reply.

I’ve revised things a little, for a broader audience, but the substance remains unchanged.

How do you teach open-mindedly in the post-truth era?

Good heavens. Hmm… with respect for peoples’ dignity, is my most immediate response. But such a question.

Ultimately, it takes two because any kind of teaching is a relationship – better still, a rapport, listening and speaking in turn, and willingly. Listening, not just hearing. But if listening (and speaking) is interpreting, then bias is inescapable, and there needs to be continual back-and-forth efforts to clarify, motivated by incentives to want to understand: that means mutual trust and respect, and both sides openly committed. So one question I’d pose back to this question pertains to the motives and incentives for teaching (or learning) ‘X’ in the first place. Maybe this question needs a scenario, to really illustrate details, but trust and respect seem generally clear enough.

Without trust and respect, Side ‘A’ is left to say, “Well, maybe some day they’ll come around to our way of thinking” (… that being a kind portrayal) and simply walks away. This, I think, is closed-minded to the degree that ‘A’ hasn’t sought to reach a thorough understanding (although maybe ‘A’ has). Whatever the case, it’s not necessarily mean-spirited that someone might say this. With the best intentions, ‘A’ might conclude that ‘B’ is just not ready for the “truth.” More broadly, I’d consider ‘A’s attitude more akin to quitting than teaching, which is to say a total failure to “teach”, as far as I define it from your question. It would differ somewhat if ‘A’ were the learner saying this vs being the teacher. In that case, we might conclude that the learner lacked motivation or confidence, for some reason, or perhaps felt alone or unsupported, but again… scenarios.

Another thing to say is, “Well, you just can’t argue with stupid,” as in we can’t even agree on facts, but saying this is certainly passing judgment on ol’ stupid over there, and perhaps also less than open-minded. To be clear… personally, I’d never say bias precludes truth, only that we’ll never escape our biases. The real trouble is having bias at all, which I think is what necessitates trust and respect because the less of these is all the more turmoil. I figure any person’s incentive to listen arises from whatever they think will be to their own benefit for having listened. But “benefit” you could define to infinity, and that’s where the post-truth bit is really the troublesome bit because all you have is to trust the other person’s interpretation, and they yours, or else not.

Yeah, I see “post-truth” as “anti-trust,” and that’s a powderkeg, the most ominous outcome arisen of late. People need incentives to listen, but if treating them with dignity and respect isn’t reaching them, then a positive relationship with me wasn’t likely what they wanted to begin with. That’s telling of the one side, if not both sides. At the same time, it’s harder to say in my experience that students have no incentives to listen or that, on account of some broader post-truth culture, they don’t trust teachers – that might be changing, who knows, but I hope not.

But I’m leaving some of your question behind, and I don’t want to lose sight of where it’s directed more towards the person doing the teaching (you asked, how do you teach open-mindedly…).

That part of the question was also in my immediate reaction: respect peoples’ dignity. For me, when I’m teaching, if I’m to have any hope of being open-minded, I intentionally need to respect the other person’s dignity. I need to be more self-aware, on a sliding scale, as to how open- or closed-minded I’m being just now, on this-or-that issue. So even while that’s empathy, it’s also self aware, and it’s intentional. It’s not “me” and “the other.” It’s “us.”

Me being me, I’d still be the realist and say you just can never really know what that other person’s motive truly is – whether it’s a pre-truth or post-truth world doesn’t matter. But whether or not you trust the other, or they you, the real valuable skill is being able to discern flaws of reason, which is what I always said about you – you’ve always been one to see through the bull shit and get to the core of something. I’m no guru or icon, I’m just me, but as I see it just now, the zeitgeist is an emotional one more than a rational one. And there’s plenty to understand why that might be the case. And given that emotional dominance, I do think post-truth makes the world potentially far more dangerous, as a result.

Whatever incentives people are identifying for themselves, these days, are pretty distinct, and that’s a hard one for unity. That saying about partisan politics – “We want the same things; we just differ how to get there” – that doesn’t apply as widely right now. So, by virtue of the other side being “the other” side, neither side’s even able to be open-minded beyond themselves because trust and respect are encased in the echo chambers. More than I’ve ever known, things have become distinctly divisive – partisan politics, I mean – and I wonder how much more deeply those divisions have room to cut. Selfish incentives cut the deepest. Trust and respect guard us from deep cuts.

So, for instance, lately I find with my Dad that I listen and may not always agree, but where I don’t always agree, he’s still my Dad, and I find myself considering what he says based on his longevity – he’s seen the historic cycle, lived through history repeating itself. And I obviously trust and respect my Dad, figuring, on certain issues, that he must know more than me. On other issues, he claims to know more. On others still, I presume he does. Based on trust and respect, I give him the benefit of the doubt, through and through. One of us has to give, when we disagree, or else we’d just continually argue over every disagreement. If you want peace, someone has to give, right? Better that both share it, but eventually one must acquiesce to their “doubt” and make their “benefit” finite, stop the cutting, compromise themselves, if they’re to see an end to the debate. So should I trust my Dad? I respect him because he’s given me plenty good reason after such a long time. Certainly I’m familiar with his bias, grown accustomed to it – how many times over my life have I simply taken his bias for granted? Too bad the rest of the world don’t get along as well as my Dad and I do.

I see it even more clearly with my daughter, now, who trusts me on account of (i) her vulnerability yet (ii) my love. The more she lives and learns alongside me, as time passes by, the more cyclically her outlook is reiterated, a bit like self-fulfilling prophecy. Other parents have warned me that the day’s coming when she’ll become the cynical teenager, and I’m sure it will – I remember going through it, myself. But I’m older, now, and back to respecting my Dad, so at least for some relationships, the benefit of the doubt returns. My Dad preceded me, kept different circles than me, and lived through two or three very different generations than me. Even as we see the same world, we kind of don’t. So this is what I wonder about that deep cut of division, reaching the level of family – and, further than one given family, right across the entire population. Do I fact-check my Dad, or myself, or maybe both? Should I? Even if I do, neither one of us is infallible, and we’re only as trustworthy as our fact-checking proficiency.

Anyway, the child of the parent, it’s as good an example as I can think of for questioning what it means to learn with an open mind because there’s no such thing as “unbiased.” Yet love, trust, and respect are hardly what we’d call “closed-minded,” except that they are, just in a positive way. Love, trust, and respect leave no room for scepticism, wariness, and such traits as we consider acceptable in healthy proportions (for reasons about motive that I explained above).

But “teaching” with an open-mind takes on so much more baggage, I think, because the teacher occupies the de facto as well as the de jure seat-of-power, at least early on – school is not a democracy (although that now seems to be changing, too). Yet teachers are no more or less trustworthy on the face of it than any other person. That’s probably most of all why I reduce my response to respecting human dignity because where it’s closed-minded, for all its “positive,” it’s also a do-no-harm approach.

That jibes with everything I’ve learned about good teaching, as in good teaching ultimately reduces to strong, healthy relationships. Short-term fear vs long-term respect – it’s obvious which has more lasting positive influence. And since influencing others with our bias is inevitable, we ought to take responsibility for pursuing constructive outcomes, or else it’s all just so much gambling. At the core, something has to matter to everybody, or we’re done.

Deciding over Derrida’s Différance

As far as I understand Jacques Derrida’s différance, he observes that we understand our experiences as distinctive, but not exhaustive, communicated links or marks comprising an on-going decisive chain of experiential moments. As to the language we use to describe our experiences, a word has contextual meaning, both from its usage at any given time as well as from its etymology over the course of time. I tend to agree with this attendance to context as furnishing meaning, and I can also spot the rabbit hole that it poses. For example, to understand some word’s definition, I might look it up in the dictionary and be left to rely upon the definition of whomever decided what it meant while, at the same time, face all sorts of words in the definition that now need looking up, too – Sisyphean, indeed! Cruel but so usual. On the other hand, thanks to whomever for compiling the dictionary, a pretty utile compendium, I have to say.

To be clear, I am not intending to invoke logocentrism, by which all our words are accorded a decided meaning from a cultural centre, which propagates existing biases or “privileges”; Derrida would roll over in his grave. Granted, I may already have laid grounds here to be accused of logocentrism, myself, by writing with words (and I confess to using English because I didn’t think anyone had the patience to muddle over Wingdings). My present aim is to suggest how we might address the afore-mentioned rabbit-hole dilemma by searching for or (… almost afraid to say it) by deciding upon some definitions of our own. Not like a dictionary, but more like– well yes, okay, like a dictionary, but one that we’ll fashion from the ground-up, like when the light bulb would go on above Darla’s head, and Spanky would snap his fingers to say, “Hey, everyone! Maybe we can put on a play!” So, in the spirit of dissemination, hey everybody, maybe we can compile a dictionary! A real, deconstructive, crowd-sourced dictionary!

I’m not really compiling a dictionary. I’m just trying to make some sense of Derrida and différance. Let me try to illustrate what I mean from my own experience. Sometimes I play Walking Football, a version of the game where players are not permitted to run. Naturally, the debate is over what differentiates walking from running. We’ve agreed that walking means “always having at least one foot in contact with the ground during the striding motion.” Running means “having both feet leave the ground at some point during the striding motion.” This makes for certainty, and so long as our eyes are trained enough to spot feet in motion, which I can spot sometimes so clearly, with such immediacy, that its more like I’m watching, not playing – I’m ghinding it tuff even now to ghet the right words, but trust me. And so long as each player is willing to obey the rules – and, ohh my, there’s always that one player who just won’t. You know who I mean… *sigh… Anyway, so long as they’re not just words uttered that then float away in the breeze, our definitions of the rules for walking and running are useful.

Luckily, too, I might add, when we clarify the rules, we do so out loud, together, and don’t whisper it around in a circle, like when my daughter plays Telephone at a birthday party – after all, we want everyone to be clear. Finally, even if we have trouble spotting feet in motion, because it all happens too quickly, or even if that one player is a cheater at heart, the definitions themselves remain clear, and usually at least one or two of us can remember them well enough to recite back, as needed, usually with a lot of finger-pointing and furrowed brows. One time we even wrote the no-running rule on the gym chalkboard, and even though no one challenged this, on the grounds that writing is secondary to speech, everyone still understood why it was scrawled there, by which I mean everyone knew exactly who should read it the most – wow, does every game have that player? Incorrigible.

Bottom line: accountability is down to the sincerity and respect offered to each player by every other player who decides to participate. As an aside, the need for a referee, an arbiter, is all the more clear when the stakes are as high as bragging rights and free beer. But, even as we play for fun, the rules exist or else the game, as such, does not. (On that note, I find a lot of players just don’t like Walking Football and would rather play with running, and that’s fine, too: it’s their decision, and plenty other like-minded players keep both games afloat. I find the Walking game amplifies decision-making, so maybe this feature just appeals to me. And I still play traditional football, too.) My broader point is that any one person must decide to accept what has been defined and, likewise, any group of people must reach a consensus. Shared meaning matters because, otherwise, as I say, we don’t have a game, or else we have a very different one, or we just have anarchy. But whether that person, alone, or the group, altogether, searching for a way to decide upon meaning, has the patience to delve down the rabbit hole… well, yes, context does indeed matter – both usage and etymology. I’ve said and written as much, myself, for a long time. So, in light of all this, I hope I’ve gathered a little something of Derrida’s différance. I’m still learning.

Another illustration: in my teaching, I occasionally introduced this matter of contextual meaning by offering students a list of synonyms: “slim,” “slender,” “skinny,” “thin,” “narrow.” Each word, of course, has its own particular meaning. “If they meant the same thing,” I offered, “then we’d use the same word,” so just what explains the need for all these synonyms? Well, students would say, there are lots of different things out there that possess or demonstrate these various adjectives (my word, not theirs), so we’ve come up with words to describe them (and I think that’s a charitable “we,” like the Royal “We.”) As the discussion proceeded, I might ask which of these words typically describe human traits versus those – leaving aside metaphors – that typically do not. Next, which words typically possess positive connotations, and which negative, or neutral? And, as it pertains to the personification metaphors, which words are more easily envisioned versus those that really stretch the imagination, or even credibility? Eventually, I would shift from ontology to epistemology, posing the questions at the heart of my intention: For any of the previous questions about these synonyms, how do you know what you’re talking about? For what each of these words could mean, where have your assurances come from? Of course, the most frequent reply to that question was “the dictionary,” followed by “my parents” or “books I’ve read,” or “just everyday experience, listening and talking to people.” Occasionally, the reply was something akin to “Who cares… it just means what it means, doesn’t it?” In every reply, though, one common thread was detectable: the involvement of other people as part of the meaning-making process. Fair enough, we can’t all be Thoreau.

One more example: when is “red” no longer red but perhaps orange or purple? Well, for one thing, if you’re colour blind, the question means something entirely different, which I say not flippantly but again to illustrate how important dialogue and community are to deciding what something means. For another thing, we might wish to ask, in keeping with context-dependency, “Why even ask?” Again, this is not flippant or dismissive but practical: when does it matter so that we distinctly need to identify the colour red? Where a group of people might face the question over what is red versus what is orange or purple, we might expect some kind of discussion to ensue. And, whether asking as part of such a group or as a hermit, alone, I submit that one person’s decision about what is “red” is ultimately down to one person to determine: “Red is this,” or “This is red,” or even, “Gosh, I still can’t really decide.” Even a coerced decision we can still attribute to the one who forces the issue – one person has decided on behalf of another, however benignly or violently: might makes right, or red, as it were.

Coercion introduces a political consideration about whose authority or power has influence, similar to needing referees on account of those players who decide to run. The point, for now, is simply that a decision over what something means to a person is ultimately made by a person, leaving others to deal with that decision on their own terms in whatever way. But other people are part of the meaning-making process, even passively, or else I wouldn’t need language to begin with since the rest of you wouldn’t trouble me by existing. Not to worry, by the way, I appreciate you reading this far. From what I understand (and I am convinced I must learn more, being no avid student of either postmodernism or Derrida), his observation of différance either discounts or else offers no account for the arbitrary decision-making that people might make when they decide they’ve had enough. People tend to land somewhere in a community, and it’s the rare person who lives and plays wholly and uncompromisingly by their own rules. However, the fact that he felt différance was worth the effort to publicise and explain to the rest of us does reflect an arbitrary decision on the part of Derrida and says something about him.

So this is where I have more fundamental trouble understanding Derrida and différance – the very notion of “different,” as in, in what world could there not be an arbiter? Even a life alone would face endless decisions: what to eat, where to go, when to sleep, and so forth. From such musing – speaking of rabbit holes – I was led to reading about another philosopher named Jacques, this one Rancière, and what he calls the axiom of equality. In pure nutshell form, I take this to mean that no (socio-political) inequality exists until it has been claimed to exist – and note that it’s claimed in a boat-rocking kind of way, what the kids these days are calling “disruptive.” The upshot is that equality, itself, can only ever be theoretical because someone somewhere inevitably is and always will be marginalised by the arbitrary decisions of cultural hegemony. Still learning.

Back to the Walking Football analogy: if the rabbit hole of defining a word in the context of those that surround it, and then having to define, even further, all those words, and on and on, and experience is inexhaustible, and what’s the point, and lift a glass to nihilism… if that kind of limitless indefinite deconstructive search-and-compare lies at the heart of what is different, then maybe Derrida just found it difficult to reach agreement with other people. It stands to reason that, if he played Walking Football, Derrida might be the worst cheater on the floor, continually running when he should be walking, then denying it just the same as he tried to gain advantage. Maybe, fed up being called a cheater, he would take his ball and go home to play by himself, where no one could say he was wrong. Being alone, who would be there, whether as an obedient player or as a sneaky one, to challenge him? In fact, maybe that’s why he chose to return to play the next day – for all the arguing, he enjoyed the game, or the attention, or the camaraderie, or the exercise, or whatever, more than being accused of cheating. I wonder if, perhaps, in the great game of philosophy football, he would have been the only rival to strike real fear in Diogenes – I mean awe & respect kind of fear, just to clarify, and I mean if they had lived at the same time. It’s hard to know about Diogenes since nothing he wrote down ever survived, and these days, I doubt more than a few can recall any of whatever he said, besides that lamp-carrying honesty thing. (We should all have such good spirit when it comes to our first principles.) Anyway, I think Diogenes played for Wimbledon.

Well, I am being unkind to Derrida. Evidently, he was a kinder person by nature than I have let on, as well as an advocate for all voices, all people. And the professional care, the uncompromising expertise he took to convey his ideas, to trouble himself with delving down the rabbit hole so arbitrarily – to go down at all but, moreover, to go so far when he might, just the same, have decided to halt. Delve as far as you like, but accept responsibility for your decision, every time. In that respect, how does Derrida differ from any other person facing decisions? Did he have still other motivations? No player who kicks a football is deliberately playing to lose, not unless they have been coerced by someone else to do so. On the other hand, for all I know, maybe what Derrida called red I would call blue. Be careful not to pass the ball to the wrong team! (By the way, in sport, dynasties are remembered precisely because they eventually come to an end.)

Was Derrida no less accountable and open to scrutiny than you, or me, or anybody else? To suggest that a word only makes sense based on how it differs from those around it is no less arbitrary than its reciprocal suggestion, that a word only makes sense based on how it describes only what it describes. Half-full / half-empty, six of one… Two sides of the same coin are still the same coin. Alternatively, who put him up to all this? Meanwhile, on his own, surely Derrida had it within himself, as people do when they reach a point, simply to say, “Here is enough. I decide to stop here. For me, [the item in question] means this.” If that doesn’t ring true and sound like him, well, I’d say that can be just as telling of his character; I heard it suggested, once, how we can be helped in knowing something by what it is not. So, fine – for Derrida to stake the claim called différance, I’m willing to concede him that moment. We all land somewhere, and we’re all hardly alike, even when we’re alike.

We are, each and every one of us, individual. But together we comprise something just as dynamic on a larger scale – one might construe us societally, or perhaps historically, anthropologically, or on and on, in whatever way through whichever lens. For me, différance appears an attempt to speak for all about all, prescriptively. A grand stab at philosophy, no question, and that’s the beauty of the equality of philosophy, with thanks to Rancière: we all have a part to play and a right to respond. For the time being, as I have understood Derrida and his thinking, and I willingly stand to be instructed further, différance strikes me as ironic, being an advocacy for the dynamic development of people and language and culture that self-assuredly asserts its own accuracy. That is not an uncommon indictment of postmodernists. What’s more, it is ohh, so human.

What On Earth Were They Thinking?

How often have you heard somebody question people who lived “back then,” in the swirling historical mist, who somehow just didn’t have the knowledge, the nuance, or the capability that we so proudly wield today?

“Things back then were just a lot more simple.”

“They just weren’t as developed back then as we are today.”

“Society back then was a lot less informed than we are today.”

It’s difficult to even confront such statements out of their context, but where I’ve had all three of these spoken to me in (separate) conversations, I challenged each impression as an insinuation that we’re somehow smarter than all the peoples of history, more skilled, more sophisticated, more aware, more woke (as they say, these days), that we’re, in the main, altogether better than our elders merely for having lived later than they did. These days, apparently, “we’re more evolved” – ha! 🙂  more evolved, that’s always a good one, as if back then everyone was Australopithecus while here we are jetting across the oceans, toting our iPhones, and drinking fine wines. Well, sure, maybe things have changed since back then, whenever “then” was. But, more typically I’ve found, contemporary judgments levelled upon history are borne of an unintended arrogance, a symptom of 20:20 hindsight and the self-righteous assurance that, today, we’ve finally seen the error – actually, make that the errors – of our ways.

Surely, these days, few – or any – would believe that we’re ignorant or unaccomplished or incapable, not the way someone might say of us while looking back from our future. At any given point on the historical timeline, I wonder whether a person at that point would feel as good about their era, looking back, as any person on some other point of the timeline would feel, also looking back, about theirs. Is it a common tendency, this judgment of contemporary superiority? These days, we might well feel superior to people who had no indoor plumbing or viral inoculations or Internet access, just as someone earlier would have appreciated, say, some technological tool, a hydraulic lift to raise heavy objects, or a set of pulleys, or a first-class lever – choose whatever you like! It really is arbitrary for a person, looking back at history, to feel better about contemporary life because contemporary life infuses all that person’s experience while history’s something that must be learnt.

I’d say that learning history means learning whatever has lasted and been passed on because what has lasted and been passed on was deemed to have merit. We’re taught the history that has been deemed worth remembering. What I’ve found has been deemed worth remembering (i.e., the kinds of things I learned from History classes) are the general mood of the times, the disputes and fights that occurred (violent or academic), a select handful of the figures involved, and an inventory of whichever non-living innovations and technologies simultaneously arose alongside it all. If, later, we demerit and no longer pass on what has lasted up until then, passing on instead some different history, then that’s entirely indicative of us, now, versus anyone who came before us, and it reflects changed but not necessarily smarter or better priorities and values.

For me, we shouldn’t be saying we’re any smarter or better, only different. So much literature has lasted, so much art. Commerce has lasted, institutions have lasted, so much has lasted. Civilization has lasted. Cleverness, ingenuity, shrewdness, wit, insight, intellect, cunning, wisdom, kindness, compassion, deceit, pretence, honesty, so many many human traits – and they all transcend generations and eras. People vary, but human nature perdures. I’ll trust the experts, far more immersed in specific historical study than me, to identify slow or subtle changes in our traits – hell, I’ll even grant we may have culturally evolved, after all – and I can only imagine in how many ways the world is different now as compared to before now. But what does it mean to be better? Better than other people? Really? And who decides, and what’s the measure?

We can measure efficiency, for instance, so to say technology has advanced and is better than before is, I think, fair. Even then, an individual will have a subjective opinion – yours, mine, anybody’s – making culture not proactive and definitive but rather reactive and variable, a reflection, the result of comprised opinions that amplify what is shared and stifle what is not. As we’re taught merited history, you’re almost forced to concur, at least until we reconsider what has merit. That’s a sticking point because everyone will have an opinion on what is culturally better and what is culturally worse. Morality inevitably differs, and suddenly we have ethical debate, even disagreement, even discord. But to say people or culture are better, I think, is too subjective to rationalize and a questionable path to tread.

Consider this as well: we each know what we’ve learned, and as individuals, we’ve each learnt what we value. But what we’re taught is what’s broadly valued and, thereby, prescribed for all. We’ve all heard that rather hackneyed epigram, that those who neglect history are doomed to repeat it. Well, maybe the ones screwing up just didn’t learn the right history to begin with. I tend to abide by another hackneyed epigram, that they are wisest who know how little they know. Real historical wisdom and real historical understanding would be seeing and thinking and understanding as people earlier did. But short of firing up the Delorean for an extended visit some place, some time, it seems to me that judgments about history are made with an aplomb that might be better aimed at acknowledging our finite limitations. We’re no angels. If anything, this error of judgment speaks volumes about us. Condescension is what it is, but in my opinion, it’s no virtue.

We should hardly be judging the past as any less able or intelligent or kind or tolerant or virtuous as us, especially not if we aim to live up to today’s woke cultural embrace of acceptance. Being different should never be something critiqued; it should be something understood. Conversely, in proportion to how much we know, passing judgment is assumptive, and we all know what Oscar Wilde had to say about assuming (at least, we know if we’ve studied that piece of history). At the very least, we ought to admit our own current assumptions, mistakes, errors, accidents, troubles, disputes, and wars before we pass any judgment on historical ones. On that positive note, I will say that considering all this prompted me to notice something maybe more constructive – so often, at least in my experience, what we seemed to study in History class were troublemaking causes and effects, bad decisions, and selfishly motivated behaviours. Far more rarely was History class ever the study of effective decision-making and constructive endeavour – maybe the odd time, but not usually. Maybe my History teachers were, themselves, stifled as products of the system that educated them. What could they do but pass it along to me and my peers? Considering that, I might more readily understand how people, alive today, could conclude that all who came before were simply not as enlightened, as sophisticated, or as adept as we are now.

Yet that merely implicates contemporary ignorance: assumptions and mistakes still happen, errors still occur, accidents – preventable or not – haven’t stopped, troubles and disputes and wars rage on. If the axiom of being doomed to repeat history were no longer valid, we wouldn’t still feel and accept its truthful description, and it would have long ago faded from meaning. All I can figure is that we’re still poor at learning from history – the collective “we,” I mean, not you in particular (in case this essay was getting too personal). We need learned people in the right positions at the right times, if we hope to prevent the mistakes of history. Not enough people, I guess, have bothered to study the branches of history with genuine interest. Or, no, maybe enough people have studied various branches of history, but they don’t remember lessons sharply enough to take them on board. Or, no no, maybe plenty of people remember history, but the circumstances they face are just different enough to tip the scale out-of-favour. Epigram time: history doesn’t repeat, but it does rhyme. Or maybe we’re just full of ourselves, thinking that we’ve got it all solved when, evidently, we don’t.

It also dawned on me, considering all this, that high school “History” influences what many people think about broader “history.” My high school experience, university too, was mostly a study of politics and geography, and toss in what would be considered very elementary anthropology – all this as compared to those other branches of historical study. Archaeology and palaeontology come to mind as detailed, more scientific branches of history, but there are so many – literary history, philosophical history, religious, environmental, military, economic, technological, socio-cultural as I’ve already indicated, on and on they go, so many categories of human endeavour. I’ve even come across a thoughtful paper contemplating history as a kind of science, although one that is normative and susceptible to generational caprice. One final epigram: history is what gets written by the winners.

And that’s really the point here: throughout what we call human history, where we’ve subdivided it so many ways (right down to the perspective of every single person who ever bothered to contribute, if you want to break it down that far), it’s people all the way back, so it’s all biased, so it’s neither complete nor even accurate until you’ve spent oodles of time and effort creating a more composite comprehension of the available historical records. And, dear lord, who has time for that! History, in that respect, is barely conceivable in its entirety and hardly a thing to grasp so readily as to say, simply, “Back then…” History is people, and lives, and belief inscribed for all time. To know it is to know who lived it as well as who recorded it. Knowing others is empathy, and empathy is a skill trained and fuelled by curiosity and diligence, not emotion or opinion. Emotion and opinion come naturally and without effort. For me, valid history derives from informed empathy, not the other way around.

As far as recording history for future study, ultimately, it will have been people again recording and studying all of it, “it” being all of what people were doing to attract attention, and “doing” being whatever we care to remember and record. It’s all a bit cyclical, in itself, and completely biased, and someone will always be left out. So people might be forgiven when shaking their heads in judgment of the past because, without empathy, what else could they possibly know besides themselves?