Before introducing the moral pairing of right and wrong to my students, I actually began with selfish and selfless because I believe morality has a subjective element, even in the context of religion where we tend to decide for ourselves whether or not we believe or ascribe to a faith.
As I propose them, selfish and selfless are literal, more tangible, even quantifiable: there’s me, and there’s not me. For this reason, I conversely used right and wrong to discuss thinking and bias. For instance, we often discussed Hamlet’s invocation of thinking: “… there is nothing good or bad, but thinking makes it so” (II, ii, 249-250). Good and bad, good and evil, right and wrong… while not exactly synonymous, these different pairings do play in the same ballpark. Still, as I often said to my students about synonyms, “If they meant the same thing, we’d use the same word.” So leaving good and bad to the pet dog, and good and evil to fairy tales, I presently consider the pairing of right and wrong, by which I mean morality, as a means to reconcile Hamlet’s declaration about thinking as some kind of moral authority.
My own thinking is that we have an innate sense of right and wrong, deriving in part from empathy, our capacity to stand in someone else’s shoes and identify with that perspective – look no further than storytelling itself. Being intrinsic and relative to others, empathy suggests an emotional response and opens the door to compassion, what we sometimes call the Golden Rule. Compassion, for Martha Nussbaum, is that means of “[hooking] our imaginations to the good of others… an invaluable way of extending our ethical awareness” (pp. 13-14). Of course, the better the storytelling, the sharper the hook, and the more we can relate; with more to go on, our capacity for empathy, i.e. our compassion, rises. Does that mean we actually will care more? Who knows! But I think the more we care about others, the more we tend to agree with them about life and living. If all this is so, broadly speaking, if our measure for right derives from empathy, then perhaps one measure for what is right is compassion.
And if we don’t care, or care less? After all, empathy’s no guarantee. We might just as reasonably expect to face from other people continued self-interest, deriving from “the more intense and ambivalent emotions of… personal life” (p. 14). Emotions have “history,” Nussbaum decides (p. 175), which we remember in our day-to-day encounters. They are, in general, multifaceted, neither a “special saintly distillation” of positive nor some “dark and selfish” litany of negative, to use the words of Robert Solomon (p. 4). In fact, Solomon claims that we’re not naturally selfish to begin with, and although I disagree with that, on its face, I might accept it with qualification: our relationships can supersede our selfishness when we decide to prioritise them. So if we accept that right and wrong are sensed not just individually but collectively, we might even anticipate where one could compel another to agree. Alongside compassion, then, to help measure right, perhaps coercion can help us to measure wrong: yes, we may care about other people, but if we care for some reason, maybe that’s why we agree with them, or assist them, or whatever. Yet maybe we’re just out to gain for ourselves. Whatever our motive, we treat other people accordingly, and it all gets variously deemed “right” or “wrong.”
I’m not suggesting morality is limited solely to the workings of compassion and coercion, but since I limited this discussion to right and wrong, I hope it’s helping illuminate why I had students begin first with what is selfish and selfless. That matters get “variously deemed,” as I’ve just put it, suggests that people seldom see any-and-all things so morally black and white as to conclude, “That is definitely wrong, and this is obviously right.” Sometimes, of course, but not all people always for all things. Everybody having an opinion – mine being mine, yours being yours, as the case may be – that’s still neither here nor there to the fact that every body has an opinion, mine being mine and yours being yours. On some things, we’ll agree while, on some things, we won’t.
At issue is the degree that I’m (un)able to make personal decisions about right and wrong, the degree that I might feel conspicuous, perhaps uneasy, even cornered or fearful – and wrong – as compared to feeling assured, supported, or proud, even sanctimonious – and right. Standing alone from the crowd can be, well… lonely. What’s more, having some innate sense of right and wrong doesn’t necessarily help me act, not if I feel alone, particularly not if I feel exposed. At that point, whether from peer pressure or social custom peering over my shoulder, the moral question about right and wrong can lapse into an ethical dilemma, the moral spectacle of my right confronted by some other right: would I steal a loaf of bread to feed my starving family? For me, morality is mediated (although not necessarily defined, as Hamlet suggests) by where one stands at that moment, by perspective, in which I include experience, education, relationships, and whatever values and beliefs one brings to the decisive moment. I’m implying what amounts to conscience as a personal measure for morality, but there’s that one more consideration that keeps intervening: community. Other people. Besides selfish me, everybody else. Selfless not me.
Since we stand so often as members of communities, we inevitably derive some values and beliefs from those pre-eminent opinions and long-standing traditions that comprise them. Yet I hardly mean to suggest that a shared culture of community is uniform – again, few matters are so black or white. Despite all that might be commonly held, individual beliefs comprising shared culture, if anything, are likely heterogeneous: it’s the proverbial family dinner table on election night. Even “shared” doesn’t rule out some differentiation. Conceivably, there could be as many opinions as people possessing them. What we understand as conscience, then, isn’t limited to what “I believe” because it still may not be so easy to disregard how-many-other opinions and traditions. Hence the need for discussion – to listen, and think – for mutual understanding, in order to determine right from wrong. Morality, in that sense, is concerted self-awareness plus empathy, the realised outcome of combined inner and outer influences, as we actively and intuitively adopt measures that compare how much we care about the things we face everyday.
Say we encounter someone enduring loss or pain. We still might conceivably halt our sympathies before falling too deeply into them: Don’t get too involved, you might tell yourself, you’ve got plenty of your own to deal with. Maybe cold reason deserves a reputation for callusing our decision-making, but evidently, empathy does not preclude our capacity to reason with self. On the other hand, as inconsistent as it might seem, one could not function or decide much of anything, individually, without empathy because, without it, we would have no measure. As we seem able to reason past our own feelings, we also wrestle echoing pangs of conscience that tug from the other side, which sometimes we call compassion or, other times, a guilt trip. Whatever to call it, clearly we hardly live like hermits, devoid of human contact and its resultant emotions. Right and wrong, in that respect, are socially individually determined.
One more example… there’s this argument that we’re desensitized by movies, video games, the TV news cycle, and so forth. For how-many-people, news coverage of a war-torn city warrants hardly more than the glance at the weather report that follows. In fact, for how-many-people, the weather matters more. Does this detachment arise from watching things once-removed, two-dimensionally, on a viewscreen? Surely, attitudes would be different if, instead of rain, it were shells and bombs falling on our heads from above. Is it no surprise, then, as easily as we’re shocked or distressed by the immediacy of witnessing a car accident on the way to our favourite restaurant, that fifteen minutes later we might conceivably feel more annoyed that there’s no parking? Or that, fifteen minutes later again, engrossed by a menu of appetizers and entrees and desserts, we’re exasperated because they’re out of fresh calamari. Are right and wrong more individually than socially determined? Have we just become adept at prioritising them, even diverting them, by whatever is immediately critical to individual well-being? That victim of the car accident isn’t nearly as worried about missing their dinner reservation.
Somewhat aside from all this, but not really… I partially accept the idea that we can’t control what happens, we can only control our response. By “partially” I mean that, given time, yes, we learn to reflect, plan, act, and keep calm carrying on like the greatest of t-shirts. After a while, we grow more accustomed to challenges and learn to cope. But sometimes what we encounter is so sudden, or unexpected, or shocking that we can’t contain a visceral response, no matter how accustomed or disciplined we may be. However, there is a way to take Hamlet’s remark about “thinking” that upends this entire meditation, as if to say our reaction was predisposed, even premeditated, like having a crystal ball that foresees the upcoming shock. Then we could prepare ourselves, rationalise, and control not what happens but our response to it while simply awaiting the playing-out of events.
Is Solomon wise to claim that we aren’t essentially or naturally selfish? Maybe he just travelled in kinder, gentler circles – certainly, he was greatly admired. Alas, though, poor Hamlet… troubled by jealousy, troubled by conscience, troubled by ignorance or by knowledge, troubled by anger and death. Troubled by love and honesty, troubled by trust. Troubled by religion, philosophy, troubled by existence itself. Is there a more selfish character in literature? He’s definitely more selfish than me! Or maybe… maybe Hamlet’s right, after all, and it really is all just how you look at things: good or bad, it’s really just a state of mind. For my part, I just can’t shake the sense that Solomon’s wrong about our innate selfishness, and for that, I guess I’m my own best example. So, for being unable to accept his claim, well, I guess that one’s on me.
Like a vast sea of experience is all that we know and learn and encounter every single day. We are but tiny ships bobbing and rolling upon its waves, its currents steering us here and there. How on earth do we discern and decide what we value, what we believe, in order to collaborate with others in meaningful curricular relationships? (I almost wish I could just be waylaid by pirates, or something.) For me, one way to decide is to consider our shared motives, and find incentives to collaborate from there. Notwithstanding the degree to which people are educated, or by whom, everybody has motives.
But we do not all necessarily have a particular destination or a future port-of-call. So the aim for curriculum appears to be that of shaping motives to coincide with the current state of affairs such that, in a broad sense, people can (a) function – a measure of the self-ful – and then (b) contribute – a measure of the selfless. Upon this vast sea, we are not so much bound for any one destination as we are bound to assist each other, each underway to wherever best suits our particular circumstances at that time – yours for you, and mine for me – and let the tangents direct us as they will.
Education, I have come to learn, is learning to have more than a destination or purpose of my own. It is to convoy with others and have faith that they do the same for others and for me, and putting in to decidedly worthwhile ports-of-call on the way. On the way, we chart our courses, but as similar as the ocean might look any given moment, wave after rolling wave, no two moments are ever exactly alike. To that degree, everyone must chart on their own. How intentionally we aid each other, how much or how little we trust, how sincerely we navigate, it is our shared curricula that will determine how effectively we undertake any particular decision we are ever likely to face, alongside whomever we find ourselves. The more we convoy in earnest, the safer we will be. With that kind of support, what is it that would sink us?
One final cautionary note: if and when some finally do make landfall somewhere, with certainty to their decision, we must acknowledge that their perspective will shift dramatically from those others who remain, however more or less certain to remain, at sea. Not everyone wants to remain at sea, and such variances our curricula are obliged to accommodate, if not fully comprehend or appreciate. There on that solid shore might be a tighter homogeneous culture that yields a more one-sided – or dogmatic? prejudiced? – communal certainty all its own. On that shore we might find a trade-off that sets the communal trustworthiness of the bobbing convoy against the stable individual footing of landfall. Yet somehow we all must sustain what we share, no matter the differences that may arise between sailor and landlubber – and why?
Because what remains the same amongst us – indeed, that which makes us who and what we are – is what we have in common. Common to all of us is being alive, being a person, being a human being, someone deserving of a basic respect for human dignity. Each of us, all of us, every one of us. We are all people. In this regard, really all that differs between us is where we are, and when. For people to think in any way differently than this about other people is narrow, delusional, perhaps cruel, and flat-out wrong. That may hardly feel a satisfactory closing, maybe even anti-climactic, but who ever said learning was meant to be entertainment? Learning’s the thing wherein we catch the conscience of each other.
 Forgive the invention, “self-ful.” I hesitated to use “selfish,” which tends to connote self-seeking and self-aggrandizing behaviour (in that colloquial sense of “No, you can’t have any of my ice cream”), and taking inspiration from the Bard, I just made up a word of my own. Likewise, I do not use “selfless” in some altruistic way so much as simply to counter “self-ful”; as a pair, I intend them to signify simply the notion of there being, for each of us, an intrinsic “me” and plenty of extrinsic “not me’s.” Further, with my students, I would liken self-fulness to each one’s academic efforts and scholarship, and selflessness to voluntary service and community stewardship of whatever kind. The longer-term idea was teaching students to balance these as required by kairos, by circumstance – an appropriate time for each, and the wisdom to know the difference.
 Or maybe, just maybe, there’s a curricular role for those gnarly amphibious surfers, after all.
For all this, what exactly does it mean to be educated? From the sole perspective – yours, mine, anybody’s – free thinking means freedom granted to individuals to believe and behave as they do, then investing proportionate faith that they continue to believe and behave as we do. Of course, anyone’s beliefs might vary, freely, from ours, as compared to everyone conforming to the same beliefs and behaviours. Imagine that world, where every inhabitant lived according to self-established morality. In such a world, how would there come about any rule of law? Even real, lived experience here in Canada is tenuous, relying on everyone to rely on everyone else. Whether out of respect for each other, out of gaining some advantage, out of fear for paying a fine or going to jail – on it goes, accountability, but the individual freedom we avouch is as ready to dissipate as the smoke of a powderkeg. For all its enlightenment, free-thinking is quicksand: shifting, uncertain, deceiving, solid ground by mere appearance. Is it any wonder that the liberty and reason of Enlightenment individuation has led us to Postmodernism relativism, identity politics, and alternative facts? Be careful what you wish for. If there are any true binaries, to trust or not to trust must certainly be one. What need for faith when we trust that we are all alike, that all around is 100% certain?
Such a world is hardly plausible for me. I have learned not to trust everybody I meet. In the world I know, we need discernment and persuasive rhetorical skill to skirt potential conflicts and get others onside. And when others have discernment and persuasive rhetorical skill, too? Seen in that light, the curricular task is competitive, not cooperative. Even so, we might still argue that curriculum is collaborative, and it does not have to be belligerent. Curriculum falls within the scope of some given morality, morality being a question of right and wrong, positive opposing negative: to x, or not to x. However, curriculum itself is an ethical choice between alternatives and is, thereby, an empowering decision. We must therefore ask to x, or to y, which are positives, a question of competing rights, and not right competing against wrong.
And anywhere right does oppose wrong, curriculum should not permit a choice because wrong is simply wrong and not something that responsible choice can decide. Beyond simply learning about the freedom to think, curriculum is about learning how to make choices that are set within the scope of defined morality. Question the morality, compare it to another morality, and we are Hamlet: we are lost. But decide, and accept the morality, and question only those choices intrinsic to its milieu… now we are educating ourselves and others, however precisely or narrowly, for as long as we care to pursue whatever makes us curious.
For me, someone is educated who thinks, and discerns, and has aims. Admittedly, such aims could be countered or rationalised pragmatically or else, more perversely, aimed beyond oneself to harm others – thinking in itself, after all, is not inherently moral. So if morality is a thing to be taught and also learned, then an educated person, for me, is someone who learns generosity of some kind, hospitality. Being educated means learning to give of oneself, for others or on behalf of others, in positive, constructive ways. This belief, I suppose, reflects my learned morality, which I am as pleased in all caring as utility to pass along. Perhaps your morality differs. To that end, education, in itself, should intentionally be both constructive and benevolent in consideration of that sense of kairos, what is appropriate in the moment for teacher and learner, even as those moments accumulate over the passage of chronos-time, like endless waves upon the shore. Then again, who am I to anybody that the sole importance of my opinion should determine an education? If I am outnumbered, what is this sense of education that I describe but some solitary means of facing an existence nasty, brutish, and short? This thing called school will be the death of me!
See? Recruiting Hamlet’s cycle of misery seems all too easy “‘where the postmodern turn of mind appears to privilege the particular over the general’” (Roberts, 2003, p. 458). Frankly, I think our present culture regards the individual far too much. Naturally, the consequent short-changing of the bigger community picture has been playing out over chronos-time since, with every decision, there has been consequence. However, Roberts continues, “… ‘for Freire both [the particular and the general] depend on each other for their intelligibility’.” So perhaps a good education – by which I mean not just a moral one but an effectual one – is best measured with due consideration for its balance of the particular and the general, the heterogeneous and the homogenous, the certainty and the ambiguity, the inductive and the deductive. A little healthy scepticism, a little cloud for the silver lining. A little dram in the substance, to paraphrase Hamlet. “A little dab’ll do ya,” quips McMurphy. You can’t have one without the other, sings the primus inter pares.
We defy augury by flouting convention, even law, because we are free agents who do what we please. Some will have more courage than others, and some are just more foolhardy, but no one is literally predictable. We defy augury by being unpredictable, even inscrutable, although maybe the rest of you just never really knew me that well to begin with. Sometimes I even surprise myself. We defy augury by defying our senses, by not comprehending the world that we apprehend, which really is to say we see only what we want to see and recognise only what we already know. If there is special providence in the fall of a sparrow, what matter when we have spent all our time watching the chickadees? I cannot shake free from critiquing our cultural veneration of the individual: the less our shared beliefs converge and reciprocate a healthy community, the greater our insistence upon personal liberty to go our own way, then all the more do we miss the point of understanding exactly what freedom really is. True freedom results from having choices, and what creates choice is not the persuasive liberty of unequivocal individualism but discipline: to do ‘x’, or ‘y’, or ‘z’.
Shakespeare’s “Let…” statements are not so colloquial as to suggest the fatalism of c’est la vie, or the aimlessness of go with the flow – these, for me, amount to giving up, or else giving in. The tragedy of Hamlet is that the curriculum he really needed – the people he could trust, who would be willing to help him – they were already there, at his side the whole time, as ready and willing as ever, so long as he gave a little back, so long as he offered just a dram of willingness to coincide with their beliefs – to his own scandal, maybe, but who in the real world is so selfish as they might expect to have their cake and eat it, too? As compared to going it alone, Hamlet might have humbled himself and cast his lot with those to whom he is closest. His education from Wittenberg proved sufficient to challenge his upbringing in Elsinore, amply suggested by his continued trust to enlist and confide in Horatio throughout the play; as far as that went, the rest of us would do well to heed his lesson with due respect: if only Hamlet had not divided his loyalty but decided, once and finally, exactly who he was and whom he trusted, then lived up to his declaration with discipline. With integrity.
The most common criticism aimed his way by my students was essentially, “Get over yourself, and grow up!” Make a decision with the discipline to accept the consequences, which is to say, accept your personal responsibility. To be fair, Hamlet finally, triumphantly, does place his faith in Horatio, whom he entrusts to tell his story. Granted, he only asks once he is terminally poisoned but hey, better to ask while alive to breathe the words than come back and haunt Horatio as the next in a line of Ghosts. As for Shakespeare, whatever exactly it was that he saw in us, this ethical curricular dilemma, evidently he felt its redemptive quality was worth its cost, as Horatio makes known – or will do – for pledging to tell his dying friend’s tale to Fortinbras. Shakespeare’s appeal by way of Hamlet is not one of giving up or giving in. It is one of giving over, to something bigger than ourselves, to something in which faith placed is faith assured, and “attuned” (Pinar, 2017b, p. 1), and certain beyond our own devices.
What that object of faith might be… perhaps it comes as no surprise, but Shakespeare has a “Let…” statement for that, too: “… let your own discretion be your tutor” (3.2.17). I never included this one in the list for my students because, until writing this essay, I had never fit it in as such a central constituent. Hamlet delivers the line, as any nervous director might do opening night, during the aforementioned lecture to the Players before the Mousetrap performance. All the more ironic, of course, is that his lecture hardly exemplifies the statement, which would be fine if Hamlet, the director, did not assume the stage during the performance but let the actors get on with their craft. Hamlet, by contrast, twice assumes the stage to augment the performance. (Ahh, what to do about such insecurity! At least he sells tickets, you may remember.) Anxious or not, the wisdom of his advisement, taken for all, is easy for a lay audience to misinterpret, particularly as it comes buried within lines of such mundane theatrical detail. Shakespeare does not suggest that we give in to our discretion, carte blanche. He suggests that we give over to our discretion as a kind of teacher-student relationship.
Let curriculum be to trust your own better judgment, to search your feelings, yet to grant with humility that more may exist than meets the eye. Let discretion be a “tutor,” yet while you let it, also think before you act – and think during and after, too – because “… the purpose of playing… was and is, to hold… the mirror up to nature” (3.2.17-23). Whether this amounts to something esoteric or spiritual is down to the beholder, yet if that is true for any one of us, it must be true for all of us. Each one of us is finite and individual, and curriculum is composite, a sum greater than the whole of its parts, as in all of us, transcending time and space. As a force of faith, curriculum is vast indeed.
Click here to read the closing reflection to “A Kind of Certainty”: Pt V. Fleeting Uncertainty
 How often I referred students to Canadian Liberal MP Stephen Owen’s definition for democracy: “the pluralistic respect for citizens empowered to self-govern within the rule of law.” Democracy, so often simplified as “majority rule,” is more accurately understood (in my opinion) as entirely dependent upon its constituents. Democracy works because we all agree to make it work. Every member therefore has a personal responsibility to respect and live up to the standard of the law on behalf of every other member. One disobedient person weakens the system and places everybody, including themselves, at risk. Either we set that person straight, or we jail them, but unless we protect the system, we are only certain to lose it.
 *Sigh… culture precedes law, I would argue, and we endlessly debate and litigate what should be right versus what should be wrong. This is politics and the justice system at work, issue by issue, and with enough lobbying and / or civil disobedience, any given topic might be up for consideration.
 Okay, so I did find a way to toss in some surf.
 aka the Chairman of the Board, aka Ol’ Blue Eyes
 In Canada, we might say that Shakespeare’s appeal to “let go” means don’t grip the stick too tight. “Hold on loosely,” as Donnie Van Zant would sing, or “Give a little bit,” from Roger Hodgson. None fully clarifies the expression, as I gather Shakespeare intended it, but the notion of giving way in deference to others is helpful, for a start.
 Of course, the best rejoinder here would be, “He who dies with the most toys wins,” to which I would reply, “You can’t take it with you.” But dialectical bumper-stickers were never my strong suit, and I digress, even for end-notes.
On second thought, the best rejoinder is to say Hamlet is fictional, not of the real world. All the more reason to admire him as perhaps Shakespeare’s best creative feat, so life-like are he and the rest of the characters who populate the play.
 Between Opheila and Horatio, he nearly does so twice, and even towards Gertrude he aims some meager hope and sympathy. Alas, yet another essay…
 Shakespeare includes numerous allusions throughout the play to the theatre milieu, its characters and culture, and its place in Elizabethan society, many of which can be construed as humorous and even as insider jokes shared amongst his theatre company and his regular audience.
 I learned, for my own spiritual belief, to distinguish between what many religions have people do, as compared to what God through Christ has already done. The primary reference, here, is to the Resurrection and what Christ has done for all. Whether one chooses to believe or not is up to them, and should be, which is the essence of my belief: what comes down to a matter of personal choice is to believe, or not to believe. Consider Ephesians 2:8-9, for example, in which Paul explains that we are saved not by works but by grace, so that none can boast: justification by grace through faith in God is the essence of Christianity, and I emphasise that part of it left up to us, to have faith in God. Some consider this ridiculous, and that is neither here nor there to me although I wish no ill upon anyone. Upon believing, upon faith, one can grasp how a selfless attitude of giving – giving of oneself – matters as compared to more selfish concerns over what is given or how much is given.
Such concerns do arise since, as I believe, all inherit Original Sin, a concept that one must accept before anything else in Christian doctrine of any stripe will make sense: we all have inherited an imperfection to believe and have faith in our selves, apart from the God who created us; to go our own way; to obey our own inclinations and not His. This pride-of-self, set in motion by the conniving serpent’s lure that whetted Eve’s curiosity, then Adam’s, enough for them to disobey one simple command… this original “missing of the mark” prompted Adam, Eve, and all their offspring to realise within themselves what had never before even appeared on their radar screens: that obedience was only appreciable once disobedience had been tried. It’s the same binary idea as saying, “You only really understand peace once you experience war,” and so forth. So, for instance, in offering to God (Genesis 4:3-4), where Cain brings some, Abel brings the choicest; yes, each still gives, yet Cain is furious upon seeing the difference in God’s response between their offerings. The sense is that Abel gives in faithful obedience what Cain withholds for himself, Abel trusting God, in a way that Cain does not, that God will give back and look after him. Cain trusts in what he can manage and control for himself; evidently, he does not trust like his brother that God will give back. Perhaps he does not even believe that God created them although, if he does believe this, how much worse his distrust.
Avenging his own honour by killing his brother is a choice Cain makes, entirely selfish and sinfully predictable. This, for me, opens explanation as to why God allows evil to prosper: He gave us free will, in His image, out of love, to choose or to not choose His gift of salvation; to believe or not to believe in His Gospel, as a matter of faith; to trust Him or to trust something else. In either case, we, the people, are answerable for all we do. As I say, back then, Cain perhaps did or didn’t know he was God’s creation – he is left to his own account for that. These days, though, how many people hardly even consider God as real, much less as Creator or Benefactor? However, if God offered us no doubt of His existence, then what would necessitate faith? Were He to provide 100% certainty, anyone then would have no choice but to believe, of necessity, or else be a fool not to believe and delude themselves in spite of the certainty. As it is, some think believers are deluded; truly, you can’t convince all the people all the time, and you definitely should not force belief. All this, for me, is consistent with a caring God who has conferred free will. So, where some condemn believers as guilty of the crimes and evils committed in the name of Christianity (or religions altogether), in fact, I fully agree: hateful beliefs and violent acts are an abomination of how God would have us treat each other.
But, again, he has bestowed upon us the free will to decide and behave, and I argue that all such crimes and evils, whether in the name of religions or not, reflect Original Sin, our turning-away from God; they do not reflect God. They cannot reflect the character of God, whose nature is neither criminal nor evil; rather, they reflect the character of our selves, who are selfishly proud. People are responsible for bastardising and usurping doctrine in order to gain for themselves, something akin to Cain, so blatantly transparently selfish. Further, as that kind of belief and behaviour continues, it roots until generations have perhaps forgotten or lost any other way to believe and behave. We are human, taken for all, and finite in power and awareness. We can do no other than we continue to prove ourselves capable of doing – and in this I include both good and evil that we do – and this, truly, is why we’re in need of salvation. So much gets lost in scriptural debate over details – details that warrant discussion yet, being details, they are also prone to misinterpretation and thereby require careful, long-studied contextual understanding – but the basic doctrine and the loving character of God I find rather straightforward. It’s people who complicate and screw it up, not God. And I’m as guilty, neither better nor worse but just plain equal to every other person trying to live under our circumstances. So I try my best to respect peoples’ dignity, everyone’s.
My choice has been to believe based on the preponderance of evidence that I’ve learned and studied for many years – the careful, long-studied contextual understanding I mention above. I have plenty more to learn, but my point is that I did have to learn, to begin with. I did not just suddenly have some nuanced supreme understanding of Christian doctrine – indeed, I’m wary that superficial knowledge is so frequently the cause of the crimes and evils people commit in the name of religion. I consider myself blessed to have had the freedom to choose what to study without duress and to have had an education provided by good teachers who understood what makes for good curriculum. I have never felt assaulted or oppressed as far as my education is concerned – or my life, for that matter – and, furthermore, I achingly, mournfully recognise that so so many others cannot agree. Why not me, I can’t say, but I count myself as blessed for this, if for no other reason in my existence. I know so well that not everyone has enjoyed such Providence.
There is so much abuse and violence out there, person-upon-person, and I suggest that I, or you or anyone, ought to be enabled to read, search, and decide for ourselves whether or not to believe something. And never forced, and never judged. Personally, I’m not a big church-goer – I have done, but I don’t much anymore. But I still quietly personally maintain my faith. Even offering this endnote struck me as bold, but I wanted this post to be thorough and honest. I believe evidence exists – we have only to look for it: “Knock, and the door shall be opened” is God’s encouragement, to be proactive and search for Him rather than sitting idly by awaiting, or else ignoring, His imminent return. Nonsense, this, for some. And I can comprehend the doubt. But I don’t share it. By the same token, I offer my testimony, but I don’t impose it. People today who demand to see evidence – God performing miracles, say – are asking Him to lay foundations all over again. But, by analogy, a building only needs one foundation, so why would God repeat that process? Enough evidence has been documented over time, for me, that I now readily believe and join the church being built on the existing foundation. Again, as I opened this rather long endnote, what matters most is what He has already done: we have only to believe, with no further need to see more miracles, which is really what having faith is all about.
A year on, and this one, sadly, only seems more relevant…
[Originally published June 16, 2017]
I had brilliant students, can’t say enough about them, won’t stop trying. I happened to be in touch with one alumna – as sharp a thinker as I’ve ever met, and a beautiful writer – in the wake of the 2016 U.S. election campaign and wrote the following piece in response to a question she posed:
How do you teach open-mindedly in the post-truth era?
I was pleased that she asked, doubly so at having a challenging question to consider. And I thoroughly enjoyed the chance to compose a thoughtful reply.
I’ve revised things a little, for a broader audience, but the substance remains unchanged.
How do you teach open-mindedly in the post-truth era?
Good heavens. Hmm… with respect for peoples’ dignity, is my most immediate response. But such a question.
Ultimately, it takes two because any kind of teaching is a relationship – better still, a rapport, listening and speaking in turn, and willingly. Listening, not just hearing. But if listening (and speaking) is interpreting, then bias is inescapable, and there needs to be continual back-and-forth efforts to clarify, motivated by incentives to want to understand: that means mutual trust and respect, and both sides openly committed. So one question I’d pose back to this question pertains to the motives and incentives for teaching (or learning) ‘X’ in the first place. Maybe this question needs a scenario, to really illustrate details, but trust and respect seem generally clear enough.
Without trust and respect, Side ‘A’ is left to say, “Well, maybe some day they’ll come around to our way of thinking” (… that being a kind portrayal) and simply walks away. This, I think, is closed-minded to the degree that ‘A’ hasn’t sought to reach a thorough understanding (although maybe ‘A’ has). Whatever the case, it’s not necessarily mean-spirited that someone might say this. With the best intentions, ‘A’ might conclude that ‘B’ is just not ready for the “truth.” More broadly, I’d consider ‘A’s attitude more akin to quitting than teaching, which is to say a total failure to “teach”, as far as I define it from your question. It would differ somewhat if ‘A’ were the learner saying this vs being the teacher. In that case, we might conclude that the learner lacked motivation or confidence, for some reason, or perhaps felt alone or unsupported, but again… scenarios.
Another thing to say is, “Well, you just can’t argue with stupid,” as in we can’t even agree on facts, but saying this is certainly passing judgment on ol’ stupid over there, and perhaps also less than open-minded. To be clear… personally, I’d never say bias precludes truth, only that we’ll never escape our biases. The real trouble is having bias at all, which I think is what necessitates trust and respect because the less of these is all the more turmoil. I figure any person’s incentive to listen arises from whatever they think will be to their own benefit for having listened. But “benefit” you could define to infinity, and that’s where the post-truth bit is really the troublesome bit because all you have is to trust the other person’s interpretation, and they yours, or else not.
Yeah, I see “post-truth” as “anti-trust,” and that’s a powderkeg, the most ominous outcome arisen of late. People need incentives to listen, but if treating them with dignity and respect isn’t reaching them, then a positive relationship with me wasn’t likely what they wanted to begin with. That’s telling of the one side, if not both sides. At the same time, it’s harder to say in my experience that students have no incentives to listen or that, on account of some broader post-truth culture, they don’t trust teachers – that might be changing, who knows, but I hope not.
But I’m leaving some of your question behind, and I don’t want to lose sight of where it’s directed more towards the person doing the teaching (you asked, how do you teach open-mindedly…).
That part of the question was also in my immediate reaction: respect peoples’ dignity. For me, when I’m teaching, if I’m to have any hope of being open-minded, I intentionally need to respect the other person’s dignity. I need to be more self-aware, on a sliding scale, as to how open- or closed-minded I’m being just now, on this-or-that issue. So even while that’s empathy, it’s also self aware, and it’s intentional. It’s not “me” and “the other.” It’s “us.”
Me being me, I’d still be the realist and say you just can never really know what that other person’s motive truly is – whether it’s a pre-truth or post-truth world doesn’t matter. But whether or not you trust the other, or they you, the real valuable skill is being able to discern flaws of reason, which is what I always said about you – you’ve always been one to see through the bull shit and get to the core of something. I’m no guru or icon, I’m just me, but as I see it just now, the zeitgeist is an emotional one more than a rational one. And there’s plenty to understand why that might be the case. And given that emotional dominance, I do think post-truth makes the world potentially far more dangerous, as a result.
Whatever incentives people are identifying for themselves, these days, are pretty distinct, and that’s a hard one for unity. That saying about partisan politics – “We want the same things; we just differ how to get there” – that doesn’t apply as widely right now. So, by virtue of the other side being “the other” side, neither side’s even able to be open-minded beyond themselves because trust and respect are encased in the echo chambers. More than I’ve ever known, things have become distinctly divisive – partisan politics, I mean – and I wonder how much more deeply those divisions have room to cut. Selfish incentives cut the deepest. Trust and respect guard us from deep cuts.
So, for instance, lately I find with my Dad that I listen and may not always agree, but where I don’t always agree, he’s still my Dad, and I find myself considering what he says based on his longevity – he’s seen the historic cycle, lived through history repeating itself. And I obviously trust and respect my Dad, figuring, on certain issues, that he must know more than me. On other issues, he claims to know more. On others still, I presume he does. Based on trust and respect, I give him the benefit of the doubt, through and through. One of us has to give, when we disagree, or else we’d just continually argue over every disagreement. If you want peace, someone has to give, right? Better that both share it, but eventually one must acquiesce to their “doubt” and make their “benefit” finite, stop the cutting, compromise themselves, if they’re to see an end to the debate. So should I trust my Dad? I respect him because he’s given me plenty good reason after such a long time. Certainly I’m familiar with his bias, grown accustomed to it – how many times over my life have I simply taken his bias for granted? Too bad the rest of the world don’t get along as well as my Dad and I do.
I see it even more clearly with my daughter, now, who trusts me on account of (i) her vulnerability yet (ii) my love. The more she lives and learns alongside me, as time passes by, the more cyclically her outlook is reiterated, a bit like self-fulfilling prophecy. Other parents have warned me that the day’s coming when she’ll become the cynical teenager, and I’m sure it will – I remember going through it, myself. But I’m older, now, and back to respecting my Dad, so at least for some relationships, the benefit of the doubt returns. My Dad preceded me, kept different circles than me, and lived through two or three very different generations than me. Even as we see the same world, we kind of don’t. So this is what I wonder about that deep cut of division, reaching the level of family – and, further than one given family, right across the entire population. Do I fact-check my Dad, or myself, or maybe both? Should I? Even if I do, neither one of us is infallible, and we’re only as trustworthy as our fact-checking proficiency.
Anyway, the child of the parent, it’s as good an example as I can think of for questioning what it means to learn with an open mind because there’s no such thing as “unbiased.” Yet love, trust, and respect are hardly what we’d call “closed-minded,” except that they are, just in a positive way. Love, trust, and respect leave no room for scepticism, wariness, and such traits as we consider acceptable in healthy proportions (for reasons about motive that I explained above).
But “teaching” with an open-mind takes on so much more baggage, I think, because the teacher occupies the de facto as well as the de jure seat-of-power, at least early on – school is not a democracy (although that now seems to be changing, too). Yet teachers are no more or less trustworthy on the face of it than any other person. That’s probably most of all why I reduce my response to respecting human dignity because where it’s closed-minded, for all its “positive,” it’s also a do-no-harm approach.
That jibes with everything I’ve learned about good teaching, as in good teaching ultimately reduces to strong, healthy relationships. Short-term fear vs long-term respect – it’s obvious which has more lasting positive influence. And since influencing others with our bias is inevitable, we ought to take responsibility for pursuing constructive outcomes, or else it’s all just so much gambling. At the core, something has to matter to everybody, or we’re done.
Outside of corruption, throwing the game, which has no place in this discussion, I submit that nobody deliberately plays to lose.
Specifically, I’m talking about football, less commonly known as soccer, and perhaps this discussion even applies to many different sports. But, as a player and coach, football is the beautiful game that I know best, so here goes.
Playing football, we would anticipate the team that makes the fewest mistakes ought to win – as in, the fewest mistakes both in and out of possession, from the kick-off until full-time. If so, then consistent quality performances are key because these should result in more opportunities to earn a win and prevent a loss. What’s more, as the reward for winning grows more lucrative, and the stakes are raised, players must all-the-more learn to develop that “consistent quality performance” on demand, under whatever pressure: effective decisions, executed at the proper moments, skillfully, every time, or at least as frequently as possible. Developing this “quality performance” consistency also demands that opponents earn victories rather than handing them the result, unimpeded, because now they’re challenged to execute just as consistently, if not just as flawlessly. As I say, no one competes to lose.
So, what of development and winning in light of all this? Too often, for me, these two ideas are falsely conflated into sides of what is truly a non-existent – or, at least, a very ill-conceived – debate. As ends-in-themselves, development and winning are typically deemed incompatible. Further, winning is then often vilified since winners produce losers while development is commended for being inclusive. At that point, I find the debate often sidetracks into competition versus fun, another false dichotomy, but in any case, the parameters are so muddled as to render all a meaningless waste of breath. For the sake of dispensing with the issue, I simply ask: why would we not reasonably expect to see fun in conjunction with competition? These are not oil and water, nor do they need to be, nor should they be deemed to be.
Football, the Game, can be played for fun, exhilaration, fitness, camaraderie, focus, perseverance, discipline, teamwork, all manner of virtues and benefits, yet all these on account of the very nature of the Game as a contest of opposition. And where one person finds things fun and enjoyable, another does not necessarily agree, yet who’s to say who is correct, if the Game has enabled all? All sorts of people find all sorts of fun in all sorts of things – who’s to say that finding competition to be fun is wrong, if only because it makes you squeamish? Just the same, if someone’s threshold for intense competitive drive is lower than another’s, can each still not enjoy playing with like-minded peers? In fact, just for instance, this is exactly why various youth and adult leagues categorize levels of play into (for ease of this discussion) gold, silver, and bronze tiers. Everyone must learn to play, and development (to whatever degree) will occur as they go. That implicates teammates, the quality of coaching, and other factors relating to a team or league’s motives for playing in the first place (i.e. gold vs silver vs bronze). Motive, however, does not change the nature of the Game, itself, or the nature of effective learning, development, coaching, and teaching.
As I see it, the issue is not Development for its Own Sake versus Winning for its Own Sake or even …for its Own Sake versus Development in order to Win. The issue is Development and Learning as a concept, altogether, period, because how else could you learn to play? And the more you play, the more you develop. Whether that development is good or poor is down to context, and a separate issue.
And when the arguments start, what’s really being debated, it seems to me, is how any one person simply wants to be “right” and demand that everyone else agree with what constitutes “successful” participation in the Game. Ironically, it’s a territorial argument over ideology. But to win an egotistical war suggests to me that we might better spend our efforts re-evaluating our culture and how we wish to treat other people.
Fair enough, people want to be “right.” We all have egos. But can we at least offer some basis from which to claim what the word “successful” can mean? So here goes.
Since losing a match always remains a possibility, no matter how consistent our quality performance might be, we ought to measure “success” as the degree to which a player or team has developed that consistent quality of performance (process) over time, at their corresponding level and motive for play, regardless of winning (product).
**I’ll specify, as I did above, that where wins are lucrative – such as in professional play – the stakes grow higher, and different debates will ensue about what “success” means. Yet that’s a commercial issue, relating to development and learning on the basis of peoples’ patience and tolerance for financial pleasure or pain: in other words, the two issues are not inherently related but coincidental: a crowd of supporters or sponsors are willing to pay to back the team for a season.**
For the Game, itself, we must let winning take care of itself because players control what they are able to control, under conditions that also include the pitch, the ball, the referee, the weather, health, fitness, and so forth. So what can we measure? Measurements ought to fall under player and team control, e.g. shots at goal, completed passes, tackles won, saves made, etc. Far from counteracting the importance of winning, such consistent measurements of quality performance provide feedback, i.e. if our pass completion is 90% successful around the penalty box, then maybe we don’t score because our shooting is infrequent or inaccurate. One might even argue that the statistical measurements we gather are less important than the ones we’ve overlooked.
In any case, successful players and successful teams identify strong and weak areas by regularly measuring consistent quality across a range of performance details, and they develop each area for consistency – which we anticipate will translate into more wins – because consistent quality performances usually translate into what can be measured as an “ongoing success.” Success now defines a degree of purposeful, committed, consistent hard work, which makes for more focused, more effective training. Developmentally, the more successful you are, the more often you can theoretically win – but if your opponents also train and measure, and respond better than you do, then guess what? That’s called competition.
Development and winning not only can but already do co-exist. And they always have. It’s people who separate them, falsely, perhaps because they want to win more than they want to earn wins – or, worse, perhaps because they merely want to win a territorial argument about development vs winning that never existed before someone’s ego dreamt it up.
Beyond on-field training and competing, development and learning should cover a range of areas that affect yet lie beyond the Game, e.g. health, fitness, nutrition, goal setting, mental preparation, personal responsibility. Coaches ought to take players beyond the Game, teaching them how to train, how to contribute to a team, how to compete at higher levels of skill and intensity, how to manage the dynamics and emotions of competition, and how to conduct themselves with personal integrity in all respects. Of course, the Game is included within the scope of these matters because that’s why we’re a team in the first place. And the range of these inclusions will comprise a more holistic football program. We implement and evaluate that program as we go, or we ought to.
Effective programs inevitably reveal the crux of commitment, either thanks to peoples’ dedication or on account of their inconsistency. Effective programs encourage trust and a shared pursuit of common goals. Where trust and commitment are maintained consistently and respectfully, a team and its members learn to measure quality and respond consistently, i.e. successfully. Such programs require time, discipline, and patience to learn, but the degree to which participants buy into the philosophy is met with concomitant developmental consistency, and again, one can expect winning to result more often than not, relative to the quality of the opposition. Likewise, individual people can take credit for this-or-that achievement only relative to their teammates, who are also active participants in the program.
Active participation should find team members applying complementary strengths by filling key roles on the path to team success. Individual contributions accumulate, and if these have been consistently defined by common goals and measured for consistent quality, “success” is more likely because people can envision it more clearly and pursue it more meaningfully.
Opponents, especially of equal or slightly higher abilities, likewise play a key role in a team’s pursuit of success since measuring consistent quality performances against them is, in one sense, what the Game – and what sport – is all about. Active involvement in a program unites a team, preparing everyone for more advanced challenges. Occasionally, a teammate might advance to more elite programs, and when a team member grows beyond the scope of the program, that is a team success that all of us can share.
So, it’s interesting, listening to people talk these days, quite frankly, in terms of their words, their language, their speech. I have an issue with what everyone’s saying – not like everyone everyone but, you know, it’s just their actual words when they talk about complex issues and such, or like politics, what with the whole Trump thing, you know, that Russia probe and the Mueller investigation and everything that goes with that. I’m also a bit of a news hound, and that’s really where I started noticing this on-air style of speeching, of making it sound thoughtful and taking them seriously.
And it’s so much out there, like an epidemic or something, which is interesting, which speaks to on-line streaming and TV news, talk radio, and pretty much the whole 24-hour news cycle. I was a high school English teacher for sixteen years, and I also started noticing all this, you know, frankly, during class discussions, too. And there was me, like guilty as anyone.
Here’s the thing, though, because I guess substance will always be up for debate, but that’s just it – it’s so wide-ranging that it’s like people have no idea they’re even doing it, which is interesting. It’s almost like it’s the new normal, which really begs the question – are people getting dumber? Is education failing us? In terms of intelligent debate, that will always be something that probably might be true or false. And let’s have those conversations!
But in terms of intelligible debate, it’s interesting because, when I listen to how people are talking, it gets really interesting because when I listen what they actually say, it’s like they’re making it all up on the spot in the moment as they go, so it’s just that that makes me not as sure it’s intelligent as it’s less intelligible. But it’s all in a sober tone, and they’re just expressing their opinion, which is democracy.
And that’s the thing – if you challenge anybody with all what I’m saying, clarity-wise, it’s interesting, they’ll get all defensive and whatnot, like it’s a personal attack that you’re calling them stupid or whatever, like you’re some kind of Grammar Jedi.
And, I mean, I get that. So that’s where I think people don’t really get it because I totally get where they’re coming from.
Seriously, who would want to be called like not intelligent or anything all like that, whatever, especially if we’re trying to discuss serious world issues like the whole Russia thing that’s been happening or the environment or all the issues in China and the Middle East? Or terrorism and all? I mean, if you look at all that’s happening in the world right now, but you’re going to get that detailed of the way someone talks, maybe you should look in the mirror.
And I mean, SNL did the most amazingggggggggg job with all this, back in the day, with Cecily Strong on Weekend Update as The Girl You Wish You Hadn’t Started A Conversation With At A Party. Comedy-wise, she even like makes a point, but basically, she’s furthering on intelligence, except I’m talking about intelligibility. But still, if you haven’t seen it, what can I tell you? Your missing out, SO FUNNY. She. Is. Amazing.
And that’s the other thing, and this one’s especially interesting, is just how there’s just SO MUCH out there, what with Google and the Internet, and Wikipedia and all, so who could possibly be expected to know like every single detail about all the different political things or the economy and all the stuff that’s out there? And it’s even more with speaking because pretty much most people aren’t like writing a book or something. (W’ll, and that’s just it – nobody speaks the way they write, so… )
Anyway, so yeah, no, it’s interesting. At the end of the day, first and foremost, one of the most interesting things is that everybody deserves to have a say because that’s democracy. And I think that gets really interesting. But the world gets so serious, probs I just need to sit down. See the bright side, like jokey headlines from newsleader, Buzzfeed, or last year’s “Comey Bingo” from FiveThirtyEight. Gamify, people! News it up! Nothing but love for the national media outlet that helps gets you wasted. Or the one about the viral tweet, for an audience intimately familiar with pop culture? News should be taken seriously, and the world faces serious aspects, for sure. But the thing is, work hard but party harder! I mean, we’re only here for a good time, not a long time!
And it’s interesting ‘cuz people seem to require more frequent, more intense, more repeated engagement, to spice up their attention spans. There’s some good drinkinggames, too, on that, because politicians! I know, right? But not like drunk drunk, just like happy drunk, you know? Not sure if all this counts as it means we’re getting dumber, per se, but it’s just interesting.
So, yeah, it’s interesting because we’ve come such a long way, and history fought for our freedom and everything, so I just really think going forward we should just really appreciate that, and all, you know?
As far as I understand Jacques Derrida’s différance, he observes that we understand our experiences as distinctive, but not exhaustive, communicated links or marks comprising an on-going decisive chain of experiential moments. As to the language we use to describe our experiences, a word has contextual meaning, both from its usage at any given time as well as from its etymology over the course of time. I tend to agree with this attendance to context as furnishing meaning, and I can also spot the rabbit hole that it poses. For example, to understand some word’s definition, I might look it up in the dictionary and be left to rely upon the definition of whomever decided what it meant while, at the same time, face all sorts of words in the definition that now need looking up, too – Sisyphean, indeed! Cruel but so usual. On the other hand, thanks to whomever for compiling the dictionary, a pretty utile compendium, I have to say.
To be clear, I am not intending to invoke logocentrism, by which all our words are accorded a decided meaning from a cultural centre, which propagates existing biases or “privileges”; Derrida would roll over in his grave. Granted, I may already have laid grounds here to be accused of logocentrism, myself, by writing with words (and I confess to using English because I didn’t think anyone had the patience to muddle over Wingdings). My present aim is to suggest how we might address the afore-mentioned rabbit-hole dilemma by searching for or (… almost afraid to say it) by decidingupon some definitions of our own. Not like a dictionary, but more like– well yes, okay, like a dictionary, but one that we’ll fashion from the ground-up, like when the light bulb would go on above Darla’s head, and Spanky would snap his fingers to say, “Hey, everyone! Maybe we can put on a play!” So, in the spirit of dissemination, hey everybody, maybe we can compile a dictionary! A real, deconstructive, crowd-sourced dictionary!
I’m not really compiling a dictionary. I’m just trying to make some sense of Derrida and différance. Let me try to illustrate what I mean from my own experience. Sometimes I play Walking Football, a version of the game where players are not permitted to run. Naturally, the debate is over what differentiates walking from running. We’ve agreed that walking means “always having at least one foot in contact with the ground during the striding motion.” Running means “having both feet leave the ground at some point during the striding motion.” This makes for certainty, and so long as our eyes are trained enough to spot feet in motion, which I can spot sometimes so clearly, with such immediacy, that its more like I’m watching, not playing – I’m ghinding it tuff even now to ghet the right words, but trust me. And so long as each player is willing to obey the rules – and, ohh my, there’s always that one player who just won’t. You know who I mean… *sigh… Anyway, so long as they’re not just words uttered that then float away in the breeze, our definitions of the rules for walking and running are useful.
Luckily, too, I might add, when we clarify the rules, we do so out loud, together, and don’t whisper it around in a circle, like when my daughter plays Telephone at a birthday party – after all, we want everyone to be clear. Finally, even if we have trouble spotting feet in motion, because it all happens too quickly, or even if that one player is a cheater at heart, the definitions themselves remain clear, and usually at least one or two of us can remember them well enough to recite back, as needed, usually with a lot of finger-pointing and furrowed brows. One time we even wrote the no-running rule on the gym chalkboard, and even though no one challenged this, on the grounds that writing is secondary to speech, everyone still understood why it was scrawled there, by which I mean everyone knew exactly who should read it the most – wow, does every game have that player? Incorrigible.
Bottom line: accountability is down to the sincerity and respect offered to each player by every other player who decides to participate. As an aside, the need for a referee, an arbiter, is all the more clear when the stakes are as high as bragging rights and free beer. But, even as we play for fun, the rules exist or else the game, as such, does not. (On that note, I find a lot of players just don’t like Walking Football and would rather play with running, and that’s fine, too: it’s their decision, and plenty other like-minded players keep both games afloat. I find the Walking game amplifies decision-making, so maybe this feature just appeals to me. And I still play traditional football, too.) My broader point is that any one person must decide to accept what has been defined and, likewise, any group of people must reach a consensus. Shared meaning matters because, otherwise, as I say, we don’t have a game, or else we have a very different one, or we just have anarchy. But whether that person, alone, or the group, altogether, searching for a way to decide upon meaning, has the patience to delve down the rabbit hole… well, yes, context does indeed matter – both usage and etymology. I’ve said and written as much, myself, for a long time. So, in light of all this, I hope I’ve gathered a little something of Derrida’s différance. I’m still learning.
Another illustration: in my teaching, I occasionally introduced this matter of contextual meaning by offering students a list of synonyms: “slim,” “slender,” “skinny,” “thin,” “narrow.” Each word, of course, has its own particular meaning. “If they meant the same thing,” I offered, “then we’d use the same word,” so just what explains the need for all these synonyms? Well, students would say, there are lots of different things out there that possess or demonstrate these various adjectives (my word, not theirs), so we’ve come up with words to describe them (and I think that’s a charitable “we,” like the Royal “We.”) As the discussion proceeded, I might ask which of these words typically describe human traits versus those – leaving aside metaphors – that typically do not. Next, which words typically possess positive connotations, and which negative, or neutral? And, as it pertains to the personification metaphors, which words are more easily envisioned versus those that really stretch the imagination, or even credibility? Eventually, I would shift from ontology to epistemology, posing the questions at the heart of my intention: For any of the previous questions about these synonyms, how do you know what you’re talking about? For what each of these words could mean, where have your assurances come from? Of course, the most frequent reply to that question was “the dictionary,” followed by “my parents” or “books I’ve read,” or “just everyday experience, listening and talking to people.” Occasionally, the reply was something akin to “Who cares… it just means what it means, doesn’t it?” In every reply, though, one common thread was detectable: the involvement of other people as part of the meaning-making process. Fair enough, we can’t all be Thoreau.
One more example: when is “red” no longer red but perhaps orange or purple? Well, for one thing, if you’re colour blind, the question means something entirely different, which I say not flippantly but again to illustrate how important dialogue and community are to deciding what something means. For another thing, we might wish to ask, in keeping with context-dependency, “Why even ask?” Again, this is not flippant or dismissive but practical: when does it matter so that we distinctly need to identify the colour red? Where a group of people might face the question over what is red versus what is orange or purple, we might expect some kind of discussion to ensue. And, whether asking as part of such a group or as a hermit, alone, I submit that one person’s decision about what is “red” is ultimately down to one person to determine: “Red is this,” or “This is red,” or even, “Gosh, I still can’t really decide.” Even a coerced decision we can still attribute to the one who forces the issue – one person has decided on behalf of another, however benignly or violently: might makes right, or red, as it were.
Coercion introduces a political consideration about whose authority or power has influence, similar to needing referees on account of those players who decide to run. The point, for now, is simply that a decision over what something means to a person is ultimately made by a person, leaving others to deal with that decision on their own terms in whatever way. But other people are part of the meaning-making process, even passively, or else I wouldn’t need language to begin with since the rest of you wouldn’t trouble me by existing. Not to worry, by the way, I appreciate you reading this far. From what I understand (and I am convinced I must learn more, being no avid student of either postmodernism or Derrida), his observation of différance either discounts or else offers no account for the arbitrary decision-making that people might make when they decide they’ve had enough. People tend to land somewhere in a community, and it’s the rare person who lives and plays wholly and uncompromisingly by their own rules. However, the fact that he felt différance was worth the effort to publicise and explain to the rest of us does reflect an arbitrary decision on the part of Derrida and says something about him.
So this is where I have more fundamental trouble understanding Derrida and différance – the very notion of “different,” as in, in what world could there not be an arbiter? Even a life alone would face endless decisions: what to eat, where to go, when to sleep, and so forth. From such musing – speaking of rabbit holes – I was led to reading about another philosopher named Jacques, this one Rancière, and what he calls the axiom of equality. In pure nutshell form, I take this to mean that no (socio-political) inequality exists until it has been claimed to exist – and note that it’s claimed in a boat-rocking kind of way, what the kids these days are calling “disruptive.” The upshot is that equality, itself, can only ever be theoretical because someone somewhere inevitably is and always will be marginalised by the arbitrary decisions of cultural hegemony. Still learning.
Back to the Walking Football analogy: if the rabbit hole of defining a word in the context of those that surround it, and then having to define, even further, all those words, and on and on, and experience is inexhaustible, and what’s the point, and lift a glass to nihilism… if that kind of limitless indefinite deconstructive search-and-compare lies at the heart of what is different, then maybe Derrida just found it difficult to reach agreement with other people. It stands to reason that, if he played Walking Football, Derrida might be the worst cheater on the floor, continually running when he should be walking, then denying it just the same as he tried to gain advantage. Maybe, fed up being called a cheater, he would take his ball and go home to play by himself, where no one could say he was wrong. Being alone, who would be there, whether as an obedient player or as a sneaky one, to challenge him? In fact, maybe that’s why he chose to return to play the next day – for all the arguing, he enjoyed the game, or the attention, or the camaraderie, or the exercise, or whatever, more than being accused of cheating. I wonder if, perhaps, in the great game of philosophy football, he would have been the only rival to strike real fear in Diogenes – I mean awe & respect kind of fear, just to clarify, and I mean if they had lived at the same time. It’s hard to know about Diogenes since nothing he wrote down ever survived, and these days, I doubt more than a few can recall any of whatever he said, besides that lamp-carrying honesty thing. (We should all have such good spirit when it comes to our first principles.) Anyway, I think Diogenes played for Wimbledon.
Well, I am being unkind to Derrida. Evidently, he was a kinder person by nature than I have let on, as well as an advocate for all voices, all people. And the professional care, the uncompromising expertise he took to convey his ideas, to trouble himself with delving down the rabbit hole so arbitrarily – to go down at all but, moreover, to go so far when he might, just the same, have decided to halt. Delve as far as you like, but accept responsibility for your decision, every time. In that respect, how does Derrida differ from any other person facing decisions? Did he have still other motivations? No player who kicks a football is deliberately playing to lose, not unless they have been coerced by someone else to do so. On the other hand, for all I know, maybe what Derrida called red I would call blue. Be careful not to pass the ball to the wrong team! (By the way, in sport, dynasties are remembered precisely because they eventually come to an end.)
Was Derrida no less accountable and open to scrutiny than you, or me, or anybody else? To suggest that a word only makes sense based on how it differs from those around it is no less arbitrary than its reciprocal suggestion, that a word only makes sense based on how it describes only what it describes. Half-full / half-empty, six of one… Two sides of the same coin are still the same coin. Alternatively, who put him up to all this? Meanwhile, on his own, surely Derrida had it within himself, as people do when they reach a point, simply to say, “Here is enough. I decide to stop here. For me, [the item in question] means this.” If that doesn’t ring true and sound like him, well, I’d say that can be just as telling of his character; I heard it suggested, once, how we can be helped in knowing something by what it is not. So, fine – for Derrida to stake the claim called différance, I’m willing to concede him that moment. We all land somewhere, and we’re all hardly alike, even when we’re alike.
We are, each and every one of us, individual. But together we comprise something just as dynamic on a larger scale – one might construe us societally, or perhaps historically, anthropologically, or on and on, in whatever way through whichever lens. For me, différance appears an attempt to speak for all about all, prescriptively. A grand stab at philosophy, no question, and that’s the beauty of the equality of philosophy, with thanks to Rancière: we all have a part to play and a right to respond. For the time being, as I have understood Derrida and his thinking, and I willingly stand to be instructed further, différance strikes me as ironic, being an advocacy for the dynamic development of people and language and culture that self-assuredly asserts its own accuracy. That is not an uncommon indictment of postmodernists. What’s more, it is ohh, so human.