A Kind of Certainty: II. Curriculum, or What You Will

Click here to read Pt I. An Uncertain Faith

 


A Kind of Certainty

2. Curriculum, or What You Will

Baumlin (2002) distinguishes three concepts of temporality. Chronos is linearity, our colloquial passage of time, “non-human; impersonal objective nature” (p. 155), from which we understandably define past, present, and future. In relation to this is kairos, a single point in time, “[describing] the quintessentially human experience of time as an aspect of individual consciousness, deliberation, and action… that single fleeting moment … when an individual’s fortune is ‘set in motion’, … [providing] the means” and yielding “Fortuna, the consequences” (p. 155). Interwoven with kairos, then, is Occasio, the cause to Fortuna’s effect, a sense of “‘right-timing’ and prudent[1] action,” an opportunity[2] to better the capricious lies of fortune and fate. Although this sense of opportunity was emancipating, it also engendered accountability for consequences.

The developing belief that we possessed not mere agency but free will weighed upon Renaissance thinking and was a trait that Shakespeare often imparted to his characters, Hamlet (4.4.46-52) being but one example.[3] By the time 17th century Elizabethans first watched Hamlet on stage, the humanist challenge to “a grim… Christian sufferance and resignation to time” (Baumlin, 2002, p. 149) was well underway. Unsurprisingly, Shakespeare offers nothing firm in Hamlet as to where our belief should lie, either with fortune or with free will; indeed, leaving the debate ruptured and inconclusive seems more to his point. To this end, perhaps most notable is his placement of Hamlet alongside Horatio in the graveyard to ponder the dust and fortune of Alexander, Yorick, and – hard upon – Ophelia.

In handling Yorick’s skull, Hamlet revives the poor fellow’s “infinite jest [and] excellent fancy” (5.1.186), memories of such fond “pitch and moment” (3.1.86) as to “reactivate” (Pinar, 2017a, p. 4) his own childhood, even momentarily. Such specific remembrances educed by Hamlet (which is to say, by Shakespeare) expose the springe of kairos; ultimately, certainty is beyond our capacity, rough-hew it[4] how we will. Colloquially, this might seem obvious (i.e. “the best laid plans…” and so forth, and no one person apparently able to pick the right lottery numbers each week). Yet the extent to which we consider ourselves masters of our own fortune is, for Hamlet, presently in the graveyard, a kind of epiphany, “a spiritual (re-) awakening, a transformation” (Baumlin & Baumlin, 2002, p. 180).[5] He decides that yielding himself over to “divinity” (5.2.10) is wise as compared to the folly of trying to control what was never within his grasp to begin with.

He does not give up any freedom so much as give over to dependence, which of course is a leap of faith. Shakespeare poses a question of allegiance – to obey, or not to obey – further compounded by which allegiance – obedience to father, or to Father; to free will, or to fortune; to an unweeded garden, or to what dreams may come – all these are the question.[6] Shakespeare has Hamlet “reconstruct” (Pinar, 2017a, p. 7) his conceptions of allegiance and obedience during the exchange with the Gravedigger, which hardens Hamlet’s resolve yet also enables him to come to terms with his tormenting dilemma over fealty and honour. By the time his confrontation with Claudius is inevitable,[7] Hamlet’s decision to “let be” (5.2.224) “[marks his] final transcendence of deliberative action in worldly time” (Baumlin & Baumlin, 2002, p. 180). Thus is indicated the subtle dominance of the third temporal concept, aion, “the fulfillment of time” (Baumlin, 2002, p. 155), a circularity like the uroboros, the serpent swallowing its tail. As such, aion signifies what is boundless or infinite, neither more nor less than eternity.

Oddly enough, these three concepts, in concert, can seem both time and place,[8] describing a “spatial-temporal sequence … from point, to line, to circle”; from “natural to human to divine orders” (p. 155). I am not fixed to the idea of a “sequence,” but the general composite still shapes my response to Hamlet’s most famous question of all.[9]

 


Let go. Learn from the past, but don’t dwell on it.

left (past)

Let it work. Anticipate the future, but no need to control it.

later (future)

Let come what comes. Every possible decision will still yield consequences.

Let be. Pay attention now to what is now.

The readiness is all. (5.2.222-223)

lasting (present)

The rest is silence. (5.2.358)

(a clever double-meaning here: “the rest” = either past regrets and future anxieties or else the undiscovered country, death)


 

As I take them, these four “Let…” statements amount to sound wisdom, like trusted advice from teacher to student or parent to child. As a student and child, myself, writing this paper, I faced some question of certainty – the same question, strangely enough, that we ask about curriculum: what is worth including? By the same token, what is worth omitting, and from there, what will also be otherwise left out or unmentioned? Whatever we decide, one thing is certain: we can neither cover nor even conceive it all, which of course was my original problem. In fact, knowing as much as we know can even shed paradoxical light onto how much we remain in the dark. Eventually, as my Dad recommended over the phone, I simply needed the courage to make a decision and go with it, and even with his voice in my ear, I knew my own advice with my students had always been the same.

Hanging up, I reasoned further that any feedback I did receive – from peers during revision or from my professor’s formal evaluation – would illustrate how effectively I had collated and communicated my message. Beyond that, say revising the paper for publishing, I would have some ready direction. And there it was, I realised, staring me in the face, curriculum in a nutshell: conversations, decisions, actions, evaluations, reflections – all these, in relation to me as I wrote this essay, amounted to a lived curricular experience of my very own.[10] My curriculum, like this essay, does not simply pose the straightforward question about what is worth including. That question is insufficient. More particularly, my curriculum, like this essay,[11] prompts me to consider what is worth including in light of the audience, the topic, what is already known about the topic, and whatever aims exist in further pursuit of the topic.[12] Succinctly, my curriculum – all our curricula – is contextual, multitudinous, and a question of – questions of – what is particularly worth knowing about any topic of study under the sun: “Why this, why here, and why now?”[13] That is the question.

Well, maybe that is the question. The essence of this question, this curricular particular, lies in kairos, the concept of opportune timing or occasion that “signals the need to bring universal ideas and principles to bear in historical time and situations [i.e., deductively] and, thus, calls for decisions [requiring] wisdom and critical judgment” (Smith, 1986, p. 15). We can only note what matters to us once we have a reference point. And since nothing occurs in a vacuum, any detail can be potentially informative, so we must learn to pointedly ask not, “In what way(s) do I already know what I’m looking at?” but rather, “In what way(s) do I not know what I am looking at?” which tends to be deductive. Typically, curriculum begins inductively, with what someone already knows, and we all know plenty of things. But we generally bring to bear only what we deem relevant to the moment. By the same token, someone who knows what is relevant to the moment has a kind of prescient “mechanism” (Christodoulou, 2014, p. 54) for spotting what will likely be of use.[14] So curriculum is a means of determining, if not discovering, in the moment what works. It is, therefore, also a means of coming to know ourselves.

As we develop confidence and self-esteem, and dignity, we grow to feel that we have something to contribute, that we matter, all of which prepares us for helping others. Curriculum helps us to sort out our values and beliefs,[15] which provide a frame-of-reference in order to select and, later, to measure our day-to-day efforts. Of course, none of this happens immediately; we need time to grow more self- and other-aware, each kairos experience filing in alongside the rest, like a crowd of ticket holders. I can only wonder whether Shakespeare might have characterised curriculum as something akin to being held over for an indefinite engagement. In any event, we never stop learning – may our auditoriums ever sell out – as we continually induce as well as encounter influence. But how deliberately do we do so? Maybe that is the question.

 

Click here for the Bibliography

Click here for Pt III. A Scripture of Truth

 


Endnotes

[1] As Baumlin (2002) notes, “For the student of prudentia, time reveals itself as golden Opportunity rather than as fickle, devastating Fortune” (p. 141). Certainly, Shakespeare and his Elizabethan audiences were feeling such debate permeate their own lived experiences, a dram of an idea that, once diffused, might only thereafter suffuse.

[2] According to Claflin (1921), “‘opportunity’ in Shakespeare means more than it does now [in the 20th century]; it is closer to the original force of Latin opportunus, and means ‘a specially favourable occasion’” (p. 347). Curiously enough, however, as I searched a concordance of Hamlet (Crystal & Crystal, 2002), I found no usage of “opportunity” whatsoever and only three of “chance,” most notably that of Hamlet to Horatio: “You that look pale and tremble at this chance…” (5.2.334) in reference to the dead and dying at the play’s closing. Of further interest is the concordance’s report that Shakespeare used “opportunity” throughout his entire catalogue of poems and plays only sixteen times as compared to “chance,” which he used 114 times.

[3] Kiefer (1983) examines Fortune at length as one colour in Shakespeare’s palette for his characters, noting of King Lear: “In no other of Shakespeare’s plays do characters invoke Fortune so insistently [or] so frequently at pivotal points of the action” (p. 296).

[4] Read either “certainty” or “our capacity,” here, in place of “it”; either works just as well. The line from the play I have paraphrased, of course, because the original antecedent is “our ends” (5.2.10) in place of “them” (5.2.11). However, where I have changed the diction of the thought, as a matter of perspective, the meaning remains intact. The implication that we – in essence – play God might not be nearly so alien for Shakespeare’s audience as to their feudal predecessors. By contrast, to postmodern audiences these days, the notion of a divinity standing apart from our own free will and shaping our ends might be the more alien concept.

I might finally point out that Shakespeare, as his creator, is Hamlet’s god, of a kind. But that analogy does not last long under scrutiny since Hamlet, being a fictional character, has no sentience, free agency, or tangibility, and actors who portray him are left with prescribed dialogue and beliefs.

[5] Because I am ultimately discussing what Shakespeare did, his characters being only conveyances as such, I was tempted to complete this sentence with a line from Macbeth, as follows: “The extent to which he considers himself master of his own fortune, presently in the graveyard, is laid plain for Hamlet, leaving him to conclude only that ‘…all our yesterdays have lighted fools the way to dusty death’ (5.5.22-23).” The key difference, of course, is that Hamlet decides against being a fool whereas Macbeth seems all too keen to excel at it. Where Hamlet best demonstrates a respect for “divinity [shaping] our ends,” Macbeth better represents the rough-hewing bit, which makes him a far less redeeming character in the end. So, upon reflection, it seemed prudent to stick substantively to just the one play. Thank heaven for endnotes, I guess.

[6] Had he fallen clearly to one side, as a subject to his monarch, Shakespeare might very well have sealed whatever freedom he did enjoy; his own response, evidently, was to render unto Caesar, and render unto God, and continue writing plays. Four centuries on, what is there about us, that we might think we are any less susceptible than he was to coming to terms with our finite nature? We live in civil society, by the rule of law under a Constitution, within which are Rights and Freedoms that include the assurance to believe, or not to believe, whatever we decide suits us best. Furthermore, we have the advantage over Hamlet in that his example exhorts us, interminably – just ask my students, remember? Alas, though, poor Yorick.

[7] As Horatio notes, “It must be shortly known [to Claudius]” that Hamlet has tricked Rosencrantz and Guildenstern to their deaths at the hands of England (5.2.71-72), a move by Hamlet in his contest that must certainly harden his uncle’s resolve to have Hamlet dealt with once and for all. Of course, Claudius had sent Hamlet to England to be killed, but in secret, on account of both Gertrude and the public’s love for the Prince (4.7.5-24). However, in dispatching his childhood comrades – and with such calculation (5.2.57-70) – Hamlet has now given Claudius justifiable means to overcome any such favourable opinion as might have benefitted Gertrude’s “son” (5.1.296).

[8] Time and place are what we commonly refer to as setting in English class, which is a curious way to consider eternity.

[9] Seldom mentioned amidst all the consternation is that Hamlet does not actually ask a question. If he had, he might have worded it as, “Is it to be, or not to be?” In that case, we would need to know what “it” means. Alive? Dead? Happy? Sad? Anything goes, I suppose, but then… what would you expect? He might have been asking, “Am I…” or “Are we to be, or not to be?” But where that is still somewhat existential and vague, now we might want to know whether his use of the verb, to be, is more open-ended or copular. I suspect Shakespeare knew enough about a general audience to trust that only the most fastidious grammarians would fuss over particulars such as antecedents and verb tenses in the dialogue. Otherwise, why decide to use the most protean verb in the English language?

[10] As far as lived curricular experiences go, there are many like it – as many as there are people to have them – but this one is mine.

[11] At this early stage, I confess: why struggle writing a paper when I could use the age-old trick of writing a paper about writing the paper? Why…? Because the age-old trick is just that – a trick – and spinning academic wheels stalls any hope of contributing to knowledge, so I would hardly be honouring my responsibility if I tried pulling that off. Still… the paper within a paper got me thinking about Hamlet, which oddly enough had been my original inspiration for this essay. As my students used to say… once you study Hamlet, he just never goes away. How true, how true…

[12] According to Hartmann (2014), it was just such questions that prompted Ezra Klein to leave The Washington Post and establish Vox.com in 2014.

[13] Students in my all courses learned to rue the question “Why?” so much so, one year, that it became a running joke simply to utter “why” as a pat-response, whether as a question, an interjection, a plea, a curse, an epithet – those last two maybe reserved for me, I don’t really know. In honour of their perseverance, and their angst, I named my blog The Rhetorical WHY.

[14] Surrounded by Winkies, confronted by certain capture, only Scarecrow eyes the chandelier above the Wicked Witch, so only he can yank Tin Man’s axe across in time to chop the rope that suspends it. Hardly the grandeur or the gravitas of Hamlet, I realise, but The Wizard of Oz has much to offer pertaining to curricular theory as well as teacher autonomy.

[15] In keeping with the three temporal concepts, perhaps a more suitable metaphor than threading our own needles would be to say we surf a long pipeline. But, this essay being more concerned with curriculum and theatre, any such Hang-Ten imagery is better suited to another time, like connecting curriculum to gnarly waves and bodacious beaches (“Surf’s Up,” 2015). Anyway, certainly no one would ever dream of linking Hamlet to surfing (“’Hamlet’s BlackBerry’,” 2010) in the first place, would they?

Umm, This.

[Originally published June 11, 2017]

So, it’s interesting, listening to people talk these days, quite frankly, in terms of their words, their language, their speech. I have an issue with what everyone’s saying – not like everyone everyone but, you know, it’s just their actual words when they talk about complex issues and such, or like politics, what with the whole Trump thing, you know, that Russia probe and the Mueller investigation and everything that goes with that. I’m also a bit of a news hound, and that’s really where I started noticing this on-air style of speeching, of making it sound thoughtful and taking them seriously.

And it’s so much out there, like an epidemic or something, which is interesting, which speaks to on-line streaming and TV news, talk radio, and pretty much the whole 24-hour news cycle. I was a high school English teacher for sixteen years, and I also started noticing all this, you know, frankly, during class discussions, too. And there was me, like guilty as anyone.

Here’s the thing, though, because I guess substance will always be up for debate, but that’s just it – it’s so wide-ranging that it’s like people have no idea they’re even doing it, which is interesting. It’s almost like it’s the new normal, which really begs the question – are people getting dumber? Is education failing us? In terms of intelligent debate, that will always be something that probably might be true or false. And let’s have those conversations!

But in terms of intelligible debate, it’s interesting because, when I listen to how people are talking, it gets really interesting because when I listen what they actually say, it’s like they’re making it all up on the spot in the moment as they go, so it’s just that that makes me not as sure it’s intelligent as it’s less intelligible. But it’s all in a sober tone, and they’re just expressing their opinion, which is democracy.

And that’s the thing – if you challenge anybody with all what I’m saying, clarity-wise, it’s interesting, they’ll get all defensive and whatnot, like it’s a personal attack that you’re calling them stupid or whatever, like you’re some kind of Grammar Jedi.

And, I mean, I get that. So that’s where I think people don’t really get it because I totally get where they’re coming from.

Seriously, who would want to be called like not intelligent or anything all like that, whatever, especially if we’re trying to discuss serious world issues like the whole Russia thing that’s been happening or the environment or all the issues in China and the Middle East? Or terrorism and all? I mean, if you look at all that’s happening in the world right now, but you’re going to get that detailed of the way someone talks, maybe you should look in the mirror.

And I mean, SNL did the most amazingggggggggg job with all this, back in the day, with Cecily Strong on Weekend Update as The Girl You Wish You Hadn’t Started A Conversation With At A Party. Comedy-wise, she even like makes a point, but basically, she’s furthering on intelligence, except I’m talking about intelligibility. But still, if you haven’t seen it, what can I tell you? Your missing out, SO FUNNY. She. Is. Amazing.

And that’s the other thing, and this one’s especially interesting, is just how there’s just SO MUCH out there, what with Google and the Internet, and Wikipedia and all, so who could possibly be expected to know like every single detail about all the different political things or the economy and all the stuff that’s out there? And it’s even more with speaking because pretty much most people aren’t like writing a book or something. (W’ll, and that’s just it – nobody speaks the way they write, so… )

Anyway, so yeah, no, it’s interesting. At the end of the day, first and foremost, one of the most interesting things is that everybody deserves to have a say because that’s democracy. And I think that gets really interesting. But the world gets so serious, probs I just need to sit down. See the bright side, like jokey headlines from newsleader, Buzzfeed, or 2017’s “Comey Bingo” from FiveThirtyEight. Gamify, people! News it up! Nothing but love for the national media outlet that helps gets you wasted. Or the one about the viral tweet, for an audience intimately familiar with pop culture? News should be taken seriously, and the world faces serious aspects, for sure. But the thing is, work hard but party harder! I mean, we’re only here for a good time, not a long time!

And it’s interesting ‘cuz people seem to require more frequent, more intense, more repeated engagement, to spice up their attention spans. There’s some good drinking games, too, on that, because politicians! I know, right? But not like drunk drunk, just like happy drunk, you know? Not sure if all this counts as it means we’re getting dumber, per se, but it’s just interesting.

So, yeah, it’s interesting because we’ve come such a long way, and history fought for our freedom and everything, so I just really think going forward we should just really appreciate that, and all, you know?

From The New York Times – “Free Speech and the Necessity of Discomfort” and further Reflections on Journalism

A needfully challenging appeal to raise the level of discourse, and an appropriate inclusion to The Rhetorical WHY, from an Opinion piece in The New York Times (Feb 22, 2018) by Op-Ed columnist, Bret Stephens:

This is the text of a lecture delivered at the University of Michigan on Tuesday [Feb 20, 2018]. The speech was sponsored by Wallace House.

“I’d like to express my appreciation for Lynette Clemetson and her team at Knight-Wallace for hosting me in Ann Arbor today. It’s a great honor. I think of Knight-Wallace as a citadel of American journalism. And, Lord knows, we need a few citadels, because journalism today is a profession under several sieges.…” [continue reading]

 

'Oddly enough, I feel offended...'
All That’s Fit to Print, Wiley Miller

Some thoughts of my own on the significance of a free press to our lives…

Next, I offer a series of responses I made to remarks by Hannah Arendt, published October 26, 1978 in The New York Review of Books, itself a report of her interview with Roger Errera). I encountered them in a Facebook post from The Bureau of Investigative Journalism.

 


Arendt: “… how can you have an opinion if you are not informed?”

Everybody has opinions – our five senses give us opinions. In order to be “informed,” we need discernment enough to detect accurate information.

Arendt: “If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer.”

For me, continual lies ultimately yield zero trust, but again, how would I know who’s even lying, but for my own discernment and experience?

At the least, if I were aware that all around were lies, that much I’d know is true. It’s not that “nobody believes anything any longer,” so much as it’s “everybody goes about searching out truth on their own.” The downside is when those individual searches for truth become disrespectful, as we’ve seen lately, or worse, chaotic.

Nevertheless, investigate! Accept responsibility to inform yourself. Accept or believe all with a grain of salt until such time as you can prove to your own satisfaction who and what are trustworthy. And, at that point, be tolerant, if not respectful, of others – this applies to everybody, all sides, liberals and conservatives and all points between. Taking the high road is not to be done with pride or smug assurance. It’s easy to nod and say, “I already do while others do not,” but even so, reflect upon yourself with each conversation, each debate, each exchange.

Open-minded and open-hearted – both are virtues, but they don’t have to be the same thing.

Arendt: “If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer.”

On its face, this statement could only be accurate if you had some clairvoyance or a crystal ball.

By “everybody” doing their own investigation and accepting responsibility to inform themselves, I mean everybody. We’re able to trust news & media sources to the extent that they have lived up to their responsibility… to the extent we’re aware that they have. I support proper, professional investigative journalism and public intellectualism, both of which I gather to be in decline.


'Well, apparently you haven't heard. . . personal opinions are the new facts.'
The New Facts, Chris Wildt

Finally, I offer two sets of remarks about journalism by two long-retired anchor-journalists of PBS fame, partners Robert MacNeil and Jim Lehrer. The first is transcribed from an exchange between them during a tribute to MacNeil upon his retirement in October 1995. The second – comprising two parts – is Lehrer’s closing words upon the “retirement” of his name from the title of the PBS NewsHour, on December 04, 2009. Following that, I’ve included a thoughtful follow-up by the PBS Ombudsman, Michael Getler, published the next week on December 11.

MacNeil’s remarks upon his retirement (October 20, 1995)…


MacNeil: You know, I’m constantly asked, and I know you are in interviews, and there have been a lot of them just now – I’m constantly asked, “But isn’t your program a little boring to some people?” and I find that amazing, because, well, sure, it probably is, but they’re people who don’t watch. The people who watch it all the time don’t find it boring, or they wouldn’t watch.

Lehrer: That’s right.

MacNeil: And it’s the strange idea that’s come out of this medium, because it’s become so much a captive of its tool – as its use as a sales tool that it’s driven increasingly, I think, by a tyranny of the popular. I mean, after all, you and I’ve said this to each other lots of times – might as well share it with the audience: what is the role of an editor? The role of an editor is to make– is to make judgments somewhere between what he thinks is important or what they think is important and what they think is interesting and entertaining.


Jim Lehrer’s guidelines of journalism (December 04, 2009)…


Lehrer: People often ask me if there are guidelines in our practice of what I like to call MacNeil/Lehrer journalism. Well, yes, there are. And here they are:

* Do nothing I cannot defend.

* Cover, write and present every story with the care I would want if the story were about me.

* Assume there is at least one other side or version to every story.

* Assume the viewer is as smart and as caring and as good a person as I am.

* Assume the same about all people on whom I report.

* Assume personal lives are a private matter, until a legitimate turn in the story absolutely mandates otherwise.

* Carefully separate opinion and analysis from straight news stories, and clearly label everything.

* Do not use anonymous sources or blind quotes, except on rare and monumental occasions.

* No one should ever be allowed to attack another anonymously.

* And, finally, I am not in the entertainment business.

Here is how I closed a speech about our changes to our PBS stations family last spring:

‘We really are the fortunate ones in the current tumultuous world of journalism right now. When we wake up in the morning, we only have to decide what the news is and how we are going to cover it. We never have to decide who we are and why we are there.’


 

I am struck by the continuity of their respective final comments, about entertainment – each, in his own way, seeks to distance journalism from vagary, each thereby implying that we are susceptible to emotional or whimsical tendencies, which evidently seem capable of overtaking our focus to learn; otherwise, why mention the point at all?

 


  • Watch Lehrer’s remarks here, in a functional if awkward series of video archives of that 2009 broadcast.
  • In May 2011, upon Lehrer’s retirement, MacNeil returned to offer his own reflections upon his friend and colleague that include some further worthwhile commentary upon contemporary TV journalism
  • Watch them during a more recent (October 25, 2016) retrospective interview from 92nd Street Y, a Jewish cultural and community centre in Manhattan.

 

I recall “Lehrer’s Rules,” as they were called, making a small stir – some of it more substantive, meaningful, and some the critical “woe-is-Us” lament at the passing of favourite things. In amongst it all, as I mentioned, were the following comments from PBS Ombudsman, Michael Getler, which I include here, at length, on account of PBS webpages’ tendency to disappear.

In fact, a number of the PBS pages where I found these articles are no longer active – where possible, I have checked, updated, and even added weblinks. But I believe Getler’s comments, like the rest, are worth preserving, on account of their potential to provoke us to think and learn more about a free press and its relation to ourselves.

 


“Lehrer’s Rules” by Michael Getler (December 11, 2009)

A couple of people wrote to me in the aftermath of that Dec. 4 sign-off to say how much they liked Lehrer’s guidelines and asked how they could get a copy. That’s why they are reproduced above. A subscriber to the widely-read Romenesko media news site also posted them there on Dec. 6 and they also were posted on the campus site of the Society of Professional Journalists (SPJ). “Whether you agree with all of Lehrer’s guidelines, or not,” that posting read, “he has surely earned our attention.”

That’s certainly true in my case. I’ve also been a devoted watcher of the NewsHour in all of its evolutions during most of the past 30-plus years, long before I took on this job four years ago. Although segments of the program have been the subject of critical ombudsman columns on a number of occasions, I’ve also said many times that it remains the best and most informative hour of news anywhere on television, and it has never been more important. I follow the news closely but almost always learn something from this broadcast every night.

Boring, at Times, But a Luxury Always

Sometimes, of course, it can seem boring. Sometimes the devotion to balanced he said/she said panel discussions can leave you frustrated and angry and no smarter than you were 15 minutes earlier. Sometimes the interviewing is less challenging than one might hope. But the luxury of an uninterrupted hour of serious, straight-forward news and analysis is just that these days, a luxury. And, in today’s world of media where fact and fiction, news and opinion, too often seem hopelessly blurred, it is good to have Lehrer – clearly a person of trust – still at work.

I had the sense when he added his guidelines to that closing segment last Friday that the 75-year-old Lehrer was trying to re-plant the flag of traditional, verifiable journalism that he has carried so well all these years so that it grows well beyond his tenure – whatever that turns out to be – and spreads to all the new platforms and audiences that the contemporary media world now encompasses.

Oddly, I did not get any e-mail from viewers commenting on the new NewsHour format, other than one critical message that said “do not post.” Maybe that’s a good sign since people usually write to me to complain.

Make no mistake, the now defunct NewsHour with Jim Lehrer is still quite recognizable within the new PBS NewsHour. So those who wrote earlier and said they didn’t want any change won’t be terribly disappointed. I, personally, found the first few days of the new format and approach to be a distinct improvement. The program seemed to have more zip and energy, faster paced, with good interviews and without the always predictable language that introduced the show in the past. It presented its news judgments more quickly, benefitted from the early introduction of other top staff members as co-anchors, and from the introduction of a promising “new guy,” Hari Sreenivasan, a former CBS and ABC correspondent who presents a headline summary from the newsroom and is the liaison to an expanded NewsHour Web operation.

Now, just to keep this a respectable ombudsman’s column, let me add a few quibbles when it comes to Lehrer’s rules, as posted above.

First, one of the interesting things about American journalism is that there are no agreed-upon national standards, no journalistic equivalent of the Hippocratic Oath for physicians. There are, of course, many universal values and practices that vast numbers of journalists have voluntarily adhered to generally for many years, best exemplified by SPJ’s Code of Ethics. But the fact is that all major news organizations – from the Associated Press to the New York Times to PBS and CBS – have their own guidelines and standards that they try and live by. And they all have their differences.

Naturally, a Few Quibbles

Lehrer’s guidelines embody lots of the good, praiseworthy stuff, and we come out of the same journalistic generation and traditions. But I think on a couple of points they are actually too nice, too lofty, cruising somewhere above some of the grittier realities of journalism.

For example, “Assume the viewer is as smart and as caring and as good a person as I am. Assume the same about all people on whom I report.” Really? Bernard Madoff? Osama bin Laden?

Then there is: “Assume personal lives are a private matter, until a legitimate turn in the story absolutely mandates otherwise.” I would argue, and have, that the NewsHour withheld from its viewers at the time a legitimate turn in a major story – reported by all other major news organizations – last year when it declined to inform them that a former senator and former candidate for the vice-presidency, John Edwards, issued a public statement and went on ABC Television to acknowledge that he had had an extra-marital affair with a woman who had been hired by his political action committee to make films for his campaign. That’s news.

Finally, there is, “Do not use anonymous sources or blind quotes, except on rare and monumental occasions.” I agree about the blind quotes when they are used to attack someone personally. But anonymous sources have often proved to be absolutely crucial to the public’s right to know what’s really going on in scores of major stories as they have unfolded from Watergate to secret CIA prisons overseas.

The most accurate and important pre-war stories challenging the Bush administration’s on-the-record but bogus case for Iraqi weapons of mass destruction were based on anonymous sources. Many of those stories, in part because they were based on anonymous sources, got buried or underplayed by newspapers at the time. Many of them never got reported at all on television, including the NewsHour. But there are times when there are mitigating circumstances – like internal threats within an administration or maybe jail time for leakers – when some sources must remain anonymous and when editors need to trust their reporters. And often you don’t know if the occasion is “rare and monumental” until it is too late. Pre-war Iraq, again, being Exhibit A.


 

Freedom of the Press_ Matteo Bertelli
Freedom of the Press, Matteo Bertelli

Some other links…

World Association of Newspapers and News Publishers

World Press Freedom Index

Ryerson School of Journalism

Edelman Trust Barometer

Freedom House

Deciding over Derrida’s Différance

As far as I understand Jacques Derrida’s différance, he observes that we understand our experiences as distinctive, but not exhaustive, communicated links or marks comprising an on-going decisive chain of experiential moments. As to the language we use to describe our experiences, a word has contextual meaning, both from its usage at any given time as well as from its etymology over the course of time. I tend to agree with this attendance to context as furnishing meaning, and I can also spot the rabbit hole that it poses. For example, to understand some word’s definition, I might look it up in the dictionary and be left to rely upon the definition of whomever decided what it meant while, at the same time, face all sorts of words in the definition that now need looking up, too – Sisyphean, indeed! Cruel but so usual. On the other hand, thanks to whomever for compiling the dictionary, a pretty utile compendium, I have to say.

To be clear, I am not intending to invoke logocentrism, by which all our words are accorded a decided meaning from a cultural centre, which propagates existing biases or “privileges”; Derrida would roll over in his grave. Granted, I may already have laid grounds here to be accused of logocentrism, myself, by writing with words (and I confess to using English because I didn’t think anyone had the patience to muddle over Wingdings). My present aim is to suggest how we might address the afore-mentioned rabbit-hole dilemma by searching for or (… almost afraid to say it) by deciding upon some definitions of our own. Not like a dictionary, but more like– well yes, okay, like a dictionary, but one that we’ll fashion from the ground-up, like when the light bulb would go on above Darla’s head, and Spanky would snap his fingers to say, “Hey, everyone! Maybe we can put on a play!” So, in the spirit of dissemination, hey everybody, maybe we can compile a dictionary! A real, deconstructive, crowd-sourced dictionary!

I’m not really compiling a dictionary. I’m just trying to make some sense of Derrida and différance. Let me try to illustrate what I mean from my own experience. Sometimes I play Walking Football, a version of the game where players are not permitted to run. Naturally, the debate is over what differentiates walking from running. We’ve agreed that walking means “always having at least one foot in contact with the ground during the striding motion.” Running means “having both feet leave the ground at some point during the striding motion.” This makes for certainty, and so long as our eyes are trained enough to spot feet in motion, which I can spot sometimes so clearly, with such immediacy, that its more like I’m watching, not playing – I’m ghinding it tuff even now to ghet the right words, but trust me. And so long as each player is willing to obey the rules – and, ohh my, there’s always that one player who just won’t. You know who I mean… *sigh… Anyway, so long as they’re not just words uttered that then float away in the breeze, our definitions of the rules for walking and running are useful.

Luckily, too, I might add, when we clarify the rules, we do so out loud, together, and don’t whisper it around in a circle, like when my daughter plays Telephone at a birthday party – after all, we want everyone to be clear. Finally, even if we have trouble spotting feet in motion, because it all happens too quickly, or even if that one player is a cheater at heart, the definitions themselves remain clear, and usually at least one or two of us can remember them well enough to recite back, as needed, usually with a lot of finger-pointing and furrowed brows. One time we even wrote the no-running rule on the gym chalkboard, and even though no one challenged this, on the grounds that writing is secondary to speech, everyone still understood why it was scrawled there, by which I mean everyone knew exactly who should read it the most – wow, does every game have that player? Incorrigible.

Bottom line: accountability is down to the sincerity and respect offered to each player by every other player who decides to participate. As an aside, the need for a referee, an arbiter, is all the more clear when the stakes are as high as bragging rights and free beer. But, even as we play for fun, the rules exist or else the game, as such, does not. (On that note, I find a lot of players just don’t like Walking Football and would rather play with running, and that’s fine, too: it’s their decision, and plenty other like-minded players keep both games afloat. I find the Walking game amplifies decision-making, so maybe this feature just appeals to me. And I still play traditional football, too.) My broader point is that any one person must decide to accept what has been defined and, likewise, any group of people must reach a consensus. Shared meaning matters because, otherwise, as I say, we don’t have a game, or else we have a very different one, or we just have anarchy. But whether that person, alone, or the group, altogether, searching for a way to decide upon meaning, has the patience to delve down the rabbit hole… well, yes, context does indeed matter – both usage and etymology. I’ve said and written as much, myself, for a long time. So, in light of all this, I hope I’ve gathered a little something of Derrida’s différance. I’m still learning.

Another illustration: in my teaching, I occasionally introduced this matter of contextual meaning by offering students a list of synonyms: “slim,” “slender,” “skinny,” “thin,” “narrow.” Each word, of course, has its own particular meaning. “If they meant the same thing,” I offered, “then we’d use the same word,” so just what explains the need for all these synonyms? Well, students would say, there are lots of different things out there that possess or demonstrate these various adjectives (my word, not theirs), so we’ve come up with words to describe them (and I think that’s a charitable “we,” like the Royal “We.”) As the discussion proceeded, I might ask which of these words typically describe human traits versus those – leaving aside metaphors – that typically do not. Next, which words typically possess positive connotations, and which negative, or neutral? And, as it pertains to the personification metaphors, which words are more easily envisioned versus those that really stretch the imagination, or even credibility? Eventually, I would shift from ontology to epistemology, posing the questions at the heart of my intention: For any of the previous questions about these synonyms, how do you know what you’re talking about? For what each of these words could mean, where have your assurances come from? Of course, the most frequent reply to that question was “the dictionary,” followed by “my parents” or “books I’ve read,” or “just everyday experience, listening and talking to people.” Occasionally, the reply was something akin to “Who cares… it just means what it means, doesn’t it?” In every reply, though, one common thread was detectable: the involvement of other people as part of the meaning-making process. Fair enough, we can’t all be Thoreau.

One more example: when is “red” no longer red but perhaps orange or purple? Well, for one thing, if you’re colour blind, the question means something entirely different, which I say not flippantly but again to illustrate how important dialogue and community are to deciding what something means. For another thing, we might wish to ask, in keeping with context-dependency, “Why even ask?” Again, this is not flippant or dismissive but practical: when does it matter so that we distinctly need to identify the colour red? Where a group of people might face the question over what is red versus what is orange or purple, we might expect some kind of discussion to ensue. And, whether asking as part of such a group or as a hermit, alone, I submit that one person’s decision about what is “red” is ultimately down to one person to determine: “Red is this,” or “This is red,” or even, “Gosh, I still can’t really decide.” Even a coerced decision we can still attribute to the one who forces the issue – one person has decided on behalf of another, however benignly or violently: might makes right, or red, as it were.

Coercion introduces a political consideration about whose authority or power has influence, similar to needing referees on account of those players who decide to run. The point, for now, is simply that a decision over what something means to a person is ultimately made by a person, leaving others to deal with that decision on their own terms in whatever way. But other people are part of the meaning-making process, even passively, or else I wouldn’t need language to begin with since the rest of you wouldn’t trouble me by existing. Not to worry, by the way, I appreciate you reading this far. From what I understand (and I am convinced I must learn more, being no avid student of either postmodernism or Derrida), his observation of différance either discounts or else offers no account for the arbitrary decision-making that people might make when they decide they’ve had enough. People tend to land somewhere in a community, and it’s the rare person who lives and plays wholly and uncompromisingly by their own rules. However, the fact that he felt différance was worth the effort to publicise and explain to the rest of us does reflect an arbitrary decision on the part of Derrida and says something about him.

So this is where I have more fundamental trouble understanding Derrida and différance – the very notion of “different,” as in, in what world could there not be an arbiter? Even a life alone would face endless decisions: what to eat, where to go, when to sleep, and so forth. From such musing – speaking of rabbit holes – I was led to reading about another philosopher named Jacques, this one Rancière, and what he calls the axiom of equality. In pure nutshell form, I take this to mean that no (socio-political) inequality exists until it has been claimed to exist – and note that it’s claimed in a boat-rocking kind of way, what the kids these days are calling “disruptive.” The upshot is that equality, itself, can only ever be theoretical because someone somewhere inevitably is and always will be marginalised by the arbitrary decisions of cultural hegemony. Still learning.

Back to the Walking Football analogy: if the rabbit hole of defining a word in the context of those that surround it, and then having to define, even further, all those words, and on and on, and experience is inexhaustible, and what’s the point, and lift a glass to nihilism… if that kind of limitless indefinite deconstructive search-and-compare lies at the heart of what is different, then maybe Derrida just found it difficult to reach agreement with other people. It stands to reason that, if he played Walking Football, Derrida might be the worst cheater on the floor, continually running when he should be walking, then denying it just the same as he tried to gain advantage. Maybe, fed up being called a cheater, he would take his ball and go home to play by himself, where no one could say he was wrong. Being alone, who would be there, whether as an obedient player or as a sneaky one, to challenge him?

In fact, maybe that’s why he chose to return to play the next day – for all the arguing, he enjoyed the game, or the attention, or the camaraderie, or the exercise, or whatever, more than being accused of cheating. I wonder if, perhaps, in the great game of philosophy football, he would have been the only rival to strike real fear in Diogenes – I mean awe & respect kind of fear, just to clarify, and I mean if they had lived at the same time. It’s hard to know about Diogenes since nothing he wrote down ever survived, and these days, I doubt more than a few can recall any of whatever he said, besides that lamp-carrying honesty thing. (We should all have such good spirit when it comes to our first principles.) Anyway, I think Diogenes played for Wimbledon.

Well, I am being unkind to Derrida. Evidently, he was a kinder person by nature than I have let on, as well as an advocate for all voices, all people. And the professional care, the uncompromising expertise he took to convey his ideas, to trouble himself with delving down the rabbit hole so arbitrarily – to go down at all but, moreover, to go so far when he might, just the same, have decided to halt. Delve as far as you like, but accept responsibility for your decision, every time. In that respect, how does Derrida differ from any other person facing decisions? Did he have still other motivations? No player who kicks a football is deliberately playing to lose, not unless they have been coerced by someone else to do so. On the other hand, for all I know, maybe what Derrida called red I would call blue. Be careful not to pass the ball to the wrong team! (By the way, in sport, dynasties are remembered precisely because they eventually come to an end.)

Was Derrida no less accountable and open to scrutiny than you, or me, or anybody else? To suggest that a word only makes sense based on how it differs from those around it is no less arbitrary than its reciprocal suggestion, that a word only makes sense based on how it describes only what it describes. Half-full / half-empty, six of one… Two sides of the same coin are still the same coin. Alternatively, who put him up to all this? Meanwhile, on his own, surely Derrida had it within himself, as people do when they reach a point, simply to say, “Here is enough. I decide to stop here. For me, [the item in question] means this.” If that doesn’t ring true and sound like him, well, I’d say that can be just as telling of his character; I heard it suggested, once, how we can be helped in knowing something by what it is not. So, fine – for Derrida to stake the claim called différance, I’m willing to concede him that moment. We all land somewhere, and we’re all hardly alike, even when we’re alike.

We are, each and every one of us, individual. But together we comprise something just as dynamic on a larger scale – one might construe us societally, or perhaps historically, anthropologically, or on and on, in whatever way through whichever lens. For me, différance appears an attempt to speak for all about all, prescriptively. A grand stab at philosophy, no question, and that’s the beauty of the equality of philosophy, with thanks to Rancière: we all have a part to play and a right to respond. For the time being, as I have understood Derrida and his thinking, and I willingly stand to be instructed further, différance strikes me as ironic, being an advocacy for the dynamic development of people and language and culture that self-assuredly asserts its own accuracy. That is not an uncommon indictment of postmodernists. What’s more, it is ohh, so human.

The Latest Visual WHY

Click here to read the latest Visual Why.

Jean-Baptiste Regnault - Socrates Tears Alcibiades from the Embrace of Sensual Pleasure (1791)
“…could we ever know what art makes oneself better, if we were ignorant of what we are ourselves?”

“…a picture’s usually worth even more than the thousand words we commonly ascribe.”

The word “text” derives from Latin, texere, meaning to weave or fit together. For me, text connotes far more than just the printed word – photography, movies, music, sculpture, architecture, the list goes on and on. The Visual WHY offers a specific look at paintings, texts with no less substance and arguably far more aesthetic. But underpinning the textuality of art altogether is its human endeavour. And beyond weaving something together for the sake of weaving, weavers – artists, people – have a further end, communication. Artists across all media are still people with influences and motives for expressing themselves. Conjointly, texts of all kinds are plenty human: provocative and reflective. Whether rich and symbolic for a global audience, or doodled sketches for your own amusement, art is text, and text has purpose. As we try to understand it more thoroughly, we can’t help but raise the level of discourse. Who knows, someday maybe art will save the world…

For those who’ve been wondering about the painting featured both here and on this site’s front page, this latest update will explain that, too.

Click here to read the latest Visual Why.

From PBS – Nova Next: “Fake News is Spreading Thanks to Information Overload”

Originally posted on Monday, June 26, 2017

 

Has the Internet Revolution helped to raise the level of discourse? Not necessarily – read on. But the situation isn’t entirely without hope, according to Bianca Datta, whose article also offers a dose of good news.

Why Read All This Stuff Into Literature?

Teaching Secondary English for sixteen years, I frequently faced questions from students like the ones listed below…

  • Why do we analyse stuff so much in English class?
  • Why read all this into a poem? How can we ever really know what the poet actually meant? Does it matter anyway? Why even study poetry in the first place?
  • Is [some deep analysis] what Shakespeare really intended? Didn’t he just write his plays to entertain people?
  • By over-analysing literature, aren’t we unfairly putting words in someone’s mouth?

Fair questions, all. Indeed, why do we read stuff into literature? English teachers can drive students up the wall with questions like, “What does this poem mean to you?” or “What’s the significance of the theme?”

“AHHHRRRRG!!!!!” replies the student.

Here are some initial thoughts I’ve had on the subject (far from exhaustive!) to address this frustration. For simplicity, I refer to Shakespeare as one primary example, plus a few other writers, but I don’t mean by that to leave out or ignore other artists or creative types of expression as compared to literature alone. Therein the students must transpose for themselves.

  • Is our analysis what Shakespeare really intended? Didn’t he just write his plays to entertain people and make a living?

Shakespeare had his own intentions, of course. We will never really know what was on his mind. What’s more, Shakespeare is dead now, gone, kaput. For 400 years, the man himself has not been a factor in the literature – sad but inevitable!

But what he’s left behind, his poems and his plays, is what we have to work with. In his words live ideas and people and memories that he knew and loved and remembered. Shakespeare said as much himself in Sonnet 55:

“Not marble nor the gilded monuments

Of princes shall outlive this powerful rime;

But you shall shine more bright in these contents

Than unswept stone, …”

And sure, maybe it’s not one-to-one correspondence to actual living (now dead) people although there is oodles of studied conjecture about many of his characters and plot lines. But, despite the fascinating academic and journalistic exploration regarding possible real-life connections, consider that (a) nobody will ever truly know, since Shakespeare never explained one way or the other, and he’s dead now, and (b) Shakespeare had to have drawn upon his real-life relationships, cultural observations, and historical knowledge to develop any characters and situations because what else could any human being ever do? It’s not like he lived in isolation from all human contact on another planet or suffered from daily bouts of amnesia (although truly, again, who knows…?) But if you’re a human being who writes, especially for a living, I’ll bet you know other people and work from your human experience. Surely, your own life is your best source. Write what you know, isn’t that the advice?

You might also read the poems listed below, which express similar sentiments – the magic and music of poetic words and imagery; the lasting immutability of words and ideas that resist the entropic passing of time; the futility of our human attempts to create enduring institutions and monuments of brick and stone. Read on, and do not despair…

Scorn Not The Sonnet” by William Wordsworth

Ozymandias” by Percy Bysshe Shelley

Shakespeare” by Matthew Arnold

  • Why read all this detail into a poem or piece of literature? How can we ever really know what the poet / author actually meant? Does it matter anyway? Why even study poetry in the first place?

Not to ignore the human being who created something artistic, but what really matters as far as this question’s concerned is the literature itself – the poem, the song, the novel; the images, the ideas, the sensations that live within the words; the themes and nuances and rhythms that play in our ears or in front of our gaze or within our thoughts.

Shakespeare’s lingual creations are so wonderful if only for their magical, playful, balanced rhythms – it is beautiful poetry. We enjoy music for the same reason. We enjoy a painting on our wall for its palette, or its brushstrokes, or its portrayal. Sometimes we don’t even understand it yet like it all the same. A sculpture, a garden, a finely prepared meal, all these we enjoy and understand as art, for myriad reasons ranging from our senses and our emotions to our beliefs and our memories.

But not only are these art forms musical, or aesthetic, or appealing (or whatever), they’re also deeply human – art is a human response to the world around us, in which we live and share with other people and other living things and other non-living things. That world and all its contents and relationships are there for us to perceive. And if they are not enough, we’re able to look up at the sky, day or night, or down into the ocean, or into a microscope, or a microprocessor, or how about inward to our own imagination? On and on and on they go, our curiosity and creativity, hand in marvellous hand. The perceived world is the artist’s muse, and it’s likewise the artist’s resource. It’s fuel for the imagination.

In his short book, The Educated Imagination, Northrop Frye opens by suggesting that we possess three levels of perception, of “language”:

  1. State of Consciousness or Awareness: we distinguish things in the world around us from ourselves; a kind of “self-expression” prompting thinking and conversation
  2. Pragmatic Attitude: we project upon this outside world a “human” way of understanding such things; a kind of “social participation” prompting deliberate knowledge and practical action
  3. Imaginative Attitude: we combine the first two perceptions and imagine a world we would prefer to see as compared to the real one we occupy; a kind of creative exercise, producing art forms, literature among them

Back to the artist: in his literature, Shakespeare created such depth and complexity, so closely mirroring the real world, and our relationships and the human condition, that we simply can’t help but see ourselves in them. Yes, he was that good, and his works resonate so much that we still find ourselves studying, admiring, and marvelling over them four hundred years later. So, for example, the genius of Shakespeare’s plays, for me, is how realistically he captures the essence of people and motives and relationships, all by way of dialogue – we watch his plays, we study his literature, we know ourselves better. More than that, we admire it, we revel in it! In the same way that we listen to our most favourite music to enjoy its melodies, we read and watch and listen to Shakespeare because he stirs us at the core.

Let’s consider one more example, the author, J. R. R. Tolkien. He tenaciously disliked allegory, and he continually denied as invalid, as definitely not his deliberate intention, all comparisons made between The Lord Of The Rings and the two World Wars [1]. Even so, evident in Tolkien’s storytelling, upon closer reading, are his concerns for the environment, polluted and abused by industrial development, his disquiet for the human race, stricken as it is with the potential for hubris and evil, and his belief that hope still exists for us, exemplified best by his beloved characters, the hobbits. Do we attribute such biased concerns as these to something from his life, say his Catholicism? Does it even matter since they’re genuine concerns whether you’re Catholic or not? His books express what they express, that much is evident. What need to pursue Tolkien’s motives or intentions when we’re able to glean something meaningful for ourselves? So far as there’s room for allegorical interpretation, it’s merely an intellectual luxury. Alright, then, Sauron is not a deliberate parallel either to Hitler or to Satan; there is nothing we are meant to spy of either Jesus Christ or an infantryman in Samwise Gamgee. The list goes ever on, but none was Tolkien’s intention for us to correlate in any direct way, according to the man himself. So I will take him at his word.

Yet how could Tolkien have written any of his stories without them somehow reflecting his background and beliefs? The case has been made, apart from any allegory, that his life necessarily had an impact on his storytelling. He remembered the Somme in dreadful detail while parts of The Lord Of The Rings he wrote during the London blitz. The very substance of his creative writing is as clear a reflection of Tolkien, the man, as any allegory we may seek to attribute to him. The Hobbit was borne of bedtime stories for his children, adventures he felt they would enjoy, undertaken by characters he thought they would like. We might even conclude that no other books were possible from Tolkien apart from the ones he produced. Tolkien himself described the entire phenomenon of Middle Earth as arising from his professional passion, philology, and his desire to express and share his passions in a tangible way. Sounds like art to me.

For a superb exposition on why to bother specifically with poetry, check out this article. For a more brief but still informative meditation, check out this one.

  • By over-analysing literature, aren’t we unfairly putting words in someone’s mouth?

Not really, no, and what’s more, be careful – asking this question is like giving away the freedom to decide for yourself. Now, I know when my students asked, they just didn’t want to do any thinking or work, and teens will be teens. But anyone seriously considering this question owes it to the rest of us, if not themselves, to reconsider.

As I mentioned above about Tolkien, he had intentions and motives behind his storytelling – who doesn’t? – irrespective of us knowing what they were. After it was published, though, Tolkien was famously bemused and upset by those in the audience who turned The Lord Of The Rings into an LSD experience, and by those who made it a screenplay. But whose problem is that? Copyright laws protect content and intellectual property, but heaven knows we’re unable to stop people from thinking. Attempts might try to control what is available to think about, or how what’s available is presented, or worse, how people think, period. We ignore such detail at our peril because whether we’re for or against a particular interpretation, how and why something is offered is at least as important as what. This is a whole ’nuther topic.

But, as to Tolkien’s problem, the only substantive response in the aftermath of publishing his story was not to have published it at all. That balance between private creativity and public expression is inescapable, unless you’re a hermit, I guess. As creative as you want to be, go crazy, but don’t forget that creativity for an audience means some lost degree of autonomy because everybody has an opinion. I don’t know if it’s a zero-sum equation, but there is a sliding scale from artist to audience, as far as we’re considering any kind of interpretive control. What else is criticism but artistic interpretation, Frye’s third language? The critic is an artist, sometimes a very good one, and there’s something to be said for the decline of public discourse (as compared to debate over the decline of public intellectualism, which is a different topic). But opinions are as numerous as the people who possess them, and sometimes just as popular.

For artists like Shakespeare and Tolkien, who have since died, this question of putting words in their mouths is moot anyway. And, as I mentioned above, once the creative process ends, once the artist decides it’s time to stop creating and start exhibiting, they rather exit the equation of their own accord. Even an explicit interpretive explanation from the artist during an interview or in a press release or on the back of a napkin is itself subject to interpretation. Bias is inescapable (and makes life worth living, so long as it’s checked by responsible morality). An artist is a catalyst, in this respect, nothing more. There are bound to be audience members who defend an artist’s interpretation of X-Y-Z, just as there are bound to be members who dispute it. But aside from any intended message, artist inevitably accedes interpretation to audience. There is no other way.

What of the creation, itself, the artefact or text, that contains the message – or should we say, by which the message is conveyed? As the medium, it enables us to experience a relationship with the artist by proxy – this avenue has lead to some fascinating cultural restoration that far exceeds in scope and import anything I offer here. It has also yielded some curious responses on behalf of artists by other artists. Still, for some others, an object of art somehow literally has its own “words in its mouth,” as people imbue an artefact with life all its own. To me, that’s the same as putting words in the artist’s mouth because, again, the kinds of artefacts I’ve had in mind here are not living, breathing things but more conventional creations such as books, paintings, songs, and sculptures.

I suppose someone out there will know a living, breathing artefact or text in this more conventional sense of art – fair enough although we now fall into discussion over a definition of “art,” which I’d argue has as its basis intentionality, which necessarily implicates the artist, not the artefact. Any conversation that might occur between an audience and an artefact or text would still depend on the frame of mind and experience of the audience after depending on the creative will of the artist. And with no two relationships ever being the same, every person unique, an art experience by its very nature is an intimate, personal phenomenon. Even when someone experiences the same art piece over and over, the interpretation will vary: a movie is different depending on your mood each time you watch it, say, or your age. In that sense, you’re putting words in your own mouth from the last time you watched it!

The same idea put a different way: I once heard from a Jimi Hendrix fan that he liked listening to songs over and over because he heard new things each time, stuff that he missed those other times. Then, after taking up guitar ten years later, he listened to the same songs with yet another new appreciation since his understanding of guitar playing had grown more sophisticated. It’s an easy example to grasp, and it respects both sides of the medium – in this instance, guitar players and guitar listeners.

So yes, then, we can respect this question being asked on behalf of an artist, that of putting words in their mouth, but it risks the cost of silencing your own words, which these days, especially, is anathema. When an audience gets involved, a published or exhibited creation of any kind takes on a new, unpredictable existence – many, in fact, each one personally derived by those who partake. As I briefly acknowledged above, this discussion can encompass not just objects or texts or songs but, more broadly, ideas, beliefs, and cultural practice. But leave the details of any such area for others to discuss who have the appropriate nuance and expertise.

Finally, an anonymous quotation – if anybody knows the source, please say so! – on the topic of literary interpretation and criticism in general…

“One way students learn to interpret fiction, poetry, and drama is through an understanding of the conventional elements of literature. Another way to deepen our appreciation and enhance our understanding of literature is to learn how readers have interpreted it over the years since it was written. Therefore, reading and discussing the ideas of scholars who have studied particular authors, periods, or genres provide us with knowledge that enriches all our reading experiences. Likewise, reading literature with an awareness of specific critical views of the major schools of literary theory – such as mythological, historical, psychological, reader-response, feminist, or deconstructive perspectives – further expands our reading experiences. Through knowledge of the historical impact of literature and the major literary theories, students of literature learn to appreciate a work of literature for its social, moral, or spiritual worth – in other words, its cultural value.”

It’s nothing so ground-breaking or, as I mentioned, music to the ears of students who just don’t want to study. Yet this response underscores a most fundamental precept of academics, across every discipline, just as it pleads for culture – the sincere stewardship of our stores of knowledge, inherited, refined, and passed on over time to future generations, is an artistic task of creativity, that is to say a shared process, and we all therefore bear a measure of responsibility. If that’s not a compelling “motive for metaphor,” then I don’t know what else to say!

I hope this helps to address some of the more common issues and questions that people raise regarding the analysis of literature, specifically, and of art in general.

[1] Letters of J.R.R. Tolkien. Ed. Humphrey Carpenter with Christopher Tolkien. George Allen and Unwin, London, 1981.