What On Earth Were They Thinking?

How often have you heard somebody question people who lived “back then,” in the swirling historical mist, who somehow just didn’t have the knowledge, the nuance, or the capability that we so proudly wield today?

“Things back then were just a lot more simple.”

“They just weren’t as developed back then as we are today.”

“Society back then was a lot less informed than we are today.”

It’s difficult to even confront such statements out of their context, but where I’ve had all three of these spoken to me in (separate) conversations, I challenged each impression as an insinuation that we’re somehow smarter than all the peoples of history, more skilled, more sophisticated, more aware, more woke (as they say, these days), that we’re, in the main, altogether better than our elders merely for having lived later than they did. These days, apparently, “we’re more evolved” – ha! 🙂  more evolved, that’s always a good one, as if back then everyone was Australopithecus while here we are jetting across the oceans, toting our iPhones, and drinking fine wines. Well, sure, maybe things have changed since back then, whenever “then” was. But, more typically I’ve found, contemporary judgments levelled upon history are borne of an unintended arrogance, a symptom of 20:20 hindsight and the self-righteous assurance that, today, we’ve finally seen the error – actually, make that the errors – of their ways.

Surely, these days, few – or any – would believe that we’re ignorant or unaccomplished or incapable, not the way someone might say of us while looking back from our future. At any given point on the historical timeline, I wonder whether a person at that point would feel as good about their era, looking back, as any person on some other point of the timeline would feel, also looking back, about theirs. Is it a common tendency, this judgment of contemporary superiority? These days, we might well feel superior to people who had no indoor plumbing or viral inoculations or Internet access, just as someone earlier would have appreciated, say, some technological tool, a hydraulic lift to raise heavy objects, or a set of pulleys, or a first-class lever – choose whatever you like! It really is arbitrary for a person, looking back at history, to feel better about contemporary life because contemporary life infuses all that person’s experience while history’s something that must be learnt.

I’d say that learning history means learning whatever has lasted and been passed on because what has lasted and been passed on was deemed to have merit. We’re taught the history that has been deemed worth remembering. What I’ve found has been deemed worth remembering (i.e., the kinds of things I learned from History classes) are the general mood of the times, the disputes and fights that occurred (violent or academic), a select handful of the figures involved, and an inventory of whichever non-living innovations and technologies simultaneously arose alongside it all. If, later, we demerit and no longer pass on what has lasted up until then, passing on instead some different history, then that’s entirely indicative of us, now, versus anyone who came before us, and it reflects changed but not necessarily smarter or better priorities and values.

For me, we shouldn’t be saying we’re any smarter or better, only different. So much literature has lasted, so much art. Commerce has lasted, institutions have lasted, so much has lasted. Civilization has lasted. Cleverness, ingenuity, shrewdness, wit, insight, intellect, cunning, wisdom, kindness, compassion, deceit, pretence, honesty, so many many human traits – and they all transcend generations and eras. People vary, but human nature perdures. I’ll trust the experts, far more immersed in specific historical study than me, to identify slow or subtle changes in our traits – hell, I’ll even grant we may have culturally evolved, after all – and I can only imagine in how many ways the world is different now as compared to before now. But what does it mean to be better? Better than other people? Really? And who decides, and what’s the measure?

We can measure efficiency, for instance, so to say technology has advanced and is better than before is, I think, fair. Even then, an individual will have a subjective opinion – yours, mine, anybody’s – making culture not proactive and definitive but rather reactive and variable, a reflection, the result of comprised opinions that amplify what is shared and stifle what is not. As we’re taught merited history, you’re almost forced to concur, at least until we reconsider what has merit. That’s a sticking point because everyone will have an opinion on what is culturally better and what is culturally worse. Morality inevitably differs, and suddenly we have ethical debate, even disagreement, even discord. But to say people or culture are better, I think, is too subjective to rationalize and a questionable path to tread.

Consider this as well: we each know what we’ve learned, and as individuals, we’ve each learnt what we value. But what we’re taught is what’s broadly valued and, thereby, prescribed for all. We’ve all heard that rather hackneyed epigram, that those who neglect history are doomed to repeat it. Well, maybe the ones screwing up just didn’t learn the right history to begin with. I tend to abide by another hackneyed epigram, that they are wisest who know how little they know. Real historical wisdom and real historical understanding would be seeing and thinking and understanding as people earlier did. But short of firing up the Delorean for an extended visit some place, some time, it seems to me that judgments about history are made with an aplomb that might be better aimed at acknowledging our finite limitations. We’re no angels. If anything, this error of judgment speaks volumes about us. Condescension is what it is, but in my opinion, it’s no virtue.

We should hardly be judging the past as any less able or intelligent or kind or tolerant or virtuous as us, especially not if we aim to live up to today’s woke cultural embrace of acceptance. Being different should never be something critiqued; it should be something understood. Conversely, in proportion to how much we know, passing judgment is assumptive, and we all know what Oscar Wilde had to say about assuming (at least, we know if we’ve studied that piece of history). At the very least, we ought to admit our own current assumptions, mistakes, errors, accidents, troubles, disputes, and wars before we pass any judgment on historical ones.

On that positive note, I will say that considering all this has prompted me to notice something maybe more constructive: so often, at least in my experience, what we seemed to study in History class were troublemaking causes and effects, bad decisions, and selfishly motivated behaviours. Far more rarely was History class ever the study of effective decision-making and constructive endeavour – maybe the odd time, but not usually. Maybe my History teachers were, themselves, stifled as products of the system that educated them. What could they do but pass it along to me and my peers? Considering that, I might more readily understand how people, alive today, could conclude that all who came before were simply not as enlightened, as sophisticated, or as adept as we are now.

Yet that merely implicates contemporary ignorance: assumptions and mistakes still happen, errors still occur, accidents – preventable or not – haven’t stopped, troubles and disputes and wars rage on. If the axiom of being doomed to repeat history were no longer valid, we wouldn’t still feel and accept its truthful description, and it would have long ago faded from meaning. All I can figure is that we’re still poor at learning from history – the collective “we,” I mean, not you in particular (in case this essay was getting too personal). We need learned people in the right positions at the right times, if we hope to prevent the mistakes of history. Not enough people, I guess, have bothered to study the branches of history with genuine interest. Or, no, maybe enough people have studied various branches of history, but they don’t remember lessons sharply enough to take them on board. Or, no no, maybe plenty of people remember history, but the circumstances they face are just different enough to tip the scale out-of-favour. Epigram time: history doesn’t repeat, but it does rhyme. Or maybe we’re just full of ourselves, thinking that we’ve got it all solved when, evidently, we don’t.

It also dawned on me, considering all this, that high school “History” influences what many people think about broader “history.” My high school experience, university too, was mostly a study of politics and geography, and toss in what would be considered very elementary anthropology – all this as compared to those other branches of historical study. Archaeology and palaeontology come to mind as detailed, more scientific branches of history, but there are so many – literary history, philosophical history, religious, environmental, military, economic, technological, socio-cultural as I’ve already indicated, on and on they go, so many categories of human endeavour. I’ve even come across a thoughtful paper contemplating history as a kind of science, although one that is normative and susceptible to generational caprice. One final epigram: history is what gets written by the winners.

And that’s really the point here: throughout what we call human history, where we’ve subdivided it so many ways (right down to the perspective of every single person who ever bothered to contribute, if you want to break it down that far), it’s people all the way back, so it’s all biased, so it’s neither complete nor even accurate until you’ve spent oodles of time and effort creating a more composite comprehension of the available historical records. And, dear lord, who has time for that! History, in that respect, is barely conceivable in its entirety and hardly a thing to grasp so readily as to say, simply, “Back then…” History is people, and lives, and belief inscribed for all time. To know it is to know who lived it as well as who recorded it. Knowing others is empathy, and empathy is a skill trained and fuelled by curiosity and diligence, not emotion or opinion. Emotion and opinion come naturally and without effort. For me, valid history derives from informed empathy, not the other way around.

As far as recording history for future study, ultimately, it will have been people again recording and studying all of it, “it” being whatever we care to remember and record about what somebody was doing, and “doing” being all of what people were doing to the attract attention of those doing the recording. It’s all a bit cyclical, in itself, and completely biased, and someone will always be left out. So people might be forgiven when shaking their heads in judgment of the past because, without empathy, what else could they possibly know besides themselves?

Suppose Trivial Grammar Were Actually the Road Map

Satire compounds disagreement, and ridicule has a way of locking closed doors

Sometimes, the hardest part of teaching felt like finding a way to reach students when they just didn’t get it. But if there’s one thing I learned while teaching, it’s that it takes two. In fact, the hardest part of teaching was coming to realise it wasn’t them not getting it, it was me not getting them. In my own defense, I think we just never can know what another person’s motive truly is. It was times like that when I realised the true constructive value of respect and a good rapport. To have any hope of being open-minded, I intentionally needed to respect my students’ dignity, and I needed to be more self-aware as to how open- or closed-minded I was being. Humility has that way of being, well, humbling. These days I’m still fallible but a lot better off for knowing it. And, yes, humility’s easier said than done.

Over sixteen marvellous years teaching secondary English in a high school classroom, I learned that teaching is a relationship. Better still, it’s a rapport. I learned that it takes two, not just hearing and talking but listening and speaking in turn, and willingly. And, because bias is inescapable, I learned to consider a constructive question: what motives and incentives are driving anyone to listen and speak to anyone else? It has an admittedly unscrupulous undertone: what’s in it for me, what’s in it for them, who’s more likely to win out? The thought of incentives in high school likely evokes report cards, which is undeniable. But where listening (maybe speaking, too) to some degree means interpreting, what my students and I valued most was open-minded class discussion. With great respect for our rapport, we found the most positive approach was, “What’s in it for us?” The resulting back-and-forth was a continual quest for clarity, motivated on everyone’s behalf by incentives to want to understand – mutual trust and respect. Looking back, I’m pleased to say that tests and curricula seldom prevented us from pursuing what stimulated us most of all. We enjoyed very constructive lessons.

Of course, we studied through a lens of language and literature. Of particular interest to me was the construction of writing, by which I mean not just words but the grammar and punctuation that fit them together. My fascination for writing has been one of the best consequences of my own education, and I had encouraging – and one very demanding – writing teachers. In the classroom and on my own, I’ve always been drawn to structure as much as content, if not more so, which isn’t unorthodox although maybe not so common. The structure of writing gets me thinking on behalf of others: why has the writer phrased it this certain way? What other ways might be more or less well-suited for this audience? How might I have phrased something differently than this writer, and why? Most English teachers I know would agree that pondering such questions embodies a valuable constructive skill, these days trumpeted as critical thinking. I’d argue further that it’s even a pathway to virtue. Situated in context, such questions are inexhaustible, enabling a lifetime of learning, as literally every moment or utterance might be chosen for study.

In that respect, we loosely defined text beyond writing to include speech, body language, film, painting, music, architecture – literally any human interaction or endeavour. I’ll stick mostly with listening and speaking, reading and writing, just to simplify this discussion. The scope being so wide, really what our class sought to consider were aim and intention. So when students read a text for content, the WHAT, I’d ask them to consider choices made around vocabulary, syntax, arrangement, and so forth, the HOW. That inevitably posed further questions about occasion and motive, the WHY, which obliged varying degrees of empathy, humility, and discernment in reply: for a given writer, how best to write effectively on a topic while, for a given audience, what makes for skillful reading? What motives are inherent to each side of the dialogue? What incentives? These and others were the broader-based “BIG Question” objectives of my courses. They demanded detailed understanding of texts – heaven knows we did plenty of that. More importantly, the BIG Questions widened our context and appreciation even while they gave us focus. When times were frustrating, we had an answer for why studying texts mattered. Questions reflect motivation. Prior to exercising a constructive frame-of-mind, they help create one.

Questions, like everything else, also occur in a particular context. “Context is everything,” I would famously say, to the point where one class had it stencilled for me on a T-Shirt. So much packed into those three plain words – everything, I suppose. And that’s really my thesis here: if we aim to be constructive, and somehow do justice to that over-taxed concept, critical thinking, then we need to be actively considering what we hear and say or read and write alongside other people, and what it all makes us think for ourselves – especially when we disagree. (Is active thinking the same as critical thinking? I’m sure the phrase is hardly original, but I’ll consider the two kinds of thinking synonymous.) During my last 3-4 years in the classroom, all this came to be known by the rallying cry, “Raise the level of discourse!” These days, however, the sentiment is proving far more serious than something emblazoned on a T-Shirt.

I’m referring, of course, to the debacle that has been the 2016 U.S. Presidential election and its aftermath. Specifically, I have in mind two individual remarks, classic teachable moments inspired by current events. The first remark, from an NPR article by Brian Naylor on the fallout over the executive order banning Muslim immigrants, is attributed to the President. The second remark is a response in the comment section that follows Naylor’s article, representative of many commenters’ opinions. To begin, I’ll explain how something as detailed as grammar and punctuation can help raise the level of discourse, especially with such a divisive topic. From there, I’ll consider more broadly how and why we must always accept responsibility for this active language – sometimes correct grammar should matter not just to nit-pickers but to everybody.

In the article (February 8, 2017), Brian Naylor writes:

“Trump read parts of the statute that he says gives him authority to issue the ban on travel from seven predominantly Muslim nations, as well as a temporary halt in refugee admissions. ‘A bad high school student would understand this; anybody would understand this,’ he said.”

We all know the 45th U.S. President can be brusque, even bellicose, besides his already being a belligerent blundering buffoon. This comment was received in that light by plenty, me included. For instance, by classifying “bad” (versus “good”), the President appeals at once to familiar opposites: insecurity and self-worth. We’ve all felt the highs and lows of being judged by others, so “bad” versus “good” is an easy comparison and, thereby, a rudimentary emotional appeal. However, more to my point, his choice to compare high school students with lawyers, hyperbole or not, was readily construed as belittling since, rationally, everyone knows the difference between adult judges and teenaged students. That his ire on this occasion was aimed at U.S. District Judge James Robart is not to be misunderstood. Ironically, though, the President invokes the support of minors in a situation where they have neither legal standing nor professional qualification, rendering his remark not just unnecessarily divisive but inappropriate, and ignorant besides – although he must have known kids aren’t judges, right?

To be fair, here’s a slightly longer quotation of the President’s first usage of “bad student”:

“I thought, before I spoke about what we’re really here to speak about, I would read something to you. Because you could be a lawyer– or you don’t have to be a lawyer: if you were a good student in high school or a bad student in high school, you can understand this.”

Notice, in the first place, that I’ve transcribed and punctuated his vocal statement, having watched and listened to video coverage. As a result, I have subtly yet inevitably interpreted his intended meaning, whatever it actually was. Yet my punctuation offers only what I believe the President meant since they’re my punctuation marks.

So here’s another way to punctuate it, for anyone who feels this is what the President said:

“Because you could be a lawyer, or you don’t have to be a lawyer – if you were a good student in high school or a bad student in high school, you can understand this.”

Here’s another:

“Because you could be a lawyer. Or you don’t have to be a lawyer. If you were a good student in high school or a bad student in high school, you can understand this.”

Finally, but not exhaustively, here’s another:

“Because you could be a lawyer… or you don’t have to be a lawyer; if you were a good student in high school or a bad student in high school, you can understand this.”

Other combinations are possible.

Rather than dismiss all this as pedantry, I’d encourage you to see where I’m coming from and consider the semantics of punctuation. I’m hardly the only one to make the claim, and I don’t just refer to Lynne Truss. Punctuation does affect meaning, both what was intended and what was perceived. To interpret the President’s tone-of-voice, or his self-interrupting stream-of-consciousness, or his jarring pattern-of-speech, or whatever else, is to partly infer what he had in mind while speaking. We interpret all the time, listening not only to words but tone and volume, and by watching body language and facial expression. None of that is typically written down as such, except perhaps as narrative prose in some novel. The point here is that, in writing, punctuation fills part of the interpretive gloss.

Note also where a number of news headlines have used the word “even” as an interpreted addition of a word the President did not actually say. Depending upon how we punctuate his statement, inclusive of everything from words to tone to gestures to previous behaviour, perhaps we can conclude that he did imply “even” or, more accurately, perhaps it’s okay to suggest that it’s what he intended to imply. But he didn’t say it.

If we’re going to raise the level of discourse to something constructive, we need to balance between accepting whatever the President intended to mean by his statement with what we’ve decided he intended to mean. In the classroom, I put it to students as such: “Ask yourself where his meaning ends and yours begins.” It’s something akin to the difference between assuming (based on out-and-out guesswork because, honestly, who besides himself could possibly know what the President is thinking) and presuming (based on some likelihood from the past because, heaven knows, this President has offered plenty to influence our expectations). Whatever he meant by referring to good and bad students might be enraging, humbling, enlightening – anything. But only if we consider the overlap, where his meaning ends and ours begins, are we any better off ourselves, as analysts. Effective communication takes two sides, and critical thinking accounts for both of them.

Effective, though, is sometimes up for debate, not merely defining it but even deciding why it matters. Anyway, can’t we all generally figure out what somebody means? Isn’t fussing over details like grammar more about somebody’s need to be right? I’d argue that taking responsibility for our language includes details like grammar precisely so that an audience is not left to figure things out, or at least so they have as little ambiguity to figure out as possible. Anything less from a speaker or writer is lazy and irresponsible.

In the Comments section following Naylor’s article, a reader responds as follows:

“Precisely describing Trump’s base…bad high school students who’s [sic] level of education topped out in high school, and poorly at that. This is exactly what Trump and the GOP want, a poorly educated populous [sic] that they can control with lies and bigoted rhetoric.”

Substantively, the commenter – let’s call him Joe – uses words that (a) oversimplify, blanketing his fellow citizens, and (b) presume, placing Joe inside the President’s intentions. Who knows, maybe Joe’s correct, but I doubt he’s clairvoyant or part of the President’s inner circle. On the other hand, we’re all free to draw conclusions, to figure things out. So, on what basis has Joe made his claims? At a word count of 42, what was he aiming to contribute? Some of his diction is charged, yet at a mere two sentences, it’s chancy to discern his motives or incentives, lest we be as guilty as he is by characterising him as he characterises the President. Even if I’m supportive of Joe, it’s problematic defending his remarks for the same reason – they leave such a gap to fill. At 42 words, where he ends is necessarily where the rest of us begin, and maybe I’m simply better off ignoring his comment and starting from scratch. Maybe that’s fine, too, since we should all have our own opinions. In any event, Joe has hardly lived up to any measure of responsibility to anybody, himself included – here I am parsing his words months later in another country. I’d even say Joe loses this fight since his inflammatory diction and sweeping fallacy play to his opponents, if they so choose. Unsurprisingly, Joe’s comment is not at all constructive.

For all its faults, his comment aptly demonstrates the two-way nature of dialogue. On the one side, responsibility falls to each reader or listener to bring their research and experience, then discern for themselves what was meant. In that regard, Joe has left us with a lot of work to do, if we’re motivated enough to bother. Yet I chose his particular comment as mere illustration – literally hundreds of others, just as brief and labour-intensive, scroll by below Naylor’s article… so much work for us to do, or else to dismiss, or perhaps to gain-say, if not insult. On that note, consider the other side: responsibility falls to the speaker or writer to offer substantive claims as well as the evidence that prompted them. In this instance, no matter the justification for offering something at all, what can a two-sentence comment add to issues as complex and long-standing as, say, Presidential politics? Whether or not on-line comments are democracy in action, certainly offering 42 words in two sentences struggles to promote a meaningful, substantive exchange of ideas.

I used to liken such on-line comments to my students as standing in line, debating with others while waiting for coffee, before returning to our cars or our lives, none the more informed except perhaps annoyed by some while appreciative of others. With the best intentions, we might excuse people, overlooking that we’re the ones who walked out and drove away – maybe we were late for work that day. We’ve been closed-minded to the degree that we haven’t sought to reach a thorough understanding, and certainly we’ve failed to raise the level of discourse. Would it have been better to just say nothing, grab our coffee, and leave?

Yes, I think so, which may not be easy to accept. Conversely, consider that reasoning from presumption and enthymeme is not reasoning at all. Further, consider that two sentences of 42 words or a few minutes spent chatting in the coffee line will barely scratch the surface. Who can say what motivates people to contribute so readily yet so sparsely? Recent times are emotional, growing more volatile, and potentially far more dangerous, as a result. We see in Joe’s comment, and so many others like it, that trust and respect are divisively encased in separate echo chambers. By virtue of us versus them, both sides are challenged to be open-minded.

Worse, the so-called era of “post-truth” impedes exactly the constructive dialogue we need right now, raising ire and diatribe in place of substance and equanimity. Satire compounds disagreement and grows that much more venomous, and ridicule has a way of locking closed doors. I don’t support proceeding from pretence or unfounded opinion – there’s nothing whatsoever to show for an exchange-of-opinion based on falsehood. The burden of post-truth is far too high. A bias and the truth can co-exist, and they do, guaranteed – one truth, objective, and one bias per person, subjective. Bias is an inevitable fact of existence. Left unchecked, bias obviates respect, which is why a constructive approach is so crucial. As I’ve said elsewhere, post-truth is anti-trust, at least for me, and, at its furthest extent, a threat to civil security, which sounds alarmist – good, let it. We need to attend to this. More than ever now, we need respect or, failing that, at least greater tolerance. That’s for starters.

Worse still, in this post-truth world, fictional claims face no arbiter but the other side so distrusted and maligned. The kind of polarised situation made infamous in Washington, DC is spreading, realised in a zillion on-line comments like Joe’s with every article published. Hopefully not this one, unless maybe someone hasn’t actually read this. On such a perilous path – facts in dispute, emotions enflamed – each side qualifies “open-minded” as unique to themselves and misappropriated by the rest. That’s significantly divisive and the recipe for unrest that I spy, and it sounds my alarm. In that divided state, in lieu of anything left to discuss, even as reality has its way of catching up, what damage might already be done? Especially when facing fellow citizens, whatever we choose now must accord with what we’re prepared to accept later. Let that sober thought sink to the core because the less we share common goals, the more we’re set to clash over unshared ones. But it’s within us to converse and to converge.

Let’s be willing to listen with empathy, understand with compassion, research with diligence, and respond with substance. Do your own investigation. Accept responsibility to inform yourself. Yes, take what you find with a grain of salt until you can believe to your own satisfaction what is right and trustworthy. Yet, even then, be tolerant if not respectful of others – too much salt is harmful. We all have our own motives and incentives for listening and participating, so let’s dig deeper than how pissed off we are with the other side: walking the high road with pride or smug assurance is really the low road and a path of hubris. It’s closed-minded, but not in the sense that we haven’t sought to reach a thorough understanding of the other side. It’s closed-minded to the degree that we haven’t sought to understand how and why the other side reached their position to begin with.

None of this is hard to understand. Once upon a time, we decided that education mattered, and it’s no accident that the trivium – grammar, rhetoric, dialectic – was given a central role. These days, its value in niche markets, notably private Christian education, is enough to switch some people off, which sadly exemplifies this entire discussion. I believe classical education is valuable for all. We’ve neglected it to our detriment, perhaps to our peril. We have a lot in common, more than we might credit, with our neighbours and fellow citizens. It’s not like they grew up on Mars. We’re not significantly different – hands up if you’re a human being. Start with that, some basic human dignity.

There’s a lot to be offered by rapport in our relationships, and little to expect without it. All we can do is understand the other person’s interpretation, and they ours, and go from there – or else not. And it’s easy to nod and say, “I already do that while others do not.” But reflect upon yourself anyway, in every conversation, debate, or exchange. Humility is a virtue, even when kept low-key. Everybody bears responsibility for their own participation. The more we live up to being respectful, even of those whom we oppose, the more progress we’re liable to make – however slowly it might happen.

As I said at the outset, yes, humility’s easier said than done. But by the same token, why write this essay if 42 words would do? We must neither hide ourselves away nor proceed prematurely. We must be able to discern flaws of reason, and we must be able to communicate with humility if we aim to deliver – and, more critically, if we hope to be received – from a place of thoughtfully considered understanding. Whether or not we truly trust one another, let’s help put the logos back in dialogue and accept our own responsibility to approach people with intentional self-awareness. Let’s seize the opportunity to be role-models – you just never know what somebody else is thinking. Let’s raise the level of discourse. And let’s remember that taking the high road must be open-hearted as well as open-minded.

The Three Appeals Must Be Measured

Have you seen this? This photo has been published via Twitter by The Humane League, an advocacy group who “[work] relentlessly to reduce animal suffering through grassroots education to change eating habits and corporate campaigns to reform farm animal treatment.” In broader respect, their aims relate to mine here on this blog – that is, to reach people by way of their decision-making in order to effect change in our (what I take to be cultural) behaviour. They post this photo on-line, where I’ve had it show up in my Facebook news feed after a friend “likes” it. Twice when it appeared, I posted my thoughts, which I share now for the third (and last) time.

The statement on the sign is clear enough, but what is the process behind enacting the words, specifically “we” and “use” and “leaves”? “We” who, which land owners, which legislators, in which jurisdictions, incentivized how, by whom, coordinated in what manner, measured by what instruments or methods, affirmed by which authorities… the list of questions goes on. The statement on the sign is clear enough, but the sought-for outcome seems way more complicated to achieve than simply eating less meat because Paul Rudd holds a sign.

How could any coordinated effort ever be done, any assurance that you not eating meat, and me not eating meat, and him, and her, and them, any of us… how could we ever know we were making an active difference versus just randomly taking turns not eating meat? On Mondays. Surely, if it were this easy… Meanwhile, the rest of the world who missed Paul Rudd’s signs and movies eats meat.

He ought, at the very least, to hold up a sign with instructions for organising or joining an effort to set aside land as a reserve. Wouldn’t it require financial incentives built in for whomever would otherwise use the land, i.e., to replace the revenue they were getting from this particular sort of agriculture? There’s one more word on the sign that is far too vague: “agriculture.” Specify… Wouldn’t changing the control of the resources, not the consumption, be a more directly effective approach? This ad seems to oversimplify, and that undermines whatever might be worthwhile about the issue.

But further, and maybe more troublesome, oversimplifying does little to flatter the ad’s proponents beyond portraying some collective feel-good wishing for a better world, which would seem to play right into the hands of anyone whom they’re disputing. Farms and land-use and pollution and economics and history are realities to be dealt with responsibly, that need intentional, incentivized discussion between however many various sides. Reality is life – decisions and consequences. If certain land-use can be proven ill-suited, or destructive, whatever the case, then approach with efforts on that basis, but don’t undermine the issue and waste valuable time and attention with fame-dropping. That must be very insulting to people who work hard and earn their livelihood farming, the very people who need to be engaged in the discussion.

It seems to me that employing (financially or colloquially) celebrity appeal as the method for touching an audience emotionally is most responsibly done when it’s also stabilized or held accountable by a detailed plan, like what roots do for plants, being a means of gathering water and nutrients as well as an anchored foundation. Emotions stirred up by reputation ought to be channelled or marshalled or else they could run rampant, and suddenly the whole message gets lost in the maelstrom – or maybe better to say, a new message takes hold. However, responsibility and effectiveness do not always converge.

This photo relies on [whomever]’s favour for a given actor – Paul Rudd, of all people, but it might have been anybody. So at least they chose a fairly peaceful, fun-loving actor and not someone more polarizing or controversial. Evidently, Paul Rudd consented to pose for this (or else hasn’t complained that he was photoshopped? …doesn’t matter, not the point), but in any case, The Humane League believed he’d appeal to the audience they sought. Doubtful they were seeking me! even though (a) I’ve always liked Paul Rudd’s acting and (b) I can sympathise with animal advocacy. But whomever they were seeking, they seem to rely upon the appeal of an actor to suffuse their ad with substance. And if I’m wrong on that point, if The Humane League can point to some ongoing program that addresses what I’ve raised, then this ad is not only a failure of wasted effort but unnecessarily provocative and irresponsible for what it might stir in spite of real programming.

The Humane League, professing that a “sense of purpose drives every action in our unrelenting march forward… ,” was not an organisation I ever knew prior to their ad showing up in my news feed. Despite their advocacy, they worry me – not about a march that’s unrelenting so much as a march that’s unchecked, which is why I bothered posting three times. I’m hoping to reach an audience, too, not with purpose merely sensed but more consequentially realised.