Category Archives: Diction

Reclaiming the “narrative”

In the wake of the Mueller report on the Trump-Russia investigation, the White House and its allies were quick to strike out against the liberal and mainstream media for their coverage of the two-year inquiry.  The scrolling news ticker on Fox News mocked the liberal media’s “collusion narrative” and an op-ed in USA Today gloated that “the drummed-up narrative of collusion has now imploded.” The president must have especially relished the blaring headline on the The Hill’s front page, which read “Treason Narrative Collapses.” His fortunes reversed, Trump lost no time in flinging that charge of treason back on the liberal press. It seemed the president had taken control of the “narrative.”

Except, of course, he hadn’t. Neither, certainly, did the Democratic camp, despite some professions of hope; on the day Attorney General William Barr released his summary report, Politico’s Bill Sher still saw a tactical opportunity in “the narrative that Trump is a threat.” A week later, that opportunity seemed to be shrinking; critics of the president were concerned that the White House’s version of events – the “first narrative” – would be hard to dislodge, even if damaging findings against the president were revealed in the pending full report. As the New York Times put it, “because Mr. Barr created the first narrative of the special counsel’s findings, Americans’ views will have hardened before the investigation’s conclusions become public.” On the eve of the full (redacted) report’s release on April 18th, Democrats accused Barr of continuing to harden those views. As Jerold Nadler put it, Barr was still “trying to bake in the narrative about the report to the benefit of the White House.”

Whatever the outcome, there will be no winner in this contest of so-called narratives, since the word, in its current usage, bears almost no relationship to its actual meaning. And narrative, actually, may have something useful to offer to politics and journalism at the present conjuncture.

In his influential essay “Permission to Narrate,” Edward Said provides an eloquent defense of narrative as a means of political self-determination. Said argues that narrative’s story-based ordering of lived experience can foster a meaningful collective vision of identity by combining historical memory with a purpose-driven orientation to the future.* However, for that same reason, he points out, narratives are often the object of competing forces that promote or suppress them. Occupied Palestine is Said’s case in point; the scholar views Israel’s persistent negation of the history of the Palestinians as a refusal to see their experience as narratively legitimate.

This focus on narrative lends a particular acuity to Said’s critique of press coverage. Writing on reports of Israeli war crimes that went largely unreported, Said says that “the findings are horrifying – and almost as much because they are forgotten or routinely denied in press reports as because they occurred.” The scholar’s claim seems almost shocking in itself, as he virtually equates tragic death and destruction on a massive scale with the “horror” of some missing lines of teletype. Far from being hyperbole or a mere argumentative ploy, Said’s rhetorical gesture shows to what extent he views narrative as being materially bound up with the forces that can authorize existence. Facts require narrative; absent a legitimating account of one’s history and purpose, even a favorable rendering of the facts can undermine the rightful claim that narrative authority confers on history and human agency (265). Narrative, then, provides an organizing and justifying rationale for one’s continuing right to exist, and it does so not only through the force of a unifying story but also by encompassing the full complexity of lived experience, including such things as “absences and gaps” (256) and an “overwhelming mess” of anecdotes, evidence and vignettes (257), even aspects of life and experience that are “prenarrative” and “antinarrative” (256).

Said’s much-missed scholarly voice combined a patient dedication to literary analysis with the restless urgency of political advocacy. His vision of narrative is very remote, to say the least, from its usage in contemporary mediaspeak, where the word’s meaning has shriveled into a synonym for “messaging,” “spin” and “disinformation.” As a result of this semantic demotion, we risk losing critical traction on terrain where narrative study and a sense of narrative’s political value – call it literacy – can help to advance progressive causes.

Should anyone doubt the continuing relevance of Said’s analysis in “Permission to Narrate,” the fallout from the release of the Mueller report’s main findings on March 25th provided sobering proof. On the very day that the White House asserted control of the Trump-Russia “narrative,” the president signed an order recognizing Israel’s claim on the Golan Heights; meanwhile, Israel launched an air campaign in the Gaza Strip. Reporting from Jerusalem, The Guardian’s Oliver Holmes says that “the fight over the narrative” routinely makes PR in Israel more important than information. “Unlike anywhere I’ve ever reported,” he says, “the focus here is not on what happens, but how that story is told.” Clearly, Said’s analysis remains pertinent, though the recent demotion of the term “narrative” adds a new and troubling dimension to his critique of the press. If Said could call media coverage “horrifying” for its denial of specific cases of injustice, the negation of narrative extends the range of that horror potentially very far indeed. By demoting the meaning of narrative, the chorus of voices in today’s media participates in a broader silencing which, by deligitimizing the activity of storytelling, denies us a fundamental means of human agency and political self-determination.

In other words, this is no quarrel about diction. It is not only that “narrative,” in the current vernacular, is fundamentally simplistic; one could make the claim about almost any vocable spit out by the corporate media. The more concerning problem is that the word’s connotations today are virtually always negative. To speak of a “false narrative” is to waste an adjective. In the current mediascape, “narratives” are understood to be manipulative, and willfully so; one’s own narratives are embraced with the cynicism of an ad man, while rival narratives are flatly refused. This crippled usage, unfortunately, is endemic on both the left and right. Well before the collusion theory was debunked, Robert Reich parsed the “underlying message” of the president’s attacks on the media as a “narrative” that Trump’s critics are enemies “conspiring” to undermine the presidency. “It’s a narrative,” the professor darkly warns us, “that’s showing up increasingly on right-wing websites.” In this usage, the term “narrative” is quite literally equivalent to the idea of conspiratorial propaganda, and is therefore identical to the crude way the term is wielded on the Republican side.

Reich, the would-be liberal thought leader, is thus wrong on two counts: both in his vulgar usage of the term and in his dismissal of a liberal conspiracy, which was unmasked to such disastrous effect on March 25. On that fateful day, Glenn Greenwald gave a merciless critique of Trump-Russia conspiracy talk, which devoured the liberal airwaves for two years and whose failure has now reset the white nationalist agenda. Greenwald’s scathing intervention was exemplary also in that he specifically castigated the liberal media for its poor narrative imagination: in his assessment, the treason story peddled on such platforms as MSNBC amounted to narrative’s crudest form of plot: a story, Greenwald said, worthy of a novel by Tom Clancy.

Here, as elsewhere, Republicans have managed to yank progressives rightward — the dominant trend in US politics over the past 40 years. It is not only in policy, however, but in the entire discursive imagination that leftists and progressives have yielded terrain to the fascists. This is what makes professor Reich’s critique of a video from the National Rifle Association so tragic, as it tacitly reinforces the cynical equation of narrative discourse with sheer artifice and deceit. Beneath the disagreement, in other words, lies a common accord. And yet the stakes of the argument could hardly be higher; the video Reich refers to was, at the time of the writing, one of the starkest expressions of American fascism to have been publicly voiced by an established, if utterly hateful, political lobbying organization. In the video, NRA spokeswoman Dana Loesch rails with quite terrifying menace at a left-wing cabal that, as she says, “use their media to assassinate real news. They use their schools to teach children that their president is another Hitler. They use their movie stars and singers and comedy shows and award shows to repeat their narrative over and over again.”

It might be argued that this use of the term “narrative” doesn’t reflect any fundamental change in the idea in its proper sense. However, like the demotion of “myth” in the positivist, rationalistic 19th century, the contemporary usage of “narrative” surely reflects a shift in storytelling’s cultural value and discursive authority. As scientific “techno-hype” and market logics undermine all sense of human purpose a lived experience of time and finitude escapes us, Donna Jones argues. “We have no meaningful narrative of our lives,” asserts Jones.** True to this zeitgeist, even nominally positive attitudes to narrative are markedly inflected toward instrumental notions of its purpose. In an article last week in the Hill Times, for instance, Lisa Van Dusen bemoans the decline of narrative in the current political climate of weaponized and “engineered” stories. Unfortunately, the journalist’s own idea of narrative is tragically reductive. In Van Dusen’s account, narrative should not only be more truthful and honest than the stories currently peddled in politics, it should aspire to a fully scientific standard of veracity; to reclaim narrative, she says, is to reassert control over “empirical” information and factual data. This eminently practical vision yields a succinct definition for our age: narrative, the journalist says, consists of “chronological facts and the biographical colour or other content connecting them.” If “biographical colour” seems a concession, however halfhearted, to narrative art, it also betrays the journalist’s instrumental notion of liberal inclusivity, coming as it does right after the author’s avowed admiration for the “triumph” of Obama’s campaign story. The slip is telling; in liberal narratives as in liberal society, “color or content” are additive, not transformative.

The point, however, is that narratives are constructive and transformative versions of reality; not because they are by nature artificial, and thus false, but because they create worlds of meaning. So nothing could be more erroneous than to claim, as the journalist does, that narratives convey “chronological facts.” Narratives are time-based, but they are anything but chronological; they portray time to the extent that it is meaningful – constructed in memory and anticipation, and with all the “absences and gaps” Said takes care to mention, without which there would be no pacing, plot or suspense, but also no rhythm or heartbeat. Neither would there be history, understood as a discursive construction of the time we share, always partial and limited, true, but without thereby being necessarily false or partisan.

These distinctions are obviously lost in the rush to promote or demolish so-called “narratives.” And if liberals, progressives and fascists are equally at fault in this problematic state of things, the fascists arguably have an advantage in the contest. Progressives and liberals will not win many battles in defending the naked truth or howling at “alternative facts.” Neither will they inspire the public by asking us to cope with fateful neoliberal inevitabilities – a chronological, managerial vision of disenchanted progress. Narrative is the original alternative to facts; it allows for a creative, transformative engagement with material reality, without thereby undermining all truth claims or yielding to cynical fabrication. The fascists may be liars, but in their euphoric delusions, willful fabrications and vicious conspiracy theories one can detect something that falls well short of narrative, but which shares something with all creative efforts to construct a world of meaning.

“Politics is numerous,” says Partha Chatterjee.*** Sadly, in its partisan acceptation “narrative” cannot seem to embrace a political multitude. A lonely exception is Naomi Klein, who sees narrative as the key to a mass movement on a global scale. “The urgency of the climate crisis,” Klein says, “could form the basis of a powerful mass movement, one that would weave … a coherent narrative about how to protect humanity from the ravages of both a savagely unjust economic system and a destabilized climate system.” **** A livable world, in Klein’s account, is a meaningful one, and its unifying “coherence” is at once political and narrative. That is a story worth sharing.


* Edward Said, “Permission to Narrate,” in The Edward Said Reader (New York: Knopf, 2007).

** Donna V. Jones, “Inheritance and Finitude: Toward a Literary Phenomenology of Time,” (ELH, Volume 85, Number 2, Summer 2018), 301.

*** Partha Chatterjee, Politics of the Governed: Reflections on Popular Politics in Most of the World (New York: Columbia University Press, 2004), x.

**** Naomi Klein, This Changes Everything: Capitalism vs. the Climate (New York: Simon & Schuster, 2014), 8.

Leave a comment

Filed under Diction, Politics of Discourse

Fresh hell

One can’t help notice a bad grammar mistake, but it doesn’t set off an alarm until you hear it again. With repetition the error becomes the verbal equivalent of a sin; but what then of the first transgression, you wonder, and where does the fault lie if it exists only in the plural? Meanwhile, unfazed by your moral worries, the locution spreads, becomes commonly accepted; you wonder if it might enter the dictionaries. You wonder, that is, how low we can go.

“Downfall.” The word is increasingly used to mean drawback, presumably by association with downside. “The only downfall was the bathroom,” says a restaurant review on Yelp. “The only downfall is the service,” says another. In a Seattle hotel review, “the only downfall” was urine on the sheets of “one of our beds.” When so much can go wrong, it seems that a lone downfall, or a single soiled bed, can be a blessing. In this way, mentioning a “downfall” can emphasize an overall success. Out of sheer scrupulousness, apparently, a patron of La Quinta details her frustration with the hotel’s excessively soft pillows. “Nonetheless,” she cheerily concludes, “that was the only downfall!”

Since “downfall,” in its proper sense, signifies a uniquely terrible and often terminal ruination, its use in these cases can be judged hyperbolic. Add the adjective “only” and the expression seems wholly redundant. But this is where the word’s new usage parts company with its staid cousin. The chorus of downfalls in our current vernacular suggests that ruination is in fact common and ordinary; the word “only” implies that the downfall in question is Version 2just one among many potential or even infinite possible downfalls. Other phrases suggest the same: the ever-relevant “new low,” for instance, and the increasingly popular “fresh hell” — as in the phrase (rhetorical question? Or not?) what fresh hell is this? Grammar prescriptivists may scoff, but who can deny the aptness of these phrases in a time when every low point turns out to be a false bottom, like a trap door to endless stacked gallows?

A book written by Christian Marazzi in the wake of the 2008 financial crisis implies as much by predicting “repetitive downfalls” in the world economy: “Over the course of 2009, and beyond,” says the author, “we will witness the succession of a false recovery, a hiccups movement in the stock exchange followed by repetitive downfalls.”* With its attention to the language of economics, Marazzi’s volume is prescient in more ways than one; the book includes a handy appendix, titled “Words in Crisis,” that lists the key terms of contemporary financial jargon, the watchwords of our grim reality. But the language of the analysis itself, including “repetitive downfalls” and “hiccups movement” seems to be symptomatic of crisis too, its stuttering awkwardness the sign of a hurried attempt to catch up with careering events — unless it is due to the translator, who we imagine (why not?) as harried, wretched, underpaid, no doubt desperate to keep to her deadlines and obligatory word count? Whatever the case, the prognosis of “repetitive downfalls” in The Violence of Financial Capitalism lends support to Colin Crouch’s insight that following its apparent demise, neoliberalism now persists in a weird state of continual “non-death,”** as well as the grim prognosis of “permanent economic collapse” that David Wallace-Wells says is our likely fate on an increasingly overheated planet. More recently, an ominous editorial by Eugene Robinson points to “a succession of new lows” in U.S. politics and darkly predicts that “the worst is yet to come.”

As a complex of overdetermined meanings, then, “the only downfall” is a culturally valid expression. It contains a host of social and economic anxieties and is therefore “crucial enough to pass along,” as a film critic says of horror’s infectious appeal.*** Google the words “only downfall” and your first hit is likely to be a popular and widely-shared quote signed r.h. Sin: “The only downfall of having a good heart is that you’re constantly looking for angels inside of demons.” The quote suggests that the word “downfall” may have migrated from its standard meaning, glomming new ones in the process, but still remains close to the source: Sin’s “downfall” unmistakably suggests Lucifer’s fall and man’s degraded state.

No joke, then; no mistake; no exaggeration. Wordsmiths at the blast furnace, our amateur linguists who labor pro bono on Yelp, Twitter, and interminable comment threads may be right to complain. They say we’re in hell, and glad “nonetheless” that our punishment is “only” this bad, that it’s “only” a few degrees warmer, and it’s “only” just begun.


*Christian Marazzi, The Violence of Financial Capitalism, Kristina Lebedeva, trans. (Los Angeles: Semiotexte, 2010), 11. It is noteworthy that the later edition of the book, translated by Lebedeva and Jason Francis McGimsey, substitutes the word “downturns” for “downfalls” — a translation no doubt more grammatically accurate but less symptomatically true to the crisis of the writing. Indeed, the previous “error” might be said to be closer to the events the book reconstructs (the start of the Wall Street crash) as well as those of the author and his translator, both suffering its immediate aftermath. Accordingly, a truer historical account of the causes and consequences of the 2008 financial crisis might be best pursued not in the field of economics but in close reading — specifically, close reading of significant errors, including the word “downfall.” What justifies this foray of literary analysis into the study of contemporary finance? We have seen that the word “downfall” is laden with an implicit knowledge of the socio-political conditions of the present. Further, grammatical errors, due to their dense overdetermination of meanings, are close to poetic expressions in their semantic complexity, and thus to artistic expression more generally. As a result, Argentine author César Aira’s theory of interpretation can be brought to bear on the socio-economic historiography of our recent past. Indeed, Aira’s theory of historical reconstruction, according to which “art … permits the reconstruction of the real-life circumstances from which it emerged” is the only theory capable of approaching the time-frame of financial trading, whose complex operations, like dreams, can occur in mere nanoseconds. In this way, interpretation can yield results that are a great deal more concrete than linguistic meanings and mathematical figures, even if the reconstructed “particles of reality,” in their increasing detail, tend ineluctably toward the impalpable. As Aira says, “The course of events that preceded the composition [of the work of art] can be deduced from the text, in ever greater detail, as one reads it over and over again. Perceptual data is recovered in this way, but also psychological binding elements, including memories, daydreams, oversights, uncertainties and even subliminal brain flashes. The treatment of the external conditions should be similarly inclusive: the succession can be progressively enriched with particles of reality, down to the subatomic level and beyond.” See César Aira, Varamo, Chris Andrews, trans. (New York: New Directions, 2012), 45, 44.

**See Colin Crouch, The Strange Non-death of Neoliberalism (Cambridge: Polity Press, 2011).

*** James B. Twitchell, cited in Carol Clover, Men, Women, and Chain Saws (Princeton: Princeton University Press, 1992), 11.

 

Leave a comment

Filed under Diction, Politics of Discourse

A Political Pathology

Election eve, 2106

American political discourse is rife with incoherence, from Sarah Palin’s word salads to Donald Trump’s staccato bluster. But like a word emerging from an infant’s babbling, misuse can yield a verbal coinage. George W. Bush, hardly a wordsmith, sometimes made a suggestive gaffe.

A word appeared this week that, to our knowledge, hasn’t been seen in print before. Since the new word didn’t draw any notice — vernacular linguists have their hands full lately — we point it out here. On November 2, 2016, The Guardian published a story about Republicans who were threatening to block any future Supreme Court candidate nominated by a Hillary Clinton administration. An interview with Senator Marco Rubio quoted him as saying that he wouldn’t reject such candidates in advance; unlike his intemperate colleagues, he would not, as he put it, “predispose” the nominees.

“No, I don’t believe that we should do that if they propose nominees that are good,” Rubio said. “I’m not going to go and predispose them that way.”

In spite of his denial, Rubio’s statement is equivocal at best; his qualification that the Clinton administration must offer “good” candidates signals his likely rejection of their nominees. In other words, or rather, in Rubio’s own new wording, the senator is very liable to “predispose” them.

Rubio’s solecism presumably draws on the sense of “disposal” as disposal of something. But the preposition of is not the only thing he has disposed of here.

Interestingly, the senator’s use of the word “predispose” seems tacitly linked to the dictionary’s standard notion of “predisposition”; in denying his Republican temperament and obstructionist leanings Rubio disavows his political “predisposition.” If this is true, the new coinage, predispose, is itself born of predisposal: the anticipatory negation of the senator’s own political character, whether through willful mendaciousness or unconscious displacement. Either way, a political pathology.

Future dictionaries may not cite this as verifiable etymology; in retracing word origins lexicographers don’t tend to plumb psychic motives. However, the authorities provide an enlightening psychological link between politics and disease in their definition of predisposition: “a liability or tendency to suffer from a particular condition, hold a particular attitude, or act in a particular way,” according to Oxford; “the state of being likely to behave in a particular way or to suffer from a particular disease,” according to Cambridge.

Our own suggested dictionary entry?

Predispose (v.): to reject something in advance; to throw out beforehand; to trash ahead of time: “Climate skeptics predisposed the future.”

 

Leave a comment

Filed under Diction, Politics of Discourse

Begging questions

Does someone approaching the cash register beg the question, “Did you find everything OK?” Does a diner in a restaurant beg the question, “How is everything tasting?” And if there are two diners, does the second one beg the question, “… and yourself?”

The answer is no, and not only because the expressions are as misguided as they are ubiquitous. Yourself is a needless overcorrection of the more usual pronoun, probably because you is thought to be too crude or invasive, not suitably deferential. But then why the strange locution How is everything tasting?, which undoes what the word yourself attempted, the phrase invading your personal space, practically intruding on your tongue, getting all up in your grill. As for Did you find everything OK?, the phrase might be credited for its use of syllepsis, the word “find” doing double duty, referring both to the act of locating something and the abstract idea of an impression or feeling. But maybe in places of overwork and undercompensation words just have to work twice as hard?

In any case, nothing “begs the question” this way. The phrase, in fact, has become one of the sorriest misuses to have recently climbed from grammatical purgatory to common acceptance. How better to gauge this verbal ascension than to scan our most respected cultural publications? When this writer first noted the expression “begging question” in a recent issue of the New Yorker, it seemed the magazine was hedging its bets, acknowledging the creeping legitimacy of the phrase but balking at its replication. In last month’s March 28 issue, however, the phrase achieves a kind of grammatical consecration. This happens in the “Shouts and Murmurs” column, on page 33, and the text is signed Suzanna Wolff. The column’s premise that week is the satirical description of a number of humorously bad wireless internet plans, with absurd names ranging from the “TV-Buff-Infuriating Buffer Plan” to the “Thrill-Seeker Triple-Refresh Bundle.” One especially frustrating wi-fi service is mockingly touted as follows: “Get away from it all with this Internet connection, which begs the question, “Do I actually need to be in contact with the outside world?”

Maybe the author’s satire justifies the misuse of the expression. But the offending phrase isn’t attributed to the service providers she mocks, it belongs to the language of the satirist herself. A margin of uncertainty remains, though, as in the elusive narrative voice of Flaubert’s style indirect libre. Who’s speaking here? The author from a lofty arbiter of culture? Or a hapless anybody, their vernacular snarky and cynical?

A recent story in The Guardian makes the case that grammar prescriptivists wage war on the language of the underprivileged, and that the conventions they defend are often “unimportant.” The phrase “begging the question,” however, is hardly insignificant. It means that a speaker’s argument includes a premise that assumes the conclusion. The “question” at stake in the phrase is that questionable assumption; a person “begs the question” when they have overlooked or tried to hide a false premise through circular reasoning or sheer deviousness.

No doubt this fine point of forensics is lost in the headlong forward rush of news coverage and social media commentary. Who has time to look back and reconsider their question begging, their prejudices and false assumptions, the historical legacy of unpaid debts and unmourned lives? In contemporary politics, there’s no looking back, which seems why even in the rhetoric of nostalgia, America’s so-called former greatness isn’t asserted in substantive claims but instead by piling query upon query, not so as to substantiate a claim, but as if, to use the current parlance, we were begging those questions: “When was the last time anybody saw us beating, let’s say, China in a trade deal? … When did we beat Japan at anything? They send their cars over by the millions, and what do we do? When was the last time you saw a Chevrolet in Tokyo?” (Donald Trump, Tuesday, June 16, 2015).

 

 

 

 

Leave a comment

Filed under Diction, Politics of Discourse

A linguistic Utopia

image.adapt.990.high.Africa_map_1581_Bunting_clover_leaf_map_a.1434858632764

In John Reader’s Africa: The Biography of a Continent, the author tells us of an island in Lake Victoria named Ukara. Having never heard of the place our interest is piqued, and in this sprawling, meticulously researched book one’s focus is always rewarded. But what holds our attention is the strange locution at the end of the introductory sentence: “Ukara is an island lying off the south-eastern shore of Lake Victoria, part of what is now Tanzania.”*

Why does the author qualify the name Tanzania in this way? What justifies the turn of phrase? Presumably, historical perspective on Ukara calls for verbal discretion; the author implies that the island both pre-dates and may outlast the African state whose territory it now occupies. A valid qualification, then, particularly in a book that studies Africa’s history in a vast geological time-frame. But in that case, couldn’t one also refer to Lake Victoria as “what is now called Lake Victoria,” and Ukara as “what is now called Ukara”? After all, the English name Victoria is a colonial imposition on the landscape, while even the name Ukara is surely not immune from the vagaries of history. But of course such a painstakingly conscientious sentence would be a stylistic horror.

And yet the lack of such scruples can lead to their own abominations. Consider Edith Wharton’s heedless anachronism as she writes on the paleolithic cave-paintings of southern France. In her devoted praise of her adopted European homeland, the American author extends the modern country’s name to the place that existed there before the last ice age, when the paintings were created.

“Thirty thousand years ago … there were men in France so advanced in observation and training of eye and hand that they could represent fishes swimming in a river, stags grazing or fighting, bison charging with lowered heads … and long lines of reindeer in perspective.”**

The claim that “men in France” made the cave paintings is not an accidental lapse, as the author repeats it, pressing her point that France, unlike less cultured places, has a long and unbroken artistic heritage: “drawing, painting and even sculpture of a highly developed kind,” Wharton says, “were practiced in France long before Babylon” (78-9). The phrase, then, is more than a convenient shorthand expression; it is a territorial ‘claim’ laden with ethnocentric bias. Nativism and national pride are always based on retroactive fictions and aggressive rivalry. In this respect the conservative and aristocratic Wharton shares something with the racist delusions of France’s National Front party.

In contrast, and as a direct challenge to such ethnocentrism, the historian Graham Robb pointedly refuses to extend the name “France” to the prehistoric territory that predates the country. Likewise, in his version of history the indigenous inhabitants of the place share no solid link with the modern French state. Robb says, “The only coherent, indigenous group that a historically sound National Front party could claim to represent would be the very first wandering band of pre-human primates that occupied this section of the Western European isthmus.”

Robb’s periphrasis “this section of the Western European isthmus” is admirably neutral. It makes one dream of a language so cold and detached, so reasonable and impartial it could defeat all chauvinism and vanity. A language that might not have the power to impose a higher standard of reason, but that could smother patriots’ emotions and bore them to tears. A language that could defang all political sound bites. The historian falls somewhat short of this linguistic utopia, however. He should have known that his “primates” were not “pre-human” but fully homo sapiens. And might he not have pointed out they came from Africa, and qualified the name of that continent with a sensible and scrupulous, if pleonastic, redundant, and circumlocutory “what is now”?


* John Reader, Africa: The Biography of a Continent (New York: Vintage, 1999), 255.

** Edith Wharton, French Ways and their Meaning (New York: Appleton, 1919), 77.

*** Graham Robb, The Discovery of France: A Historical Geography (New York: Norton, 2007), 26.

 

 

Leave a comment

Filed under Diction

Ruins of the University

A recent issue of The Economist includes a special report on the state of higher education, grandly titled “Excellence v Equity.” It’s a dispiriting read. The magazine tackles its sprawling, complex topic with hard-nosed pragmatism, monetizing everything, reducing all to the cold, hard reality of economic exchange. A strange picture of the university emerges from the magazine’s frigid gaze. From the Economist’s viewpoint, “education” has no real content; its value lies solely in the “sign” it confers on the consumer who purchases it. “Students,” the Economist claims, “…are not buying education. They are buying degrees, whose main purpose is to signal to employers that an individual went to a — preferably highly-selective — university.”*

What happens when students object to this cynical, consumerist model of the university? In Quebec, where a proposed hike in tuition fees was understood as yet another dangerous step toward privatization, students went on strike in 2012 to defend the notion of the university as a public good and a common right. Predictably, The Economist dismisses the striking students as a “European” anomaly and implicitly sides with Quebec’s complacent neighbors: “the rest of Canred-squareada, used — American-style — to much higher fees, was baffled by their fury” (12). It’s no small irony that the Economist approves of university students being baffled. But the kind of ignorance the magazine endorses here is not the receptive state of non-knowing familiar to anyone whose job is to teach and educate, it’s an irremediable incapacity to understand. The Economist’s readers are encouraged to believe that once accustomed to paying high fees, students will no longer be able to bridge the “deep cultural differences” that privatization has caused by severing them from the ethos of the traditional public university.

Twenty years ago, writing from his post in Comparative Literature at the Université de Montréal, Bill Readings penned an influential diagnosis of higher education, The University in Ruins. The author pointed out that writing from Quebec in the ‘nineties afforded him a unique perspective on the transformations happening at universities around the globe. Quebec, as an independent state in the making, still retained a sense of its universities as forming national citizens within a distinct cultural context; Readings argued that this “national cultural mission,” in decline most everywhere else in the first world, contrasted with the emerging paradigm of the globalized university as a “transnational bureaucratic corporation.”** Readings singled out one word from the bureaucratic jargon of new university administrators and devoted a whole chapter to its analysis: the word “excellence,” a basically vacuous term, which replaces references to “culture,” and so is symptomatic of the shift from the traditional university to a globalized entity tied to the abstract dictates of accounting. Twenty years later, “excellence” is now a ubiquitous term in university bureaucratese. Looking back, one notes that the catch-word emerged at the same time as Wayne’s World and Bill and Ted’s Excellent Adventure — which suggests the new university is “excellent” the way Pez is “awesome” and an iPhone is “dank.” “Excellence,” appropriately enough, is the first word of The Economist’s so-called “Special Report.”

Readings’ analysis was prescient. “Excellence,” he wrote in The University in Ruins, “is clearly a purely internal unit of value that effectively brackets all questions of reference or function, thus creating an internal market. Henceforth, the question of the University is only the question of relative value-for-money, the question posed to a student who is situated entirely as a consumer, rather than as someone who wants to think” (27). Given that Readings’ bleak prognosis matches up with The Economist’s callous picture of students “buying degrees,” we might conclude that two decades later the contemporary university is now completely “ruined.” Why, then, is there so much talk in the “Special Report” about “destroying” the university? How is it possible to “destroy” a “ruin”?

The answer to this question is obvious to anyone teaching in the university sector today. Privatization has downsized faculty, raised teaching loads and increased class sizes; underpaid and overworked adjunct teachers outnumber traditional, full-time faculty; the student body is demoralized by the instrumental nature of the student-faculty relationship; and rising tuition and fees have locked students into the all-important need to snag a lucrative career. Under these conditions the university can no longer justify its former educational role. It’s time for the privatizers, the former advocates of the ruined university, to go in for the kill, to destroy what’s left. Never mind that privatization, by removing state funding from universities, is responsible for shifting the cost of “public” education onto individual students. Unconcerned by the contradiction, The Economist and its allies now strike a populist note, demanding that education be cheaper, more “equitable” for the sake of the students. The aim, however, is simply to cut costs, shed jobs, and redistribute wealth upward. Enter the MOOC.

“When massive open online courses (MOOCs) took off three years ago, there was much concern that they would destroy [sic] traditional universities,” The Economist scoffs. “That isn’t happening. ‘We’re doing a better job of improving job skills than of transforming the university sector,’ says Rick Levin, a former president of Yale, who runs Coursera, the biggest of the MOOCs” (18). The passage calls for some textual analysis, as it performs a kind of rhetorical bait-and-switch. It opens with a high-handed dismissal of the idea that edtech would “destroy” education as we know it, only to confirm in what follows that such is in fact the intention. After all, the quote from Mr. Levin would be a non-sequitur unless the idea of “transforming” the university were somehow equivalent to “destroying” it. The apparently reassuring statement that destruction “isn’t happening” should then be read as saying, at best, that it hasn’t happened yet. In the rest of the article, which promotes the supposed benefits of online education, the language of destruction is shifting and ambiguous. Thus, The Economist predicts that edtech startups “will eventually undermine traditional high-cost university education” — presumably a good thing — whereas some universities “are wary of undermining the value of their degrees” — probably a bad thing. These ambiguities derive from the equivocal meaning of the term “disruption,” which waves like a pirate jack from the article’s title: “Online learning could disrupt higher education, but many universities are resisting it.” Clearly, disruption is two-sided, as any crisis has its winners and losers. But from the lofty standpoint of The Economist, capital always benefits from disruption, never mind the costs.

If faculty wield the language of destruction, however, The Economist mocks and derides their concerns, all the while promoting its own plans for creative disruption. A case in point: when San Jose State University refused to adopt a MOOC, the faculty stated that it would “replace professors, dismantle departments and provide a diminished education for students.” Replace, dismantle, diminish: a fairly dismal assessment from the front lines of the crisis. The Economist, however, parses the statement as simply meaning that greedy professors want to keep their jobs. How, then, to pry those jobs away? How to make professors’ work more disposable, more flexible and insecure? The Economist suggests that faculty must be weaned from their quaint idea of teaching as a vocation, or, as the magazine puts it, “the belief that education is not an occupation but a calling.”*** It’s a running theme in The Economist’s “Special Report”: “For now,” the magazine concludes with thinly-veiled scorn, “the interests of academics … prevail over those of students.”


* The Economist, March 28th-April 3rd 2015, 16.

** Bill Readings, The University in Ruins (Cambridge, MA: Harvard University Press, 1996), 3.

*** Giorgio Agamben’s insight into the “confusion … between jobs and vocations” is pertinent here. “The idea that anyone can do or be anything — the suspicion that not only could the doctor who examines me today be a video artist tomorrow, but that even the executioner who kills me is actually, as in Kafka’s The Trial, also a singer — is nothing but the reflection of the awareness that everyone is simply bending him- or herself according to this flexibility that is today the primary quality that the market demands.” Giorgio Agamben, “On What We Can Not Do,” in Nudities (Stanford: Stanford University Press, 2011), 45.

 

 

1 Comment

Filed under Diction

Fueling Debate

On Easter weekend the Big Oil firebrand and Tea Party favorite Ted Cruz released the first TV ad of the US presidential election campaign. The inane homilies of senator Cruz’s televised message set the tone for a publicity barrage that will culminate next year in the empty ritual of the so-called presidential “debate.” As a platform for scripted statements and one-liners, the candidates’ live televised exchange will be, as always, a predictable extension of their TV ad campaigns, a debasement of the art of rhetoric and an affront to the idea of dialogue.

When did articulate civic discourse and reasoned argument fell into decline? Jürgen Habermas and other public sphere theorists have long pointed to the role of mass media in undoing the spatio-temporal ideal of civic communication. Before the advent of radio and TV, they say, public space still provided a shared forum of ideas in the live, face-to-face encounters of rational citizens. But that communicative ideal has been criticized as a nostalgic bourgeois illusion blind to its excluded others: the illegitimate voices of women, the silence of the subaltern, the ‘babble’ of the colonized. Social theory’s most important stakes now lie in the idea of “dissensus” rather than agreement; more specifically, disagreements that can’t be resolved in a contest of ideas, because they point to a situation in which the forum, the grounds of speaking are themselves in question. As Jacques Rancière puts it, “Disagreement occurs wherever contention over what speaking means constitutes the very rationality of the speech situation.”*  Such “contention over what speaking means” comes to a head when excluded parties make their often baffling voices heard, and so put both language and political space into crisis. As Rancière argues, that crisis or “division” in social space is what constitutes politics itself. Politics, Rancière says, “is primarily conflict over the existence of a common stage and over the existence and status of those present on it” (26-7).

This definition may account for the cacophony of much of our political environment. It also seems to capture the meaning of the Tea Party’s latter-day putsch in the fateful year 2009, when pasty-looking insurgents burst onto the American political scene, storming congressional “town hall” meetings in order to throw the staid, civil (Habermasian?) proceedings into disarray. A glaring irony of the Tea Party’s shouting and disruptions, however, is that their raging bluster simply aped the grievances of past social movements without any of their moral motivations. Another is that their undermining of the Rancièrian “speech situation” meshes with the right wing’s discrediting of participatory democracy in favor of an unfettered market. Far from intervening in the “speech situation,” then, the Tea Party’s role in the larger neoliberal juggernaut is to scramble the airwaves, to sow confusion, to undermine speech itself. Their assault on language would be the exact counterpart to Obama’s grandiloquence, if the president’s rhetorical prowess didn’t in fact wind up proving the same thing: that one man’s righteous words, inspired by generations of striving social movements, count for nothing in the larger scheme of American imperial power. Where does debate fit into this context?

The poet Ben Lerner gives a specific date to the demise of debate in the United States: the year 1979, when corporate dollars reshaped the priorities of high school forensics, separating “values” from “policy” in competitive debates. As Lerner tells it, Phillips Petroleum, the main corporate sponsor of the US National Forensics League, was alarmed at how debates on political topics focused on figures and statistics to the detriment of eloquent, straightforward communication. The solution Phillips proposed was to establish a separate form of debate devoted to the vague, blurry realm of so-called “values.” The kind of talk heard in this forum is familiar to anyone who has heard the contentless obfuscation of a presidential hopeful’s TV pitch. Presumably our overlords speak more concretely of policy details in private. And this is Lerner’s point: the two modes of discourse point to a scission in the body politic that divides actual content from the fuzzy eloquence we’re habitually fed through the media. It’s no coincidence, then, that Phillips’ “sundering of values from policy” happens just a year before the notorious Reagan-Carter debate, when image, branding and asinine quips decisively trumped verbal argumentation. As Lerner says, “I can’t believe that the existence of a corporately sponsored separation of value and policy in high school debate can be separated from that separation in the political culture at large.”**

The wedge that petrodollars have driven between “values” and “policy” can be seen everywhere Big Oil’s destructive reach conflicts with decent citizens’ concerns for peace, sustainability, and environmental protection. It can be seen in the predatory scheme called “Fuel Your School,” in which Chevron corporation provides support to public schools by funding classroom projects in the US and abroad. Predictably, the program emphasizes STEM subjects (Science, Technology, Engineering and Math). As the company’s website puts it, the program’s aim is “to help prepare students for the growing number of technical jobs in the modern economy, including possible engineering positions at Chevron.” Some school districts, such as Vancouver, BC, have rejected Fuel Your School, and even in surrounding districts where the program has been implemented local teachers have organized to oppose it. The controversy has given rise to what is commonly called “debate.” But in a context where apparently unstoppable neoliberal economic policies undermine public schools’ autonomy and make them increasingly vulnerable to market forces, money can speak very loudly. Moreover, as Fuel Your School won’t fund the humanities, the prospects for language arts, including debate and critical thinking, are likely to dwindle.

Chevron spokesman Adrien Byrne was no doubt banking on this foregone conclusion when he wrote a letter addressing the controversy. The text is a marvel of robotic bureaucratese. “Chevron welcomes robust debate [sic] on education funding in the community, and encourages input from all relevant stakeholders” (24 Hours, 11/26/2014, p.6).

__________

* Jacques Rancière, Disagreement: Politics and Philosophy (Minneapolis: University of Minnesota Press), xi.

** Ben Lerner, “Contest of Words: High School Debate and the Demise of Public Speech” (Harper’s, October 2012).

Leave a comment

Filed under Diction, Politics of Discourse

Verbal impunity

The vernacular hive mind sometimes hits on a phrase that can’t be improved, an idiom for the ages. Once uttered, the phrase is indispensable; once heard, ubiquitous. This popular idiom is to spoken language what the mot juste is to an exacting writer: the right word in the right place, perfectly chosen yet seemingly imposing itself of its own accord, as if dictated by language alone.

The past months have seen the spread of a host of new phrases, urgent slogans of the most significant mass social moment in recent years: “Black Lives Matter”; “I Can’t Breathe”; “Hands Up, Don’t Shoot.” Another expression, less successful, has lately emerged that seems still in flux, not having found its best formulation. It’s as if we were witnessing a phrase’s stuttering birth, a verbal catastrophe in slow motion. “There’s only one way to say it,” goes one expression; “Only one word can describe it,” goes another. Strangely enough, the phrases circle around their own obsessive notion of a mot juste — the “single right word” dear to Gustave Flaubert,* “the one and only correct word to use,” as Hemingway put it, translating.** But in a grotesque parody of that ideal of verbal precision, the new phrases often miss the mark, even as they strike a dogmatic tone; the word chosen may be neither right nor correct, though it insists that it is just. 

One-way sign_2_2

Darren Wilson, the police officer who shot an unarmed Michael Brown in Ferguson, Missouri, made repeated use of the offending expression in his notorious grand jury testimony. “When I grabbed him,” Wilson said of his young victim, “the only way I can describe it [sic] is I felt like a 5-year-old holding onto Hulk Hogan.” Jamelle Bouie, in an article in Slate, helpfully points out that Wilson is 6-foot-4 and weighs 210 pounds; Brown was 6-foot-5 and 290 pounds — a fairly close match, physically speaking. According to the police officer, however, there is only one way to compare the two bodies: as the confrontation of a small child with an indomitable giant. This confirms Judith Butler’s recent insight into the hallucinatory nature of “schematic racism”: even when subdued or imperiled, the black target of police violence “never stops looming as a threat to security,” Butler says. Similarly, when Wilson fired his first shot at Brown he said that the boy “had the most aggressive face. That’s the only way I can describe it [sic], it looks like a demon, that’s how angry he looked.”

Interestingly, Wilson’s inflexible phrase “the only way I can describe it” gave way to a moment of verbal compunction when the officer struggled to find the right word to portray the boy he killed. “I’ve never seen anybody look that, for lack of a better word, crazy,” Wilson said. But this conflict between certainty and scruples shouldn’t surprise us. It’s precisely the “lack of a better word” that supports the officer’s fantasy of a just word. After all, Wilson’s peremptory descriptions aren’t founded on any real certainty but rather on the denial of their own patently obvious speciousness, a disavowal of the truth that yields the man’s racist cartoon-world phantasmagoria of projected cruelty and horror. We find a similar conflation of ignorance and conviction in the lyrics to a current American song, where the mot juste is not a carefully selected word but its exact opposite, the word one uses because one can’t think of anything else, because one doesn’t know any better: “Only one word comes to mind / There’s only one word to describe…”.

Like its variants in popular language today, the murdering officer’s hapless phrase “the only way I can describe it” seems to derive from the more established expression, “X can only be described as Y.” Typically, in a phrase of this kind, the descriptor is pejorative and exaggerated, sometimes to humorous ends. As such, the turn of phrase is a rhetorical hyperbole. The speaker stretches the truth to make a point — stretches it, that is, except when referring to something itself hyperbolically nasty, such as racism in the American criminal-justice system. Here, the figural expression turns denotative. A UN special rapporteur, lambasting the United States for its treatment of Black Panther Albert Woodfox (one of the so-called Angola Three), provides a well-formed version of the phrase that ends in a judiciously chosen noun: “Four decades in solitary confinement,” the rapporteur says, “can only be described as torture.”

What accounts for the apparent warping of this well-known and effective locution into the strange, limping phrases we see in Wilson’s testimony and seemingly everywhere in the contemporary mediascape? Does a similar disavowed impotence underly the quasi-fascistic high-handedness of American speech? Perhaps the mutating phrases are tending toward a police-state’s perfect mot juste, the phrase to end all phrases and all discussion, the negation of language and dialogue. Accordingly, “there’s only one way to describe it” becomes I don’t care how you describe it. Isn’t this the implicit meaning of Wilson’s self-serving justifications? And isn’t it the message sent by the grand jury when it refused to hear reason, to consider the evidence and heed months of righteous protests?

American English is always refining the vocabulary of prudery and violence. Appropriately enough, it falls to a snarling Rudolph Giuliani to coin the definitive phrase from out of the vernacular babble. In an interview on “Fox and Friends” after two policemen were gunned down in Brooklyn, Giuliani perfectly captured the heady spirit of authoritarian counterrevolt that day as police officers snatched back the cause of social justice from vulnerable citizens and threatened a blue-shirted putsch on New York’s liberal mayor. Future lexicographers may credit the would-be strongman for the expression he used on the occasion. “We’ve had four months of propaganda, starting with the president, that everybody should hate the police,” Giuliani said. “I don’t care how you want to describe it [sic]: That’s what those protests are all about” (The Washington Post, 12/22/2014).

__________

* Flaubert always insisted on the paramount role of diction in literature. “Tout le talent d’écrire ne consiste après tout que dans le choix des mots. C’est la précision qui fait la force” (Correspondance, II, 471). In a letter to Sainte-Beuve, Flaubert says, “Si je mets bleues après pierres, c’est que bleues est le mot juste, croyez-moi” (Ibid., V, 67). Elsewhere Flaubert uses the phrase l’expression juste; to George Sand he writes, “A force de chercher, je trouve l’expression juste, qui était la seule et qui est, en même temps, l’harmonieuse” (Ibid., VII, 290). To Sand again, he insists on the “rapport nécessaire entre le mot juste et le mot musical.” See Flaubert, Correspondance (Paris: Conard, 1929).

** Ernest Hemingway, A Moveable Feast (New York: Bantam Books, 1965), 132.

Leave a comment

Filed under Diction, Politics of Discourse

At home in the void

4918352_orig

When Copernicus showed that the Earth orbits the sun, his heliocentric model of the cosmos placed our local star not only in the middle of the solar system but at the center of the universe itself. The error is revealing. The astronomer may have knocked mankind from its Earthly pedestal, but he warded off an even more profound displacement with his compact and orderly vision of concentric heavenly bodies. This Copernican double movement of decentering and recentering is typical of scientific knowledge, if not of human knowledge as a whole.

Copernicus comes to mind when one considers the announcement last week of the discovery of Laniakea, the enormous galactic supercluster that contains Earth’s galaxy. “Say Hello to Milky Way’s New Home,” said a headline in the Tech Times; “Welcome to Laniakea, our Galactic Home,” said another. Nature magazine announced “Earth’s New Address,” and the Daily Beast touted “The Milky Way’s Place in the Heavens.” The articles’ cheerful tone and their stubborn insistence on the idea of “home” betray the literally domesticating impulse in cosmological understanding. It seems that no earth-shaking discovery is so disorienting that can’t be made familiar, friendly and reassuringly divine. Unsurprisingly, the name given to the supercluster puts a comforting metaphysical stamp on the cosmic discovery: Laniakea, derived from the Hawaiian language, means “immense” or “immeasurable heaven.”

Laniakea. Red dot indicates position of Milky Way

Laniakea. Red dot indicates position of Milky Way

At least one commentator has pointed out that the name “immeasurable heaven” is unwittingly ironic, given that, as he says, “measuring it is exactly what we’re doing.” But one despairs at a grasp of “irony” that can invoke exactitude when referring to measurements on such a vast scale, or that blithely wields an interpellating “we,” the journalist’s casual shorthand, presumably, for “our common humanity.” Still, no commentator is immune from a genuine sense of wonder and amazement at the magnitude of the cosmic discovery. This, however, becomes the occasion for a kind of ritual of self-debasement in the press that is quickly converted into a posture of mastery. As the Slate columnist puts it, “Astronomy is both ennobling and humbling. It tells us our place in the Universe, which can make you feel small … but don’t forget that we’re a part of that Universe, and the fact that we can figure this stuff out at all makes us very big indeed.” Geek magazine echoes this symptomatic disavowal of human tininess by first invoking the “unknowable” only to follow it up with a reference to man’s intelligence, which the columnist imagines as mirrored by a similar species aping our own actions “from the other side.”

“It’s amazing to think that Laniakea is just one of many superclusters…. It contains (literally) quadrillions of stars and an unknowable number of planets. Maybe somewhere out there is another intelligent species looking out into space and mapping the same supercluster from the other side.”

The size of Laniakea beggars the mind, and writing about it surely calls for some minimal verbal compunction. After all, when speaking of a place some 520 million light-years away — the “other side” of the supercluster — how can one plausibly use the present verb tense, as if humans shared the same ontological time as beings among those faint faraway lights, many of which are ghostly afterimages of worlds that no longer exist? Laniakea throws the human scale of space and time utterly out of operation. Our frail, earthly meaning-making loses all relevant context and perspective. One might think the encounter with such cosmic enormity would provoke an experience of the sublime.

Immanuel Kant’s Critique of Judgment offers the most compelling and influential account of sublime experience. When confronted with a spectacle of overwhelming natural grandeur, Kant says that the observing mind takes stock of the imagination’s incapacity to fully render it in an adequate image. The experience is painful, Kant says, as it demeans the mind and makes it realize its powerlessness. But as it turns out, the failure of the imagination engenders a victory of the mind, since the latter, he says, can go so far as to conceive of the infinite, even if it cannot picture it. In his account of sublime experience, then, Kant leads us to the very limits of what can be humanly represented and conceived, but at that perilous brink the philosopher rediscovers a quasi-divine force of “supersensible” intuition that rescues the mind from its mortifying sense of impotence.

Andromeda. Charles Messier

Andromeda. Charles Messier

Kant the polymath also authored a Universal Theory of the Heavens, which, drawing on the work of contemporary astronomers, speculated on the the puzzling “nebula” formations readily visible from Earth. The general belief in the mid-18th century was that nebulae such as Andromeda, which the philosopher called “island universes,” formed part of our own Milky Way. It wasn’t until the 1920’s that the Andromeda “cloud” was conclusively proven to be a separate galaxy far outside the Milky Way and twice its size as well. Since then the number of recognized galaxies has grown at an exponential rate. In 1995 the Hubble Space Telescope peered deep into an apparently empty speck of sky and discovered no less than 3,000 previously unknown galaxies. Some 100,000 galaxies are encompassed by the Laniakea supercluster alone, but this is only a tiny fraction of the universe’s estimated total of 125 billion galaxies.

Critics have faulted Kant for refusing to own up to human finitude in the text where he explores it so well, and the stakes of Kant’s failure are more pressing than ever in a time of inflationary sublimity. Indeed, pedestrian cosmologists seem to enact in their own way a Kantian recoil from the evidence of the senses. Kant has been accused of propping philosophical critique on illusions of spiritual mastery, and Jean-François Lyotard has suggested that “The Analytic of the Sublime” could be taken as the staging of “a philosophical neurosis.”* In an effort to reclaim the most radical aspects of Kant’s text, Lyotard has argued that certain extreme experiences should be thought of as defying representation. However, in a biting critique titled “Are Some Things Unrepresentable?” Jacques Rancière scorns Lyotard’s claims as imposing on the visual arts a sanctimonious and quasi-religious ban on images in the face of “holy terror.”** Such a ban logically implies a dictatorial notion of appropriate forms of representation, Rancière argues, and he sarcastically asserts that “this idea is vacuous” (137). But what, we might ask, is the idea of the vacuous? Is there, in Kantian terms, a representation adequate to sheer vacuity — say, the empty interstellar space comprising a mere atom per square meter? Can we form an idea or even an image of the “cosmic void” over which the earth is apparently perched, clinging to the tip of a near-infinite strand of filaments at the far reaches of Laniakea?


* Jean-François Lyotard, Lessons on the Analytic of the Sublime, 150.

** Jacques Rancière, “Are Some Things Unrepresentable?” In The Future of the Image.

 

 

1 Comment

Filed under Diction

Too menny

July 11 is United Nations’ World Population Day, and we submit a pair of graphs for the occasion. The first graph shows the incidence rate of the word “overpopulate” during the past two hundred years; the second charts the world population’s growth over that same period. Taken together, the graphs portray an inverse relationship between word and thing: in recent decades, just as the global population began to dramatically increase, the word designating that phenomenon went into steep decline.

Image

Screenshot of Google definition search feature, with graph of word incidence rate

Note peak in word incidence, circa 1960 (above); sudden angle and sharp rise in world population curve, circa 1950 (below).

Image

Graph courtesy of Ourfiniteworld.com

The organization HowMany.org claims that a “population taboo” prevents coverage of global population issues in contemporary media and politics, and that as a result the root causes of the world’s social and ecological woes remain largely unexamined. There are exceptions, though, and the tide may perhaps be turning (one can see the suggestion of an uptick in the Google graph). But even a recent Thomas Friedman op-ed sporting the blunt, if not brutal, headline “The Earth is Full” somehow avoids using the word “overpopulation.”

As happens in catastrophes, words fail us. This conjunction of trauma and inarticulateness is memorably captured in the murderous words penned by “Father Time” in Jude the Obscure: “we are too menny.”

 

Leave a comment

Filed under Diction, Politics of Discourse