Going Nucular Page 20
In fact sometimes a word can have contradictory meanings with no one being the wiser. I once got into an argument with a linguist friend over the meaning of the sentence “The pool was deceptively shallow.” I maintained that it meant that the pool was shallower than it looked, and he said it meant that the pool was deeper than it looked. To settle the argument I took advantage of my role as chair of the usage panel of the American Heritage Dictionary , a group of 175 noted writers we poll every so often on usage questions. But when we asked the panelists what “The pool is deceptively shallow” means, the results were curiously inconclusive. Half of them said that it means “The pool is shallower than it appears,” a third said that it means “The pool is deeper than it appears,” and the rest said it could go either way. In other words, if you put that sentence on a warning sign you can be dead certain that anywhere from a third to a half of the people who see it will get the wrong message.
Or take minimal. “She ran best when she had a minimal amount of food in her stomach”—does that mean she ran best when she’d eaten nothing or when she’d eaten a bit? The usage panel was split on that one, too—a third said A, a third said B, and another third said it could mean either one.
If you’re of a pessimistic turn of mind, you could take all this as a reminder of how elusive understanding can be—it puts us on guard against what Adrienne Rich called the dream of a common language. But maybe the wonder of it all is that we manage to muddle through, breakfast after breakfast, trusting to good faith to bridge over all the gaps in comprehension. To paraphrase another poet, Randall Jarrell: We understand each other worse, and it matters less, than any of us suppose.
The Bloody Crossroads of Grammar and Politics
Is there a grammatical error in the following sentence? Toni Morrison’s genius enables her to create novels that arise from and express the injustices African Americans have endured. Not according to the Educational Testing Service, which included the item on the PSAT given on October 15 of last year. But Kevin Keegan, a high-school journalism teacher from Silver Spring, Maryland, protested that a number of grammar books assert that it is incorrect to use a pronoun with a possessive antecedent like “Tony Morrison’s”—or at least, unless the pronoun is itself a possessive, as in Toni Morrison’s fans adore her books.
After months of exchanges with the tenacious Keegan, the College Board finally agreed to adjust the scores of students who had marked the underlined pronoun her as incorrect. That’s only fair. When you’re asking students to pick out errors of grammar, you ought to make sure you haven’t included anything that might bring the grammarati out of the woodwork.
Some read the test item as the token of a wider malaise. “Talk about standards,” wrote David Skinner, a columnist at the conservative Weekly Standard. Not only had the example sentence been “proven to contain an error of grammar,” but the sentence’s celebration of Toni Morrison, a “mediocre contemporary author,” betrayed the “faddish, racialist, wishful thinking that our educational institutions should be guarding against.”
The New York Times Week in Review, June 1, 2003
That may seem like a lot to lay on the back of a grammar example. But it was telling how easily Skinner’s indignation encompassed both the grammatical and cultural implications of the sentence. In recent decades, the defense of usage standards has become a flagship issue for the cultural right: The people who are most vociferous about grammatical correctness tend to be those most dismissive of the political variety. And along the way, grammatical correctness itself has become an increasingly esoteric and arbitrary notion.
Take the rule about pronouns and possessives that Keegan cited in his challenge to the testing service. Unlike the hoary shibboleths about the split infinitive or beginning sentences with “but,” this one is a relative newcomer, which seems to have surfaced in grammar books only in the 1960s. Wilson Follett endorsed it in his 1966 Modern American Usage, and it was then picked up by a number of other usage writers, including Jacques Barzun and John Simon.
The assumption behind the rule is that a pronoun has to be of the same part of speech as its antecedent. Since possessives are adjectives, the reasoning goes, they can’t be followed by pronouns, even if the resulting sentence is perfectly clear.
If you accept that logic, you’ll eschew sentences like Napoleon’s fame preceded him (rewrite as His fame preceded Napoleon). In fact you’ll have to take a red pencil to just about all of the great works of English literature, starting with Shakespeare and the King James Bible (“And Joseph’s master took him, and put him into the prison”). The construction shows up in Dickens and Thackeray, not to mention H. W. Fowler’s Modern English Usage and in Strunk and White’s Elements of Style, where we find “The writer’s colleagues . . . have greatly helped him in the preparation of his manuscript.” And it’s pervasive not just in The New York Times and the New Yorker, but in the pages of the Weekly Standard, not excluding David Skinner’s own columns. (“It may be Bush’s utter lack of self-doubt that his detractors hate most about him.”)
The ubiquity of those examples ought to put us on our guard—maybe the English language knows something that the usage writers don’t. In fact the rule in question is a perfect example of muddy grammatical thinking. For one thing, possessives like Mary’s aren’t adjectives; they’re what linguists call determiner phrases. (If you doubt that, try substituting Mary’s for the adjective happy in sentences like The child looks happy or We saw only healthy and happy children.)
And if a nonpossessive pronoun can’t have a possessive antecedent, logic should dictate that things can’t work the other way around, either—if you’re going to throw out Hamlet’s mother loved him,” then why accept Hamlet loved his mother? That’s an awful lot to throw over the side in the name of consistency.
But that’s what “correct grammar” often comes down to nowadays. It has been taken over by cultists who learned everything they needed to know about grammar in ninth grade, and who have turned the enterprise into an insider’s game of gotcha! For those purposes, the more obscure and unintuitive the rule, the better. Pity the poor writers who come at grammar armed only with common sense and a knowledge of what English writers have done in the past. You’re walking down the street minding your own business, and all of a sudden the grammar police swoop down and bust you for violating some ordinance you couldn’t possibly have been aware of.
Not all modern usage writers take doctrinaire views of grammar. But the politicization of usage has contributed to its trivialization, and has vitiated it as an exercise in intellectual discrimination. The more vehemently people insist on upholding standards in general, the less need there is to justify them in the particular. For many, usage standards boil down to the unquestioned truths of “traditional grammar,” even if some of the traditions turn out to be only a few decades old.
Take the way Skinner asserted that the College Board examination sentence was “proven to contain an error of grammar” in the way you might talk about a document being proven to be a forgery—it’s as if the rules of grammar were mysterious dicta handed down from long-forgotten sages. For some writers, that’s a natural pairing. The English conservative writer Roger Scruton has described the controversies over usage as merely a special case of the debate between conservative and liberal views of politics. But until fifty years ago, nobody talked about “conservative” and “liberal” positions on usage, and usage writers were drawn from both sides of the aisle.
Even today, it would be silly to claim that conservatives actually care more deeply about usage standards than liberals do, much less that they write more clearly or correctly. In language as elsewhere, it isn’t as if vices are less prevalent among the people who denounce them most energetically.
But people who have reservations about the program of the cultural right often find themselves in an uneasy position when the discussion turns to usage. How do you defend the distinction between disinterested and uninterested without suggesting that
its disappearance is a harbinger of the decline of the West? For that matter, how do you make the general case for standards when they’re no longer answerable to common educated consent?
Not that the cultural left is blameless in this. Some of the usage reforms they championed have been widely adopted, and society is the better for it. There aren’t a lot of male executives around who still refer to their secretaries as “my girl.” But many of the locutions and usage rules that have recently been proposed in the name of social justice are as much insider codes as the arcane strictures of the grammar cultists—think of the s/he business, for example. They’re exercises in moral fastidiousness that no one really expects will catch on generally.
To younger writers, a lot of these discussions of usage seem to be less about winning consensus than about winning points. It’s no wonder they tend to regard the whole business with a weary indifference. WHAT-ever—will this be on the test?
Letter Perfect
TLAMs and RPGs, MREs and SSEs, EPWs and WMDs. The language we were hearing from the Iraq war had a decidedly alphabetic ring. But then that’s only appropriate. The word “acronym” itself was first used exactly sixty years ago to describe military coinings like WAC, ANZAC, and radar, all drawn from the initials of longer phrases. By then, the process was familiar enough so that servicemen could make fun of it with the term snafu, for “situation normal, all fucked up.” (Snafu was so successful that it gave rise to other phrases like fubb, for “fucked up beyond belief,” and cummfu for “complete utter monumental military fuck-up.” But none of these had legs except fubar, “fucked up beyond all recognition,” which under the spelling foobar has survived as programmer’s slang, though it’s used now as a generic file name or command.)
True, the American fondness for acronyms and abbreviations dates from well before World War II. After all, ours was the first modern nation to be known by its initials—the abbreviation U.S. dates from the 1830s. Nineteenth-century Americans gave the language items like C.O.D., S.O.B., and P.D.Q., not to mention O.K.—certainly the most successful American contribution to the languages of the world, even if nobody’s sure what the letters originally stood for.
Fresh Air Commentary, June 3, 2003
It wasn’t until the mid-twentieth century that acronyms became the linguistic wallpaper of modern life. One of the most successful acronyms of the 1950s was veep, which was used affectionately for Truman’s vice president Alban Barkley. It was largely abandoned in 1953 when Richard Nixon took over the job and made it known that he didn’t like the appellation—Nixon wasn’t a man for whom the phrase “lighten up” had a lot of resonance. But the Republican administration compensated by introducing other acronyms like riff for “reduction in force.” That was the first bureaucratic euphemism for layoffs—and as it turns out, a surprisingly resilient one.
Some sticklers insist that acronym should only be used for a string of letters that’s pronounced as a word, like riff or NATO—items like FBI and LSD they call initialisms. I suppose that’s a valid distinction, but most people can’t be bothered with it, and anyway, it misses the main point. However they’re pronounced, the crucial thing about these expressions is the way they come to live lives of their own as separate words. AC in a real-estate classified ad is just an abbreviation for “air conditioning.” But AC/DC is a distinct word when it’s used to describe a sexual orientation—if you heard someone’s sexuality described as “alternating current/direct current” your thoughts would run in a very different direction. And you may have your doubts about UFOs, but no one denies the existence of unidentified flying objects.
It’s astonishing how pervasive these coinings have become over the past sixty years. I’m not thinking just of the ones that come from bureaucracy and technology. Those are mostly used for efficiency—strings like SEC and EEG flow a lot more trippingly from the tongue than Securities and Exchange Commission and electroencephalogram.
But by now acronyms are piled up in every room in the American house. BLTs, PB&Js and OJ in the kitchen; GTOs, RVs, and SUVs in the garage, and TP in the bathroom, proving that someone remembered to stop at the A&P. In the living room we turn on the VCR or put on a CD by REM, ELO, or UB40; in the bedroom we slip out of our BVDs and cop some Zs. Yuppies and WASPS, LSD and PCP, TGIF and BYOB, CBGBs and MTV—you could sketch the social history of the postwar period just by listing the initials it has carved on the walls.
Items like these don’t have much to do with a propensity for conciseness—in fact most of them barely save any syllables over their spelled-out equivalents. The urge to acronymize goes deeper than that. It’s as if we’re moving towards a purely analytic language, where the shape of every word reveals its meaning to the initiates who possess the secret key. There’s a profane manifestation of that urge in the stories people tell about how our most charged words are secret acronyms derived from phrases like “For Unlawful Carnal Knowledge”—a bit of linguistic folklore that seems impervious to philological correction.
In that sense, acronyms are the slang of a textual world. There’s a mysterious sense of destiny to these names. The cabalists used the process called notarikon to form new names for God by combining the first or last letters of the words from phrases or biblical verses. And the Tudors studded their verses with acronyms and acrostics—though probably not nearly as many as scholars have claimed to find in their efforts to prove that Shakespeare’s works were written by someone else.
That’s the same impulse that leads people to rig the game—they start with a plausible acronym and then contrive a description to fit it. As best I can tell, the first of these was WAVES, the name the Navy coined in 1942 for “Women Accepted for Volunteer Emergency Service,” a description that managed to be simultaneously condescending and inaccurate. In later years the same process has given us organizational names like CARE, NOW, and MADD.
No one is more adept at this game than legislators. Over the past few years we’ve had the RAVE Act, for “Reducing Americans’ Vulnerability to Ecstasy,” and operation TIPS, the Terrorism Information and Prevention System. And then there’s the antiterrorism act that goes by the name of “Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism.” That acronymizes as USA PATRIOT—a happy accident indeed.
A Thousand Pictures
Some years ago, I inherited a seven-volume set of the 1907 Grand Larousse French dictionary. The set is pretty tattered by now, but it’s still glorious, with its dark red covers embossed with trees and gold letters, its art nouveau frontispieces and letter pages, and above all its intricate engravings and maps and its resplendent color plates—plates of animals, birds and insects; plates of costumes and furniture; plates of eggs, locomotives, and fencing positions; and lots and lots of plates of military uniforms.
That’s the dictionary that Jean-Paul Sartre recounts reading as a child in his grandfather’s study in Alsace, in his autobiographical novel Les Mots:
The Grand Larousse was everything to me; I would take down a volume at random, behind the desk, on the next-to-last shelf. A-bello, belloc-Ch, or Ci-D . . . (these associations of syllables had become proper names that denoted the sectors of universal knowledge: there was the Ci-D region, the Pr-Z region, with their flora and fauna, their cities, their great men and their battles). . . . Men and beasts were there in person—the engravings were their bodies, the text was their souls, their unique essences.
Fresh Air Commentary, August 7, 2003
An American child isn’t likely to have that experience of the dictionary nowadays. For all their considerable virtues, most of our dictionaries aren’t books to take us lands away. The average dictionary may include a handful of color plates, but it’s basically a textual affair—the illustrations serve mostly to break the unrelieved monotony of the columns of type. Those dictionaries do fine on souls, but there’s not a lot to satisfy our hunger for bodies.
Still, even those dictionaries have their visual charms, and i
t was nice to see a piece in the Sunday New York Times Magazine on Jeffrey Middleton, who’s the illustrator of the new eleventh edition of Merriam-Webster’s Collegiate. Middleton is responsible for the elegant little pen-and-ink drawings inserted every couple of pages or so alongside the entries, in a style that hasn’t much changed in the past seventy-five years.
Dictionary traditionalists argue that those drawings do a better job of rendering the idea of a word than elaborate plates or photographs. As the noted lexicographer Sidney Landau once put it, “Photographs are necessarily of unidealized individual things, whether zebras, geese, or medieval churches, [whereas] drawings may represent a composite distillation.”
Actually, that remark says as much about our conception of photography as it does about dictionaries. It’s a fair bet that Pierre Larousse would have used photographs in his grand dictionary if he’d had the technology to print them properly. In that age, people didn’t have any problem thinking of photographs as the representations of abstract ideas or imaginary settings. Victorian photographers like Oscar Rejlander and Henry Peach Robinson produced staged allegories and dramatic scenes with titles like “Youth and Age” or “The Two Ways of Life.” Julia Cameron did photographic illustrations for Tennyson’s “Idylls of the King,” and Henry James allowed the use of photographs to illustrate a 1909 edition of The Golden Bowl. And Darwin’s cousin Francis Galton made composite photographs aimed at isolating the physiognomic traits of various types and classes—criminals, Jews, and the members of the Academy of Sciences.
In that age, abstraction was an accepted goal of photography—in the words of the critic Charles Caffin, a collaborator of Alfred Stieglitz: “the artist [photographer] must make some abstract quality the prime feature of his picture.” It wasn’t until the early twentieth century that people began to think of photography as a pure record of the concrete facts before the lens. For Paul Strand, Andre Kertesz, and Henri Cartier-Bresson, there could be no photographs but of things. As Strand once said, “[T]he camera machine cannot evade the objects that are in front of it.”