- Home
- Geoff Nunberg
Going Nucular Page 2
Going Nucular Read online
Page 2
Culture at Large
Plastics!
We’re all attuned to the word games that other people try to play on us—what we have to watch out for are the ones we play on ourselves. Consider the curious transformation of plastic. For the first part of the twentieth century, that word connoted all the blessings that science was bestowing on modern life. Then, forty years ago, it suddenly became the P-word, and synthetic materials started having to deny their paternity.
The American enchantment with synthetics began in the 1920s, when Bakelite caught on as a material for everything from fountain pens to telephones, and haute couture designers like Elsa Schiaparelli redeemed viscose from its chintzy associations under the new name of rayon. The name was taken from the French word for “beam of light” and set the pattern for later names like nylon, Dacron, Orlon, and Ban-lon.
Those were the glamour years of plastics. The transparent version of viscose called cellophane was such a success in both packaging and fashion that Cole Porter listed it among the superlatives of You’re the Top, alongside the Coliseum, the Louvre Museum, Mickey Mouse, and a summer night in Spain. And in a 1940 poll to determine the most beautiful word in the English language, cellophane came in third, right behind mother and memory. Hardly a year went by that some new miracle fiber didn’t capture the public imagination. Nylon took the world by storm when it made its debut at the World’s Fair of 1939, where the DuPont pavilion featured a shapely Miss Chemistry reclining on a podium in nylon stockings.
San Jose Mercury News, January 5, 2003
Over the following decades, the press was full of exotic new names like Lucite, vinyl, Formica, Styrofoam, Dacron, and Saran Wrap, each of them replacing another natural material. By 1950, Popular Mechanics could show an illustration of a woman spraying a sleek couch with a garden hose, with the water running off into a drain in the floor, with the caption, “Because all her furniture is waterproof, the housewife of 2000 can do her daily cleaning with a hose.”
But technological predictions have a way of going awry. Back in 1950, no one could have foreseen that little more than a decade later, plastic would become a problematic word. (And so, a little later on, would housewife.) The break in the filament was signaled in two events in 1963. In the fall of that year, DuPont introduced the leather substitute Corfam at the Chicago Shoe Show and went on to make the material the centerpiece of its 1964 World’s Fair pavilion, a triumph of synthesis from its Tedlar roof and Delrin doorknobs to its Mylar curtains and Fabrilite seat upholstery.
DuPont had reason to be confident. Corfam was light and durable and could be cleaned with a wet sponge—it seemed a natural, if you’ll excuse the expression. But Corfam was a marketing catastrophe. A few years later DuPont took a $100 million write-off and sold off its Corfam operation to a company in the People’s Republic of Poland, where the fabric quickly became the cynosure of captive-nation haute couture.
People had good practical reasons for rejecting Corfam, which didn’t breathe or break in the way leather did. But the material was also the victim of a more equivocal attitude toward synthetic products. As it happens, in fact, 1963 also recorded the first use of the word plastic to refer to something superficial or insincere. By 1967, the nation was snickering at the line in Mike Nichols’s The Graduate, in which a skeptical young Dustin Hoffman received career advice from a family friend: “I just want to say one word to you. . . . Plastics!”
That line marked the end of America’s innocent faith in the synthetic future: From then on, plastic would be charged with a curious ambiguity. For the hippies and later the greens, the word stood in for all the wastefulness and superficiality of American consumer culture. As Stephen Fenichell put it in his lively social history Plastic: The Making of a Synthetic Century, plastic embodies the features that people like to denigrate about the twentieth century—artificiality, disposability, and synthesis.
Frank Zappa sounded that note in his 1967 Plastic People: “I’m sure that love will never be / A product of plasticity.” That was the progenitor of a line of musical plastiphobia that was carried on in songs like Radiohead’s Fake Plastic Trees and Alanis Morissette’s Plastic. (“You got a plastic girl in a plastic bed. . . . Got a plastic smile on a plastic face/But it’s underneath that you can’t erase.”)
But the 1960s also saw the birth of a new kind of plastiphilia, which had less to do with the corporate triumphalism of DuPont’s “Better Living Through Chemistry” than with the ironic detachment of pop art and the mods of Swinging London. That sensibility was what led performers to take names like Plastic Bertrand and the Plastic Ono Band. And it had its own anthems, from the Jefferson Airplane’s Plastic Fantastic Lover to Björk’s Dear Plastic, a paean to artifice: “Dear Plastic/Be proud/Don’t imitate anything/ You’re pure, pure, pure.”
In the end, both parties prevailed. The plastiphiles left us with a new distinction between hip plastic and unhip plastic. Unhip plastic was AstroTurf, Lucite chandeliers, disposable diapers, and the double-knit polyester leisure suits that were leaving pills on the upholstery of discotheques across America. Hip plastic was girls in vinyl Mary Quant miniskirts dancing the Watusi (a song by the Orlons). It was the plastic chain-mail dresses of Paco Rabanne and the spandex outfits of David Bowie and the glamrockers he spawned.
Unhip plastic was foam cups and cigarette wrappers that people dropped on the beach; hip plastic was the million square feet of polypropylene sheeting that Christo used to wrap a mile-long section of the Australian coast near Sydney. Needless to say, those distinctions have nothing to do with chemistry or environmental apprehensions—the same molecules can be unhip in car upholstery and hip in a Gucci bag.
But the plastiphobes had an effect, as well. Once plastic became a term of derision, people started to avoid the word to refer to the new materials that were coming out of labs. From the sixties on, plastic no longer meant any manufactured polymer, the way it had in the 1950s—now it only connotes glossy materials like vinyl, polystyrene, and Lucite. Ask people what their computer housings are made of, and they’ll fumble for a name.
For that matter, the word polyester has been dropped from the advertising lexicon. People may still drape their bodies in it, but now it’s sold as microfiber, or under brand names like Gore-Tex, Polar Fleece, and Eco-Spun—names free of any of the tacky, down-market evocations of Saturday Night Fever. And forty years after Corfam tanked, synthetic leather was back as pleather, the fabric of choice for animal-friendly performers like Britney Spears and Janet Jackson—with the P-word reduced to an unobtrusive prefix p. It’s an efficient way of accommodating our aesthetic or ecological scruples about plastic—we merely call it something else and go on as before. It’s just one word.
Keeping Ahead of the Joneses
You can tell a lot about an age from the way it adapts prefixes to its purposes. Take our enthusiasm for using post- in new ways. Sometimes it means “late” rather than “after,” as in postcapitalism, and sometimes, as in postmodernism, it means something like “once more without feeling.”
As it happens, the same sort of process has been going on more quietly with pre- at the other end of the scale. Time was, pre- chiefly meant “before,” as in prewar or prepubescent. In recent years, though, the meaning of the prefix seems to be shifting to “in advance.”
The world offers us preowned cars, preassembled furniture, precooked meals, prewashed denim, and preapproved loans. Hotels urge us to prebook our rooms (invariably more productive than postbooking them). At the airport gate, they announce preboarding and the VIPs and first-class passengers are already getting on the plane. It’s an ideal prefix for an age that’s preoccupied with getting a leg up on things.
The importance of getting an early advantage was behind the shift in the meaning of preschool about fifty or sixty years ago. Before that, the word could refer only to children too young to attend school. Then progressive educators drafted it into service as a new name for the nursery school, a name that brought to mind a
place where toddlers frittered away their days in idle play. Preschool implied a more goal-directed curriculum—a “developmentally oriented readiness” program, as the 92nd Street Y in Manhattan puts it.
The New York Times Week in Review, November 24, 2002
That’s unquestionably a worthy object—it’s implicit in the name Head Start, after all. But in the case of institutions like the 92nd Street Y, preschool is understood a bit more specifically than it normally is—it’s really a shorthand for pre-pre-Harvard-Yale-or-Princeton. So it isn’t surprising that parents go to considerable lengths to secure a place for their children on the first rung of that ladder.
“There are no bounds for what you do for your children,” the Citigroup analyst Jack B. Grubman said in the e-mail message that suggested he might have traded a bullish rating on AT&T for help from his boss, Sanford Weill, in getting his two-year-old twins into the Y program. If that’s correct, then few parents have a better right to make that claim.
In the earlier age of unapologetic privilege, a man making $20 million a year wouldn’t have had to bother with such things—members of the coupon-clipping classes simply put their sons down at birth for Groton or Eton. Now, anxious parents with no strings to pull are obliged to enlist friends and family in a frenzy of dialing on the morning after Labor Day, as they desperately try to secure one of the limited places on the schools’ applicant lists. “It’s the most democratic way,” explained Alix Friedman, the Y’s director of public relations. Indeed, it seems to have become a tenet of modern egalitarianism that the fairest way to apportion scarce resources, whether nursery-school places or postseason tickets, is according to people’s deftness in handling the speed-dial button. Life is getting to feel a lot like Jeopardy.
So it’s natural that people should get indignant when someone’s caught cutting to the head of the line—not that most of us wouldn’t have done the same if a pair of World Series tickets were in the balance.
That’s what predestination comes down to in these postmeritocratic times; it’s a matter of going through the motions of equal access at the same time you’re frantically trying to game the system. Among all the denials and disclaimers that were making the rounds after Mr. Grubman’s concession, in fact, the “butter-wouldn’t-melt” award would have to go to Ms. Friedman’s assertion that a million-dollar donation from Citigroup hadn’t helped to grease the way for Grubman’s twins. “No child is guaranteed admission here,” she said. “Every child—every child—goes through the same rigorous admissions process.”
What the skeptics found hard to swallow about that wasn’t just the implausibility of supposing that the Y accepted Citigroup’s money and then said, “We’ll get back to you on the Grubman kids”—or, what seems like an even bigger stretch, that a shrewd postcapitalist like Weill would have handed theY a million bucks on pure spec. It’s the claim that the Y evaluates its two-and-a-half-year-old applicants according to a rigorous admissions process, as if all those little Warburgs, Schiffs, Allens, and Stings had been chosen on merit alone. It reminds you that precocious is just the Latin for “precooked”—you wonder if there are objective tests that can preselect the toddlers who will rise like soufflés under the preschool’s warm attentions.
Wouldn’t the parents of a 92nd Street Y rejectee be happier if the Y just came out and said that the fix was in? Otherwise, it’s a bit like being told that the passengers who are preboarding in first class were selected because they looked as if they would make the best use of the champagne, not because they paid more for their tickets.
It may be, as Nicholas Lemann suggests in The Big Test: The Secret History of the American Meritocracy, that the meritocratic system began to unravel when it ceased to be aimed at picking the best people for public service and became largely a matter of deciding how to hand out the goodies. What is clear is that opportunity is increasingly a matter of getting an early jump on things—of pretesting, preselection, preapproval, preadmission, and, not to be coy about it, prepayment. Or, as a letter I received the other day informed me, “You may already be a winner.”
Caucasian Talk Circles
The recall has been getting all the ink, but the item on tomorrow’s California ballot that has the most important national implications is what backers call the “racial privacy initiative,” which sharply restricts the state’s ability to classify people according to race.
Opponents of the measure argue that it will hamper efforts to gather information on discrimination, student progress, hate crimes, and health questions. Supporters defend it with the new rhetoric of color-blindness—they ridicule the stew of ethnic and racial identifications that students are required to tick off on University of California admission forms. As they put it, it’s time to “junk a 17th-century racial classification system that has no place in 21st-century America.”
They’re not going to get much argument on that. But the classification system they want to sweep away is more of a modern creation than an antique one. And if the language of racial classification seems inconsistent and jumbled, that’s the fault of the uneven social landscape we’re asking it to map.
Those inconsistencies came to the surface in a recent story about a fifteen-year-old high-school freshman in Oakley, California, who had gathered 250 signatures to start a Caucasian club. If African Americans, Latinos, and Asians could have clubs to “teach them their cultures,” as she put it, then why shouldn’t whites have one as well?
Fresh Air commentary, October 6, 2003
That logic was plausible to a lot of people, including 87 percent of the respondents to a poll conducted by a Los Angeles TV station. And while others thought the club was an ill-conceived idea, a lot of them blamed the multiculturalists for setting a bad example. As National Review’s Jay Nordlinger put it, “A Caucasian club—ugh! Enough of the Balkanization of America. . . . It is the Left’s fault. It is the fault of all of those who have insisted on the prominence—virtually the primacy—of race.”
Actually, the most revealing word there is “ugh.” What is it about the history of that quaint word Caucasian that makes even conservatives a little squeamish about seeing it in an organization’s bylaws? As it happens, the word is exactly as old as the American nation. It was invented in 1776 by the German anthropologist Johannes Blumenbach, a disciple of Linnaeus, as the name of one of the four basic racial stocks of mankind—he chose Caucasian in the belief that the white race began its peregrinations when Noah’s ark landed on Mount Ararat in the Caucasus.
By all rights, the Caucasian label should have vanished a long time ago, along with Mongoloid, Negroid, and the other categories of discredited racial theories. But it proved to be a conveniently genteel term for excluding people of the wrong sort. In fact Americans have rarely used Caucasian in its original anthropological meaning, which included not just Europeans but the peoples of the Middle East and North Africa. In 1919, the secretary of an immigration reform group remarked that the United States gave citizenship to many who were not Caucasians, including “Tartars, Finns, Hungarians, Jews, Turks, Syrians, Persians, Hindus, Mexicans, Zulus, Hottentots [and] Kafirs.” The Finns and Hungarians were presumably ruled out because they didn’t speak an Indo-European language; the Persians and Hindus because of low surface albedo.
By the 1920s, the word had become common in the “Caucasian clauses” of organizational bylaws and the restrictive housing covenants that became common in the North in the years following World War I. As late as 1947, a civil-rights report of the American Missionary Association said there was no immediate prospect of “a mass migration of Negroes, Jews, and other minorities into exclusively Caucasian areas.” And when Jews were reclassified as Caucasians soon after that, it had more to do with a reevaluation of their effects on property values than with any new findings in physical anthropology.
Even now, that dispensation hasn’t been extended to the other Semitic peoples. “It’s not Arabs against Caucasians,” explained CNN’s anchor Jack Cafferty shortly a
fter the September 11, 2001, attacks. It’s unlikely he would have been tempted to put that as “Arabs against whites.” When it comes to the crunch, Caucasian doesn’t mean much more than “white people who play golf.”
In fact the Caucasian label has become even more common in recent years, to the point where it’s part of the active vocabulary of a high-school freshman. That’s partly a response to the need for a term to pair with African American, another odd entry in the American racial lexicon. When African American was popularized in the late 1980s, it was supposed to suggest an identity defined by color rather than one defined by ancestry. But we don’t use African American the way we use labels like Italian American, where we feel free to drop the American when the context makes it clear. We talk about an Italian neighborhood, but not an African one. The “African” of African American isn’t a geographical label, it’s just a prefix that means “black.”
If we were being consistent, we’d contrast African American with European American. But that term has never caught on widely, probably because Europe is a more diverse place than Africa in the mental geography of most Americans. About the only people you see using European American are scholars and the modern racialists who have tricked out their programs in the language of multiculturalism. (A couple of years ago, an outfit called the European American Issues Forum persuaded the California legislature to proclaim a European American Heritage Month. That occasioned a lot less comment than if they’d called it White Heritage Month.)