Now there is W. Joseph Campbell’s “1995: The Year the Future Began” (California), a worthy, informative, and sporting attempt to convince us that the world we live in was crucially shaped by things that happened in 1995. (Campbell insists that there is a distinction between “the x that changed the world” books and his own “the year the future began” book, although it’s hard to grasp.)
The book is not completely persuasive, but that’s not important. None of the “x that changed the world” books are completely persuasive, for the reason that all dots have dots of their own. Unless you count God, there is no uncaused cause. Even the butterfly that started the hurricane flapped its wings for a reason. Whatever happened in 33 or 1959 or 1995 never would have happened unless certain things had happened in 32, 1958, and 1994. And so on, back into the protozoic slime. All points are turning points.
All points might not be tipping points. But that’s not what these books are arguing. They are seeking to confer before-and-after explanatory power on a single thing, or on what happened on a single date on the calendar. We can doubt the premise. But what the melodramatic titles are really and usefully doing is drawing our attention to something—pepper or 1959—that we might otherwise have ignored. Do melodramatic titles also sell books? So what if they do? We’re in favor of selling books.
Campbell’s book draws our attention to the nineteen-nineties. And he’s right when he points out that the decade is pretty much ignored. Maybe this is because many Americans remember the nineteen-nineties as a tranquil time or maybe it’s because the decade is wedged between two periods that attract a lot of industrial-strength historical notice: the Reagan era and the “age of terror.”
How tranquil were the nineteen-nineties? “Our Long National Nightmare of Peace and Prosperity Is Finally Over” was the headline in the Onion the month George W. Bush took office, January, 2001. His Administration took care of that in a hurry. In fact, though, the nineteen-nineties were not so peaceful. Dozens of wars were under way around the world. It’s just that, especially while Bill Clinton was President, the United States was involved in very few of them.
It was, however, genuinely a time of prosperity. In 1993, the year Clinton became President, median household income in the United States was $48,884. Six years later, it was $56,080, and the federal government ran a $125.6-billion surplus. There was an even bigger surplus in 2000, and ever since 2001 the federal government has been in the red. In 2013, median household income was $51,939, and the budget deficit was $680 billion (which was small by post-Clinton standards).
The stock market began the nineteen-nineties with the Dow at 2,753. At the end of trading in 1999, the Dow was at 11,497. Middle-class Americans tend to feel that life is good when their 401(k)s are robust. But the quality of public life in the nineteen-nineties, as measured by the headlines, was actually somewhat sad and tawdry. Names in the news: Tonya Harding, Rodney King, Ted Kaczynski, Lorena Bobbitt, Amy Fisher, Heidi Fleiss, Susan Smith, Clarence Thomas and his can of Coke. The movie of the decade was “Titanic.” The No. 1 pop star was Mariah Carey. In baseball, it was the steroid era. (In basketball, there was Michael Jordan, so that much was good.)
The nineteen-nineties was Columbine, the Atlanta Olympics backpack bombing, the World Trade Center truck bombing, and the siege in Waco. Elsewhere around the globe, there was a civil war in Somalia, genocide in Rwanda, and ethnic cleansing in the place formerly known as Yugoslavia. Chechnya was at war with Russia, and a civil war began in Sierra Leone that lasted eleven years. The decade ended with the worldwide Y2K hysteria, a nutty cocktail of digital overthink and Luddite millennialism.
“Don’t ask, don’t tell,” the phrase coined to sum up the Clinton Administration’s policy of resolving the issue of gays in the military by resolving to leave the issue unresolved, seems a fitting slogan for the era. It was a period of loose ends, of isolated eruptions, spasmodic violence, and one-off scandals. Nothing went with anything else. This is because there was no context to hold the headlines together. There was no Cold War, no civil-rights movement, no Vietnam or oil embargo or Reagan revolution, no catchy new mode of music or art or fashion to be forever and fondly associated with the times. Clinton was the obvious person to give the decade an imprint, but he turned out to be the protagonist in yet one more set of depressing headlines about behavior that made no sense.
Clinton’s affair with Monica Lewinsky is one of the five things that happened in 1995 that Campbell believes opened the door to the future. The others are the O. J. Simpson trial, the Oklahoma City bombing, the Dayton negotiations that settled the Bosnian war, and the rise and fall of the Internet browser Netscape Navigator.
The list certainly reflects the inchoate spirit of the age. But that is not Campbell’s point. His point is that our contemporary (American) world started with a White House sex scandal; the murder trial of a former football star; a set of agreements hammered out among foreign heads of state on an Air Force base in Ohio; a loner who thought that blowing up a federal office building was justified on political principles; and a computer program that ultimately lost the “browser wars” to Microsoft. You have to admire a historian who proposes to extract reverse-prediction gold from that material.
Campbell’s specialty—he teaches in the School of Communication at American University—is the history of journalism. He is the author of the indispensable “Getting It Wrong: Ten of the Greatest Misreported Stories in American Journalism,” a debunking of exaggerated or fallacious stories that were repeated so often they became what he calls “media myths.” These range from William Randolph Hearst’s promise to “furnish the war” with Spain, in 1897, to the Jessica Lynch story, in 2003, and the coverage of Hurricane Katrina. So “1995” is devoted less to the five world-changing events of that year than to the way they were covered, interpreted, and handed down to us.
What was the lasting importance of the O. J. Simpson trial, which began on January 24, 1995, and concluded on October 3rd? Was it the demonstration that a rich defendant can lawyer up and beat a criminal prosecution? That hardly seems news. A lot of people have thought that the importance of the Simpson trial had to do with race. When the verdict was announced, many white Americans were surprised that a jury could acquit a man who had motive, opportunity, and no alibi, and whose blood appeared to be all over the place. Most people consider it highly unusual for their blood to be anywhere outside their bodies. Black Americans tended to be surprised (or not) that white Americans could be surprised that the case of a black defendant might be mishandled by the cops. The trial was therefore taken to expose the insidious role that race plays in the law-enforcement and criminal-justice systems, and the response to the verdict to reveal a deep split between white and black views on the state of racial relations.
Campbell thinks that the significance of the Simpson trial had almost nothing to do with race. He thinks that Simpson was acquitted because, well, he was a rich defendant who lawyered up and beat the rap. Campbell doesn’t put it this way, but if Simpson had been a white sports celebrity he would very likely also have got off—and although some people might have been upset by the outcome, no one would have been astonished. The Simpson verdict was an anomaly because Simpson was an anomaly, a wealthy, unthreatening, well-connected entertainment star who happened to be African-American. In his case, money and fame bought him a huddle of high-priced lawyers, folks who don’t chase just any ambulance.
The day the Simpson verdict was announced—the judge, Lance Ito, had held the verdict overnight, in part to insure that the announcement would receive maximum coverage—was an interesting moment in the history of American race relations. Campbell doesn’t dispute this. What he disputes is that it was a moment of enduring impact. He says that the trial “dented but did not reverse” a trend in public-opinion polls showing that both white and black Americans believed that racial relations were improving. Simpson’s acquittal was a blip, not a turning point.
So what does Campbell think the enduring significance of the Simpson trial was? It established the credibility of DNA evidence. That is not the first thing that jumps to mind. After all, the DNA evidence against Simpson was ripped apart by one of his lawyers, Barry Scheck. How could this have made people more comfortable with the use of DNA evidence in criminal trials? Campbell argues that Scheck never challenged the validity of DNA evidence per se. He only challenged the handling of that evidence by police investigators. The implication of Scheck’s argument—that Simpson’s samples were corrupted—was that properly handled samples would have yielded admissible results. Which is, in fact, what Barry Scheck believes.
This is sideways history. A relatively technical courtroom exchange has unexpected consequences for the criminal-justice system—and only because the defendant happened to be famous and the crime spectacular, which meant that the trial was televised and millions of people watched it. Like a lot of sideways history, the theory is provocative and a little deflating, especially for someone who knows that, no matter how productively he spends the rest of his life, he will never make back the time he spent following the Simpson trial. Such a person would hope that the experience held a grander lesson than this.
Much of “1995” is sideways history, extracting unintended or unexpected long-term consequences from apparently isolated and eccentric events. But Campbell’s discussion of Netscape Navigator and the Internet is an exception. There he makes a tipping-point argument.
Netscape Navigator was a browser created by a group led by a twenty-four-year-old named Marc Andreessen, who was described in Newsweek as “the über-super-wunder whiz kid of cyberspace.” The company’s I.P.O., on August 9, 1995, was a huge success. Five million shares went on sale on Nasdaq, at twenty-eight dollars a share; they closed the day at $58.25. The Times called it “the best opening day for a stock in Wall Street history for an issue of its size.”
A little more than two weeks later, Microsoft released Windows 95, backed by what was reported to be a three-hundred-million-dollar marketing campaign, along with its own browser, Internet Explorer 1.0, and the browser wars were on. Netscape, of course, was quickly and easily outmuscled by Microsoft. In 1998, Netscape was acquired by AOL, and it faded into insignificance. (Although, Campbell points out, a nonprofit venture that Netscape had set up, Mozilla, later produced the popular open-source browser Firefox.)
Campbell thinks that the Netscape I.P.O. woke the world up to the Internet. It “brought the Web into popular consciousness,” he says; it “demonstrated that the Web could be a place to make fortunes fast.” This does seem a lesson of lasting impact, although no one wants to invent anything so complicated as a browser or an operating system anymore. Today, everyone dreams of inventing an app with a couple of friends from college, selling it to Google for a hundred mil, and kicking back for the remainder of life.
Possibly the Netscape I.P.O. was not what tipped the Web into mainstream life. But it was arguably part of a critical mass of Internet phenomena that emerged at almost the same moment. Campbell estimates that in 1995 between twenty and forty million people used the Internet. That number seems the key to what happened: twenty to forty million people was just enough for entrepreneurs to figure that it might be worthwhile launching a boat or two in the direction of this unmapped continent. It might turn out to be Greenland—but what if it was India!
The predecessors of Craigslist, eBay, and Salon all started up in 1995. Yahoo! was incorporated that year, and the New York Times Web site began appearing. Java was introduced, by Sun Microsystems, in 1995. And 1995 was the début year of Amazon and of the wiki (which Campbell reports is Hawaiian for “quick”). In an industry that has tremendous turnover, and in which capital seems to chase every new idea the moment it’s whiteboarded, the longevity of these early sites is impressive.
But what kept them in business was the transformation of the computer from a place of work into a place of recreation. Ideally, if you are selling things, you want people to be somewhere you can find them, and to be there for reasons other than to be sold something. People read magazines for the stories, not the advertisements; they watch television for the shows, not the commercials.
In the beginning, what got people to turn on their computers during leisure hours was the computer game. Gamers were a reliable, even addicted, audience, but they were not a huge audience. What transformed the Internet into the virtual “place” for almost every kind of transaction was social media. The big mover there was Facebook, a Web site that people would apparently go online to check out every time they had a spare nanosecond. And Facebook wasn’t launched until 2004. A turning point. The Web site that changed the world. Someone is probably writing a book about it right now. ♦