Show Mobile Navigation
           
History |

Ten Things That Made the 1990s

by Christopher Dale
fact checked by Darci Heikkinen

Each new decade seems to bring disdain for its immediate predecessor but infatuation with the next-to-last one. The Yuppie 1980s were intrigued by the Hippie 1960s, the tech-driven 2000s by the pre-web ’80s.

So while the 2010s obsession with all things 1990s wasn’t surprising, it seemed like more than that. The ’90s became idealized as perhaps our last decade of smooth sailing. The Cold War was ending, the Internet Age was beginning, and liberal democracy was poised to proliferate indefinitely. We were out of danger and full of hope.

Here are ten things that defined the 1990s—a decade that feels more like a century ago than a mere generation.

Related: 10 Things Your Ancestors Did Better Than You

10 The Naïve ’90s: A 12-year Decade with Bookends

The fall of the Berlin Wall in 1989

In terms of the global order, few decades are as chronologically neat as the 1990s. They began two months early with the air of Western invincibility; they ended 20 months late with that fantasy in smoldering ruins.

It started with an exceptional promise: on November 9, 1989, the Berlin Wall began crumbling. East Berliners who hadn’t seen loved ones in nearly three decades embraced for the first time as masses fled the Soviet-controlled sectors for those occupied by the U.S., the UK, and France. While the USSR itself would hang on for another two years, the message was loud and clear: the West had won.

Overnight. a bifurcated world had a singular superpower: The USA. The 1991 Gulf War rout (see entry #2) reinforced this. Despite the small setback of 1993’s failed operation in Somalia—the so-called “Black Hawk Down” incident—the US-led NATO mission to stop dictator Slobodan Milošević from slaughtering ethnic Albanians in Kosovo served to show what the full might of Western military power could accomplish.

America’s perceived invincibility lasted a year and a half into the next decade. It came crashing down, along with two of the world’s largest skyscrapers, on September 11, 2001. Instantly, American citizens no longer felt exceptional in their own safety. Soon, failed attempts to capture Osama bin Laden and the disastrous invasion of Iraq proved its military far from infallible. The naïve 90s were over.[1]

9 The Gulf War and the Dawn of a Superpower’s Hubris

President George H.W. Bush Announces Persion Gulf War 1-16-91

In August 1990, Iraqi forces under dictator Saddam Hussein invaded the tiny Persian Gulf country of Kuwait. The response from the West was clear: the illegal incursion would not stand.

Largely lost to history, though, is that many feared freeing Kuwait would mean massive military casualties and devolve into a Vietnam-esque quagmire. After all, the Iraqi army was the world’s fifth-largest, with some 950,000 personnel, 5,500 tanks, 10,000 armored vehicles, and 4,000 artillery pieces. “If they choose to stand and fight, we are going to take casualties,” said allied commander General Norman Schwarzkopf. “if they choose to dump chemicals on you, they might even win.”

Regardless, when Hussein refused to withdraw, armed conflict began on January 16, 1991. President Bush announced the mission’s inception, and the world braced for the likelihood of a long-term engagement that could kill tens of thousands of coalition troops.

What ensued was among the most lopsided fights in modern military history. Iraq lost at least 25,000 soldiers and was completely driven from Kuwait by the end of February. And despite the widescale, open-territory fighting, collation forces lost less than 400.

For the U.S., the Arab defeat was one reason the 1990s became a decade-long pat on the back for what appeared to be the last superpower standing. America’s righteous might would ensure the protection and extension of liberal democracies worldwide. We could, it seemed, police the globe.[2]


8 “The End of History” (and Common Sense)

Francis Fukuyama and The End of History

The Berlin Wall, the Gulf War, and the collapse of the Soviet Union: America and the West were riding high. Maybe too high—as in “what are you smoking?” high.

It was against such a back-slapping backdrop that the most naïve political philosophy book of all time was penned. In 1992, American political scientist Francis Fukuyama published The End of History and the Last Man. In the wake of liberal democracy’s triumph over Soviet-style communism, Fukuyama declared “not just…the passing of a particular period of post-war history, but the end of history as such: That is, the end-point of mankind’s ideological evolution and the universalization of Western liberal democracy as the final form of human government.”

Humanity had, Fukuyama argued, evolved to its predestined governance endgame. He contended that history was an evolutionary process and that liberal democracy was the best—and therefore the final—form of government for all nations. For proof, Fukuyama pointed to the progressive ethical, political, and economic benefits of liberal democracy since the late 1700s.

In fairness, Fukuyama gives his victory declaration some wiggle room: authoritarianism may indeed resurface, he predicts—but will eventually be overwhelmed by the righteousness of democracy.

To say that Fukuyama’s tome has not aged well would be an understatement of historic proportions. Radical Islam, Vladimir Putin’s USSR 2.0, and democracy’s erosion everywhere from Eastern Europe to the United States have proven Fukuyama little more than an eloquent fool.[3]

7 Nirvana and the Age of Apathy

Nirvana – Smells Like Teen Spirit (Official Music Video)

At a time when seemingly everyone who’s ever picked up a guitar (and many who haven’t) are gaining induction into the Rock & Roll Hall of Fame, saying one album “changed everything” has become a watered-down phrase. Many bands make hits; few change paradigms.

On September 17, 1991, a wildly successful 1980s rock band, Guns N’ Roses, released an ambitious double album, Use Your Illusion I & II. Anchored by hits like “November Rain” and “Don’t Cry,” it was a critical and commercial success.

But a week later, it was instantly old-fashioned. On September 24, Nirvana’s Nevermind brought ’80s glam rock to a grinding halt.

Television expedited this transition from leather pants and puffy hair to ripped jeans and flannels. In fact, the video for Nevermind’s debut single, “Smells Like Teen Spirit,” is arguably the most culturally influential video in music history. Power ballads had their day; the era of discordant grunge had arrived, driven by a short, skinny kid with an ability to yell in rhythm (another Nevermind single, “Lithium,” showcases this.)

Kurt Cobain’s outsized talent belied his undersized ambitions. This was great music delivered with a cynical, “whatever” angst that bands would exude, to varying degrees, through the ’90s and beyond. Buttressing Nirvana’s game-changing prominence was the success of other bands from their Pacific Northwest roots, including Pearl Jam and Soundgarden.[4]


6 Success = Sellout: The Decade of Anti-Validation

Fight Club – Tyler Durden speech

Every time we check our social media feeds—or our comments under online articles like this one—we do something that’s, like, totally not the ’90s. Because, dude, who really cares what they think?

The 1990s had a cultural conundrum: it wasn’t cool to try too hard, so people made a conscious effort not to try too hard…or at least appear like they weren’t trying too hard. Being cool takes work, and lazy was cool.

This sort of poseur nonchalance came about for a confluence of reasons. One was the aforementioned cultural phenomenon of Nirvana and the subsequent advent of grunge and alternative rock music, which ushered in both the fashion and ethos of anti-aspiration.

Another was a dearth of causes. Newly freed from the specter of nuclear annihilation and tiring of upbeat, patriotic Reaganism, Generation X found itself purposeless, adrift, and largely unenthused. This chronological circumstance would be aptly summarized in a speech from the 1999 film Fight Club, when Brad Pitt’s Tyler Durden declares the gathering of angry, aimless men the “middle children of history.” “Our great war is a spiritual one,” he says. “Our Great Depression is our lives.”

Such dejection was contagious and ultimately endemic to the ’90s. Doing something with the goal of wealth or acclaim became uncool, and being labeled “inauthentic” was a cultural death sentence. With the free-love, mass-protesting 1960s dead and buried, ’90s rebellion against corporatism and conformity thrived on personal disdain and lethargy.[5]

5 Ross Perot and the Rise of the New Democrats

George H.W. Bush-Debate with Bill Clinton and Ross Perot (October 11, 1992)

In the last century, no U.S. president was elected with a lower percentage of the vote than…Donald Trump? George W. Bush, perhaps?

Nope. The answer is William Jefferson Clinton, who in 1992 became the 42nd president with just 43.01% of the vote. (Runner up: Nixon in 1968, with 43.42%).

Arkansas Governor Clinton became U.S. President Clinton because of two words: Ross Perot. In 1992, the short, eccentric billionaire from Texas dropped in and out of the race, espousing protectionism and low taxes along the way. Officially, he was the Reform Party candidate; unofficially, the quirky conservative was siphoning support from the otherwise popular sitting president, George H.W. Bush. In the end, Perot won 18.9% of the popular vote—the highest share won by a third-party candidate since 1912.

What resulted was predictable: Clinton’s Democratic Party got walloped in the 1994 midterm elections, partly because voters really hadn’t voted for a liberal direction two years prior. So to have any chance of re-election in 1996, Clinton tacked toward the center, embracing measures like more police officers and 1996’s “Welfare to Work” bill, which placed tighter stipulations on government assistance.

Clinton won easily in 1996. In the process, the Democratic Party went from trying to implement universal health care in 1993 to completely deregulating the financial system (1999’s repeal of the Glass-Steagall Act). Many were left wondering whether there were really two political parties anymore.[6]


4 Minorities Go Mainstream as Rap Matures

2Pac – I Get Around

In the 1990s, America suddenly remembered it had Black people for the first time since…oh, say 1968 or so. While that’s probably overstating it, it’s no exaggeration that the early 1990s were a demarcation point for Black culture in both the United States and the world. A major reason for this is the rapid ascent of hip-hop and reality-based rap.

Was rap around in the 1980s? Sure. But it was rarely played on the radio, and MTV largely refused to air it, using its traditional rock and New Age formats as an excuse to shy away from biting, often violent rap lyrics.

And let’s face it: rap was still feeling its way throughout the 1980s, and despite fond memories of trailblazers like Kurtis Blow, Grandmaster Flash, and even Run-D.M.C., most ’80s artists can’t hold a candle to the talent that emerged in the early 1990s. (Most, not all; Public Enemy and Ice T are exceptions). Others, like N.W.A., were birthed in the late ’80s and became solo legends—Dr. Dre, Ice Cube—after the turn of the decade.

Biggie. Tupac. Wu-Tang Clan. Snoop Dogg. Rap got mature at the same time it went mainstream. Cheesy acts like MC Hammer, Vanilla Ice, and Sir Mix-a-Lot were soon laughed off the stage, making way for Jay-Z, Missy Elliott, Nas, and others. While rap was pioneered in the ’80s, it proliferated in the ’90s, cementing hip-hop and Black culture in mainstream Western society.[7]

3 Rodney King and the First Viral Video

30 years after Rodney King LAPD beating video, what has changed?

On April 29, 1992, four Los Angeles police officers were acquitted of excessive use of force. The incident in question occurred after an intoxicated paroled felon led them on a high-speed chase and, initially, put up mild resistance to arrest.

The motorist was black. And unarmed. And this type of thing had played out far too often in the United States and, especially, Los Angeles. And the whole thing would have gone unnoticed had someone not caught it on a handheld camcorder.

The video from the March 3, 1991 altercation between the motorist, Rodney King, and the four officers quickly ran on nightly newscasts across the country. Regardless of the details, it was clear that the officers were kicking and beating King with nightsticks, a guy who was on the ground and largely harmless.

It was on camera, on the airwaves, and now, many wanted the handcuffers in handcuffs. The all-white jury disagreed and acquitted the officers on all charges.

The streets exploded. The Los Angeles Riots marked the widest-scale racially-motivated violence in a quarter-century. The Black community understandably took the verdict as proof that law enforcement not only abused them but did so without consequences.

Ironically, three years later, the Rodney King trial verdict was cited as a main reason for a mixed-ethnicity jury acquitting accused murderer OJ Simpson. Thirty years later, the ability for everyday people to not only record but stream videos has proven the King video an ill omen, one in which bruises are replaced by bullets.[8]


2 A Decade Divided as the World Comes Online

AOL: The Rise and Fall of the First Internet Empire

One of the most overlooked dates in human history is August 6, 1991. That’s the day the first website was launched. Now defunct, the URL www.info.cern.ch was a rousing, rollicking…instruction manual for how to use the world wide web. How quaint.

All Al Gore memes aside, the site (and the world) was brought online by British computer scientist Tim Berners-Lee of the European Organization for Nuclear Research, a.k.a. CERN. It was the first successful attempt at “going viral;” at the site, people could learn how to create their own web pages and about hypertext—coded words or phrases that link to additional content.

Just like that, a new medium was born for the first time since the advent of television nearly half a century earlier. The following year, an ambitious startup called Quantum Computer services renamed itself America Online and, in 1993, began mailing install kits to get customers up and running. By 1996, AOL had five million customers, and Netscape’s Navigator browser had revolutionized surfing.

All the while, the public’s ability to search the internet matured. AltaVista became the first fast search engine that scoured a significant portion of the web; Yahoo! followed in 1994 and had a few years in the sun before the 800-pound gorilla, Google, launched in 1998.

In 1990, the Internet basically didn’t exist. By 1999, businesses and major institutions (like universities) had already graduated from dial-up modems to high-speed ethernet connections.[9]

1 Boom to Bust: The Dot-Com Bubble

What Caused the Dot-Com Bubble?

Several entries on this list deal with hubris, and for a good reason: the ’90s were chock full of it. And when you combine an air of invincibility with a promising new medium, the result is predictable: a bubble built on confidence and burst by reality.

As the internet took shape, its potential was obvious. Here was a platform to promote products, share information, and improve consumer experiences and lives. The web had disruptive potential that could expedite research and make communication more convenient and less expensive.

Now, if we could just figure out how to make money off this damn thing…

That was the ’90s Dot-Com Bubble in a nutshell: a lot of neat new websites from a lot of neat new cyber-companies…with not a clue how to monetize them. As the fledgling tech-heavy NASDAQ stock index rose from 1000 in 1995 to 5000 in 2000, stock equity ballooned largely on speculation. The market collectively assumed these companies would soon develop sound strategies for turning their temporarily in-the-red startups into permanently in-the-black behemoths.

The market was, collectively, wrong. Pop went the web weasel, and the early 2000s recession returned the world to reality. Many sites quickly vanished. Remember Pets.com? Webvan.com? eToys.com or Flooz.com, perhaps? Yeah, me neither.[10]

fact checked by Darci Heikkinen
Christopher Dale

Chris writes op-eds for major daily newspapers, fatherhood pieces for Parents.com and, because he's not quite right in the head, essays for sobriety outlets and mental health publications.

Read More: Twitter Website


1 Shares
Share
Tweet
WhatsApp
Pin1
Share