1990s

1990s

The 1990s: When Technology Upended Our World

If you were to pick the one, singular, culture-defining moment from the ’90s—a decade that gave us so many—you’d be hard pressed to beat Bill Clinton–Monica Lewinsky affair. Even now, in our current climate of oversharing and punch-drunk numbness to the spewing of digital media, ...read more

Bill Clinton Once Struck a Nuclear Deal With North Korea

President Bill Clinton took the podium on October 18, 1994, with aspeech that reads like a sigh of relief—the announcement of a landmark nuclear agreement between the United States and North Korea. “This agreement is good for the United States, good for our allies, and good for ...read more

1990s: The Good Decade

On August 11, 1992, I was in suburban Minneapolis for the opening of the Mall of America, the largest shopping mall on Earth. I was on a book tour and in the early afternoon I spent 60 minutes with the local AM radio station host, seated in chairs on a small plywood platform ...read more


1990s Fashion trends

The Ladette was 90s feminism &ndash we could be just as messy as the boys, and fight and swear and drink as much. And talking of messy, Grunge popped up in the US to show us that wearing vintage clothing any-old-how was the way forward, and gender equality could also be achieved by men wearing pretty dresses and smudged eyeliner with their jeans just like women did. Hedonistic and illegal rave culture also had its own codes in 1990s fashion.

On the flip side, we all worried about nuclear disasters (Chernobyl was only a few years ago), the hole in the ozone layer and saving the whales.

Moschino Cheap and Chic, suit with Roy Lichtenstein print, 1991

90s fashion women

On the international 90s fashion stage, minimal designers like Calvin Klein, Jil Sander, Prada and Donna Karan gave us stripped down outfits and slip dresses, and the Belgian designers dubbed the Antwerp Six promoted a doomy sort of cerebral design.

Tom Ford provided sexy and sleek gowns. John Galliano and Alexander McQueen proved that British designers could do perfectly executed and very creative glamour, even as they partied their heads off with The Primrose Hill set, a posh and beautiful group of actors, models and assorted rich people whose coke-snorting exploits the press just could not get enough of.

90s Supermodels

There was a new breed of model in town, the supermodels, and &ldquoWe don&rsquot wake up for less than $10,000 a day&rdquo, as Linda Evangelista said in 1990. That statement provoked outrage but it was probably true &ndash Linda, and her fellow models dubbed &ldquoThe Big Six&rdquo (Christy Turlington, Naomi Campbell, Tatjana Patiz, Claudia Schiffer and Cindy Crawford) were EVERYWHERE in the 1990s, not only on the catwalks and magazine covers but in the gossip columns too. Christy Turlington, as the face of Maybelline, earned $800,000 for 12 days&rsquo work a year. 1995 saw Claudia Schiffer earn a reported $12 million.

Heroin chic

Those supermodels were tall, classically beautiful women, but another type of beauty standard was in town too: heroin chic. Initially it was epitomised by Kate Moss &ndash only 5&rsquo7&rdquo and not a six-foot Glamazon, with skinny, bandy legs, flat-chested and very young &ndash she was 14 when she was &ldquodiscovered&rdquo and 16 when she shot to fame, photographed wearing a grubby vest and knickers and no makeup in a grungy flat by Corinne Day for the Face magazine.

This kick-started a trend for models so thin they looked ill, or like drug addicts &ndash hence the heroin chic label. The crumpled, unstructured clothes in haphazard layers that Moss and fellow skinny models were put in heightened this general effect.

Faith Ford at the Emmy Awards, 1994.Photo by Alan Light.

1990s Fashion Daywear and Evening wear

90s Grunge

Over in the US, a recession was going on and so people began rummaging in second hand clothes stores for outfits. There was a huge music scene in Seattle, where people just picked up guitars and started to strum, and hey, apparently people in Seattle like to stay warm, though not by any means immaculate. So their thrift shop finds mainly seemed to consist of plaid flannel shirts worn over t shirts from any decade, with a cardigan on top. Just jeans and boots did fine on the bottom half.

The style was good for both male and female, and when those Seattle bands (Nirvana, Jane&rsquos Addiction, Red Hot Chilli Peppers) made it big in the UK, the nation&rsquos teenagers wore the same.

Kate Moss &ndash Decorté advertisement

Prom dresses, petticoats and slips in 1990s Fashion

A variation on unisex look was to riffle though and find old prom dresses or even old petticoats and slips and wear them. But please, don&rsquot iron or repair them first, and make sure they are worn with discordant big boots, and don&rsquot spend time on your makeup &ndash smear it on and sprinkle glitter over the top.

The world was shocked when Marc Jacobs, the designer for preppy all American brand Perry Ellis, took grunge style and tried to make it high fashion. He had cheap cotton flannel shirts recreated in Italian silk, he had holy knitwear especially made. It was not a success and he lost his job.

Slip dress design as minidress, Calvin Klein , 1991

The slip dress

Calvin Klein did rather better when he simply adopted the slip dress part of the grunge dress code. The Calvin Klein brand had, up till now, been riotously sexy, but with the AIDS epidemic, sexiness was suspect. Klein needed a new direction and he cleverly chose androgyny and waif like Kate Moss, who represented his brand throughout the nineties.

The slip dress, though it looked like underwear, was subversively understated in 1990s fashion and many brands also took up the look. When Princess Diana wished to reinvent herself as a fashion icon in 1996 John Galliano at Dior made a slip dress for her in navy and black. It caused a furore. Slip dresses were worn as both day and evening wear, layered over jeans or on their own.

The slip dress sort of straddled maximal grunge and minimal futurism in 1990s fashion, depending on how it was worn. Minimalism was the clean, uncluttered lines of designers like Donna Karan and Prada. Like the 60s futurists, they looked at silver and white colours, and futuristic fabrics. Prada designer Miuccia Prada made nylon the most wanted material, with simple nylon shopper bags, jackets and dresses in black with the distinctive small red label shunning ornamentation. But Miuccia Prada had her decorative side and launched Miu Miu too, with its quirky, little girl dresses and cute silver shoes.

Givency evening suit, 1990

Earth colours

For grown ups who didn&rsquot want to be little girls or space aliens, tonal layers of earth colours was the answer, in comfortable but low key luxury fabrics like cashmere. These natural colours were also used by environmentalists keen to create collections without harsh chemical dyes and using lower impact fabrics, like hemp and linen.

1990s Bridal

Bridal wear often follows trends in evening wear, with versions of the same dresses but in white. 1990s fashion was no exception and some of the most fashionable weddings were in little white slip dresses. For added romance, some, like Cindy Crawford, had lace covered versions others like Stella Tennant, who was a staunch supporter of Helmut Lang, had a minimalist version, but with layers of tulle as a concession to the bridal theme.

1990s Perfume

Really clean, unisex perfumes came to be popular in the 1990s, as designers like Calvin Klein made youthful perfumes that could be worn by everyone, like CK One and CK Be. He also made the very popular and more feminine Escape and Obsession. But as well as this there were still heavy, sexy orientals, like Guerlain&rsquos Samsara, Yves Saint Laurent&rsquos Opium, and Coco by Chanel. Christian Dior&rsquos Poison was still popular from the 80s, too. More &ldquopretty&rdquo scents included Estee Lauder&rsquos Pleasure and Beautiful.

1990s Makeup

Fashion in the 90s took bright 1980s makeup and softened it into pretty washes of colour on lids and cheeks, perhaps a pastel blue eyeshadow and pale pink blusher with pink lipgloss. A strong red lip with minimal eye makeup was a powerful alternative.

But what the decade is more remembered for is a &ldquoNo makeup makeup look&rdquo &ndash with all your flaws and imperfections smoothed out and features subtly highlighted with, say, a dash of tinted moisturiser and a slick of clear mascara. The aim was to look naturally glowing, with no obvious colour. Alternatively, you could just literally wear no makeup.

Manuela Arcuri &ndash Marie Claire photoshoot, 1996

Hair in 1990s fashion

Long, loose hair was the most popular women&rsquos style, though for practicality or a sporty vibe it was scraped back into a ponytail.Mostly it was left artfully product free and a little fluffy, although mousses and gels were available to tame the flyaways. A romantic, curly look was celebrated, either long, full and free or gently gathered so that tendrils fell around the face.

The classic long bob looked neat, though some supermodels (and Uma Thurman in Pulp Fiction) went for a short, sharp, classic bob.

The Rachel

The most requested hairstyle of the 1990s was said to be The Rachel. The TV series &ldquoFriends&rdquo debuted in 1994, and Jennifer Anniston&rsquos character, Rachel Green, had the haircut people wanted &ndash bouncy, layered, shoulder length, obviously styled to within an inch of its life yet at the same time artfully tousled.

For all the long hair styles that proliferated in 1990s fashion, short, choppy hairstyles looked cute too. Skin, the singer from Skunk Anansie was majestically beautiful with her bald head, and if you wanted to look alternative, you could go for the undercut &ndash a conventional longish length on top but shaved short at the back up to the top of the ears, usually worn pulled back in a ponytail to show the two lengths.

1990s Underwear

As with everything in 1990s fashion, you could go in two opposite directions for underwear fashion in the 1990s. One was to go for cotton jersey triangle bras, or no bra and just a little vest with spaghetti straps. No lace decoration or little bows, just plain jersey. Knickers were the same, plain jersey in block colours of black, white or ecru, and the only decoration perhaps a wide elastic waistband with the name of the brand you were wearing &ndash Calvin Klein for preference.

Knickers were cut quite high in the leg and slightly high waisted to make legs look longer and make sure that elastic showed beneath baggy, low trousers or as triangle briefs with string sides.

Alternatively, you could try the Wonderbra. Padded, uplifting and determinedly sexy, the brand proved a sensation when Eva Hertzigova starred in it&rsquos &ldquoHello Boys&rdquo campaign.

1990s Fashion Accessories

Hats weren&rsquot really a thing in the 90s, unless you count the ubiquitous cap and the bucket hats worn by ravers. And pop star JK from the band Jameroquai&rsquos trademark crazy headwear. Stephen Jones made some lovely creations to go with John Galliano&rsquos catwalk shows, but grungy girls and ravers alike preferred cutesy, ironic hair accessories, little girl clips and elastic bands to actual hats.

Footwear

For footwear in 1990s fashion, anyone of an alternative bent wore Dr Martens boots with their floaty dresses, or even huge, clompy army surplus boots on the end of bruised bare legs or black tights. Blocky loafers were another option &ndash Patrick Cox&rsquos &ldquoWannabe&rdquo loafers with a block heel in a multitude of colours and finishes including mock croc were the designer desirable. Child-like jelly shoes were also sought after and Cox made a version of these too, with plastic figurines embedded in the heel.

Trainers never loosed their hold and were as collectible as ever, with classic Stan Smiths and new Jordan Air Nike high tops utilising technology for a bouncy sole.

For an elegant look, kitten heels were both practical and coquettish. They had pointed toes and sometimes a slingback, and jewelled or velvet finishes. Sometimes an all black outfit would be accessorised with a quirky leopard print or hot pink fuchsia velvet kitten heel.

Perspex ring, transparant with stripes

1990s jewellery

Because 1990s fashion outfits were so over the top , and often in silver, glitter or metallics, or worn with beaded belts or jewelled shoes, jewellery wasn&rsquot much worn in 1990s fashion.

The exception was probably the velvet choker, worn with or without a little pendant, and for those who were baring their bellies, a new trend of not only nose but belly button piercing shocked and surprised many who had never considered piercing anything but their ears.

Child-like (prefereably actual children&rsquos) plastic rings, beads and bracelts, maybe with glitter embedded in were layered for the grunge look, along with deliberately tacky cheap tiaras and hair accessories.

For the minimalists, space age silver chokers and cuff bracelets by designers like Elsa Peretti were worn to accessorise an outfit along with clear perspex.

Pearl necklaces and bracelets never went away for the more traditional.

1990s Fashion-Sportswear

Pop stars All Saints were revered by teenage girls for their cool uniform of very baggy camo trousers teemed with a sports crop top bra, showing stretches of stomach in between. They teamed this with baggy tracksuit jackets, matching trainers and caps. &ldquoSporty Spice&rdquo &ndash real name Melanie Chisholm &ndash kept to a similar look, with tracksuit bottoms instead of camo trousers.

Ralph Lauren&rsquos Polo line, Tommy Hilfiger and Calvin Klein provided the &ldquoDesigner&rdquo version of sportswear in particularly desirable names.


Goo Goo, Growl Growl

The latter part of the ’90s was all over the map when it came to rock music. Hip-hop and dance started to trickle in between guitar riffs. Sugar Ray excelled in carefree party anthems (1997’s “Fly”), thanks to a combo of singer Mark McGrath’s frat-boy good looks and DJ Homicide’s crackling beats. Goo Goo Dolls, once a grittier blues-punk band, went the adult contemporary route with its 1998 mega-hit, “Iris.” And nice-guy group Matchbox Twenty made it OK for rockers to wear their hearts on their sleeves. (It helped in getting the girl.)

Conversely, a brash noise was arising thanks to the rap-rock and nu-metal genres. Braggadocio and drop-C guitars reigned supreme for bigwigs like Limp Bizkit, Korn and Kid Rock. This infusion of machismo may have been to blame for the mayhem at Woodstock 1999, essentially putting the nail in the coffin of the decade that smelled like teen spirit.


William Gibson and Bruce Sterling publish The Difference Engine

William Gibson and Bruce Sterling are known as two of the leading lights in developing Cyberpunk literature in the 1980s. In 1990, the pair collaborate on what many consider to be the first blockbuster "Steampunk" novel. Imagining a word where Charles Babbage's Analytical Engine was built and the pace of technology greatly accelerated, The Difference Engine featured many historical characters, such as Lord Byron, Ada Lovelace, and John Keats, placed in an alternate history where rival factions competed to capture a stack of secret punched cards containing an important program.


Contents

    should be viewed as an evolutionary process.
  • Events still occur at the end of history.
  • Pessimism about humanity's future is warranted because of humanity's inability to control technology.
  • The end of history means liberal democracy is the final form of government for all nations. There can be no progression from liberal democracy to an alternative system.

Misinterpretations Edit

According to Fukuyama, since the French Revolution, liberal democracy has repeatedly proven to be a fundamentally better system (ethically, politically, economically) than any of the alternatives. [1]

The most basic (and prevalent) error in discussing Fukuyama's work is to confuse "history" with "events". [3] Fukuyama claims not that events will stop occurring in the future, but rather that all that will happen in the future (even if totalitarianism returns) is that democracy will become more and more prevalent in the long term, although it may suffer "temporary" setbacks (which may, of course, last for centuries).

Some argue [ who? ] that Fukuyama presents "American-style" democracy as the only "correct" political system and argues that all countries must inevitably follow this particular system of government. [4] [5] However, many Fukuyama scholars claim this is a misreading of his work. [ citation needed ] Fukuyama's argument is only that in the future there will be more and more governments that use the framework of parliamentary democracy and that contain markets of some sort. Indeed, Fukuyama has stated:

The End of History was never linked to a specifically American model of social or political organization. Following Alexandre Kojève, the Russian-French philosopher who inspired my original argument, I believe that the European Union more accurately reflects what the world will look like at the end of history than the contemporary United States. The EU's attempt to transcend sovereignty and traditional power politics by establishing a transnational rule of law is much more in line with a "post-historical" world than the Americans' continuing belief in God, national sovereignty, and their military. [6]

An argument in favour of Fukuyama's thesis is the democratic peace theory, which argues that mature democracies rarely or never go to war with one another. This theory has faced criticism, with arguments largely resting on conflicting definitions of "war" and "mature democracy". Part of the difficulty in assessing the theory is that democracy as a widespread global phenomenon emerged only very recently in human history, which makes generalizing about it difficult. (See also list of wars between democracies.)

Other major empirical evidence includes the elimination of interstate warfare in South America, Southeast Asia, and Eastern Europe among countries that moved from military dictatorships to liberal democracies.

According to several studies, the end of the Cold War and the subsequent increase in the number of liberal democratic states were accompanied by a sudden and dramatic decline in total warfare, interstate wars, ethnic wars, revolutionary wars, and the number of refugees and displaced persons. [7] [8]

Critics of liberal democracy Edit

In Specters of Marx: The State of the Debt, the Work of Mourning and the New International (1993), Jacques Derrida criticized Fukuyama as a "come-lately reader" of the philosopher-statesman Alexandre Kojève (1902–1968), who "in the tradition of Leo Strauss" (1899–1973), in the 1950s, already had described the society of the U.S. as the "realization of communism" and said that the public-intellectual celebrity of Fukuyama and the mainstream popularity of his book, The End of History and the Last Man, were symptoms of right-wing, cultural anxiety about ensuring the "Death of Marx." In criticising Fukuyama's celebration of the economic and cultural hegemony of Western liberalism, Derrida said:

For it must be cried out, at a time when some have the audacity to neo-evangelize in the name of the ideal of a liberal democracy that has finally realized itself as the ideal of human history: never have violence, inequality, exclusion, famine, and thus economic oppression affected as many human beings in the history of the earth and of humanity. Instead of singing the advent of the ideal of liberal democracy and of the capitalist market in the euphoria of the end of history, instead of celebrating the ‘end of ideologies’ and the end of the great emancipatory discourses, let us never neglect this obvious, macroscopic fact, made up of innumerable, singular sites of suffering: no degree of progress allows one to ignore that never before, in absolute figures, have so many men, women and children been subjugated, starved or exterminated on the earth. [9]

Therefore, Derrida said: "This end of History is essentially a Christian eschatology. It is consonant with the current discourse of the Pope on the European Community: Destined to become [either] a Christian State or [a] Super-State [but] this community would still belong, therefore, to some Holy Alliance" that Fukuyama practised an intellectual "sleight-of-hand trick", by using empirical data whenever suitable to his message, and by appealing to an abstract ideal whenever the empirical data contradicted his end-of-history thesis and that Fukuyama sees the United States and the European Union as imperfect political entities, when compared to the distinct ideals of Liberal democracy and of the free market, but understands that such abstractions (ideals) are not demonstrated with empirical evidence, nor ever could be empirically demonstrated, because they are philosophical and religious abstractions that originated from the Gospels of Philosophy of Hegel and yet, Fukuyama still uses empirical observations to prove his thesis, which he, himself, agrees are imperfect and incomplete, to validate his end-of-history thesis, which remains an abstraction. [9]

Radical Islam, tribalism, and the "Clash of Civilizations" Edit

Various Western commentators have described the thesis of The End of History as flawed because it does not sufficiently take into account the power of ethnic loyalties and religious fundamentalism as a counter-force to the spread of liberal democracy, with the specific example of Islamic fundamentalism, or radical Islam, as the most powerful of these.

Benjamin Barber wrote a 1992 article and a 1995 book, Jihad vs. McWorld, that addressed this theme. Barber described "McWorld" as a secular, liberal, corporate-friendly transformation of the world and used the word "jihad" to refer to the competing forces of tribalism and religious fundamentalism, with a special emphasis on Islamic fundamentalism.

Samuel P. Huntington wrote a 1993 essay, "The Clash of Civilizations", in direct response to The End of History he then expanded the essay into a 1996 book, The Clash of Civilizations and the Remaking of World Order. In the essay and book, Huntington argued that the temporary conflict between ideologies is being replaced by the ancient conflict between civilizations. The dominant civilization decides the form of human government, and these will not be constant. He especially singled out Islam, which he described as having "bloody borders".

After the September 11, 2001, attacks, The End of History was cited by some commentators as a symbol of the supposed naiveté and undue optimism of the Western world during the 1990s, in thinking that the end of the Cold War also represented the end of major global conflict. In the weeks after the attacks, Fareed Zakaria called the events "the end of the end of history", while George Will wrote that history had "returned from vacation". [10]

Fukuyama did discuss radical Islam briefly in The End of History. He argued that Islam is not an imperialist force like Stalinism and fascism that is, it has little intellectual or emotional appeal outside the Islamic "heartlands". Fukuyama pointed to the economic and political difficulties that Iran and Saudi Arabia face and argued that such states are fundamentally unstable: either they will become democracies with a Muslim society (like Turkey) or they will simply disintegrate. Moreover, when Islamic states have actually been created, they were easily dominated by the powerful Western states.

In October 2001, Fukuyama, in a Wall Street Journal opinion piece, responded to the declarations that the September 11 attacks had disproved his views by stating that "time and resources are on the side of modernity, and I see no lack of a will to prevail in the United States today." He also noted that his original thesis "does not imply a world free from conflict, nor the disappearance of culture as a distinguishing characteristic of societies." [10]

The resurgence of Russia and China Edit

Another challenge to the "end of history" thesis is the growth in the economic and political power of two countries, Russia and China. China has a one-party state government, while Russia, though formally a democracy, is often described as an autocracy it is categorized as an anocracy in the Polity data series. [11]

Azar Gat, Professor of National Security at Tel Aviv University, argued this point in his 2007 Foreign Affairs article, "The Return of Authoritarian Great Powers", stating that the success of these two countries could "end the end of history". [12] Gat also discussed radical Islam, but stated that the movements associated with it "represent no viable alternative to modernity and pose no significant military threat to the developed world". He considered the challenge of China and Russia to be the major threat, since they could pose a viable rival model which could inspire other states.

This view was echoed by Robert Kagan in his 2008 book, The Return of History and the End of Dreams, whose title was a deliberate rejoinder to The End of History. [13]

In his 2008 Washington Post opinion piece, Fukuyama also addressed this point. He wrote, "Despite recent authoritarian advances, liberal democracy remains the strongest, most broadly appealing idea out there. Most autocrats, including Putin and Chávez, still feel that they have to conform to the outward rituals of democracy even as they gut its substance. Even China's Hu Jintao felt compelled to talk about democracy in the run-up to Beijing's Olympic Games." [14]

Failure of civil society and political decay Edit

In 2014, on the occasion of the 25th anniversary of the publication of the original essay, "The End of History?", Fukuyama wrote a column in The Wall Street Journal again updating his hypothesis. He wrote that, while liberal democracy still had no real competition from more authoritarian systems of government "in the realm of ideas", nevertheless he was less idealistic than he had been "during the heady days of 1989." Fukuyama noted the Orange Revolution in Ukraine and the Arab Spring, both of which seemed to have failed in their pro-democracy goals, as well as the "backsliding" of democracy in countries including Thailand, Turkey and Nicaragua. He stated that the biggest problem for the democratically elected governments in some countries was not ideological but "their failure to provide the substance of what people want from government: personal security, shared economic growth and the basic public services. that are needed to achieve individual opportunity." Though he believed that economic growth, improved government and civic institutions all reinforced one another, he wrote that it was not inevitable that "all countries will. get on that escalator." [15]

Twenty-five years later, the most serious threat to the end-of-history hypothesis isn't that there is a higher, better model out there that will someday supersede liberal democracy neither Islamist theocracy nor Chinese capitalism cuts it. Once societies get on the up escalator of industrialization, their social structure begins to change in ways that increase demands for political participation. If political elites accommodate these demands, we arrive at some version of democracy.

Fukuyama also warned of "political decay," which he wrote could also affect established democracies like the United States, in which corruption and crony capitalism erode liberty and economic opportunity. Nevertheless, he expressed his continued belief that "the power of the democratic ideal remains immense." [15]

Following Britain's decision to leave the European Union and the election of Donald Trump as President of the United States in 2016, Fukuyama feared for the future of liberal democracy in the face of resurgent populism, [16] [17] [18] and the rise of a "post-fact world", [19] saying that "twenty five years ago, I didn't have a sense or a theory about how democracies can go backward. And I think they clearly can." He warned that America's political rot was infecting the world order to the point where it "could be as big as the Soviet collapse". Fukuyama also highlighted Russia's interference in the Brexit referendum and 2016 U.S. elections. [18]

Posthuman future Edit

Fukuyama has also stated that his thesis was incomplete, but for a different reason: "there can be no end of history without an end of modern natural science and technology" (quoted from Our Posthuman Future). Fukuyama predicts that humanity's control of its own evolution will have a great and possibly terrible effect on liberal democracy.


1990s - HISTORY

Trends in the 90s: Films with Serious Themes

The trend toward sequels from the previous decade continued, but Hollywood was also attempting to deal with serious themes, including homelessness, the Holocaust, AIDS, feminism, and racism, while making bottom-line profits. There were a number of mainstream films that confronted the issues in a profound way. Director Jonathan Demme's Philadelphia (1993) was the first big-studio attempt to deal with AIDS, winning for Tom Hanks the first of consecutive Best Actor Oscars. He starred as a lawyer with AIDS who found that Denzel Washington was the only person who would take his case.

With seven Oscars including Best Picture and Best Director, Steven Spielberg's long and serious B/W Holocaust 'prestige' epic Schindler's List (1993) was a significant milestone but also a grim story about an opportunistic German businessman (Liam Neeson) in Poland who ultimately saved over 1,000 Jews from a Holocaust death by employing them as cheap labor. This historical drama was released only a few months after Spielberg's major entertainment blockbuster Jurassic Park (1993).

Soon before its bankruptcy in 1991, Orion Pictures distributed the three-hour western - director/producer/actor Kevin Costner's Dances with Wolves (1990), that retold the story of the Wild West from the viewpoint of Native Americans and displayed some of the subtitled dialogue in Sioux. Kevin Costner starred as Lt. John Dunbar who married Stands With a Fist (Mary McDonnell), and inadvertently became a hero. The film won the Best Picture Academy Award and six other Oscars, the first "western" to do so since Cimarron (1931).

Orion also released Jonathan Demme's The Silence of the Lambs (1991), a chilling thriller about serial killer Hannibal Lecter (Anthony Hopkins) and featuring Jodie Foster as young agent Clarice Starling seeking help from the psychopath to catch another psychopath named Buffalo Bill. The remarkable film swept the top five Oscars (Picture, Director, Actor, Actress, and Writer) - it was the first horror film to be so honored. [It repeated the earlier success of two other sweep films: One Flew Over the Cuckoo's Nest (1975) and It Happened One Night (1934).]

The immensely popular box-office hit Forrest Gump (1994) by director Robert Zemeckis looked back on the 60s and Vietnam War era through the eyes of a slow-witted Everyman (Best Actor-winning Tom Hanks with his second Oscar win), reportedly with an IQ of 75. He would spout Gump-isms ("Life is like a box of chocolates") and the film's exceptional special effects placed the unassuming Forrest within documentary newsreels - creating the illusion of meeting with Presidents (Kennedy, Johnson and Nixon), missing legs and the finale's floating feather.

Racial issues and social tensions were disturbingly portrayed by Michael Douglas in Falling Down (1993). Actor Tim Robbins' independent film Dead Man Walking (1995) confronted the issue of capital punishment in a powerful tale with Oscar-winning Susan Sarandon portraying real-life Catholic nun Sister Helen Prejean as a death-row spiritual counselor for convicted murderer and rapist Sean Penn. In Scent of a Woman (1992), Best Actor-winning Al Pacino starred as a blind ex-Marine (Col. Frank Slade), known for saying "Hoo-hah!", who intended to commit suicide - until hiring a young college student (Chris O'Donnell).

African-American Film-Makers:

And black filmmakers, including John Singleton, Spike Lee and Mario Van Peebles (among others) were making an impact. Twenty-three year old writer/director John Singleton marked his directorial debut with the semi-autobiographical Boyz N The Hood (1991), a powerful film about gang violence in South Central L.A. [Singleton's nominated film brought him the distinction of being the first African-American and the youngest person ever nominated for the Best Director Academy Award. At the time of its release, the film was the highest-grossing, black-themed film ever, earning ten times its $6 million budget.] However, the film's marketing incited some violence and trouble when it opened in various theaters.

After Mo' Better Blues (1990), writer/director Spike Lee's Jungle Fever (1991) told a story of inter-racial romance between Annabella Sciorra and Wesley Snipes. New Jack City (1991), with actor Wesley Snipes was a realistic film about inner city drug violence. Lee's big-budget biopic of the black political and religious activist Malcolm X (1992), with Denzel Washington in the title role as the radical 1960s assassinated leader, was an ambitious, stirring production, and caused controversy among African-American groups. Another conflict arose when the high-school basketball documentary about inner-city Chicago, Hoop Dreams (1994) failed to receive an Oscar nomination (as Best Picture or Best Documentary), although it won documentary awards from the Sundance Film Festival, the New York Film Critics Circle, and the LA Film Critics Association.

Co-scripter-director Lee's Clockers (1995), co-produced by Martin Scorsese, was a penetrating, sober examination of the vicious cycle of drug-dealing on the streets in the modern urban world. The Hughes brothers' Dead Presidents (1995) followed the coming-of-age odyssey of a young black man (Larenz Tate) from his late 60s Bronx upbringing through a Vietnam tour of duty and back to his life in the 'hood in 1973. Allen and Albert Hughes directed the powerful and daring inner-city film about a conflicted teenage gang member entitled Menace II Society (1993).

Female Film-Makers and a New Feminist Consciousness:

Similarly, from the 80s into the 90s, female directors were exerting greater influence and demonstrating their skill within the film industry: Barbra Streisand with her first film Yentl (1983) (as director/producer/co-writer/actor) and The Prince of Tides (1991), Penny Marshall with the soul-transference fantasy/comedy Big (1988) starring Tom Hanks as a youngster in an adult body, Penelope Spheeris with Wayne's World (1992) starring Mike Myers and Dana Carvey from TV's Saturday Night Live, Kathryn Bigelow with the exciting, fast-moving action-crime film Point Break (1991) and the dark, virtual reality futuristic film Strange Days (1995), and New Zealand director Jane Campion with the Oscar winning sensual and haunting masterpiece that was filmed from a female perspective The Piano (1993) - a love story set in 19th century New Zealand about the tragic consequences of an arranged marriage and erotic passion between mute Holly Hunter and native Harvey Keitel.

Ridley Scott's radical wide-screen film Thelma and Louise (1991) with a debut script by Callie Khouri (that won a Best Screenplay Oscar) has been noted as the first feminist-buddy/road film with two raging female heroines. Although controversial and defiant (like its counterpart Easy Rider (1969) was in its day), it offered splendid character roles to two actresses Susan Sarandon (as fed-up, overworked waitress Louise) and Geena Davis (as housewife Thelma) portraying outlaw women in flight across the American Southwest from abusiveness in marriage, rape and the law. [Incidentally, the film also launched Brad Pitt as a new star.] Director Penny Marshall's A League of Their Own (1992) was the true story of WWII manpower shortages that impacted baseball. Tom Hanks managed an all-female baseball team which included Geena Davis and Madonna.

Writer/director Nora Ephron, famous for When Harry Met Sally. (1989) created the witty, simple romantic comedy Sleepless in Seattle (1993) (a re-make of An Affair to Remember (1957)) with Tom Hanks and Meg Ryan as the perfect couple brought together by a talk-radio program. Ephron also directed the fantasy comedy Michael (1996) with John Travolta as the atypical title character - an angel (with wings) who drank, smoked, swore, and lived in Iowa. After her success in The Silence of the Lambs (1990), actress Jodie Foster became a producer and director for Egg Pictures, releasing her directorial debut film Little Man Tate (1991) and then a comedy about family relationships on Thanksgiving weekend titled Home for the Holidays (1995).

Fried Green Tomatoes at the Whistle Stop Cafe (1991), adapted from Fannie Flagg's popular women's novel, starred Jessica Tandy as an elderly storyteller in a nursing home and Kathy Bates as an emotionally-repressed housewife who found strength and independence through the recollections. Another feminist film The Joy Luck Club (1993), an adaptation of Amy Tan's novel with the theme of mother-daughter relationships, told of four mothers from China (who met weekly to play Mah-Jongg) whose daughters were born in America. The First Wives Club (1996) was about three friends (Goldie Hawn, Bette Midler, and Diane Keaton) who decided to get revenge on their unfaithful ex-husbands.

As a sign of the times, the Alien films (1979, 1986, 1992, and 1997) spotlighted a self-reliant female heroine - Lieut. Ellen Ripley (Sigourney Weaver). [The popular protagonist was killed off in the third installment, David Fincher's Alien3 (1992), but then brought back through cloning in the fourth film, Alien Resurrection (1997)].

Action-Thrillers Dominate the 90s:

There seemed to be a significant shift toward action films in the 90s - with their requisite speed, kinetic hyper-action, and of course, violence. Most of the biggest and popular films were not dialogue-based and character-driven. One of the greatest summer smashes of the decade was Andrew Davis' The Fugitive (1993) - a spin-off from the celebrated 1963-67 TV series (with David Janssen), with Harrison Ford as the wrongly-convicted surgeon Dr. Richard Kimble in flight from dogged US Marshal (Tommy Lee Jones) and in pursuit of a one-armed man. Brian De Palma's big-budget slick summer blockbuster Mission: Impossible (1996) derived from the popular 1960s TV series, was financially successful given its Tom Cruise star power and huge marketing campaign. Sylvester Stallone was in the title role of Demolition Man (1993) in pursuit of Wesley Snipes in futuristic San Angeles.

During the 1993 filming of the martial arts action-thriller adapted from a comic book, The Crow (1994), Bruce Lee's actor son Brandon was killed in an accident on the set. Hong Kong action director John Woo proved that he could make mainstream Hollywood films with the high-powered Broken Arrow (1996), his second US film, and filled with Woo's trademark action sequences. He also directed Face/Off (1997), a brilliantly-acted film with stars John Travolta as an FBI agent and Nicolas Cage as the villain - who both swapped faces after plastic surgery.

Jan de Bont's action disaster-thriller Speed (1994) was just as its title suggested - a fast-paced, out-of-control tale about a mad bomber (Dennis Hopper) vs. an LA SWAT team cop (Keanu Reeves), and a runaway suburban LA bus (driven by Sandra Bullock) wired to explode if it slowed below 50 mph. The vacation-cruise sequel Speed 2: Cruise Control (1997) paled in comparison. The producers of Top Gun (1986) made their last film The Rock (1996) - Michael Bay's successful action thriller with an all-star cast set on the island prison Alcatraz, with renegade Marines (led by Ed Harris) and a task force teaming Sean Connery and Nicolas Cage.

After their 1995 successes with Braveheart (1995) and Apollo 13 (1995) - a true story about a near-space disaster (with Tom Hanks as astronaut Jim Lovell), Ron Howard teamed with Mel Gibson for the suspenseful crime-kidnapping thriller Ransom (1996), a remake of the 1950s film Ransom (1956) starring Glenn Ford. Two mid-air thrillers included:


Contents

The term culture war is a loan translation (calque) of the German Kulturkampf ('culture struggle'). In German, Kulturkampf, a term coined by Rudolf Virchow, refers to the clash between cultural and religious groups in the campaign from 1871 to 1878 under Chancellor Otto von Bismarck of the German Empire against the influence of the Roman Catholic Church. [3] The translation was printed in some American newspapers at the time. [4]

1920s–1980s: Origins Edit

In American usage, "culture war" may imply a conflict between those values considered traditionalist or conservative and those considered progressive or liberal. This usage originated in the 1920s when urban and rural American values came into closer conflict. [5] This followed several decades of immigration to the States by people who earlier European immigrants considered 'alien'. It was also a result of the cultural shifts and modernizing trends of the Roaring '20s, culminating in the presidential campaign of Al Smith in 1928. [6] In subsequent decades during the 20th century, the term was published occasionally in American newspapers. [7] [8]

The expression would join the vocabulary of U.S. politics in 1991 with the publication of Culture Wars: The Struggle to Define America by James Davison Hunter, who redefined the American notion of "culture war." Tracing the concept to the 1960s, [9] Hunter perceived a dramatic realignment and polarization that had transformed U.S. politics and culture, including the issues of abortion, federal and state gun laws, immigration, separation of church and state, privacy, recreational drug use, LGBT rights, and censorship. The perceived focus of the American culture war and its definition have taken various forms since then. [10]

1991–2001: Rise in prominence Edit

James Davison Hunter, a sociologist at the University of Virginia, introduced the expression again in his 1991 publication, Culture Wars: The Struggle to Define America. Hunter described what he saw as a dramatic realignment and polarization that had transformed American politics and culture.

He argued that on an increasing number of "hot-button" defining issues—abortion, gun politics, separation of church and state, privacy, recreational drug use, homosexuality, censorship—there existed two definable polarities. Furthermore, not only were there a number of divisive issues, but society had divided along essentially the same lines on these issues, so as to constitute two warring groups, defined primarily not by nominal religion, ethnicity, social class, or even political affiliation, but rather by ideological world-views.

Hunter characterized this polarity as stemming from opposite impulses, toward what he referred to as Progressivism and as Orthodoxy. Others have adopted the dichotomy with varying labels. For example, Bill O'Reilly, a conservative political commentator and former host of the Fox News Channel talk show The O'Reilly Factor, emphasizes differences between "Secular-Progressives" and "Traditionalists" in his 2006 book Culture Warrior. [11] [12]

Historian Kristin Kobes Du Mez attributes the 1990s emergence of culture wars to the end of the Cold War in 1991. She writes that Evangelical Christians viewed a particular Christian masculine gender role as the only defense of America against the threat of communism. When this threat ended upon the close of the Cold War, Evangelical leaders transferred the perceived source of threat from foreign communism to domestic changes in gender roles and sexuality. [13]

During the 1992 presidential election, commentator Pat Buchanan mounted a campaign for the Republican nomination for president against incumbent George H. W. Bush. In a prime-time slot at the 1992 Republican National Convention, Buchanan gave his speech on the culture war. [14] He argued: "There is a religious war going on in our country for the soul of America. It is a cultural war, as critical to the kind of nation we will one day be as was the Cold War itself." [15] In addition to criticizing environmentalists and feminism, he portrayed public morality as a defining issue:

The agenda [Bill] Clinton and [Hillary] Clinton would impose on America—abortion on demand, a litmus test for the Supreme Court, homosexual rights, discrimination against religious schools, women in combat units—that's change, all right. But it is not the kind of change America wants. It is not the kind of change America needs. And it is not the kind of change we can tolerate in a nation that we still call God's country. [15]

A month later, Buchanan characterized the conflict as about power over society's definition of right and wrong. He named abortion, sexual orientation and popular culture as major fronts—and mentioned other controversies, including clashes over the Confederate flag, Christmas, and taxpayer-funded art. He also said that the negative attention his "culture war" speech received was itself evidence of America's polarization. [16]

The culture war had significant impact on national politics in the 1990s. [10] The rhetoric of the Christian Coalition of America may have weakened president George H. W. Bush's chances for re-election in 1992 and helped his successor, Bill Clinton, win reelection in 1996. [17] On the other hand, the rhetoric of conservative cultural warriors helped Republicans gain control of Congress in 1994. [18]

The culture wars influenced the debate over state-school history curricula in the United States in the 1990s. In particular, debates over the development of national educational standards in 1994 revolved around whether the study of American history should be a "celebratory" or "critical" undertaking and involved such prominent public figures as Lynne Cheney, the late Rush Limbaugh, and historian Gary Nash. [19] [20]

2001–2014: Post-9/11 era Edit

A political view called neoconservatism shifted the terms of the debate in the early 2000s. Neoconservatives differed from their opponents in that they interpreted problems facing the nation as moral issues rather than economic or political issues. For example, neoconservatives saw the decline of the traditional family structure as a spiritual crisis that required a spiritual response. Critics accused neoconservatives of confusing cause and effect. [21]

During the 2000s, voting for Republicans began to correlate heavily with traditionalist or orthodox religious belief across diverse religious sects. Voting for Democrats became more correlated to liberal or modernist religious belief, and to being nonreligious. [22] Belief in scientific conclusions, such as climate change, also became tightly coupled to political party affiliation in this era, causing climate scholar Andrew Hoffman to observe that climate change had "become enmeshed in the so-called culture wars." [23]

Topics traditionally associated with culture war were not prominent in media coverage of the 2008 election season, with the exception of coverage of vice-presidential candidate Sarah Palin, [24] who drew attention to her conservative religion and created a performative climate change denialism brand for herself. [25] Palin's defeat in the election and subsequent resignation as governor of Alaska caused the Center for American Progress to predict "the coming end of the culture wars," which they attributed to demographic change, particularly high rates of acceptance of same-sex marriage among millennials. [26]

2014–present: Broadening of the culture war Edit

While traditional culture war issues, notably abortion, continue to be a focal point, [27] the issues identified with culture war broadened and intensified in the mid-late 2010s. Journalist Michael Grunwald says that "President Donald Trump has pioneered a new politics of perpetual culture war" and lists the Black Lives Matter movement, U.S. national anthem protests, climate change, education policy, healthcare policy including Obamacare, and infrastructure policy as culture war issues in 2018. [28] The rights of transgender people and the role of religion in lawmaking were identified as "new fronts in the culture war" by political scientist Jeremiah Castle, as the polarization of public opinion on these two topics resemble that of previous culture war issues. [29] In 2020, during the COVID-19 pandemic, North Dakota governor Doug Burgum described opposition to wearing face masks as a "senseless" culture war issue that jeopardizes human safety. [30]

This broader understanding of culture war issues in the mid-late 2010s and 2020s is associated with a political strategy called "owning the libs." Conservative media figures employing this strategy, prominently Ben Shapiro, emphasize and expand upon culture war issues with the goal of upsetting liberal people. According to Nicole Hemmer of Columbia University, this strategy is a substitute for the cohesive conservative ideology that existed during the Cold War. It holds a conservative voting bloc together in the absence of shared policy preferences among the bloc's members. [31]

A number of conflicts about diversity in popular culture occurring in the 2010s, such as the Gamergate controversy, Comicsgate and the Sad Puppies science fiction voting campaign, were identified in the media as being examples of the culture war. [33] Journalist Caitlin Dewey described Gamergate as a "proxy war" for a larger culture war between those who want greater inclusion of women and minorities in cultural institutions versus anti-feminists and traditionalists who do not. [34] The perception that culture war conflict had been demoted from electoral politics to popular culture led writer Jack Meserve to call popular movies, games, and writing the "last front in the culture war" in 2015. [35]

These conflicts about representation in popular culture re-emerged into electoral politics via the alt-right and alt-lite movements. [36] According to media scholar Whitney Phillips, Gamergate "prototyped" strategies of harassment and controversy-stoking that proved useful in political strategy. For example, Republican political strategist Steve Bannon publicized pop-culture conflicts during the 2016 presidential campaign of Donald Trump, encouraging a young audience to "come in through Gamergate or whatever and then get turned onto politics and Trump." [37]

Some observers in Canada have used the term "culture war" to refer to differing values between Western versus Eastern Canada, urban versus rural Canada, as well as conservatism versus liberalism and progressivism. [38]

Nevertheless, Canadian society is generally not dramatically polarized over immigration, gun control, drug legality, sexual morality, or government involvement in healthcare: the main issues at play in the United States. In all of those cases, the majority of Canadians, including Conservatives would support the "progressive" position in the United States. In Canada a different set of issues create a clash of values. Chief among these are language policy in Canada, minority religious rights, pipeline politics, indigenous land rights, climate policy, and federal-provincial disputes.

It is a relatively new phrase in Canadian political commentary. It can still be used to describe historical events in Canada, such as the Rebellions of 1837, Western Alienation, the Quebec sovereignty movement, and any Aboriginal conflicts in Canada but is more relevant to current events such as the Grand River land dispute and the increasing hostility between conservative and liberal Canadians. [ citation needed ] The phrase has also been used to describe the Harper government's attitude towards the arts community. Andrew Coyne termed this negative policy towards the arts community as "class warfare." [39]

Interpretations of Aboriginal history became part of the wider political debate sometimes called the "culture wars" during the tenure of the Liberal–National Coalition government of 1996 to 2007, with the Prime Minister of Australia John Howard publicly championing the views of some of those associated with Quadrant. [40] This debate extended into a controversy over the presentation of history in the National Museum of Australia and in high-school history curricula. [41] [42] It also migrated into the general Australian media, with major broadsheets such as The Australian, The Sydney Morning Herald and The Age regularly publishing opinion pieces on the topic. Marcia Langton has referred to much of this wider debate as "war porn" [43] and as an "intellectual dead end". [44]

Two Australian Prime Ministers, Paul Keating (in office 1991–1996) and John Howard (in office 1996–2007), became major participants in the "wars". According to Mark McKenna's analysis for the Australian Parliamentary Library, [45] John Howard believed that Paul Keating portrayed Australia pre-Whitlam (Prime Minister from 1972 to 1975) in an unduly negative light while Keating sought to distance the modern Labor movement from its historical support for the monarchy and for the White Australia policy by arguing that it was the conservative Australian parties which had been barriers to national progress. He accused Britain of having abandoned Australia during the Second World War. Keating staunchly supported a symbolic apology to Australian Aboriginals for their mistreatment at the hands of previous administrations, and outlined his view of the origins and potential solutions to contemporary Aboriginal disadvantage in his Redfern Park Speech of 10 December 1992 (drafted with the assistance of historian Don Watson). In 1999, following the release of the 1998 Bringing Them Home Report, Howard passed a Parliamentary Motion of Reconciliation describing treatment of Aborigines as the "most blemished chapter" in Australian history, but he refused to issue an official apology. [46] Howard saw an apology as inappropriate as it would imply "intergeneration guilt" he said that "practical" measures were a better response to contemporary Aboriginal disadvantage. Keating has argued for the eradication of remaining symbols linked to colonial origins: including deference for ANZAC Day, [47] for the Australian flag and for the monarchy in Australia, while Howard supported these institutions. Unlike fellow Labor leaders and contemporaries, Bob Hawke (Prime Minister 1983–1991) and Kim Beazley (Labor Party leader 2005–2006), Keating never traveled to Gallipoli for ANZAC Day ceremonies. In 2008 he described those who gathered there as "misguided". [48]

In 2006 John Howard said in a speech to mark the 50th anniversary of Quadrant that "Political Correctness" was dead in Australia but: "we should not underestimate the degree to which the soft-left still holds sway, even dominance, especially in Australia's universities". [ citation needed ] Also in 2006, Sydney Morning Herald political editor Peter Hartcher reported that Opposition foreign-affairs spokesman Kevin Rudd was entering the philosophical debate by arguing in response that "John Howard, is guilty of perpetrating 'a fraud' in his so-called culture wars . designed not to make real change but to mask the damage inflicted by the Government's economic policies". [49]

The defeat of the Howard government in the Australian Federal election of 2007 and its replacement by the Rudd Labor government altered the dynamic of the debate. Rudd made an official apology to the Aboriginal Stolen Generation [50] with bi-partisan support. [51] Like Keating, Rudd supported an Australian republic, but in contrast to Keating, Rudd declared support for the Australian flag and supported the commemoration of ANZAC Day he also expressed admiration for Liberal Party founder Robert Menzies. [52] [53]

Subsequent to the 2007 change of government, and prior to the passage, with support from all parties, of the Parliamentary apology to indigenous Australians, Professor of Australian Studies Richard Nile argued: "the culture and history wars are over and with them should also go the adversarial nature of intellectual debate", [54] a view contested by others, including conservative commentator Janet Albrechtsen. [55] The Liberal Party parliamentarian Christopher Pyne indicated [ when? ] an intention to re-engage in the history wars. [56] [ failed verification ]

According to political scientist Constance G. Anthony, American culture war perspectives on human sexuality were exported to Africa as a form of neocolonialism. In his view, this began during the AIDS epidemic in Africa, with the United States government first tying HIV/AIDS assistance money to evangelical leadership and the Christian right during the Bush administration, then to LGBTQ tolerance during the administration of Barack Obama. This stoked a culture war that resulted in (among others) the Uganda Anti-Homosexuality Act of 2014. [57]

Zambian scholar Kapya Kaoma notes that because "the demographic center of Christianity is shifting from the global North to the global South" Africa's influence on Christianity worldwide is increasing. American conservatives export their culture wars to Africa, Kaoma says, particularly when they realize they may be losing the battle back home. US Christians have framed their anti-LGBT initiatives in Africa as standing in opposition to a "Western gay agenda", a framing which Kaoma finds ironic. [58]

North American and European conspiracy theories have become widespread in West Africa via social media, according to 2021 survey by First Draft News. COVID-19 misinformation, New World Order conspiracy thinking, Qanon and other conspiracy theories associated with culture war topics are spread by American, Pro-Russian, French-language, and local disinformation websites and social media accounts, including prominent politicians in Nigeria. This has contributed to vaccine hesitancy in West Africa, with 60 percent of survey respondents saying they were unlikely to try to get vaccinated, and an erosion of trust in institutions in the region. [59]

Several media outlets have described the Law and Justice party in Poland, [60] Viktor Orbán in Hungary, Aleksandar Vučić in Serbia, and Janez Janša in Slovenia as igniting culture wars in their respective countries by encouraging fights over LGBT rights, legal abortion, and other topics. [61] In the United Kingdom, the Conservative Party have similarly been described as attempting to ignite culture wars in regard to "conservative values" under the tenure of Prime Minister Boris Johnson. [62] [63] [64] [65] Sunder Katwala has observed that British conservatives import American culture wars issues, hoping that the conflicts will benefit conservatives in British elections as they have in American elections. [66]

Since the time that James Davison Hunter first applied the concept of culture wars to American life, the idea has been subject to questions about whether "culture wars" names a real phenomenon, and if so, whether the phenomenon it describes is a cause of, or merely a result of, membership in groups like political parties and religions. Culture wars have also been subject to the criticism of being artificial, imposed, or asymmetric conflicts, rather than a result of authentic differences between cultures.

Validity Edit

Researchers have differed about the scientific validity of the notion of culture war. Some claim it does not describe real behavior, or that it describes only the behavior of a small political elite. Others claim culture war is real and widespread, and even that it is fundamental to explaining Americans' political behavior and beliefs.

Political scientist Alan Wolfe participated in a series of scholarly debates in the 1990s and 2000s against Hunter, claiming that Hunter's concept of culture wars did not accurately describe the opinions or behavior of Americans, which Wolfe claimed were more united than polarized. [67]

A meta-analysis of opinion data from 1992 to 2012 published in the American Political Science Review concluded that, in contrast to a common belief that political party and religious membership shape opinion on culture war topics, instead opinions on culture war topics lead people to revise their political party and religious orientations. The researchers view culture war attitudes as "foundational elements in the political and religious belief systems of ordinary citizens." [68]

Artificiality or asymmetry Edit

Some writers and scholars have said that culture wars are created or perpetuated by political special interest groups, by reactionary social movements, by dynamics within the Republican party, or by electoral politics as a whole. These authors view culture war not as an unavoidable result of widespread cultural differences, but as a technique used to create in-groups and out-groups for a political purpose.

Political commentator E. J. Dionne has written that culture war is an electoral technique to exploit differences and grievances, remarking that the real cultural division is "between those who want to have a culture war and those who don't." [22]

Sociologist Scott Melzer says that culture wars are created by conservative, reactive organizations and movements. Members of these movements possess a "sense of victimization at the hands of a liberal culture run amok. In their eyes, immigrants, gays, women, the poor, and other groups are (undeservedly) granted special rights and privileges." Melzer writes about the example of the National Rifle Association, which he says intentionally created a culture war in order to unite conservative groups, particularly groups of white men, against a common perceived threat. [69]

Similarly, religion scholar Susan B. Ridgely has written that culture wars were made possible by Focus on the Family. This organization produced conservative Christian "alternative news" that began to bifurcate American media consumption, promoting a particular "traditional family" archetype to one part of the population, particularly conservative religious women. Ridgely says that this tradition was depicted as under liberal attack, seeming to necessitate a culture war to defend the tradition. [70]

Political scientists Matt Grossmann and David A. Hopkins have written about an asymmetry between the US's two major political parties, saying the Republican party should be understood as an ideological movement built to wage political conflict, and the Democratic party as a coalition of social groups with less ability to impose ideological discipline on members. [71] This encourages Republicans to perpetuate and to draw new issues into culture wars, because Republicans are well equipped to fight such wars. [72]


The 90s: The Decade That Doesn’t Fit?

Unlike most other eras, the notion of 90s music is hard to pin down. Oddball and eclectic, the decade defies easy categorisation, but it’s this cross-pollination of sounds that left a boundary-breaking legacy that remains today.

In A Hard Day’s Night, the exceptional madcap 1964 film 1964 starring The Beatles, a reporter asks Ringo Starr, “Are you a mod or a rocker?” She’s referring to the long-warring British musical subcultures, also captured with anxious sincerity a decade later in The Who’s Quadrophenia. The Beatles’ drummer replies with the rather deft portmanteau, “Um, no, I’m a mocker.” The joke being: there’s no way you could be both.

But, 30 years later, in the broad-stroked soundscape that was the 90s music industry, such posturing would look preposterous. The beauty of that decade was that you could be mod, rocker, hip-hop explorer, R&B fan, and country fan – all at the same time. Because the notion of what popular music was had shifted so radically.

While you’re reading, listen to our 90s Music playlist here.

Along came grunge

The biggest curveball that 90s music threw us was, of course, grunge. In the lead up to its inflection point (Nirvana’s Nevermind), guitar-based music roughly fell into three categories: alternative rock, classic-rock standbys, and an already-dimming hair metal scene. It was so lost that 1989 also marked the curious year that Jethro Tull won the best hard-rock/metal Grammy.

Still, at that time, the impact of MTV as the arbiter of youth culture could not be understated. The video for Nirvana’s “Smells Like Teen Spirit” quietly premiered on 120 Minutes, the network’s late-night stepchild, and was almost exotic in its betrayal of the channel’s visual conventions. It was dark, cynical, and so squarely “I don’t give a f__k” in a way that the industry’s self-aware harder rock acts fundamentally were not. But what makes Nirvana such a great microcosm of 90s music was that their sound was not singular in scope. It referenced everything from punk to garage rock to indie pop to country and blues.

Heavy metal didn’t disappear it just reconfigured itself. The more formidable acts (Guns N’ Roses, Metallica, Aerosmith) transcended fads, becoming stadium bands. Still, for the most part, rock fans diverted their attentions to grunge, with Nevermind and its follow-up, In Utero, serving as a gateway to other bands related to the scene: former labelmates Mudhoney, the metal-inspired Soundgarden, classic-rockers-in-the-making Pearl Jam and the gloomier Alice In Chains. Not to mention non-Seattle groups Bush, Stone Temple Pilots, and a pre-art rock Radiohead – all essentially distillations of the above.

Grunge was resoundingly male-dominated. Regardless, Hole (fronted by Cobain’s wife, Courtney Love, a provocateur with a propensity for stage-diving) managed to benefit greatly from grunge’s popularity. The group’s breakthrough album, the presciently named release Live Through This, dropped in 1994, a mere week after Cobain’s death. Celebrity Skin, its 1998 follow-up, ended up being their best-selling album.

Girls to the front

Most female-fronted rock bands didn’t chart as well, but they did deal in a cultural currency that produced a vibrant feminist-rock scene. Hole drew attention to Love’s contemporaries, including Bikini Kill, Babes In Toyland, Bratmobile, and, later, Sleater-Kinney. Then there was L7. All flying-V riffs, head-banging hair, and “screw you” lyrics, L7 (along with Mudhoney) helped pioneer grunge before grunge broke. And after it did, the group’s 1992 album, Bricks Are Heavy, won acclaim for skillfully toeing the line between the grunge, alternative, and riot grrrl worlds.

Towards the decade’s end, a rise of feminism (and female spending power) in 90s music would trickle up the pop charts. This led to an explosion of multi-platinum singer-songwriters: Sarah McLachlan, Alanis Morissette, Sheryl Crow, Lisa Loeb, Paula Cole, Fiona Apple, Jewel, and the lone woman of color, Tracy Chapman. All of the above (less Morissette) also appeared on the inaugural Lilith Fair tour, McLachlan’s answer to Lollapalooza. It became the best-selling touring festival of 1997.

Counterculture goes mainstream

The larger impact of grunge on 90s music was that it normalized what was once deemed countercultural. Suddenly, middle-of-the-road music listeners were nudged towards exploring what was once considered the domain of indie-music fans, who initially viewed these newcomers as interlopers. Sonic Youth – idols to countless punk bands, including Nirvana, who had opened for them in Europe just before Nevermind exploded – were finally getting radio and MTV airplay. Pixies and R.E.M., already highly respected in the underground, also grew their fanbases, alongside like-minded newcomers such as Pavement, Elliott Smith, Weezer, and Beck.

Meanwhile, the louder alt.rock scene assumed the space left by heavy metal. Industrial music’s Nine Inch Nails and Marilyn Manson, rap-rock’s Rage Against the Machine and Faith No More, the funk-centric Red Hot Chili Peppers and Primus, as well as the transcendent rock of The Smashing Pumpkins and Jane’s Addiction – all capitalized on the new thirst for angst. In this new environment, even a reissue of “Mother,” by the dystopian goth-metal beast Glenn Danzig, became a hit. Perry Farrell, Jane’s Addiction’s eccentric frontman, became a nexus for this phenomenon in 90s music when he created the then-quixotic Lollapalooza festival (its name a Webster dictionary deep cut meaning “extraordinarily impressive”) in the auspicious year of 1991.

After a decade of jock-versus-nerd narratives, being weird became cool, with grunge’s influence permeating into the aesthetics of fashion. Movies such as Cameron Crowe’s Seattle-centric Singles, Ben Stiller’s Reality Bites, and Allan Moyle’s Empire Records jumped on board to celebrate the virtues of outsiders.

As the trajectory of 90 music continued to be reshaped by grunge, the genre itself began to peter out by the middle of the decade. Some influential bands struggled with catastrophic substance-abuse issues. Others felt a disenchantment with becoming part of the establishment they worked so hard to surmount. The progenitors that did survive – Soundgarden and Pearl Jam, for instance – switched up their sounds. The latter went a step further: they simply stopped the machine by refusing to make music videos. And in an even more gutsy move, Pearl Jam refused to work with the events behemoth Ticketmaster.

The rise of Britpop

In the UK, grunge’s chart-takeover of the early 90s created a backlash in the form of Britpop. It’s no coincidence that Blur’s second, sound-defining album was titled Modern Life Is Rubbish (or that its alternate title was Britain Versus America). The Cool Britannia movement hearkened back to the 60s and the fertile music scene it cultivated, referencing music legends such as The Jam, The Kinks, and The Who.

Blur led the way for 90s music in the UK, albeit in fierce competition with their genre-defining peers Suede, whose just-as-buzzy self-titled debut emerged in 1993. By 1994, Blur had released the seminal Parklife and a whole scene corralled around it, yielding some exceptional albums: Pulp’s quick-witted Different Class, Elastica’s indie-cool self-titled LP, Supergrass’ gleefully pop I Should Coco, and new rivals Oasis’ no-frills rock Definitely Maybe. Bad blood between Blur and Oasis infamously underscored 1995’s Battle Of Britpop, an unofficial singles competition in which both groups released a track on the same day. A modern take on mods versus rockers, the press surrounding it was nothing short of dizzying, framing it as a tug-of-war between middle-class and working-class bands.

In the end, Blur’s “Country House” outsold Oasis’ “Roll With It.” But within a year, Oasis went on to achieve staggering international fame and even broke America, which eluded Blur. This culminated in two sold-out shows at Knebworth Park, resulting in England’s largest ever outdoor concert. It was a mixed bag: the event also marked the rapid decline of Britpop, which, like grunge, had reached saturation point. Death knell theories include: Oasis’ overexposure and in-band fighting Blur making a lo-fi album and even the Spice Girls co-opting and diluting a Brit-centric image for global fame.

Assuming the rock’n’roll mantle

Back in the US, post-grunge acts assumed rock’s mantle by pushing the genre towards a less destructive style of brooding through longhairs such as Collective Soul, Candlebox, Goo Goo Dolls, Creed, Silverchair, and Incubus. In retort (and due to angst fatigue), an assortment of colorful ska and pop-punk acts – No Doubt, Blink-182, Green Day, and Rancid – jettisoned up the charts. Notably, the untimely death of singer Brad Nowell helped Sublime’s self-titled album move more than five million CDs by the end of the decade. There was longevity in that bright sound, which ensured success for many of those bands into the next decade.

A technological shift

Going back to 1991, there was also one pivotal music-industry development, above and beyond grunge, that indelibly shifted music tastes for decades. This was the year that Billboard updated charts to reflect actual SoundScan sales figures. Up until that point, chart rankings were determined by the projections of record-store clerks and managers. Those “guesstimations” were frequently biased in genre and did not always reflect public consumption. Doing away with that almost immediately made the charts more genre-diverse.

Teen-pop confections, a resilient market draw, never went away. Fans of Backstreet Boys and NSYNC – and, later, Britney Spears and Christina Aguilera – continued to make a significant dent in sales. And the stalwart adult-contemporary demographic made megastars out of Kenny G, Whitney Houston, Michael Bolton, and Céline Dion. Then things got interesting.

Earthier offerings such as Hootie & The Blowfish and Blues Traveler seemed to suddenly pop up out of nowhere. The runaway success of Tejano legend Selena, once relegated to the Latin world, began to pop up on mainstream charts. And Garth Brooks became an unlikely bellwether of things to come. His 1991 album, Ropin’ The Wind, released mere months after the implementation of SoundScan, marked the first time a country artist had hit No.1 on the Billboard 200 album chart.

Newcomers Billy Ray Cyrus and Tim McGraw soon followed, as did a palpable uptick in the interest of established artists (George Strait, Reba McEntire, Alan Jackson, Vince Gill, and Clint Black). And, in 1995, thanks to Shania Twain’s massive, multi-platinum The Woman In Me, country-pop became its own female-fronted genre dominated also by Dixie Chicks, Faith Hill, and LeAnn Rimes.

Hip-hop gets soulful

But Billboard’s new accounting actually had its greatest impact on R&B and hip-hop, revealing the two genres’ growing relationship with one another. The 90s kicked off with New Jack Swing in full effect, its most effective purveyors being Bell Biv DeVoe, Al B Sure, Keith Sweat, and Boys II Men. As New Jack Swing waned, R&B embraced a soul-and-groove sound typified by Janet Jackson, D’Angelo, Erykah Badu, Usher, Toni Braxton, and Mary J Blige.


The 1990s: American Pop Culture History

The 1990s were a really intense decade. First we start off with a war that was about as one sided as they get — the first Iraq War (AKA Desert Storm).

Then we get the birth of “grunge” as Seattle takes over the way everyone dresses and listens to music.

Gangsta rap also had a huge impact on the way people dressed, talked and acted.

Sports became even more popular than they were before, but unfortunately my Cleveland Browns decided to move to Baltimore.

There were tons and tons of cheesy TV shows and sitcoms, but we can’t forget about what was probably the best sitcom of all time: Seinfeld.

Share your love for The 1990s: American Pop Culture History

The 90s were a solid decade, with great movies like Pulp Fiction, great music like Radiohead and great cars like the Dodge Viper.

Just for fun, you can check out this goofy list of 1991 slang terms & phrases.

Learn a whole lot more about the 1990s below. Our in-depth profiles of various categories like fashion, sports and cars take you deep into the decade.


1990s in music

Popular music in the 1990s saw the continuation of teen pop and dance-pop trends which had emerged in the 1970s and 1980s. Furthermore, hip hop grew and continued to be highly successful in the decade, with the continuation of the genre's golden age. Aside from rap, reggae, contemporary R&B, and urban music in general remained extremely popular throughout the decade urban music in the late-1980s and 1990s often blended with styles such as soul, funk, and jazz, resulting in fusion genres such as new jack swing, neo-soul, hip hop soul, and g-funk which were popular.

Similarly to the 1980s, rock music was also very popular in the 1990s, yet, unlike the new wave and glam metal-dominated scene of the time, grunge, [1] Britpop, industrial rock, and other alternative rock music emerged and took over as the most popular of the decade, as well as punk rock, ska punk, and nu metal, amongst others, which attained a high level of success at different points throughout the years.

Electronic music, which had risen in popularity in the 1980s, grew highly popular in the 1990s house and techno from the 1980s rose to international success in this decade, as well as new electronic dance music genres such as rave, happy hardcore, drum and bass, intelligent dance, and trip hop. In Europe, techno, rave, and reggae music were highly successful, [2] while also finding some international success. The decade also featured the rise of contemporary country music as a major genre, which had started in the 1980s. [3]

The 1990s also saw a resurgence of older styles in new contexts, including third wave ska and swing revival, both of which featured a fusion of horn-based music with rock music elements.

Reflecting on the decade's musical developments in Christgau's Consumer Guide: Albums of the '90s (2000), music critic Robert Christgau said the 1990s were "richly chaotic, unknowable", and "highly subject to vagaries of individual preference", yet "conducive to some manageable degree of general comprehension and enjoyment by any rock and roller." [4]

In December 1999, Billboard magazine named Mariah Carey as the Artist of the Decade in the United States. [5] In 1999, Selena was named the "top Latin artist of the '90s" and "best-selling Latin artist of the decade" by Billboard, for her fourteen top-ten singles in the Top Latin Songs chart, including seven number-one hits. [6] The singer also had the most successful singles of 1994 and 1995, "Amor Prohibido" and "No Me Queda Más". [7]


Watch the video: 1990s Old School Jungle u0026 Rave Tribute Mix