Skip to main content

From Snow White to Brown Skin-Media Studies and Disney: Movies

From Snow White to Brown Skin-Media Studies and Disney
Movies
    • Notifications
    • Privacy
  • Project HomeDisney Movies and Representation
  • Projects
  • Learn more about Manifold

Notes

Show the following:

  • Annotations
  • Resources
Search within:

Adjust appearance:

  • font
    Font style
  • color scheme
  • Margins
table of contents
  1. Table of Contents
  2. Mass Media
  3. Media and Culture
  4. Popular Culture
  5. Media Effects
  6. Research Methods
  7. Economics of Mass Media
  8. New Media Economics
  9. Books
  10. Contemporary Publishing
  11. Movies
  12. Contemporary Film
  13. Traditional Principles of Animation
  14. Animation and Gender
  15. Roger Rabbit
  16. Feminism and Mulan
  17. Encanto and Trauma
  18. Emotions
  19. Cinderella and Family
  20. Disney Channel Original Movies and Gender
  21. Rapunzel and Toxic Relationships
  22. Toy Story 4 and Personality Types
  23. Pinocchio and Morality

Movies[1]

Are 3-D Effects Creating Two-Dimensional Films?

In 2009, many moviegoers were amazed by the three-dimensional (3-D) film Avatar. Avatar grossed over $1.8 billion in theaters worldwide, $1.35 billion from 3-D sales alone. Following in that vein, dozens of other movie studios released 3-D films, resulting in lesser box office successes such as Alice in Wonderland, Clash of the Titans, and Shrek Forever After. Many film reviewers and audiences seemed adamant—3-D movies were the wave of the future. However, could this eye-popping technology actually ruin our moviegoing experience? Brian Moylan, a critic for Gawker.com, argues that it already has. The problem with 3-D, he says, is that “It is so mind-numbingly amazing that narrative storytelling hasn’t caught up with the technology. The corporate screenwriting borgs are so busy trying to come up with plot devices to highlight all the newfangled whoosiwhatsits—objects being hurled at the audience, flying sequences, falling leaves, glowing Venus Flytraps—that no one is really bothering to tell a tale.”

James Cameron, director of Avatar, agrees. “[Studios] think, ‘what was [sic] the takeaway lessons from Avatar? Oh you should make more money with 3-D.’ They ignore the fact that we natively authored the film in 3-D, and [they] decide that what we accomplished in several years of production could be done in an eight-week (post-production 3-D) conversion [such as] with Clash of the Titans.” Cameron makes the following point: While films such as Avatar (2009) and Beowulf (2007) were created exclusively for 3-D, many other filmmakers have converted their movies to 3-D after filming was already complete. Clash of the Titans is widely criticized because its 3-D effects were quickly added in postproduction.

What effect does this have on audiences? Aside from the complaints of headaches and nausea (and the fact that some who wear glasses regularly can find it uncomfortable or even impossible to wear 3-D glasses on top of their own), many say that the new technology simply makes movies looks worse. The film critic Roger Ebert has continuously denounced the technology, noting that movies such as The Last Airbender look like they’re “filmed with a dirty sheet over the lens.” [5] 3-D technology can cause a movie to look fuzzier, darker, and generally less cinematically attractive. However, movie studios are finding 3-D films attractive for another reason. Because seeing a movie in 3-D is considered a “premium” experience, consumers are expected to pay higher prices. And with the increasing popularity of IMAX 3D films, many moviegoers were amazed by the 3-D film Avatar 3-D, tickets may surpass $20 per person. This gives 3-D films an advantage over 2- D ones as audiences are willing to pay more to do so.

The recent 3-D boom has often been compared to the rise of color film in the early 1950s. However, some maintain that it’s just a fad. Will 3-D technology affect the future of filmmaking? With a host of new 3-D technologies for the home theater being released in 2010, many are banking on the fact that it will. Director James Cameron, however, is unsure of the technology’s continuing popularity, arguing that “If people put bad 3-D in the marketplace they’re going to hold back or even threaten the emerging of 3-D.” What is important, he maintains, is the creative aspect of moviemaking—no technology can replace good filmmaking. In the end, audiences will determine the medium’s popularity. Throughout the history of film, Technicolor dyes, enhanced sound systems, and computer-generated graphics have boasted huge box-office revenues; however, it’s ultimately the viewers who determine what a good movie is and who set the standard for future films.

The History of Movies

The movie industry as we know it today originated in the early 19th century through a series of technological developments: the creation of photography, the discovery of the illusion of motion by combining individual still images, and the study of human and animal locomotion. The history presented here begins at the culmination of these technological developments, where the idea of the motion picture as an entertainment industry first emerged. Since then, the industry has seen extraordinary transformations, some driven by the artistic visions of individual participants, some by commercial necessity, and still others by accident. The history of the cinema is complex, and for every important innovator and movement listed here, others have been left out. Nonetheless, after reading this section you will understand the broad arc of the development of a medium that has captured the imaginations of audiences worldwide for over a century.

The Beginnings: Motion Picture Technology of the Late 19th Century

While the experience of watching movies on smartphones may seem like a drastic departure from the communal nature of film viewing as we think of it today, in some ways the small-format, single-viewer display is a return to film’s early roots. In 1891, the inventor Thomas Edison, together with William Dickson, a young laboratory assistant, came out with what they called the kinetoscope, a device that would become the predecessor to the motion picture projector. The kinetoscope was a cabinet with a window through which individual viewers could experience the illusion of a moving image. A perforated celluloid film strip with a sequence of images on it was rapidly spooled between a lightbulb and a lens, creating the illusion of motion. The images viewers could see in the kinetoscope captured events and performances that had been staged at Edison’s film studio in East Orange, New Jersey, especially for the Edison kinetograph (the camera that produced kinetoscope film sequences): circus performances, dancing women, cockfights, boxing matches, and even a tooth extraction by a dentist.

As the kinetoscope gained popularity, the Edison Company began installing machines in hotel lobbies, amusement parks, and penny arcades, and soon kinetoscope parlors—where customers could pay around 25 cents for admission to a bank of machines—had opened around the country. However, when friends and collaborators suggested that Edison find a way to project his kinetoscope images for audience viewing, he apparently refused, claiming that such an invention would be a less profitable venture. Because Edison hadn’t secured an international patent for his invention, variations of the kinetoscope were soon being copied and distributed throughout Europe. This new form of entertainment was an instant success, and a number of mechanics and inventors, seeing an opportunity, began toying with methods of projecting the moving images onto a larger screen. However, it was the invention of two brothers, Auguste and Louis Lumière—photographic goods manufacturers in Lyon, France—that saw the most commercial success. In 1895, the brothers patented the Cinématographe (from which we get the term cinema), a lightweight film projector that also functioned as a camera and printer. Unlike the Edison kinetograph, the Cinématographe was lightweight enough for easy outdoor filming, and over the years the brothers used the camera to take well over 1,000 short films, most of which depicted scenes from everyday life. In December 1895, in the basement lounge of the Grand Café, Rue des Capucines in Paris, the Lumières held the world’s first-ever commercial film screening, a sequence of about 10 short scenes, including the brother’s first film, Workers Leaving the Lumière Factory, a segment lasting less than a minute and depicting workers leaving the family’s photographic instrument factory at the end of the day, as shown in the still frame here in Figure 8.3.

Believing that audiences would get bored watching scenes that they could just as easily observe on a casual walk around the city, Louis Lumière claimed that the cinema was “an invention without a future,” but a demand for motion pictures grew at such a rapid rate that soon representatives of the Lumière company were traveling throughout Europe and the world, showing half-hour screenings of the company’s films. While cinema initially competed with other popular forms of entertainment—circuses, vaudeville acts, theater troupes, magic shows, and many others—eventually it would supplant these various entertainments as the main commercial attraction. Within a year of the Lumières’ first commercial screening, competing film companies were offering moving-picture acts in music halls and vaudeville theaters across Great Britain. In the United States, the Edison Company, having purchased the rights to an improved projecter that they called the Vitascope, held their first film screening in April 1896 at Koster and Bial’s Music Hall in Herald Square, New York City.

Film’s profound impact on its earliest viewers is difficult to imagine today, inundated as many are by video images. However, the sheer volume of reports about the early audience’s disbelief, delight, and even fear at what they were seeing suggests that viewing a film was an overwhelming experience for many. Spectators gasped at the realistic details in films such as Robert Paul’s Rough Sea at Dover, and at times people panicked and tried to flee the theater during films in which trains or moving carriages sped toward the audience. Even the public’s perception of film as a medium was considerably different from the contemporary understanding; the moving image was an improvement upon the photograph—a medium with which viewers were already familiar—and this is perhaps why the earliest films documented events in brief segments but didn’t tell stories. During this “novelty period” of cinema, audiences were more interested by the phenomenon of the film projector itself, so vaudeville halls advertised the kind of the projector they were using (for example, “The Vitascope—Edison’s Latest Marvel”) [10], rather than the names of the films.

By the close of the 19th century, as public excitement over the moving picture’s novelty gradually wore off, filmmakers were also beginning to experiment with film’s possibilities as a medium in itself (not simply, as it had been regarded up until then, as a tool for documentation, analogous to the camera or the phonograph). Technical innovations allowed filmmakers like Parisian cinema owner Georges Méliès to experiment with special effects that produced seemingly magical transformations on screen: flowers turned into women, people disappeared with puffs of smoke, a man appeared where a woman had just been standing, and other similar tricks.

Not only did Méliès, a former magician, invent the “trick film,” which producers in England and the United States began to imitate, but he was also the one to transform cinema into the narrative medium it is today. Whereas before, filmmakers had only ever created single-shot films that lasted a minute or less, Méliès began joining these short films together to create stories. His 30-scene Trip to the Moon (1902), a film based on a Jules Verne novel, may have been the most widely seen production in cinema’s first decade. However, Méliès never developed his technique beyond treating the narrative film as a staged theatrical performance; his camera, representing the vantage point of an audience facing a stage, never moved during the filming of a scene. In 1912, Méliès released his last commercially successful production, The Conquest of the Pole, and from then on, he lost audiences to filmmakers who were experimenting with more sophisticated techniques.

The Nickelodeon Craze (1904–1908)

One of these innovative filmmakers was Edwin S. Porter, a projectionist and engineer for the Edison Company. Porter’s 12-minute film, The Great Train Robbery (1903), broke with the stage-like compositions of Méliès-style films through its use of editing, camera pans, rear projections, and diagonally composed shots that produced a continuity of action. Not only did The Great Train Robbery establish the realistic narrative as a standard in cinema, it was also the first major box-office hit. Its success paved the way for the growth of the film industry, as investors, recognizing the motion picture’s great moneymaking potential, began opening the first permanent film theaters around the country. Known as nickelodeons because of their 5-cent admission charge, these early motion picture theaters, often housed in converted storefronts, were especially popular among the working class of the time, who couldn’t afford live theater. Between 1904 and 1908, around 9,000 nickelodeons appeared in the United States. It was the nickelodeon’s popularity that established film as a mass entertainment medium.

The “Biz”: The Motion Picture Industry Emerges

As the demand for motion pictures grew, production companies were created to meet it. At the peak of nickelodeon popularity in 1910, there were 20 or so major motion picture companies in the United States. However, heated disputes often broke out among these companies over patent rights and industry control, leading even the most powerful among them to fear fragmentation that would loosen their hold on the market. Because of these concerns, the 10 leading companies—including Edison, Biograph, Vitagraph, and others—formed the Motion Picture Patents Company (MPPC) in 1908. The MPPC was a trade group that pooled the most significant motion picture patents and established an exclusive contract between these companies and the Eastman Kodak Company as a supplier of film stock. Also known as the Trust, the MPPC’s goal was to standardize the industry and shut out competition through monopolistic control. Under the Trust’s licensing system, only certain licensed companies could participate in the exchange, distribution, and production of film at different levels of the industry—a shut-out tactic that eventually backfired, leading the excluded, independent distributors to organize in opposition to the Trust.

The Rise of the Feature

In these early years, theaters were still running single-reel films, which came at a standard length of 1,000 feet, allowing for about 16 minutes of playing time. However, companies began to import multiple-reel films from European producers around 1907, and the format gained popular acceptance in the United States in 1912 with Louis Mercanton’s highly successful Queen Elizabeth, a three-and-a-half reel “feature,” starring the French actress Sarah Bernhardt. As exhibitors began to show more features—as the multiple reel film came to be called—they discovered a number of advantages over the single-reel short. For one thing, audiences saw these longer films as special events and were willing to pay more for admission, and because of the popularity of the feature narratives, features generally experienced longer runs in theaters than their single-reel predecessors. Additionally, the feature film gained popularity among the middle classes, who saw its length as analogous to the more “respectable” entertainment of live theater. Following the example of the French film d’art, U.S. feature producers often took their material from sources that would appeal to a wealthier and better educated audience, such as histories, literature, and stage productions.

As it turns out, the feature film was one factor that brought about the eventual downfall of the MPPC. The inflexible structuring of the Trust’s exhibition and distribution system made the organization resistant to change. When movie studio, and Trust member, Vitagraph began to release features like A Tale of Two Cities (1911) and Uncle Tom’s Cabin (1910), the Trust forced it to exhibit the films serially in single-reel showings to keep with industry standards. The MPPC also underestimated the appeal of the star system, a trend that began when producers chose famous stage actors like Mary Pickford and James O’Neill to play the leading roles in their productions and to grace their advertising posters. Because of the MPPC’s inflexibility, independent companies were the only ones able to capitalize on two important trends that were to become film’s future: single-reel features and star power. Today, few people would recognize names like Vitagraph or Biograph, but the independents that outlasted them—Universal, Goldwyn (which would later merge with Metro and Mayer), Fox (later 20th Century Fox), and Paramount (the later version of the Lasky Corporation)—have become household names.

Hollywood

As movie-going increased in popularity among the middle class, and as the feature films began keeping audiences in their seats for longer periods of time, exhibitors found a need to create more comfortable and richly decorated theater spaces to attract their audiences. These “dream palaces,” so called because of their often lavish embellishments of marble, brass, guilding, and cut glass, not only came to replace the nickelodeon theater, but also created the demand that would lead to the Hollywood studio system. Some producers realized that the growing demand for new work could only be met if the films were produced on a regular, year-round system. However, this was impractical with the current system that often relied on outdoor filming and was predominately based in Chicago and New York—two cities whose weather conditions prevented outdoor filming for a significant portion of the year. Different companies attempted filming in warmer locations such as Florida, Texas, and Cuba, but the place where producers eventually found the most success was a small, industrial suburb of Los Angeles called Hollywood. Hollywood proved to be an ideal location for a number of reasons. Not only was the climate temperate and sunny year-round, but land was plentiful and cheap, and the location allowed close access to a number of diverse topographies: mountains, lakes, desert, coasts, and forests. By 1915, more than 60 percent of U.S. film production was centered in Hollywood.

The Art of Silent Film

While the development of narrative film was largely driven by commercial factors, it is also important to acknowledge the role of individual artists who turned it into a medium of personal expression. The motion picture of the silent era was generally simplistic in nature; acted in overly animated movements to engage the eye; and accompanied by live music, played by musicians in the theater, and written titles to create a mood and to narrate a story. Within the confines of this medium, one filmmaker in particular emerged to transform the silent film into an art and to unlock its potential as a medium of serious expression and persuasion. D. W. Griffith, who entered the film industry as an actor in 1907, quickly moved to a directing role in which he worked closely with his camera crew to experiment with shots, angles, and editing techniques that could heighten the emotional intensity of his scenes. He found that by practicing parallel editing, in which a film alternates between two or more scenes of action, he could create an illusion of simultaneity. He could then heighten the tension of the film’s drama by alternating between cuts more and more rapidly until the scenes of action converged. Griffith used this technique to great effect in his controversial film The Birth of a Nation, which will be discussed in greater detail later on in this chapter. Other techniques that Griffith employed to new effect included panning shots, through which he was able to establish a sense of scene and to engage his audience more fully in the experience of the film, and tracking shots, or shots that traveled with the movement of a scene, which allowed the audience—through the eye of the camera—to participate in the film’s action.

MPAA: Combating Censorship

As film became an increasingly lucrative U.S. industry, prominent industry figures like D. W. Griffith, slapstick comedian/director Charlie Chaplin, and actors Mary Pickford and Douglas Fairbanks grew extremely wealthy and influential. Public attitudes toward stars and toward some stars’ extravagant lifestyles were divided, much as they are today: On the one hand, these celebrities were idolized and imitated in popular culture, yet at the same time, they were criticized for representing a threat, on and off screen, to traditional morals and social order. And much as it does today, the news media liked to sensationalize the lives of celebrities to sell stories. Comedian Roscoe “Fatty” Arbuckle, who worked alongside future icons Charlie Chaplin and Buster Keaton, was at the center of one of the biggest scandals of the silent era. When Arbuckle hosted a marathon party over Labor Day weekend in 1921, one of his guests, model Virginia Rapp, was rushed to the hospital, where she later died. Reports of a drunken orgy, rape, and murder surfaced. Following World War I, the United States was in the middle of significant social reforms, such as Prohibition. Many feared that movies and their stars could threaten the moral order of the country. Because of the nature of the crime and the celebrity involved, these fears became inexplicably tied to the Artbuckle case. Even though autopsy reports ruled that Rapp had died from causes for which Arbuckle could not be blamed, the comedian was tried (and acquitted) for manslaughter, and his career was ruined.

The Arbuckle affair and a series of other scandals only increased public fears about Hollywood’s impact. In response to this perceived threat, state and local governments increasingly tried to censor the content of films that depicted crime, violence, and sexually explicit material. Deciding that they needed to protect themselves from government censorship and to foster a more favorable public image, the major Hollywood studios organized in 1922 to form an association they called the Motion Picture Producers and Distributers of America (later renamed the Motion Picture Association of America, or MPAA). Among other things, the MPAA instituted a code of self-censorship for the motion picture industry. Today, the MPAA operates by a voluntary rating system, which means producers can voluntarily submit a film for review, which is designed to alert viewers to the age-appropriateness of a film, while still protecting the filmmakers’ artistic freedom.

Silent Film’s Demise

In 1925, Warner Bros. was just a small Hollywood studio looking for opportunities to expand. When representatives from Western Electric offered to sell the studio the rights to a new technology they called Vitaphone, a sound-on-disc system that had failed to capture the interest of any of the industry giants, Warner Bros. executives took a chance, predicting that the novelty of talking films might be a way to make a quick, short-term profit. Little did they anticipate that their gamble would not only establish them as a major Hollywood presence but also change the industry forever.

The pairing of sound with motion pictures was nothing new in itself. Edison, after all, had commissioned the kinetoscope to create a visual accompaniment to the phonograph, and many early theaters had orchestra pits to provide musical accompaniment to their films. Even the smaller picture houses with lower budgets almost always had an organ or piano. When Warner Bros. purchased Vitaphone technology, it planned to use it to provide prerecorded orchestral accompaniment for its films, thereby increasing their marketability to the smaller theaters that didn’t have their own orchestra pits. In 1926, Warner debuted the system with the release of Don Juan, a costume drama accompanied by a recording of the New York Philharmonic Orchestra; the public responded enthusiastically. By 1927, after a $3 million campaign, Warner Bros. had wired more than 150 theaters in the United States, and it released its second sound film, The Jazz Singer, in which the actor Al Jolson improvised a few lines of synchronized dialogue and sang six songs. The film was a major breakthrough. Audiences, hearing an actor speak on screen for the first time, were enchanted. While radio, a new and popular entertainment, had been drawing audiences away from the picture houses for some time, with the birth of the “talkie,” or talking film, audiences once again returned to the cinema in large numbers, lured by the promise of seeing and hearing their idols perform. By 1929, three-fourths of Hollywood films had some form of sound accompaniment, and by 1930, the silent film was a thing of the past.

“I Don’t Think We’re in Kansas Anymore”: Film Goes Technicolor

Although the techniques of tinting and hand painting had been available methods for adding color to films for some time (Georges Méliès, for instance, employed a crew to hand-paint many of his films), neither method ever caught on. The hand-painting technique became impractical with the advent of mass-produced film, and the tinting process, which filmmakers discovered would create an interference with the transmission of sound in films, was abandoned with the rise of the talkie. However, in 1922, Herbert Kalmus’ Technicolor company introduced a dye-transfer technique that allowed it to produce a full-length film, The Toll of the Sea, in two primary colors. However, because only two colors were used, the appearance of The Toll of the Sea (1922), The Ten Commandments (1923), and other early Technicolor films was not very lifelike. By 1932, Technicolor had designed a three-color system with more realistic results, and for the next 25 years, all color films were produced with this improved system. Disney’s Three Little Pigs (1933) and Snow White and the Seven Dwarves (1936) and films with live actors, like MGM’s The Wizard of Oz(1939) and Gone With the Wind (1939), experienced early success using Technicolor’s three-color method.

Despite the success of certain color films in the 1930s, Hollywood, like the rest of the United States, was feeling the impact of the Great Depression, and the expenses of special cameras, crews, and Technicolor lab processing made color films impractical for studios trying to cut costs. Therefore, it wasn’t until the end of the 1940s that Technicolor would largely displace the black-and-white film.

Rise and Fall of the Hollywood Studio

The spike in theater attendance that followed the introduction of talking films changed the economic structure of the motion picture industry, bringing about some of the largest mergers in industry history. By 1930, eight studios produced 95 percent of all American films, and they continued to experience growth even during the Depression. The five most influential of these studios—Warner Bros., Metro-Goldwyn-Mayer, RKO, 20th Century Fox, and Paramount—were vertically integrated; that is, they controlled every part of the system as it related to their films, from the production, to release, distribution, and even viewing. Because they owned theater chains worldwide, these studios controlled which movies exhibitors ran, and because they “owned” a stock of directors, actors, writers, and technical assistants by contract, each studio produced films of a particular character.

The late 1930s and early 1940s are sometimes known as the “Golden Age” of cinema, a time of unparalleled success for the movie industry; by 1939, film was the 11th-largest industry in the United States, and during World War II, when the U.S. economy was once again flourishing, two-thirds of Americans were attending the theater at least once a week. Some of the most acclaimed movies in history were released during this period, including Citizen Kane and The Grapes of Wrath. However, postwar inflation, a temporary loss of key foreign markets, the advent of the television, and other factors combined to bring that rapid growth to an end. In 1948, the case of the United States v. Paramount Pictures—mandating competition and forcing the studios to relinquish control over theater chains—dealt the final devastating blow from which the studio system would never recover. Control of the major studios reverted to Wall Street, where the studios were eventually absorbed by multinational corporations, and the powerful studio heads lost the influence they had held for nearly 30 years.

Post–World War II: Television Presents a Threat

While economic factors and antitrust legislation played key roles in the decline of the studio system, perhaps the most important factor in that decline was the advent of the television. Given the opportunity to watch “movies” from the comfort of their own homes, the millions of Americans who owned a television by the early 1950s were attending the cinema far less regularly than they had only several years earlier. In an attempt to win back diminishing audiences, studios did their best to exploit the greatest advantages film held over television. For one thing, television broadcasting in the 1950s was all in black and white, whereas the film industry had the advantage of color. While producing a color film was still an expensive undertaking in the late 1940s, a couple of changes occurred in the industry in the early 1950s to make color not only more affordable but also more realistic in its appearance. In 1950, as the result of antitrust legislation, Technicolor lost its monopoly on the color film industry, allowing other providers to offer more competitive pricing on filming and processing services. At the same time, Kodak came out with a multilayer film stock that made it possible to use more affordable cameras and to produce a higher quality image. Kodak’s Eastmancolor option was an integral component in converting the industry to color. In the late 1940s, only 12 percent of features were in color; however, by 1954 (after the release of Kodak Eastmancolor) more than 50 percent of movies were in color.

Another clear advantage on which filmmakers tried to capitalize was the sheer size of the cinema experience. With the release of the epic biblical film The Robe in 1953, 20th Century Fox introduced the method that would soon be adopted by nearly every studio in Hollywood: a technology that allowed filmmakers to squeeze a wide-angle image onto conventional 35-mm film stock, thereby increasing the aspect ratio (the ratio of a screen’s width to its height) of their images. This wide-screen format increased the immersive quality of the theater experience. Nonetheless, even with these advancements, movie attendance never again reached the record numbers it experienced in 1946, at the peak of the Golden Age of Hollywood.

Mass Entertainment, Mass Paranoia: HUAC and the Hollywood Blacklist

The Cold War with the Soviet Union began in 1947, and with it came the widespread fear of communism, not only from the outside, but equally from within. To undermine this perceived threat, the House Un-American Activities Committee (HUAC) commenced investigations to locate communist sympathizers in America, who were suspected of conducting espionage for the Soviet Union. In the highly conservative and paranoid atmosphere of the time, Hollywood, the source of a mass-cultural medium, came under fire in response to fears that subversive, communist messages were being embedded in films. In November 1947, more than 100 people in the movie business were called to testify before the HUAC about their and their colleagues’ involvement with communist affairs. Of those investigated, 10 in particular refused to cooperate with the committee’s questions. These 10, later known as the Hollywood Ten, were fired from their jobs and sentenced to serve up to a year in prison. The studios, already slipping in influence and profit, were eager to cooperate in order to save themselves, and a number of producers signed an agreement stating that no communists would work in Hollywood.

The hearings, which recommenced in 1951 with the rise of Senator Joseph McCarthy’s influence, turned into a kind of witch hunt as witnesses were asked to testify against their associates, and a blacklist of suspected communists evolved. Over 324 individuals lost their jobs in the film industry as a result of blacklisting (the denial of work in a certain field or industry) and HUAC investigations.

Down with the Establishment: Youth Culture of the 1960s and 1970s

Movies of the late 1960s began attracting a younger demographic, as a growing number of young people were drawn in by films like Sam Peckinpah’s The Wild Bunch (1969), Stanley Kubrick’s 2001: A Space Odyssey (1968), Arthur Penn’s Bonnie and Clyde (1967), and Dennis Hopper’s Easy Rider (1969)—all revolutionary in their genres—that displayed a sentiment of unrest toward conventional social orders and included some of the earliest instances of realistic and brutal violence in film. These four films in particular grossed so much money at the box offices that producers began churning out low-budget copycats to draw in a new, profitable market. [40] While this led to a rise in youth-culture films, few of them saw great success. However, the new liberal attitudes toward depictions of sex and violence in these films represented a sea of change in the movie industry that manifested in many movies of the 1970s, including Francis Ford Coppola’s The Godfather (1972), William Friedkin’s The Exorcist (1973), and Steven Spielberg’s Jaws (1975), all three of which saw great financial success. [41]

Blockbusters, Knockoffs, and Sequels

In the 1970s, with the rise of work by Coppola, Spielberg, George Lucas, Martin Scorsese, and others, a new breed of director emerged. These directors were young and film-school educated, and they contributed a sense of professionalism, sophistication, and technical mastery to their work, leading to a wave of blockbuster productions, including Close Encounters of the Third Kind (1977), Star Wars (1977), Raiders of the Lost Ark (1981), and E.T.: The Extra-Terrestrial (1982). The computer-generated special effects that were available at this time also contributed to the success of a number of large-budget productions. In response to these and several earlier blockbusters, movie production and marketing techniques also began to shift, with studios investing more money in fewer films in the hopes of producing more big successes. For the first time, the hefty sums producers and distributers invested didn’t go to production costs alone; distributors were discovering the benefits of TV and radio advertising and finding that doubling their advertising costs could increase profits as much as three or four times over. With the opening of Jaws, one of the five top-grossing films of the decade (and the highest-grossing film of all time until the release of Star Wars in 1977), Hollywood embraced the wide-release method of movie distribution, abandoning the release methods of earlier decades, in which a film would debut in only a handful of select theaters in major cities before it became gradually available to mass audiences. Jaws was released in 600 theaters simultaneously, and the big-budget films that followed came out in anywhere from 800 to 2,000 theaters nationwide on their opening weekends.

The major Hollywood studios of the late 1970s and early 1980s, now run by international corporations, tended to favor the conservative gamble of the tried and true, and as a result, the period saw an unprecedented number of high-budget sequels—as in the Star Wars, Indiana Jones, and Godfather films—as well as imitations and adaptations of earlier successful material, such as the plethora of “slasher” films that followed the success of the 1979 thriller Halloween. Additionally, corporations sought revenue sources beyond the movie theater, looking to the video and cable releases of their films. Introduced in 1975, the VCR became nearly ubiquitous in American homes by 1998 with 88.9 million households owning the appliance. Cable television’s growth was slower, but ownership of VCRs gave people a new reason to subscribe, and cable subsequently expanded as well. And the newly introduced concept of film-based merchandise (toys, games, books, etc.) allowed companies to increase profits even more.

The 1990s and Beyond

The 1990s saw the rise of two divergent strands of cinema: the technically spectacular blockbuster with special, computer-generated effects and the independent, low-budget film. The capabilities of special effects were enhanced when studios began manipulating film digitally. Early examples of this technology can be seen in Terminator 2: Judgment Day (1991) and Jurassic Park (1993). Films with an epic scope—Independence Day (1996), Titanic (1997), and The Matrix (1999)—also employed a range of computer-animation techniques and special effects to wow audiences and to draw more viewers to the big screen. Toy Story (1995), the first fully computer-animated film, and those that came after it, such as Antz (1998), A Bug’s Life (1998), and Toy Story 2(1999), displayed the improved capabilities of computer-generated animation.[45] At the same time, independent directors and producers, such as the Coen brothers and Spike Jonze, experienced an increased popularity, often for lower-budget films that audiences were more likely to watch on video at home. A prime example of this is the 1996 Academy Awards program, when independent films dominated the Best Picture category. Only one movie from a big film studio was nominated—Jerry Maguire—and the rest were independent films. The growth of both independent movies and special-effects-laden blockbusters continues to the present day. You will read more about current issues and trends and the future of the movie industry later on in this chapter.

Movies and Culture

Movies Mirror Culture

The relationship between movies and culture involves a complicated dynamic; while American movies certainly influence the mass culture that consumes them, they are also an integral part of that culture, a product of it, and therefore a reflection of prevailing concerns, attitudes, and beliefs. In considering the relationship between film and culture, it is important to keep in mind that, while certain ideologies may be prevalent in a given era, not only is American culture as diverse as the populations that form it, but it is also constantly changing from one period to the next. Mainstream films produced in the late 1940s and into the 1950s, for example, reflected the conservatism that dominated the sociopolitical arenas of the time. However, by the 1960s, a reactionary youth culture began to emerge in opposition to the dominant institutions, and these anti-establishment views soon found their way onto screen—a far cry from the attitudes most commonly represented only a few years earlier.

In one sense, movies could be characterized as America’s storytellers. Not only do Hollywood films reflect certain commonly held attitudes and beliefs about what it means to be American, but they also portray contemporary trends, issues, and events, serving as records of the eras in which they were produced. Consider, for example, films about the September 11, 2001, terrorist attacks: Fahrenheit 9/11, World Trade Center, United 93, and others. These films grew out of a seminal event of the time, one that preoccupied the consciousness of Americans for years after it occurred.

Birth of a Nation

In 1915, director D. W. Griffith established his reputation with the highly successful film The Birth of a Nation, based on Thomas Dixon’s novel The Clansman, a pro-segregation narrative about the American South during and after the Civil War. At the time, The Birth of a Nation was the longest feature film ever made, at almost 3 hours, and contained huge battle scenes that amazed and delighted audiences. Griffith’s storytelling ability helped solidify the narrative style that would go on to dominate feature films. He also experimented with editing techniques such as close-ups, jump cuts, and parallel editing which helped make the film an artistic achievement. Griffith’s film found success largely because it captured the social and cultural tensions of the era. As American studies specialist Lary May has argued, “[Griffith’s] films dramatized every major concern of the day.” In the early 20th century, fears about recent waves of immigrants had led to certain racist attitudes in mass culture, with “scientific” theories of the time purporting to link race with inborn traits like intelligence and other capabilities. Additionally, the dominant political climate, largely a reaction against populist labor movements, was one of conservative elitism, eager to attribute social inequalities to natural human differences. According to a report by the New York Evening Post after the film’s release, even some Northern audiences “clapped when the masked riders took vengeance on Negroes.” However, the outrage many groups expressed about the film is a good reminder that American culture is not monolithic, that there are always strong contingents in opposition to dominant ideologies.

While critics praised the film for its narrative complexity and epic scope, many others were outraged and even started riots at several screenings because of its highly controversial, openly racist attitudes, which glorified the Ku Klux Klan and blamed Southern blacks for the destruction of the war. Many Americans joined the National Association for the Advancement of Colored People (NAACP) in denouncing the film, and the National Board of Review eventually cut a number of the film’s racist sections. However, it’s important to keep in mind the attitudes of the early 1900s. At the time the nation was divided, and Jim Crow laws and segregation were enforced. Nonetheless, The Birth of a Nation was the highest grossing movie of its era. In 1992, the film was classified by the Library of Congress among the “culturally, historically, or aesthetically significant films” in U.S. history.

“The American Way”

Until the bombing of Pearl Harbor in 1941, American films after World War I generally reflected the neutral, isolationist stance that prevailed in politics and culture. However, after the United States was drawn into the war in Europe, the government enlisted Hollywood to help with the war effort, opening the federal Bureau of Motion Picture Affairs in Los Angeles. Bureau officials served in an advisory capacity on the production of war-related films, an effort with which the studios cooperated. As a result, films tended toward the patriotic and were produced to inspire feelings of pride and confidence in being American and to clearly establish that America and its allies were forced of good. For instance, critically acclaimed Casablanca paints a picture of the ill effects of fascism, illustrates the values that heroes like Victor Laszlo hold, and depicts America as a place for refugees to find democracy and freedom.

These early World War II films were sometimes overtly propagandist, intended to influence American attitudes rather than present a genuine reflection of American sentiments toward the war. Frank Capra’s Why We Fight films, for example, the first of which was produced in 1942, were developed for the U.S. Army and were later shown to general audiences; they delivered a war message through narrative. As the war continued, however, filmmakers opted to forego patriotic themes for a more serious reflection of American sentiments, as exemplified by films like Alfred Hitchcock’s Lifeboat.

Youth versus Age: From Counterculture to Mass Culture

In Mike Nichols’s 1967 film The Graduate, Dustin Hoffman, as the film’s protagonist, enters into a romantic affair with the wife of his father’s business partner. However, Mrs. Robinson and the other adults in the film fail to understand the young, alienated hero, who eventually rebels against them. The Graduate, which brought in more than $44 million at the box office, reflected the attitudes of many members of a young generation growing increasingly dissatisfied with what they perceived to be the repressive social codes established by their more conservative elders. This baby boomer generation came of age during the Korean and Vietnam wars. Not only did the youth culture express a cynicism toward the patriotic, prowar stance of their World War II–era elders, but they displayed a fierce resistance toward institutional authority in general, an anti-establishmentism epitomized in the 1967 hit film Bonnie and Clyde. In the film, a young, outlaw couple sets out on a cross-country bank-robbing spree until they’re killed in a violent police ambush at the film’s close. Bonnie and Clyde’s violence provides one example of the ways films at the time were testing the limits of permissible on-screen material. The youth culture’s liberal attitudes toward formally taboo subjects like sexuality and drugs began to emerge in film during the late 1960s. Like Bonnie and Clyde, Sam Peckinpah’s 1969 Western The Wild Bunch, displays an early example of aestheticized violence in film. The wildly popular Easy Rider (1969)—containing drugs, sex, and violence—may owe a good deal of its initial success to liberalized audiences. And in the same year, Midnight Cowboy, one of the first Hollywood films to receive an X rating (in this case for its sexual content), won three Academy Award awards, including Best Picture. As the release and subsequently successful reception of these films attest, what at the decade’s outset had been countercultural had, by the decade’s close, become mainstream.

The Hollywood Production Code

When the MPAA (originally MPPDA) first banded together in 1922 to combat government censorship and to promote artistic freedom, the association attempted a system of self-regulation. However, by 1930—in part because of the transition to talking pictures—renewed criticism and calls for censorship from conservative groups made it clear to the MPPDA that the loose system of self-regulation was not enough protection. As a result, the MPPDA instituted the Production Code, or Hays Code (after MPPDA director William H. Hays), which remained in place until 1967. The code, which according to motion picture producers concerned itself with ensuring that movies were “directly responsible for spiritual or moral progress, for higher types of social life, and for much correct thinking,” was strictly enforced starting in 1934, putting an end to most public complaints. However, many people in Hollywood resented its restrictiveness. After a series of Supreme Court cases in the 1950s regarding the code’s restrictions to freedom of speech, the Production Code grew weaker until it was finally replaced in 1967 with the MPAA rating system.

MPAA Ratings

As films like Bonnie and Clyde and Who’s Afraid of Virginia Woolf? (1966) tested the limits on violence and language, it became clear that the Production Code was in need of replacement. In 1968, the MPAA adopted a ratings system to identify films in terms of potentially objectionable content. By providing officially designated categories for films that would not have passed Production Code standards of the past, the MPAA opened a way for films to deal openly with mature content. The ratings system originally included four categories: G (suitable for general audiences), M (equivalent to the PG rating of today), R (restricted to adults over age 16), and X (equivalent to today’s NC-17).

The MPAA rating systems, with some modifications, is still in place today. Before release in theaters, films are submitted to the MPAA board for a screening, during which advisers decide on the most appropriate rating based on the film’s content. However, studios are not required to have the MPAA screen releases ahead of time—some studios release films without the MPAA rating at all. Commercially, less restrictive ratings are generally more beneficial, particularly in the case of adult-themed films that have the potential to earn the most restrictive rating, the NC-17. Some movie theaters will not screen a movie that is rated NC-17. When filmmakers get a more restrictive rating than they were hoping for, they may resubmit the film for review after editing out objectionable scenes.

The New War Film: Cynicism and Anxiety

Unlike the patriotic war films of the World War II era, many of the films about U.S. involvement in Vietnam reflected strong antiwar sentiment, criticizing American political policy and portraying war’s damaging effects on those who survived it. Films like Dr. Strangelove (1964), M*A*S*H (1970), The Deer Hunter (1978), and Apocalypse Now (1979) portray the military establishment in a negative light and dissolve clear-cut distinctions, such as the “us versus them” mentality, of earlier war films. These, and the dozens of Vietnam War films that were produced in the 1970s and 1980s—Oliver Stone’s Platoon (1986) and Born on the Fourth of July (1989), and Stanley Kubrick’s Full Metal Jacket (1987), for example—reflect the sense of defeat and lack of closure Americans felt after the Vietnam War and the emotional and psychological scars it left on the nation’s psyche. A spate of military and politically themed films emerged during the 1980s as America recovered from defeat in Vietnam, while at the same time facing anxieties about the ongoing Cold War with the Soviet Union. Fears about the possibility of nuclear war were very real during the 1980s, and some film critics argue that these anxieties were reflected not only in overtly political films of the time but also in the popularity of horror films, like Halloween and Friday the 13th, which feature a mysterious and unkillable monster, and in the popularity of the fantastic in films like E.T.: The Extra-Terrestrial, Raiders of the Lost Ark, and Star Wars, which offer imaginative escapes.

Movies Shape Culture

Just as movies reflect the anxieties, beliefs, and values of the cultures that produce them, they also help to shape and solidify a culture’s beliefs. Sometimes the influence is trivial, as in the case of fashion trends or figures of speech. After the release of Flashdance in 1983, for instance, torn T-shirts and leg warmers became hallmarks of the fashion of the 1980s. However, sometimes the impact can be profound, leading to social or political reform, or the shaping of ideologies.

Film and the Rise of Mass Culture

During the 1890s and up until about 1920, American culture experienced a period of rapid industrialization. As people moved from farms to centers of industrial production, urban areas began to hold larger and larger concentrations of the population. At the same time, film and other methods of mass communication (advertising and radio) developed, whose messages concerning tastes, desires, customs, speech, and behavior spread from these population centers to outlying areas across the country. The effect of early mass-communication media was to wear away regional differences and create a more homogenized, standardized culture.

Film played a key role in this development, as viewers began to imitate the speech, dress, and behavior of their common heroes on the silver screen. In 1911, the Vitagraph company began publishing The Motion Picture Magazine, America’s first fan magazine. Originally conceived as a marketing tool to keep audiences interested in Vitagraph’s pictures and major actors, The Motion Picture Magazine helped create the concept of the film star in the American imagination. Fans became obsessed with the off-screen lives of their favorite celebrities, like Pearl White, Florence Lawrence, and Mary Pickford.

American Myths and Traditions

American identity in mass society is built around certain commonly held beliefs, or myths about shared experiences, and these American myths are often disseminated through or reinforced by film. One example of a popular American myth, one that dates back to the writings of Thomas Jefferson and other founders, is an emphasis on individualism—a celebration of the common man or woman as a hero or reformer. With the rise of mass culture, the myth of the individual became increasingly appealing because it provided people with a sense of autonomy and individuality in the face of an increasingly homogenized culture. The hero myth finds embodiment in the Western, a film genre that was popular from the silent era through the 1960s, in which the lone cowboy, a seminomadic wanderer makes his way in a lawless, and often dangerous, frontier. An example is 1952’s High Noon. From 1926 until 1967, Westerns accounted for nearly a quarter of all films produced. In other films, like Frank Capra’s 1946 movie It’s a Wonderful Life, the individual triumphs by standing up to injustice, reinforcing the belief that one person can make a difference in the world. And in more recent films, hero figures such as Indiana Jones, Luke Skywalker (Star Wars), and Neo (The Matrix) have continued to emphasize individualism.

Social Issues in Film

As D. W. Griffith recognized nearly a century ago, film has enormous power as a medium to influence public opinion. Ever since Griffith’s The Birth of a Nation sparked strong public reactions in 1915, filmmakers have been producing movies that address social issues, sometimes subtly, and sometimes very directly. More recently, films like Hotel Rwanda (2004), about the 1994 Rwandan genocide, or The Kite Runner (2007), a story that takes place in the midst of a war-torn Afghanistan, have captured audiences imaginations by telling stories that raise social awareness about world events. And a number of documentary films directed at social issues have had a strong influence on cultural attitudes and have brought about significant change.

In the 2000s, documentaries, particularly those of an activist nature, were met with greater interest than ever before. Films like Super Size Me (2004), which documents the effects of excessive fast-food consumption and criticizes the fast-food industry for promoting unhealthy eating habits for profit, and Food, Inc .(2009), which examines corporate farming practices and points to the negative impact these practices can have on human health and the environment, have brought about important changes in American food culture. Just 6 weeks after the release of Super Size Me, McDonald’s took the supersize option off its menu and since 2004 has introduced a number of healthy food options in its restaurants. Other fast-food chains have made similar changes.

Other documentaries intended to influence cultural attitudes and inspire change include those made by director Michael Moore. Moore’s films present a liberal stance on social and political issues such as health care, globalization, and gun control. His 2002 film Bowling for Columbine, for example, addressed the Columbine High School shootings of 1999, presenting a critical examination of American gun culture. While some critics have accused Moore of producing propagandistic material under the label of documentary because of his films’ strong biases, his films have been popular with audiences, with four of his documentaries ranking among the highest grossing documentaries of all time. Fahrenheit 9/11 (2004), which criticized the second Bush administration and its involvement in the Iraq War, earned $119 million at the box office, making it the most successful documentary of all time.

  1. This chapter is adapted from The Saylor Foundation Media and Culture which is adapted without attribution ↑

Annotate

Next Chapter
Contemporary Film
PreviousNext
Class Copy
This text is licensed under a CC BY 4.0 license, except where noted.
Powered by Manifold Scholarship. Learn more at
Opens in new tab or windowmanifoldapp.org