A Brief History of Sound Effects

Sound effects have come a long way since the days of banging coconut halves together to simulate horse hooves. Now, just a few clicks away from this blog, you can access extraordinarily detailed, high-quality recordings of a real stallion – or almost anything else you can dream of, from underwater sounds to bank vault doors. The tools and resources for making sound effects are also more advanced and accessible than ever before, allowing sound artists to do their jobs more creatively and efficiently.

But to get to this point, sound effects have gone through a long evolution. Join us as we chart a course through the fascinating history of sound effects, including landmark innovations and technologies that have advanced the art from the earliest days of live performance to the current golden age of technology.

 

In the Beginning…

Sound effects began as clever ways to simulate real-world sounds for theatrical performances. By 3000 B.C. the people of modern-day China and India had begun incorporating sound into performance, a tradition carried on by the Greeks and Romans who built special machines to create the sounds of thunder and earthquakes conjured by the gods. Early devices included a “thunder machine” that dropped brass balls onto stretched hides and a “wind machine” consisting of a rotating wheel draped with fabric.

Mechanical sound effects continued to advance throughout the medieval and renaissance periods as theater moved indoors and audiences expected a more immersive experience. By the 18th century, all manner of clever devices had been invented, including a “rain box” that functioned like a massive rainstick and a new type of thunder machine involving large balls rolled through a long trough hidden in the ceiling. 

While experiments in sound recording had been in progress since the mid-19th century, sound effects as we know them didn’t make their debut until 1890, when a baby’s cry was played on a phonograph in a London theater.

 

Radio and Talkies

By the early 20th century, radio broadcasting technology had matured enough to make radio plays a mainstream form of entertainment. When consumer radios began popping up in households, radio plays saw a surge in popularity – and without visuals to accompany the story, sound effects became more important than ever. In radio plays, sound effects artists performed live along with the cast, manipulating props to add a sense of immersion and make the story more believable. These techniques were so effective that during the 1938 broadcast of H. G. Wells’ The War of the Worlds, some listeners were convinced there was a real alien invasion happening.

Thanks to sound-on-film technology, motion pictures began to feature synchronized sound in the 1920s, and these so-called “talkies” changed everything. Not only did this advancement finally make dialogue audible, it opened up new possibilities for sound effects as well. For the next several decades, pioneers like Jack Foley developed techniques for creating sound effects in film that are still used today. In fact, we still refer to recording footsteps, character movements and props as “Foley.”

"In radio plays, sound effects artists performed live along with the cast, manipulating props to add a sense of immersion and make the story more believable. These techniques were so effective that during the 1938 broadcast of H. G. Wells’ The War of the Worlds, some listeners were convinced there was a real alien invasion happening."

 

An ARP 2600 analog synthesizer with grey patch cables plugged in. Photo by Esa Kotilainen.
Sound Designer Ben Burtt combined his voice with an ARP 2600 synthesizer
to create the sounds of R2-D2 for Star Wars. (Photo by Esa Kotilainen)

 

The Golden Age of Analog Sound Effects

As the art of filmmaking and the technology behind it matured, so did the art of sound. While the industry began to tinker with experimental widescreen formats, sound engineers developed stereo and multichannel audio formats such as Disney’s “Fantasound” developed for Fantasia in the late 1930s. Freed from the restrictions of mono soundtracks, sound artists took advantage of these formats to create more immersive sound effects.

In 1958, the British Broadcasting Corporation opened its cutting-edge Radiophonic Workshop, giving its audio engineers carte blanche to explore the possibilities of synthesizers, tape loops and other electronic trickery for sound design and music. The most well-known product of the workshop was Delia Derbyshire’s theme for Doctor Who, one of the first to heavily feature synthesizers and strange new sounds that fit the science fiction-themed program.

The ‘60s saw an acceptance of synthesizers in mainstream music, and sound effects creators soon followed suit. Sound designers started using modular synthesizers made by Moog, Buchla and others, as well as electronic instruments like the Theremin, to conjure otherworldly sounds for space, fantasy, and horror films.

No discussion of sound effects history is complete without mentioning Star Wars, the science-fantasy epic that set a new standard for both visual and sound effects. The first film in the series introduced some of the most iconic sound effects in film history, including the synthesized robotic “voice” of R2-D2, the pew-pew of laser blasters, and the unforgettable hum and crackle of lightsabers.

 

 

The Digital Age

By the 1980s, digital audio technology had entered mainstream use, marking another paradigm shift in the world of sound effects. Although movies were still shot on film for the most part, digital recorders and processors had begun to show up at sound stages, sound design studios and dub stages. The first Digital Audio Workstations (DAWs) were hardware units like the Akai DD1500 and Fairlight MFX, which gave sound effects editors unprecedented control compared to splicing analog tape.

Digital audio also presented unique sound design opportunities. Digital samplers like the E-mu Emulator made it easy to record, edit, process, store and recall sound effects, freeing sound designers to let their creativity take over. Another sound design revolution came in the form of FM synthesis, made popular by the Yamaha DX7 keyboard, which made it possible to craft complex and realistic sounds by combining and manipulating individual frequencies. Digital reverbs such as the Lexicon 224 also entered use, making it possible to place sounds in realistic-sounding virtual spaces and create new types of echo effects.

However, digital audio wasn’t without its drawbacks. The limited dynamic range of 16-bit recording required sacrificing fidelity for convenience, making loud sounds like explosions sound compressed and sometimes distorted. Digital Audio Tape, introduced in 1987, was notably unreliable, becoming all but obsolete within a decade. In the world of surround sound, 1987 also marked the first use of the 5.1 format – which is still widely used today.

Toward the end of the 1980s, software-based DAWs entered the market. One of the first to gain the support of the industry was Digidesign’s Sound Designer, which developed into Pro Tools in 1990, heralding a new era of sound effects creation. DAWs continued to mature throughout the 1990s, while hardware developed to support higher sample rates and 24-bit recording for greater detail and dynamic range. As a result, sound effects libraries went digital, making them easier than ever to access. In the late 1990s, hard drive-based portable recorders such as the Zaxcom Deva II made it easier than ever for field recordists to capture sound effects outside of the studio.

There was also an explosion in plugin technology around this time, putting all manner of effects into the sound artist’s toolbox. From simple EQ and compressor plugins to reverbs, delays, doppler effects, and emulations of analog gear, plugins became indispensable to sound effects creators and editors. Steinberg’s Virtual Studio Technology (VST) format, introduced in 1996, is still the most popular format for plugins today.

"By the 1980s, digital audio technology had entered mainstream use, marking another paradigm shift in the world of sound effects."

 

A close-up of a digital mixing console or control surface with a screen displaying Dolby Atmos mixing software. Photo by Nichollas Harrison.
Dolby Atmos debuted in 2012, bringing more immersion to the art of sound effects.
Photo by Nichollas Harrison.

 

A New Era of Sound

The turn of the millennium brought even more DAWs to the market, with Steinberg releasing its post-production-oriented Nuendo software in 2000 around the same time the free and open-source DAW Audacity was introduced. In 2002, Apple acquired Emagic and launched Logic Pro, followed in 2006 by Cockos’ REAPER, a feature-packed, open-source donationware DAW that put advanced sound design tools into the hands of amateurs everywhere. Along with Pro Tools, all of these DAWs are still in widespread use today.

Throughout the late 2000s and early 2010s, Dolby was busy developing Dolby Atmos, an immersive audio format that is now supported by cinemas, streaming services, and home theater systems worldwide. The premiere of Brave in 2012 marked the first theatrical Dolby Atmos installation, giving audiences a taste of the exciting potential of immersive sound effects. Netflix threw its support behind Atmos in 2017, adding it to the official deliverables specifications for original content and encouraging more creators to make use of the technology.

Field recordists rejoiced in 2018 when Rode Microphones released the NTSF-1 SoundField Ambisonic microphone, giving field recordists a convenient way to record a three-dimensional soundfield with complete spatial information. The same year, Zoom released the H3-VR, a self-contained Ambisonic mic and recorder that made spatial audio more affordable than ever. Higher-order Ambisonics microphones entered the market soon after, increasing the spatial resolution for extremely immersive recordings.

 

Here and Now

Today, we find ourselves in a golden age of sound effects with an ever-growing library of sounds at our fingertips and rapidly advancing audio technologies that allow us to use them more creatively than ever. DAWs have evolved through dozens of iterations, advanced sound design tools are opening up new possibilities, 32-bit floating-point recording has given us nearly unlimited dynamic range, and spatial audio technologies like Dolby Atmos and Ambisonic microphones are now in widespread use. Using all of the sound libraries, techniques, and technologies developed over the last hundred years, we now have more options than ever at our fingertips and the only limit to the art of sound effects is our collective imagination.

To learn more about the cutting edge of sound effects, be sure to check out our blog to learn how sound effects are made and how to use them, watch tutorials on creating sci-fi and fantasy sound design, and learn from experienced sound artists.

At Pro Sound Effects, we partner with award-winning sound artists like Mark Mangini (Dune) and Richard King (Inception) as well as top library recordists and sound designers to make the highest quality sounds available to all. Creators everywhere now have access to everything from historic archive recordings to the freshest sonic ingredients captured by celebrated masters of the craft. Explore our library and bring your ideas to life.

 

 


 

Dante Fumo_HeadshotDante Fumo is a Midwest-based sound designer, editor, and mixer via Super Natural Sound. When he’s not doing that or writing about sound, Dante composes instrumental and electronic music using spatial audio.

 

Brown leather shoes over broken glass front in from of a silver professional studio microphone

What is Foley?

Sound Design Tips for Trailers & Teasers

A wide shot of a canyon in Zion National Park.

What is Worldizing?