The big screen has been developing the art of visual storytelling over the years; from the art form’s infancy to its current incarnation, the movie making industry has always been at the cutting edge of new technology. Movies have evolved to encompass groundbreaking techniques that produce mind-blowing effects, and the evolution of cinematic techniques is now happening so quickly that movie fans are being shocked and amazed anew on a yearly basis.
These days the movie industry has taken to remaking old classics with contemporary twists, utilizing the most advanced visual tricks and technological wizardry to update well-loved stories: from “Alice in Wonderland,” to “The Wizard of Oz,” and the upcoming Disney movie starring Angelina Jolie, “Maleficent.” In movies like this, we’re seeing the product of hard work from artists, directors, writers, editors, sound techs and special effects teams which render modern movies almost unrecognizable from the products of earlier, less-technologically advanced times. What was once a gradual, subtle advancement over decades has now become a sprint – and cinematic evolution looks set to continue accelerating. From “”The Matrix,””The Lord of the Rings,” and “Toy Story,” to “Monsters Inc.” and “Avatar”, our generation have been witnessing the evolution of technology on-screen.
How did we get from silent black and white movies, to talkies, to IMAX 3D showstoppers? The following list outlines ten of the most important changes over the timeline of cinema, which have altered audience’s expectations and set a new standard of excellence in the film industry.
10. Rotoscoping – 1915
Rotoscoping has been around since the dawn of moviemaking time, used for animation, and eventually seeping into music videos as well. It was the invention of pioneer Max Fleisher who patented his technique in 1915. Known as one of the forefathers of animation, along with Walt Disney, Fleischer’s own use of his rotoscoping technique is best exemplified in his Betty Boop characters, as well as his screen adaptation of Popeye and the original Superman cartoons in the 40s.
Rotoscoping translates into tracing live action frames for the purpose of simplifying the animation process. The evolution of rotoscoping has made a lasting and profound impact on the genre. It made possible the “Lord of the Rings” series in 1978 and the lightsaber effects in the original Star Wars films, The Beatles’ “Yellow Submarine,” and “Snow White and the Seven Dwarfs” in 1937. It was from Snow White onwards that the rotoscope was used as a way of studying human and animal motion, aiding the process of motion capture.
9. Vitaphone – 1920s
We all know that before surround sound, and the sophisticated sound techniques in cinema that we have today, movies started out silent. It wasn’t until the mid-20s that “talkies” were introduced to the viewing public in movie houses, for a mere 25 cents a ticket. To say movies were silent is a bit of a misnomer – there was live music, from full orchestras to organs or just a solitary piano, in an orchestra pit at the front of the screen which accompanied the film.
Talkies evolved in response, in part, to the changing tastes of the audience, who were blown away by the first use of the Vitaphone, endorsed by Warner Brothers. The Vitaphone was first used to make the first part-talking musical, “The Jazz Singer,” in 1927. Viewers called it “miraculous,” and “more real,” and so the Vitaphone technology – which recorded sound on a singular wax disc that projectionists were able to use to synchronize the film – became a fixture in the movie industry at the time.
8. Technicolor – 1930s
We all know that movies originally began in black and white, and it’s an aesthetic still employed by filmmakers today for an old-timey, more artistic experience. Back in the days of black and white it was possible to manipulate color, but it took a while for the practice to become refined enough to be standard. Initially there were a variety of ways to do so: hand colouring each frame of film, stenciling, tinting and toning. By the end of the 20s, there were almost two dozen companies holding color patents for film. Most of the methods were tedious and too expensive to employ over the length of a feature movie. The results, too, were not particularly natural.
Technicolor, all the while, was being refined – since its birth in 1916 – by an engineering firm unrelated to the film industry. The process employed by The Technicolor Corporation, using a two-strip additive process of mixing two colours on the screen for an approximate spectrum, was tweaked to success in the 30s. It became a three-strip subtractive process that required new technology, including a camera, through which the three strips ran to emphasize a different colour of the spectrum. It was enough of a breakthrough for the film industry. Technicolor was generously employed in 1939 original, “The Wizard of Oz,” one of the most universally appreciated and endearing movies of all time. This movie set the tone for colour films to come.
7. VistaVision – 1954
VistaVision, by Paramount, offered a larger viewing screen to audiences by providing an alternative to the anamorphic process used by CinemaScope. VistaVision refined the quality of their flat and wide screen system by positioning the 35mm film negative horizontally in a camera gate and spotlighting onto a larger area to produce a finer-grained projection. While CinemaScope could do what VistaVision did (and the second became obsolete as finer-grained film stocks eventually broke through the market), VistaVision was used in special effects and was a major influence on techniques to come. IMAX and OMNIMAX film formats, taking a note from their predecessor, orientated sideways.
In its heyday, many films were shot in VistaVision, though none perhaps as famous as Alfred Hitchcock’s “Vertigo.” Hitchcock shot much of his movies with this camera technique. VistaVision was most notably employed in high resolution special effects scenes for the original Star Wars films. Lucas’ team, known now as Industrial Light & Magic, played with a VistaVision camera to help create some of their complex, cutting-edge effects.
6. Blue screen – 1959
A blue screen is used to film an actor in a setting that doesn’t, essentially, exist in actuality. The screen makes it possible for filmmakers to add a separately filmed background or special effects in post-production. The original blue screen technique was crude, with some objects appearing to glow and making the final product nothing short of unrealistic. Eventually, though, the technique was refined by Petro Vlahos whose “composite color photography” patent was submitted in 1959, and first employed on the big screen in the 1959 classic movie, “Ben Hur.”
The influence of the blue screen in cinema is everywhere, and was further used in the grand-scale experimental film, “Sky Captain and the World of Tomorrow,” which wasn’t a box office hit but was the first movie to be wholly shot on blue screen. Director Kerry Conran and a team of over 100 filmmakers created a multi-layer background that would allow his actors to be added in post-production.
5. CGI & Photorealistic CGI – 1973
Computer-generated imagery, or CGI, made a massive impact on movies and the movie going audiences in 1982 with its special effects in the original “Tron.” This was the first time the technique had been used on such a large-scale project. But CGI had been around before that, first making an appearance on screen in the 1973 western sci-fi thriller “Westworld,” starring Yul Brynner. In the movie, the otherworldly cowboy utilizes his robo-vision, which was a digitally processed technique, created by a computer graphics duo behind the scenes, to create Brynner’s pixilated POV. On a side note, more transformational work was exhibited in the movie’s sequel, “Futureworld,” where for a brief moment the first 3D imagery made an appearance with Peter Fonda’s hand and face.
CGI eventually evolved into “photorealistic CGI,” which was utilized in 1985 for the first time by Lucasfilm’s stained glass knight in “Young Sherlock Holmes.” As computers progressed, so did CGI; the technique is heavily present in 1991’s “Terminator 2, Judgement Day,” and the 3D CGI character ‘T-1000’, played by Robert Patrick, who had the ability to move through iron and build himself out of a pool of metal. Then in 1993, a new bar for CGI was set by Steven Spielberg’s “Jurassic Park.” Where costly and labour-intensive animatronics had been used before, CGI stepped up to create other-worldly creatures on-screen.
4. Bullet time (or time slice) – 1990s
Like many of the influential techniques employed in the film industry, this one also had been around for years, being retooled to finally make an unforgettable imprint on viewers imaginations. Bullet time, which was trademarked by Warner Bros., and coined after the Matrix series from 1999-2003, is also referred to as “slice time,” and was actually first developed all the way back in the 19th century in experiments by Eadweard Muybridge, who analyzed horses movement with his technique of snapping a photo of a horse galloping by the camera, one frame at a time. This “freeze motion” was used in music videos, video games and TV commercials, while slow motion fight scenes and CGI was brought to the big screen with movies like 1999’s “Blade.” But Matrix movies compressed all three: CGI, freeze motion, and slow motion that was digitally tweaked to develop amazing bullet-dodging effects. The influence has been huge, especially in the video game industry, and has been seen in other movies including, “Charlie’s Angels,” “Spider-Man,” “The One,” and “The Green Hornet.”
3. Motion capture – 2000s
Before Andy Serkis’ unforgettable turn as Gollum, in Peter Jackson’s “Lord of the Rings” trilogy, the idea of copying human motion for animated characters was, as noted above, originally developed through the use of rotoscoping. But Jackson charged through the existing limitations. Before Gollum, motion capture had been mostly employed in the medical industry and in video games; Jackson developed the technique for cinema, to record Serkis’ movement and translate that information into an unforgettable, digital 3D computer animated character. In 2005’s King Kong, the technology was further pushed to marry live action and facial capture for even more stunningly realism.
2. Pixar’s Photorealistic Renderman – 1995
The world-renowned CGI production company based in California, known as Pixar, has earned no less than 26 Oscars for work in CGI animation. Pixar’s CGI techniques are employed using PhotoRealistic RenderMan technology, revolutionising animation to an unprecedented degree. Pixar’s first full-length, computer animated movie came out in 1995, as “Toy Story,” swept the world into its adventurous, buddy-comedy embrace. Their technology was used in other blockbuster movies from “Titanic,” to the Star Wars prequels, and in 2001, Pixar’s Renderman software developers received an Academy Award of Merit for their advancement in the field of motion picture rendering.
1. 3D Conversion – 2010s
It’s unsurprising that James Cameron would be the one to take 3D to the next level in 2009’s “Avatar,” given his history with the technology, that dates back to “The Abyss,” in 1989, which was the first instance of a computer generated 3D character. The technique had been employed long before the 80s though – with old-fashioned 3D cinematography offering the old red and green coloured lenses to help fool the eyes, and first thrilled audiences in 1953’s “The Charge at Feather River.” One of the main differences between 3D of yesteryear and 3D of today is simple: digital projection. But Cameron also employed a number of innovative techniques in “Avatar” to create stunning levels of layered imagery. While Cameron relied on CGI, as in 3D movies of past, he also utilized motion-capture suits, which are onsies covered in sensors that track body movement, with actors playing out scenes on a performance capture stage that was several times bigger than any used in Hollywood to that point.
Cameron also improved on the suits themselves, creating skull caps with camera enhancement to monitor the facial expressions to a more precise degree. The director also developed a virtual monitor that allowed him to watch the motion capture results as they were being filmed instead of afterwards, when the computer had finished rendered the images. And he further pushed the envelope by developing a filming rig more advanced than was used before; allowing the cinematographer to capture two images simultaneously and provide the illusion of depth. It’s safe to say Cameron’s advancement spawned a rebirth of the genre today, with most movies now offered both in regular view and in 3D. Cameron continues to be a leader in the field, with his promotion of techniques that include shooting and projecting movies with faster frame rates. He has been quoted as saying, “If watching a 3D movie is like looking through a window, then [with this] we’ve taken the glass out of the window and we’re staring at reality.”