Kendra Coburn is a mathematics junior and Mustang News columnist. The views expressed in this column do not reflect the viewpoints and editorial coverage of Mustang News.
Computer-generated imagery, or CGI, first appeared on-screen in the 1976 film “Futureworld,” where a brief shot of a robotic hand and face were replaced with computer-rendered images. Since then, the technology has proliferated in Hollywood, sneaking into almost every major studio-released film of the past few years. As CGI is usurping practical effects in most filmmakers’ editing process, the question becomes increasingly relevant: are digital effects actually better than their practical counterparts?
Intuition might answer this question in favor of CGI. After all, the sheer ubiquity of computer-generated images in films has made most modern movie-goers complacent with their place on the silver screen. To be sure, CGI has vastly increased the scope of possibilities for filmmakers. Hugely successful movies like “Avatar” (2009) and “Doctor Strange” (2016) simply wouldn’t have been possible without the aid of computer imagery.
However, I am here to argue in favor of traditional, practical effects. Although both CGI and practical effects have their place in modern filmmaking, I believe that films utilizing a practical approach over a digital one are better made and more memorable than their modern digital equivalents for two major reasons.
The first reason is cost versus quality. The cost of CGI often takes up a significant portion of a film’s budget. The fourteen highest-budget films of 2014 were all special effects-driven spectacles, with the CGI-heavy “The Hobbit: The Desolation of Smaug” (2013) topping the list with a $250 million budget. As a result of how expensive these effects are, other aspects of a film’s budget are neglected, such as the quality of the script, cinematography and even acting. By simple economics, investing more money into one part of a film means investing less money in others. I believe that the cost-efficiency of practical effects allows filmmakers to make holistically better films. Some of the more dated films of the 20th century still hold up by virtue of their superior scripts and cinematography.
The second reason is known as the Weta Effect. The Weta Effect describes the paradoxical phenomenon of a computer-rendered image being too realistic. CGI images often appear to have a “shiny” or unnaturally-smooth quality to them. A good example of this is the eponymous character in “Avengers: Age of Ultron” (2015). Despite Ultron’s hyper-realistic appearance on-screen, our brains are able to detect minuscule details about the character that are “off”, such as shadows not landing in the right places or reflective surfaces not refracting light properly. Although these seem like small details, they trigger an instinctual response in our subconscious that leads to the common feeling that something isn’t quite right with a CGI character. Practical effects do not have this problem because they are created using real, physical methods, so they obey the normal laws of physics.
Perhaps the argument I defend here makes me sound like a movie purist who sits in a basement watching old films and dislikes new movies before I’ve even seen them. I assure you, this is not the case. The 21st century has seen some remarkable cinematic achievements. However, I am concerned by the rate at which filmmakers and movie-goers are hopping on the CGI bandwagon. By taking a step back and analyzing the filmmaking process of the movies we choose to see, we can all enrich our movie-going experience.