CGI or ‘Computer Generated Imagery’ is a technique used by film makers to digitally add elements to films which would not be possible in real life or would be too hard to shoot. While on the surface CGI seems like a great leap forward for film and a truly positive example of the move into a new technological landscape for filmmakers, it is not always appreciated in films. CGI can be seen as tacky and is often overused to an extent that the film is no longer immersive and the technique has therefore been made redundant.
Use of ‘bad’ CGI can be found at both ends of the filmmaking spectrum. Blockbusters like “The Planet of The Apes” find CGI a quintessential part of their process and rely heavily on the use of CGI. This can remove audiences from a film as the augmentation of CGI and real life is difficult to get perfect and therefore little mistakes can ruin of CGI have an overall more damaging effect on the enjoyment of a film.
On the other hand, low end films’ use of CGI can be a bad decision for a director as professionally executed CGI is expensive and films with smaller budgets simply can’t afford it. Unlike the faults of CGI in higher budget films, the failing of CGI in indie films is a lot less subtle and has a much more devastating effect. When CGI is poorly done, a film loses all credibility and immersion and therefore filmmakers with less of a budget should think very carefully about whether or not CGI is necessary or beneficial to their story.
Despite this, CGI is a useful tool for filmmakers that would not have had such easy access too and therefore filmmakers should use the most of it. Before that however, they must ask themselves whether it is going to achieve it’s desired effect or merely ruin the immersion of a film.