All you need to know about digital special effects
We define by special effects, or VFX for Visual Effects, all the animation techniques used to create an illusion within a cinematographic product (feature film, 3D film, video games, etc.). The purpose of these special effects is to give movement to action, simulate an object, characters, or even phenomena that do not exist or be filmed during the shooting. We also speak of special effects, about the first experiments carried out by Georges Méliès.
VFX works from various techniques that go into three stages of filming: pre-production, during filming, and post-production. Among the jobs involved, we can notably mention:
- Makeup,
- Costumes,
- Models,
- 3D image,
- Rotoscoping,
- Sound effects,
- Compositing,
- etc.
You should know that special effects occur on a multitude of media:
- Feature film,
- Movie ad,
- The 3D animated film,
- Video game,
- TV show,
- Television Advertising,
- Spot of a website,
- etc.
Three methods to make special effects
If you want to integrate a 3D image into a video clip, we talk about 3D overlay. It is about integrating, within the same image, several objects separately, or 3D objects. The technique of inlaying is most often carried out in post-production, during editing. It is used to integrate decors or obtain specific visual effects. But it can be done live, during the broadcast, for example.
We also talk about rotoscoping, which consists of cropping an animated character, frame by frame, to make their shapes and actions show through 3D animated film. Thus, we find certain realism in the dynamics given by the movements of the characters. Among some cult cultural objects with recourse to rotoscoping, we find Snow White and the Seven Dwarfs, Alien 3, Prince of Persia, etc.
The essential in the world of VFX is the green background or blue. The green background makes it possible to integrate, digitally, the background chosen by the director during the inlay phase. This gives free rein to the writers' imagination because the sets can be synthesized and transcribed during the post-production phase. This type of background, for example, was used during the filming of The Lord of the Rings, to bring to life certain cataclysmic environments of the trilogy.
Digital special effects: understanding the movie theater
Many have already wondered how George Lucas created an entire galaxy for his legendary Star Wars. One thing is certain: the special effects largely contributed to the trilogy's success, a pioneer in the field!
Contrary to what we might think, special effects are present in every movie, from fantasy film to romantic comedy. Most of the time, however, we ignore it. How are they made? How to identify them? Focus on the different techniques used to push the limits of what is possible.
What are the special effects? Update on definitions.
First of all, let's start with some definitions. The term "special effect" groups together all the techniques used in the audiovisual industry to create the illusion of action and simulate objects, characters, environments, or phenomena (meteorological, sound, etc.) that do not exist. Not in reality or which cannot be filmed during filming. Their goal is to appear as real as possible, so we also speak of "special effects": melted into reality, the viewer does not notice them.
We can nevertheless distinguish two types of special effects:
- The mechanical special effects (or SFX for Special Effects) are tangible effects manually created for the film and implanted directly on the board, such as makeup (yes, Harry Potter scar is a special effect), the scale models, artificial rain, etc.
Mechanical effects: makeup in Planet of the Apes (1968)
- The visual effects or digital (or VFX for Visual Effects) are techniques by which the image is digitally created or altered because doing mechanically would be too complicated or impossible (see the reptilian Voldemort's nose).
Digital Effects: Planet of the Apes (2011)
But concretely, how is it done? Update on techniques.
As we have seen, not all special effects are digital. There was even a time when everything was done mechanically or by camera effects. 2001, A Space Odyssey, the famous Kubrick, is a very good example of a science fiction film filmed without recourse to digital technology.
Nevertheless, it is undeniable that the development of information technology has allowed the emergence of new techniques, revolutionary for the film industry. So let's take a closer look at creating digital effects.
Motion capture: the art of creating living beings
Motion capture makes it possible to create living beings by recording the movements of a subject (a person most of the time, but sometimes an animal or an object) and by reproducing them in a virtual environment. This is how the famous Lord of the Rings creature, Gollum, came to life in cinema. The data source used in this case is actor Andy Serkis. Numerous sensors placed on his body allow him to capture his movements and facial expressions, then to transcribe them virtually. Thus, Gollum's movements appear real.
There are different types of motion capture. The most used in cinema is optical capture. The sensors placed on the actor's body reflect infrared rays sent by the camera (special motion capture). These reflections thus help to capture the movement and to transcribe it. Once the movement is recorded, it is digitally processed to transpose the creature's physique on that of the actor.
Matte painting: the creation of sets
Matte painting is a technique that allows you to create a setting or modify a landscape according to the needs of the film, then to integrate it into a filmed scene. This technique has not always been linked to digital, since it was originally a question of painting decoration on large glass plates in order to integrate them into the set.
Today, all genres of movies use matte painting. Thanks to the advent of digital technology, the sets created seem more and more real. It is more and more difficult for the viewer to notice the special effect. Thus, it allows directors to create imaginary universes or to reconstitute an environment. From a shoot against a green or blue background, matte painting is digitally incorporated into the scene during the post-production period. This is how the Dallas of the 1960s could be reconstituted to reproduce JFK's assassination scene in the film Jackie.
The inlay: all you need to know about the famous green screen
The inlay allows an actor filmed on a green or blue background to be embedded in a setting (often created using the matte painting technique). Using the software, inlay makes it possible to integrate objects, actors, and sets filmed separately, as well as digitally created 3D objects into a single image.
Why, green, why blue? These two colors are the furthest away from human skin tones, which helps create a clean cut. In the video, and more particularly in the cinema, directors tend to use green. Indeed, video camera sensors are more sensitive to green, so cutting is all the more precise. Blue, on the other hand, is more difficult to use since it requires more light. The choice of background depends on these different factors, as well as on the colors present on the plate. For example, if the actor filmed on a blue background wears blue pants, he may become invisible!
All the green or blue parts of the image are used to create a clipping signal (what is green or blue takes the value 1, the rest the value 0). This produces an image in black (the filmed elements) and white (the green or blue elements). This new image makes it possible to integrate new elements on the shapes that remain white.
Thus, synthetic images can replace certain subjects used during filming. Take the example of Game of Thrones. When Daenerys interacts with her dragon, a green object represents the creature on the set. It is only during the post-production period that the dragon comes to life. However, having an object, and sometimes even a reference actor (who will be replaced in post-production), helps the actor to set up his game and the director to perform the right camera movements.
The digital backing: recreating an actor
The digital liner is a visual effect that helps recreate an actor's physique, movements, and expressions. Thus, this makes it possible to obtain a scene that an actor does not want or cannot do (a dangerous stunt, for example).
Several steps are necessary to create a digital liner. First, it is about capturing the facial expressions of the actor using the technique of performance capture. This technique is much the same as that of motion capture, but since it is restricted to the face, the movements captured are more precise. Then, the motion capture makes it possible to capture the bodily attitudes of the actor. Then, the actor passes through a sphere formed by thousands of LEDs to capture his physique with precision and to be able to reproduce it numerically identically. Once these steps have been taken, the texts of the actor are recorded. The assembly of these different stages makes it possible to animate the actor digitally.
This technique also makes it possible to resuscitate actors. It was used in particular to finalize the filming of Fast and Furious seven following the death of Paul Walker. Using images from previous films, post-production studio Weta Digital created its digital double. The actor's two brothers served as understudies during filming. Having the same build, they played with sensors on the face, which allowed Paul's digital face to be transposed onto theirs.
So with a computer, anyone can create a visual effect? Update on the work of the computer graphics designer.
Obviously, the special effects do not fall from the sky. They also do not appear directly in the cameraman's lens during filming. When are they created and added to the movie and above all, by whom?
The post-production phase
A film is made in 3 phases: pre-production, shooting, and post-production. The pre-production phase helps identify the special effects needed for the film and design them as directed by the director. However, it is only in post-production (after editing) that they are incorporated into the film. This is all the work of the computer graphics designer.
Once the film has been edited, the computer graphics designer is responsible for adding special effects. He incorporates matte painting into the scenes, adds digital liners and other virtual characters, amplifies special mechanical effects (explosions for example), etc. All this assembly work involves a technique called compositing. It's about superimposing all the special effects of the film to make a unique shot. Once this unique plan has been obtained, the graphic designer uses the cleaning technique. This is to erase the inconsistencies present in the scene (for example, the reflection of the camera in a mirror). The film's final touch is then the color grading, that is to say, the treatment of the image in terms of colors and brightness. This allows the different effects to blend together in the same shot in a coherent way that feels as real as possible.
How compositing works: a series of layers
The question remains: can everyone create their own special effects? The answer is "yes, but"
Visual effects studios rely on different software to create their special effects. The most well-known (and used) are Nuke, After Effects, Photoshop, and Fusion by BlackMagic. This software is specific to certain types of special effects. For example, Photoshop is mainly used for the creation of matte paintings, while Nuke is used more for compositing. Thus, anyone who owns one of this software can venture into the field of special effects.
However, let's not kid ourselves. The visual effects of an amateur film will not be as impressive as those of large productions. Logical, you will tell me. Within a studio, each graphic designer has a specialty. Thus, whoever creates a matte painting will not manage the compositing, etc. The creation of large special effects requires training and real expertise.
In addition, to optimize their achievements, visual effects studios rely on render farms (or render farms). These are clusters of computers, whose main computer (the “server”) oversees the distribution of tasks to other computers (the “compute nodes”). These are used to calculate the rendering of synthetic images. The longer the calculation time, the more precise the final rendering of the scene has been done. Here again, everyone can get a render farm, or even make one. On the other hand, the most efficient (those used by professional studios) are extremely expensive, hence a better final rendering.
Conclusion
All these examples show that digital technology has greatly changed the world of cinema. However, mechanical special effects are still present. Makeup and models remain essential. Some actors such as Tom Cruise refuse to use a digital double and continue to perform their stunts alone. “All digital” still has limits. The films using it today are only animated films. Despite this, digital effects continue to improve. Now directors can bring their wildest dreams to life, from Spielberg to amateur short filmmakers. Indeed, the growing use of digital technology allows simplified access to editing software, video processing applications, and tutorials.
Author: Vicki Lezama