Whether you want to make it in the larger professional world, or get started at home, or just need a refresher; these are the terms to know. While pretty much all of the definitions come straight from text books, I’ll be providing more insight on the terms and how they apply.
A cut that uses the motion of something on screen as a distraction to hide the cut and create a smooth transition. One example of this is in Jaws, where Spielberg used the motion of people crossing the camera to change the shot.
Really this is something that happens a lot and can be used in a variety of ways. Basically anything you have pass in front of the screen that blocks off anything in the background can be used to hide the cut. This is used A LOT in Sci-Fi movies with spaceships zooming past, etc.
Action safe area
A region of the screen where elements are guaranteed to be visible. The action safe area is bigger than the title safe area because it is less important for moving elements to be visible at all times. This is for compatibility with older CRT TVs that did not display the full area of the image.
With the digital age upon us, too many video editors have the attitude that the ‘Safe Areas’ are no longer necessary. Not true. Even for things you’re putting online you need to be aware of the aspect ratio your final product will be viewed at. Then you can adjust your own ‘Safe Areas’ to accommodate your viewing format. This ensures that nothing of your video gets cut off unintentionally.
Automated Dialog Replacement. This process involves re-recording actors’ dialog in a studio and syncing it up to their moving lips on screen as if it was recorded on-set. This is usually performed when dialog is recorded poorly or to change certain lines.
In fact, this is becoming far more common than the definition suggests. There are plenty of times as well when they love the actor…but hate his voice, and use ADR to replace it all together. It requires a lot of patience and a strong ear for this to work successfully.
The opacity channel in an image. This controls the opacity level of a given pixel, with 0% being fully transparent and 100% being fully opaque.
If you’re doing manual fades, or need one image to hover over the other (for keying purposes) adjusting the Alpha Channel is the way to do it.
An editor that handles the logistical aspects of editing such as synchronizing sound to picture, cutting in temporary (“temp”) sound effects and music, and overseeing the creation of optical effects (“opticals”) such as titles and fades.
Unfortunately for most of us freelancers, this is a luxury. We’re stuck doing all of this work on top of regular editing. However, if you find yourself specializing in this area, any editor would gladly love to take you on to help out. It’s a great way to fill up your resume.
Black and code
Tapes that have been pre-recorded with blank data (a black screen and timecode) before they are used. Another word for striped stock.
All right, while this is done well before Post, it is vitally important. If you don’t black out a tape (if you’re still using them), then anytime you turn off the camera, will create a timecode break. Meaning a new and separate timecode will start. So when you’re editing you may run into two different clips that share the same timecode. This could cause all manner of problems when filling in the gaps and retaining your footage.
Burnt-in Timecode (BITC)
Footage with timecode permanently displayed on the image (“burnt in”). This is normally used during offline editing to correctly match shots for the online edit.
CGI (Computer-Generated Imagery)
The process of generating and animating elements in a computer to be composited into a scene as if the elements were present in the scene as it was shot.
This is such a common practice now in films, that there’s hardly anything out that doesn’t feature CG in some form or fashion. Getting it to work on a small budget is much harder, but with LOTS of patience and work, anyone can get it to work in their projects. Just be prepared to invest a lot of time into it.
One of several components used to make up an image. RGB images are made up of red, green and blue channels, with an optional alpha channel for transparency.
Understanding channels becomes very important when it comes to the color correction of scenes. By manipulating the individual channels you can create dynamic color schemes and be able to quickly fix many problems.
Chroma key filter
A filter that allows a selected color in an image to be made transparent. Commonly used for green screen photography.
And that’s really the best/only use for it. Chroma Key will remove the color that you select…but it’s not perfect. You’ll still need to fine tune the edges of the colors you’ve removed. Otherwise your green/blue screen project is going to look amateurish and awful.
Peaks in the luminance signal that have been cut off at certain value to prevent them from exceeding the limits of the video system.
Typically you don’t want your whites to be clipped. This can cause digital artifacts (distortion) and even washout on the scenes in which this is occurring. You’ll need to color correct the white, in order to bring it down to the ‘legal’ levels so it won’t be clipped off.
Color Grading/ Color Correction
The process of altering or enhancing the color of an image to seamlessly blend cuts together in a scene, remove unintended mistakes (e.g. overexposure of highlights) or for creative effect.
I’ll tell you now…every single scene in your project needs some sort of color correction. Documentaries not so much, but any narrative project should. It doesn’t always have to be big changes, but the main thing you want to ensure is that all of the colors from scene to scene (no matter location) need to feel ‘similar’. If each scene is slightly off in their colors and highlights, then it gives a disjointed feeling and looks amateurish. Good color correction can help fix errors in lighting (NOT ALL OF THEM) and even give some of that ‘film look’.
A mathematical model of color. Color spaces differ in how they model color – for example, RGB creates color from mixtures of red green and blue and CMYK creates color from mixtures of cyan, magenta, yellow and black. Broader models such as YRYBY incorporate other factors into their models in order to create more accurate models for the specific display or recording device.
The process of combining multiple elements shot separately (images, movie clips, CGI, etc) into a final image or sequence to give the impression they were all shot at the same time.
This is actually a fairly intensive process, as adjusting one of the elements typically means an adjustment to all. Timing is very important here, and nudging a clip even a couple frames may make the difference between believability and total crap. Take your time with this process and it’ll get you far.
The process of removing information from a file or video signal in order to reduce its size or transmission rate. The aim is to reduce storage, transmission and processing costs whilst retaining optimal quality.
This is very important in the digital age we’re in now. Always be mindful of the storage limits you’re allowed on whatever you’re outputing your project to (web, DVD, etc) as the more storage you have, the less compression you’ll actually need. That means the higher quality it can be. If you’ve got tons of storage space there’s no need to compress it as much as possible and rise quality. Pay attention to it and adjust your settings accordingly.
The process of maintaining the consistency of the plot, characters, time period, objects, places and events of the film in order to maintain the audience’s suspension of disbelief.
This is something that should be done on set during filming, but it’s important to keep an eye out for it in Post as well. Especially when you start flipping images around to get the right shot, you need to keep continuity in mind. You don’t want a ball flying in from the right in one shot, to be mysteriously coming in from the opposite side in the next.
A detailed list of occurrences during the shooting of a scene with the aim of tracking, and therefore compensating for, any changed elements that may affect continuity.
This is crucial and unfortunately as editors, we won’t always get one. It’s very important to ask for one though before starting, because if they do have one, they could always forget to give it to you.
Cross Fade (audio transition)
Lowering the volume of the preceding audio clip whilst raising the volume of the following clip at the same time, with the aim of easing abrupt transitions between the two.
This is essentially the audio equivalent of a Cross Dissolve (or Dissolve, covered later). It’s about as basic as it gets for transitioning between scenes and cuts. Be careful though, as most of the default settings on the transitions can be too quick, resulting in an audio ‘jump’. Simply dragging out the amount of frames in the transition will smooth it over.
Cutting to another scene or set of events whilst an existing scene is taking place, to give the impression that they are both taking place at the same time.
This is common practice and it’s in this style of editing that you control the overall pacing of the film. Cutting too soon, or waiting too long in between crosscuts can result in poor pacing and confusion amongst the audience watching.
To move abruptly from one shot to another angle or scene.
This is the whole purpose of editing! You’re cutting the frames in order to tell a cohesive story. If you can’t understand this…better stay out of Post.
Cutting to a shot of something other than the main focus of the scene. This can be used to hide an edit or give significance to a particular object or hand movement, for example.
The easiest example for this would be a moment where an actor is sitting on a couch (or something) the camera focuses on them but then you’ll cut to a close-up of the picture he’s holding in his hands. It can help break the tension, if needed, give more detail to the story, without have to use dialogue, and can mask an edit if you want to switch to another angle of him on the couch. These are very important and understanding how these work will save a lot of frustration.
A sound editor that focuses purely on dialogue. His job is to assemble, synchronize and edit the dialogue in a production, with the aim of producing the clearest dialogue possible for the sound editor to work with.
Again, this is assuming your editing with a team, and not just yourself. Sound is very important, so if you are working by yourself, you’ll need to spend almost as much time working with the sound as you do the picture. There’s a simple adage….”Problems with the picture can be excused as artistic or stylistic in movies; there is NO excuse for bad audio though.”
Music or sound effects that appear to emanate from the world of the film. This is in contrast to the music score for example, which accompanies the movie but clearly does not come from within it.
This really ties in to how you’re going to make your world sound ‘real’. These are simple things, like hearing glasses clink at a bar, with a radio or TV going on in the background. This is where attention to detail comes in handy. The more realistic the sound setting are, the higher quality the film will appear.
A matte extraction technique that separates a subject from its background using color differences between the two.
This is kind of similar to the chroma key (or Green Screen) extrapolation, but is used regardless of the background. Basically it’s taking a subject out from the rest of the background (even if you’re not using a green/blue screen). This requires much more patience and way more attention to detail.
Merging of one shot into another by gradually decreasing the opacity of the first shot over time until it is completely transparent.
This is the most basic of all video transitions (like the Cross Fade for audio) and there various different kinds of dissolves one can use. The most simple (and widely used) is the Cross Dissolve. The important thing to remember is you need to use the transition that makes the most sense for your story and established style for the film. Throwing in something random can seem out of place to the audience. Using something that seems ‘cool’ may just look amateurish.
Converting from a higher quality format to a lower one. Obviously there will be a great loss in quality, so keep that in mind.
Drop Frame Timecode
Timecode that is modified to remain in sync when 29.97 fps NTSC video is broadcast at 30 fps. In order to retain accuracy, the first two timecode frames of every minute are dropped, with the exception of every tenth minute. Note that only the timecode references are lost; not the actual frames themselves.
Technical stuff that you need to keep in mind when you’re exporting your work. You always need to be aware of what format your final project will be viewed at.
A brief loss of signal that results in a “blank” area of video or audio, or adds excess noise to an image.
If you come across this in your footage, might as well just cut it out. You won’t really be able to salvage or fix it. That’s where cutaways come in handy.
The animation of non-character elements such as explosions, smoke, rain, etc.
These are the elements you’ll use when it comes to Compositing.
Film stock optimized purely for shooting visual effects footage. It has very fine grain to allow easier compositing.
If you don’t already, it’s a good idea to build a library of effects stock. The easiest way to do this is to simply hold on to all the effects you create. Put them on a hard drive, that way you can bring them back up as needed. Also talk with other editing friends or visit some online forums. People will gladly share with you.
The process of making sure that an actor’s eyes are looking at a creature or element that will be inserted later. This is normally achieved by a grip holding a pole in the desired position but some sets have more elaborate methods.
This is something that should be handled on the set. If not, it can make your post-production hectic and painful. However, even if handled correctly on the set, you still have to make sure your composited element matches the eyeline that’s already established.
The process of causing an image to gradually disappear into darkness.
Essentially it’s a type of dissolve, but it doesn’t lead into more footage. It leads to black. Be careful where you use it though, as fading to black randomly can defeat pacing. As with all things you do in editing, make sure it fits in with the story you’re telling.
Interlaced video is split into two fields: one comprising the odd-numbered scan lines, and the other comprising the even lines. There are 60 fields for every second of NTSC video.
A software add-on to simulate a given effect upon the footage. Common filters include blurs, de-grain and color correction.
Everyone loves playing around with filters. They can give off cool effects and just looks neat. Problem is, some editors get too focused on how ‘cool’ a filter may look, and totally forget that it doesn’t really fit in with the rest of the project. Don’t let your filters and effects dictate how your project gets edited.
A computer-generated file that establishes the relationship between timecode, keykode and often audio timecode. Flex files are important in the online editing process. They have the file extension .flx.
This is mostly about housekeeping and backing up all of the work you’ve done. Flex files will save all the data you have…as data. It won’t save the actual assets themselves. This also works well when working with editors in other cities. You can simply transfer the Flex file (since it’s much smaller than the entire project) to the other editor and they can get to work based off their own assets. It saves space and time.
The process of recording sound effects on-the-fly as the picture plays. Sound effects are often created from everyday household objects.
Again, this is an entirely separate job, but on small projects you may get stuck with it. It’s important to then use the footage you’ve edited and scenes that are as close to finished as possible. You don’t want to be recording sound effects for shots that are going to get cut; it’s just a waste of time.
The number of frames played every second. The standard film frame rate is 24 fps, with NTSC video at 29.97 fps and PAL video at 25 fps. Shooting higher than these rates will result in slow-motion footage and shooting lower will result in fast-motion. It’s also referred to as Frame Rate
Before you get started editing, you need to know at what frame rate the footage you have was captured at. You don’t want to edit 24fps using a 29.97fps timeline. It will cause all kinds of problems, especially when it comes to exporting it.
A single image that represents the movie at a given point in time. When several of these images are played in sequence, they give the impression of motion.
The repetition of a single frame of footage to give the effect that the action has stopped or that the audience is looking at a still image.
This is a great tool to use for those moments where you need to drag out a shot for just a few more frames, but due to the way it was filmed…don’t have the footage. Be very careful with it though. Doing it for too long becomes very obvious, and unless that’s what you’re trying to do (such as when a narrator stops the action, Dukes of Hazard style), it’ll look awkward. Also try not to stop on a moving action. You’ll get some frame distortion and motion blur.
An increase in signal amplification. It also results in an increase in signal noise.
This deals with audio stuff. Sometimes dialogue isn’t recorded perfectly and you need to bring some of the sound up a bit. You can do that by adjusting the decibel level or messing with the gain. However bumping it up will give you more static and white noise. Conversely you can bring the gain down to cut down on excess noise, but the audio quality might also go down.
A measurement of the level of midtones in an image. Adjusting the gamma adjusts the level of the midtones while leaving the blacks and whites untouched.
This is something you’ll play with while doing color correction. Once you get your whites and blacks set to the level you want…leave them alone! Then you can adjust the midtones without having to mess up the black/white adjustments you already made.
A matte designed to tell the computer which areas of an image to ignore or remove.
In simple terms, you’re cropping out parts of the image. You can use this in order to replace something in the background (like the sky or a building). Matting requires a lot of time and patience. Adjusting the edges of your matte frame-by-frame is really the only way to ensure it does what you want it to, and still looks great.
GIGO (Garbage In, Garbage Out)
A phrase referring to the fact that the footage you output can only be as good as the footage you input. If you shoot poor quality footage, you’re going to get a poor quality output.
I threw this in here, because it’s something all filmmakers need to be aware of. Despite the old motto “Fix it in Post!”, not everything can be. Be sure to get the best you can before starting editing, because at that point, it’s likely too late to change it.
Green Screen Compositing
The process of making all green elements in an image transparent and placing a different background underneath. Commonly used for TV weather forecasts and placing live characters in CG environments.
All right, well that does it for A-G. I know it’s a lot to take in but we’ve still got a ways to go! Be sure to keep checking back as we finish up our alphabet of Post-Production terms.