Video Editing, Part IV
Any noticeable, abrupt, or undesirable change in audio or video during a production is referred to as a technical continuity problem.
We tend to accept some technical continuity problems; others we don't.
News and documentaries are often shot under drastically different conditions and so we tend to accept such things as changes in video color balance or audio ambiance between scenes.
But in dramatic productions we don't want technical inconsistencies diverting our attention from the storyline.
In this type of production the medium (television) should be totally "transparent" so there's nothing to get in the way of the message (the story).
Audio Continuity Problems
Audio continuity problems can be caused by a wide range of factors including shot-to-shot variations in:
In single-camera production most of these inconsistencies may not be easy to detect on location. It's only when the various shots or takes start to be assembled during editing that you discover the problem.
As you cut from one scene to another you may discover that the talent suddenly seems to move closer or farther away from the mic, or that the level or type of background sound changes (passing traffic, the hum of an air conditioner, or whatever).
Some problems can be helped with the skilled use of graphic equalizers or reverberation units. Changes in background sound can sometimes be masked by recording a bed of additional sound in the audio. This could be music or street noise.
As in most of life, it's easier to avoid problems than to fix them -- assuming there even is a way to fix them.
Things to Be Alert For
First, be aware that mics used at different distances reproduce sounds differently. This is due to changes in surrounding acoustics, as well as the fact that specific frequencies diminish over distance.
Although some expensive directional mics will minimize the effect of distance, most mics exhibit proximity or presence effects.
A good pair of padded earphones placed on top of a set of well-trained ears can detect these differences.
With the increased reliability of wireless mics many production facilities are equipping actors with their own personal mics. The distance of the mic -- it's generally hidden in the person's clothes -- can't change, and because of the proximity of the mic, background sounds tend to be eliminated. Some of the things we talked about in using personal mics should be kept in mind here.
Finally, you need to watch for changes in background sounds. For example, the sound of a passing car or a motorcycle may abruptly appear or disappear when you cut to a shot that was recorded at a different time.
Even if an obvious background sound doesn't disappear, its level may change when you cut from one person to another. This may be due to differences in microphone distance coupled with the level adjustments needed to compensate for the different strength of voices.
The scene on the right would make a beautiful background for an interview, but the waterfall could create major sound problems, especially for a single camera interview or a dramatic production.
Audio technicians will typically want to keep the camera or audio recorder running for a minute or so after an interview so that the ambient sound on the location can be recorded. In other settings this is referred to as room tone.
You may need to use this audio to cover a needed moment of silence or just to give an even and consistent "bed" of sound behind a segment. Low-level audio from a sound effect CD can also be used in this way.
Continuity Issues in Background Music
Music can smooth the transition between segments and create overall production unity.
Background music should add to the overall mood and effect of the production without calling attention to itself. The music selected should match the mood, pace and time period of the production.
Vocals should be avoided when the production contains normal (competing) dialogue.
Ideally, the beginning of a musical selection should coincide with the start of a video segment and end as the segment ends. In the real world, this almost never happens, at least without a little production help.
To a limited degree you can electronically speed up and slow down instrumental segments with digital editing equipment, especially if the music is not well known.
Because a kind of continuity issue arises when music has to be faded out "midstream" to conclude at the end of a video segment, you can back time the music.
If the music is longer than the video, during editing you can start the music a predetermined amount of time before starting the video. You can then fade in the music as the video starts. This will be less noticeable if the segment starts with narration and the music is subtly brought in behind it.
If you calculate things correctly, the music and the video will both end at the same time.
Let's assume, for example, that a music selection is two minutes and 40 seconds long and the video is only two minutes long.
By starting (or cuing in) the audio 40 seconds before the video and fading it in with the start of the video, they should both end together.
As will see later, all of this is fairly easy when you are using a nonlinear, computer-based editing system. (Everything is visible on the computer screen's time-line.) With linear editing the process takes a bit more work and planning.
Video Continuity Problems
Video has its own continuity problems; for example, changes in:
Intercutting scenes from cameras with noticeably different color characteristics (color balance) in a dramatic production will immediately be apparent to viewers.
To alleviate this problem all cameras should be carefully color-balanced and compared before a production.
Once cameras are color balanced and matched, an electronic test pattern with all of the primary and secondary colors is often recorded at the beginning of the recording. This has traditionally been used to color balance the video playback. However today, many systems can electronically adjust color from the recording's integrated color reference signal.
Notice in the photos above that several things change, especially skin tones and color balance.
As we've noted, a vectorscope and a waveform monitor are both a part of the software of professional nonlinear editing systems. These systems allow you to change the basic color balance of scenes.
However, in trying to match different video sources the subtle differences between some colors may not be able to be satisfactorily corrected. This is why the initial color balancing of cameras is so important.
This page shows the various color balance and luminance range settings available in one sophisticated nonlinear video editing system.
Before turning our attention away from the above shots, notice how we could encounter another type of continuity problem.
Cutting directly from the close-up to the two-shot would represent a problem because of the change in the position of the woman's head. Using a cutaway or insert shot could solve this problem.
Issues Forum Author's Blog/E-Mail Associated Readings Bibliography
Index for Modules To Home Page Tell a Friend Tests/Crosswords/Matching
© 1996 - 2017,
All Rights Reserved.
Use limited to direct, unmodified access from CyberCollege® or the InternetCampus®.