Integration

From RMIT Visual Effects
Jump to: navigation, search

Integrating elements from disparate sources is a key skill in compositing.

Matching color

In the case where two live action elements from two sources are required to match, the workflow is fairly simple. The footage that requires matching is termed the 'target', whilst the 'source' is the footage which it should match. Though the target will require the majority of attention, it is entirely conceivable that the source will also need to be tweaked. Color matching is particularly important in getting the face of an actor to be consistent within a scene.

The color match workflow is simple enough, and assumes that any major color cast has been fixed with a color correction...

Matching lightness and saturation
  • Match first the lightness value of the footage. Lightness values can best been evaluated in the Viewer by pressing the 'Y' key whilst the mouse is hovering over the Viewer. The Grade node is the best tool for matching lightness value. You are advised to first adjust the white point and black point using the 'blackpoint' and 'whitepoint' sliders. After that the gamma using the gamma slider.
  • Match then the saturation value. If these don't match, then it can probably be fixed by no more than a slight tweak of the Saturation node.

There are (at least) three ways to do this. Both require that first match regions are identified: a region of the source footage that should match a region of the target, yet doesn't.

Hue matching 'by the numbers'.

This technique identifies the numerical values of the source match region, and matching corresponding values of the target match region.

Hue matching 'by eye'.

targeting the same match regions as the preceding technique, this method employs a 'by eye' match method. Color is difficult to match, but grey values are can be matched quiet effectively. Hence, the individual channels of the target match region may be match reliably (a channel is, after all, a greyscale image).

  • This method requires that the source and target match regions are in very close proximity to each other. A CopyRectange can be used to copy the match region from the target image and lay it on top of the source image.
  • Cycle through the 'R, G and B' channels in the Viewer. By making an adjustment in a ColorLookup or Grade node, the greyscale channels may be matched to each other.
'Two point' hue matching
The preceding methods matches one set of values from one image to those of another. Using this method two sets of values can be matched: a light and a dark. It can be more effective that the first methods, but it is not straightforward:
  • Using the blackpoint and whitepoint parameters of the Grade node set, in the target image, the light and dark points. This will move those points to black and white and make the image look weird, but wait...
  • The reason these points were set to black and white is that they are now much more easily moved to new values (moving a 0 or 1 to arbitrary RGB values is easier than moving one arbitrary set of values to another). Using the lift and gain parameters of the Grade node these new black and white values can be moved to new positions which may be sampled from the source image. Part B of the videos that are linked below of the page covers this method very well.

The method is shown here in this excellent video from The Foundry.

Matching grain

One of the big issues in compositing is matching the appearance of the various inputs that come into Nuke. It is ironic that in the process the compositor is often obliged to visually 'damage' the visual information: to add blur, motion jitter or grain.

The Grain node, with its 'Size', 'Irregularity' and 'Intensity' parameters, is simple enough to understand. To use it:

  1. Have at hand a movie source clip the grain of which you want the target clip to match.
  2. Identify in that clip two regions that are uniform and a matching mid grey.
  3. Zoom in close to these regions and arrange your workspace so that they can easily be closely compared to each other.
  4. Go through each channel individually and, by adjusting the Grain node's parameters, get the grain of the target to match that of the source. This is done because the grain is different for each channel.

Matching motion blur

Simply... matching motion blur, usually requires no more than activating the motion blur value of the match move. This is done by changing the value of the 'motionblur' parameter from zero value to one. The effect of the motion blur effect may increased by reducing the 'shutter' parameter.