Keying

  Welcome to the world of Blue Screen / Green Screen compositing! Once the exclusive domain of Hollywood special effects artists, blue screen compositing expanded to include video and computer imaging. There are many mysteries to the succesful execution of a blue screen composite and considerable confusion as to what a blue screen composite is.

What is Blue Screen Imaging?

    (First a note about terminology: when I first wrote this page in 1995, the common term in use was "blue screen" compositing. Since then the vogue has shifted to calling it "green screen". The broader term from the days of film optical effects is "traveling matte composite", but that has fallen out of favor. For now, this page will mainly refer to the process as blue screen, but almost everything here applies to green screen effects too, except where noted.)
Creating a blue screen composite image starts by photographing a subject in front of an evenly lit, bright, pure blue (or green) background. The compositing process, whether photo-chemical or digital, replaces all the blue in the picture with another image, known as the background plate.
    Blue screen composites can be made optically for still photos or movies, with dedicated real time hardware for live video, and digitally using software to composite still and motion images. Until the the 1990s most blue screen compositing for films was done optically, and all television composites were done using analog real time hardware.
    In addition to blue, other colors can be used. While green has become the most common; sometimes red has been used for special purposes.
    Another term for Blue Screen is Chroma-Key. Chroma-Key is a television process only. A more sophisticated television process is Ultimatte; also the name of the company that manufactures Ultimatte equipment. Ultimatte has been the ultimate in video compositing for 20 years. With an Ultimatte unit it is possible to create composites that include smoke, transparent objects, different shades of blue, and shadows. Ultimatte now makes software that works with other programs to create digital mattes, either as a standalone program, or as a filter for programs such as Photoshop and After Effects from Adobe.

How does Chroma Key work?

    The Chroma Key process is based on the luminance key. In a luminance key, everything in the image over (or under) a set brightness level is "keyed" out and replaced by either another image, or a color from a color generator. (Think of a keyhole or a cookie-cutter.) For example, a title card with white on black titles is prepared and placed in front of a camera. The camera signal is fed into the keyer's foreground input. The background video is fed into the keyer. The level control knob on the keyer is adjusted to cause all the black on the title card to be replaced by the background video. The white letters now appear over the background image.
    Luminance keying works great with titles, but not so great for making live action composites. When we want to key people over a background image, problems arise because people and their clothing have a wide range of luminance tones. Hair, shoes and shadow areas may be very dark, while eyes, skin highlights and shirt collars can approach 100% white. Those areas might key through along with the backdrop.
    Chroma Key creates keys on just one color channel. Broadcast and high end consumer cameras cameras use three independent sensors, one for each primary color-- Red, Green and Blue. Many cameras can output these RGB signals separately from the composite video signal. So the original chroma key was probably created by feeding the blue channel of a camera into a luminance keyer. This works, sort of, but soon manufacturers created dedicated chromakeyers that could accept all 3 colors, plus the background composite signal, and the foreground composite signal. This made it possible to select any color for the key and fine tune the selection of the color tint, chroma level and luminance level.
    As keyers became more sophisticated, with finer control of the transition between background and foreground, the effect became less obvious and jarring. Today's high-end keyers can make a soft key that is essentially undetectable. Some of the best modern Special Effects Generator Switchers from Grass Valley Group, Sony, and others can create composites rivaling the performance of a dedicated Ultimatte unit. (Though they are not as good at removing blue spill, working through water or fabric, etc.)

Why Blue? Can't other colors be used?

    Red, green and blue channels have all been used, but blue has been favored for several reasons. Blue is the complementary color to flesh tone--since the most common color in most scenes is flesh tone, the opposite color is the logical choice to avoid conflicts. Historically, cameras and film have been most sensitive to blue light, although this is less true today.
    Green has its own advantages, beyond the obvious one of greater flexibility in matting with blue foreground objects.  Green paint has greater reflectance than blue paint, which can make matting easier. Also, video cameras are usually most sensitive in the green channel, and often have the least noise in that channel. A disadvantage is that green spill is almost always objectionable and obvious even in small amounts, while blue can sometimes slip by unnoticed.
    Sometimes (usually) the background color reflects onto the foreground talent creating a slight blue tinge around the edges. This is known as blue spill. It doesn't look nearly as bad as green spill, which one would get from green.
    Traditionally, a single camera was used as the Chroma Key camera. This creates a problem on three camera sets; the other cameras can see the blue screen. The screen must be integrated into the set design, and it is easier to design around a bright sky blue than an intense green or red. However, modern Special Effects Generators (Usually just called "Switchers" in the US, more accurately called "Vision Mixers" in the UK and elsewhere) can accommodate multiple camera sources, whether as RGB analog, or SDI video, inputs.

Roto

Rotoscoping is the process of manually altering film or video footage one frame at a time. The frames can be painted on arbitrarily to create custom animated effects like lightning or light-sabres, or traced to create realistic traditional style animation or to produce hold-out mattes for compositing elements in a scene.

As a VFX artist, you are primarily creating motion graphics or visual effects. Without a thorough knowledge of rotoscoping and how it fits into the modern digital pipeline, you are limiting just how far you can take an effect or design.

The art of rotoscoping changed considerably with the introduction of digital tools such as Commotion, Digital Fusion (DF), Shake, Combustion (C3) and After Effects (AE). With a thorough knowledge of rotoscoping, digital artists can create better live-action or CG composites as well as amazing visual effects. Various rotoscoping techniques are covered below, including matte creation, effects painting, paint touchup, digital cloning, and motion tracking as well as a brief history of the craft.


Historical overview of rotoscoping


Fleischer Studios

A true pioneer of animation, Max Fleischer produced the Popeye and Betty Boop animated series, as well as the animated features “Gulliver’s Travels” and "Mr. Bug Goes to Town." With his brother Dave, he founded the Fleischer Studios in the early 1920’s, which offered a less sentimental animated vision of the world than the rival Disney studio. Perhaps most importantly, Fleischer invented the rotoscope, a device that changed the look of animation forever.

Born in Vienna Austria in 1883, Max Fleischer immigrated with his family to America at the age of four. His artistic skills were quickly recognized, and instead of attending public high school he opted for the Art Students League in New York. While attending school he landed his first job at the Brooklyn Daily News, where he worked as an assistant in the cartoon department. Within a few years, he was a full-time staff artist with his own comic strip. He then moved on to Popular Science Monthly, which sparked a life-long fascination with machinery and inventions. While working at this magazine, Fleischer began working on his plans to create the rotoscope.

Early animated films were crude, jerky and difficult to look at. They were not very popular and were only tolerated because they were a curiosity. Max Fleischer aimed to change this by inventing a device that would allow them to project live action film onto the glass of an animation stand. The animators could then place paper on the animation stand and trace the live action footage one frame at a time. This device, named a Rotoscope, was patented by Max Fleischer in 1917.

In a 1920 New York Times interview, Fleischer said, "An artist, for example, will simply sit down and, with a certain character in mind, draw the figures that are to make it animated. If he wants an arm to move, he will draw the figure several times with the arm in the positions necessary to give it motion on the screen. The probability is that the resulting movement will be mechanical, unnatural, because the whole position of his figure's body would not correspond to that which a human body would take in the same motion. With only the aid of his imagination, an artist cannot, as a rule, get the perspective and related motions of reality."

The rotoscope, though, allowed animators to work from a filmed image, which gave them the guidance they needed to create more graceful and realistic movement on screen. "It was beautiful to watch, rather than very annoying to watch," Fleischer said.

The first cartoons created by the Fleischers using the Rotoscope were the Koko the Clown series, and then went on to utilize it in Betty Boop and Popeye. Though they used rotoscoping to create the main characters, they continued to rely on traditional rubber hose style animation in their cartoons. The Fleischers pioneered other traditional animation priniciples in their studio which changed the face of modern animation, right up to today. Most animators at the time would use the technique of “Straight Ahead Action”. Animators would simply start drawing their sequences at the beginning and straight ahead to the end. The Fleischers used another technique called “Pose to Pose” animation, in which the animators would produce main extreme poses, or keyframes, then fill in the in-betweens. The difference was that the Fleischers would have assistants draw the in-betweens while the lead animators moved on to create more keyframes. Though at the time this eventually led to labor problems and striking workers at Fleischer Studios, the practice is still used today by traditional cel animation companies, and has been translated into the automatic “tweening” processes found in computer based animation tools.

Disney

During the 1930s, the Fleischers found themselves in an on-going competition with another animator -- Walt Disney. The Fleischers and Disney constantly raced one another to each new milestone in animation -- first sound cartoon, first color cartoon, and first feature. But according to Max Fleischer’s son, Richard Fleischer, Max and Dave often came in second, largely because the studio behind them, Paramount, didn't offer the support they needed.

Walt Disney also turned to rotoscoping, for “Snow White”. At the time, Fleischer considered suing Disney for patent violation, but in doing preliminary research, his attorneys discovered that before Fleischer's patent, a company in Wilkes-Barre, Pa., had created a device similar to the rotoscope. The company, Bosworth, Defresnes and Felton, had never patented it, so Fleischer actually was entitled to sue, but he evidently lost interest in pursuing the Disney case after hearing about the earlier machine.

The movements of Snow White herself were acted out by a high school student named Marjorie Belcher, later known as dancer Marge Champion. Initially, Disney intended to use Belcher's movements as a guide for the dancing in the cartoon, but soon he opted to use it more extensively. This was partly because the animators otherwise used themselves and their own facial expressions as the basis for their characters' faces, Disney explained. "The artists looking at themselves in a mirror sometimes were not so successful, because they were bad actors and would do things in a stiff way," he wrote.

Nevertheless, some of the Disney animators looked down on the idea of rotoscoping. One of them, Don Graham, derided the technique as a "crutch" for artists who lacked the skill to do their work on their own. Another, Grim Natwick, said that even when the artists used the device, they used it only as the basis for their work, adding heavy elaboration and even changing the proportions of the original filmed figures. "We went beyond rotoscope," he said.

But rival animator Walter Lantz criticized the look of the rotoscoped work in "Snow White." In press materials for his own project, "Aladdin and the Wonderful Lamp," Lantz declared he would use the rotoscope only for timing because of what he saw as its limitations, especially in Disney's film. "This literal system resulted in two faults -- a jittering movement that contrasted with the fluidity of the animals, and the fact that the human characters were too accurate to be seen beside the caricatures," he said.

Yet rotoscoping did help the artists on "Snow White" maintain a consistency that might otherwise have been impossible. On earlier animated shorts, each character was done by a single animator; as a result, the characters had a unity of style. Because "Snow White" was so extensive, however, more than one artist had to work on each character. Working from live-action footage offered them the best way to create a cohesive look.

Analog Rotoscoping for Visual Effects

While the technique is useful for animation, rotoscoping eventually became an important tool for visual effects in general. From the 1940s through the 1960s, U.B. Iwerks, a well-known animator, turned to effects work, where he pioneered the use of the rotoscope on films such as Alfred Hitchcock's “The Birds” (1963).

Rotoscoping in visual effects was used primarily to make holdout mattes. "You frequently want to composite different elements into the same shot to create that shot," explained Tom Bertino, who was head of Industrial Light & Magic’s rotoscoping department from 1987-93. "By using the tracing to create black mattes, you can hold out certain elements."

For example, Bertino imagines a scene of an explosion behind two people on-screen, where the explosion is added after the fact. "You could print the explosion over the frame. But you'd also cover up the people," he said. "You'd need to isolate them with the rotoscope." To make a traditional holdout matte, a rotoscope artist would trace the figures that had to be isolated onto an animation cel. The outline traced onto the cel then would be filled in with black paint, so that it would block the appropriate section of the frame. "You create a solid black matte," Bertino said. This black matte then could "hold out" the part of the explosion image where the two people would appear, so that when the two images were printed together, the people would appear to be in front of the explosion.

Rotoscoping also could be used to stabilize a shaky film image. To do stabilization, each film frame was rotoscoped onto an alignment chart. A comparison of the charts allowed changes in position to be tracked from frame to frame. Using this information, an optical copy of the film could be made, with the printer offsetting the shifts in each frame's movement.

Bertino said people underestimate the difficulty of rotoscoping during the photochemical era: "It was a painstaking process. There were so many moving parts to the rotoscope camera, and so many places for things to get out of hand." Rather than being a refuge for the unskilled artist, he added, rotoscoping was a demanding craft. "The rotoscoper had to be a skilled animator to make the line follow through. That's actually something that plagued some early uses of the rotoscope as a special effects tool -- without actual animators to handle it, it could get jittery."

Good rotoscope artists were very precise about their work. "It was so exacting," Bertino said. "It's almost like -- I don't know if you’ve ever seen those incredibly detailed Chinese tapestries that they made in the monasteries generations ago. They finally stopped making them because the artisans would go blind. I'm surprised that more rotoscopers didn't go that route."

Jack Mongovan, a paint and rotoscope supervisor at ILM, began his career in traditional rotoscoping and has been working in the field for 19 years. He remembers working in rooms that were completely dark except for the light coming out of the projector. The rotoscope artists were at the mercy of the painters who would later fill in their outlines, and who could with a few stray brushstrokes outside the outline make the image suddenly jittery. "I would never go back to traditional for anything," Mongovan said.

Digital rotoscoping for Visual Effects
Today, rotoscoping is done in the computer, using programs such as Shake, FFI and Pinnacle Commotion. The shift to computer-based rotoscoping began in the early 1990s with a software called Colorburst, an image editing tool like Photoshop, that later evolved into Matador. "When computers became prodigiously viable around here, right after the 'Terminator 2'/'Jurassic Park' era, we realized that the computer had great capabilities for this," Bertino said. "It obviously became much simpler."


Mongovan said that today, one rotoscope artist can do the same amount of work that eight used to do, and in one quarter of the time. This is often because in traditional rotoscoping, each frame had to be drawn individually. The computer, on the other hand, can use the previous frame as a basis, which means most of the drawing may already be done.

Rotoscoping software works using splines, which are a series of points connected by a line or curve. These splines are adjusted from frame to frame, so that they continue to conform to whatever shape the artist is tracing. Because rotoscoping software includes the tools to paint an image, rotoscope artists now find themselves doing a lot of paint work as well. "Rotoscoping is becoming the lesser part of what we do," Mongovan said. "We do so much more painting." Painting might mean taking someone out of a shot, or replacing a sky, or painting out the tennis balls used as visual effects tracking markers.

Some skills remain necessary, including a sense of what is important. "One of the hardest things for people to do in our department is to realize that they're looking at a very zoomed-up plate," Mongovan said. Also, he pointed out, a movie audience will see an image for only 1/24th of a second, too short a time to register flaws that may torture the artists. More important is consistency. "I tell people, 'You can paint that first frame wrong, just keep it wrong it all the way through.'"



That kind of understanding is key, Bertino agreed. "The secret to good rotoscoping has always been -- regardless of what it's used for -- an educated eye and good judgment as to what to include and what to leave out," he said. "Most people think the rotoscope is very literal -- you trace what's there, and that's it. It's possible to put too much detail and confuse matters. You need to have that sense for judicious editing. That hasn't changed at all. And not everybody's got that."



Summary of Roto Tools

After Effects

After Effects was the first tool to bring professional compositing motion graphics and effects functionality to the desktop. After Effects was originally developed by CoSA, then aquired by Aldus, which in turn was aquired by Adobe. After Effects had very limited rotoscoping tools in earlier versions, with only one rotospline and no paint tools, but this is slowly changing. Version 4 added multiple rotosplines for cutting mattes, version 5 added vector paint, and version 6.5 has added cloning tools and tracker advancements (we still haven't tested these improvements). It is still lacking b-splines as well as the realtime roto performance found in more advanced roto tools like Commotion. Tip: Red Giant software offers a Commotion to AE roto import plugin




Flint/Flame/Inferno/Fire/Smoke

Discreet’s Advanced System, which include Flint, Flame, Inferno, Fire, and Smoke, run on SGI workstations and range in price from $60,000 to over $500,000. These products offer a complete post-production solution, including very powerful and fast rotoscoping tools. The painting and cloning tools are top notch, with excellent brushes and advanced features including brushed based warping. The rotosplining functionality is excellent, though not quite up to par with Commotion due to a lack of b-splines and the inability to play spline over a moving image in realtime. Tracking is very fast and very accurate. Many facilities using Discreet’s advanced systems offset roto work to Macs and PCs running Commotion, Shake or Combustion.










Combustion

In 1997, Discreet aquired Paint and Effect from Denim Software. Paint offered a vector based painting and cloning system for Mac and PC, while Effect offered compositing capabilities. Discreet re-designed the interfaces to make the applications more Discreet like, and merged the two applications into Combustion. Along the way, they also replaced some of the core functionality like Keying, Color Correction, and Tracking with the same tool set found in Discreet’s Advanced Systems. Combustion 2.0 added additional Advanced Systems features, including the same rotosplines found in Flame. Combustion 3.0 took the product even further with an edit operator, flash output and much more, most significantly a flow diagram UI feature that many users feel more comfortable working with. Combustion roto spline files can be opened directly in the larger Inferno/flame/flint products.











Curious gFx Pro

gFx is a relatively new product for the Mac OSX. Unlike other paint programs it is designed around a stong user interface that fully embraces moving footage, as such it can import, composite, track, or stablise footage easily. The spline shapes can not yet be exported and the product does not fully import Photoshop files and maintain their structure, but this is planned for an upcoming release. the product does have specialist wire removal tools and a very friendly and interactive user interface. One of Curious's founders is the man behind Parrallax, and it shows in some of the depth of tools already available, 16bit raster paint with an excellent brush engine, and b-spline rotosplines with an excellent transform points UI, motion blur on splines, grouping splines, selective edge feathering (ie. advanced gradient), and more.










Digital Fusion

Digital Fusion started in Sydney and moved to Toronto, Canada. At one stage a version of Fusion was provided with Alias 3D - but today Eyeone has gained one of the strongest postions in NT/Windows desktop compositing solutions. Eyeon has two main products Digital Fusion and DFX +.
Digital Fusion 4 is eyeon’s flagship product and marks the ninth major release of this powerful compositor. DFX+ 4 is the 8-bit expandable version of eyeon’s image processing software, Digital Fusion. DFX+ is based on the architecture of DF4 and offers a number of significant enhancements to its predecessor, DFX, including the flexible flow, superior character generation, PSD import into separate layers for animation, and more.
Since Shake's move away from NT/Windows DF has provided a powerful cost effective solution.








Shake
Shake has 3 options for Roto, Quickpaint, Quickshape, and Rotoshape. Quickpaint is a procedural paint package inside Shake. You can paint frame by frame and then view in realtime or paint with interpolation. As all the paint elements can be animated over time it is a reasonable roto tool. Quickshape is a basic roto tool, somewhat now completely over shadowed by Rotoshape. Rotoshape allows variable edge softness and logical operations between roto shapes. The rotos in Rotoshapes are classic spline shapes with complex parent child relationships - and velocity based motion blur. For complex rotoscoping this gives very accurate results. Both Rotoshape and quickpaint can use shakes 2D trackers. It is worth noting that given Shake is a node workflow model it is possible to paint or roto through a track or image transform.










Photoshop

The most ubiquitous graphics application in the world was probably the first digital rotoscoping tool to be used in film and video post production. Though Photoshop was initially intended for still images, it can work with motion by importing frames one at a time or importing filmstrip files from video applications. Photoshop’s brush engine is the benchmark everyone else strives for, and gives excellent control when using pressure sensitive Wacom tablets. The biggest drawback is a lack of a realtime preview of sequential frames. You will not know how well your cloning is working out until you play back your clip in realtime at full resolution. After painting numerous frames in Photoshop, the sequence must be brought back into an editing or compositing application such as Final Cut Pro to see realtime playback. This is a painfully slow way of working. And since it isn’t intended for video, it lacks travelling matte capabilities and motion tracking.

Other older products:
Commotion

Developed by Industrial Light and Magic Visual Effects Superviser Scott Squires, Commotion was used for years at ILM before Scott formed Puffin Designs and released it to the public. Commotion, then called Flipbook, was often sighted at ILM and mistakenly referred to as the “secret ILM motion version of Photoshop”. Though Commotion looked very similar to Photoshop in some respects, Commotion’s interface and tools were designed for moving images, and was the first tool on the desktop to offer realtime ram based playback. This realtime core functionality was the foundation for all of the roto tools added as the product developed. Advanced roto tools include raster based paint, spatial and temporal cloning, wire removal tools, auto-paint, unlimited bezier and natural cubic b-splines, motion blur on rotosplines, and a very fast and accurate motion tracker. Commotion quickly became the de-facto roto tool in the industry, replacing Matador in most post facilities. Puffin Designs was aquired by Pinnacle Systems in 2000, but sadly development has stopped on the product, most if not all the original developers has long since left and no new work has really been done on the product in the last 3 years. Importantly Commotion curves can be exported and imported into AfterEffects, see AE above.


Matador

Matador was originally developed by Brittish developer Parralax, and acquired by Avid along with Parralax’ compositing application Illusion. Available only on the SGI platform and priced around $15,000, Matador was one of the first digital rotoscoping tools which gained a wide acceptance in the film post production pipeline. Matador started as a tool made for editing still images, so many of the tools used for motion work were not well thought out. Matador provides excellent matte creation tools including b-splines, motion tracking, and a full set of painting and cloning tools, with full 16bit/channel support. Avid stopped development of Matador in the late 90’s. The original developers tried to spin it off into a new company called “Blue”, but that never took off.
There are new Roto tools that have now been incorporated into Softimage XSI compositor in V.4, but these are not Matador - as many people believe.

Aura

Newtek is mostly known for their 3D application Lightwave. Aura was a stand-alone paint application designed for film and video. It hasn’t become widely accepted in the industry, and mostly used by Lightwave users to finesse 3D renders. Some advanced features include a 16bit/channel paint engine, and auto-paint. Newtek has now stopped supporting the program and as of June 2003 with Lightwave 3D 7.5 - Newtek offers DFX+ at no additional cost.

Roto DV

Originally developed as a product named “Roto” by a failed start-up company called Post Digital, Roto DV was aquired by Radius, which later turned it’s name into Digital Origin, and then was aquired by Media100. Though it was called Roto, it actually didn’t have very sophisticated roto tools, and the ones that were actually pretty cool never made it into the shipping product. Media100 has no information on their website about this product, so we assume it is no longer developed or supported.


Rotoscoping in the modern post-production pipeline.

Effects Painting

Effects Painting is generally used to quickly add new elements to a scene. Instead of creating elaborate particle effects in 3D simulation software like Maya, many effects can be done faster by a skilled artist using a paintbrush or airbrush in a paint application. Effects like lightning or light-sabres can be painted one frame at a time. More advanced roto tools offer auto-paint capabilities which allow you to record brush strokes and then play them back over a selected range of frames. Some roto applications also allow you to add jitter to the brushes, as well as add the ability to paint the stroke out over time.

There are two types of paint engines used in modern graphics applications; Bitmap (also known as raster) and Vector. Raster paint engines are destructive in the sense that they replace the pixels being painted onto with the color from the paint stroke. Photoshop, Commotion, and Flame are raster based applications. This is a very fast way of working since the frame is immediately updated and the results can be played back in real time without rendering. Vector based paint engines, like Illustrator, Shake, After Effects Vector Paint, or Combustion, use points and splines to define a brush stroke, and do not destroy the underlying pixels. This non-destructive process allows you to edit paint strokes at any time, though you pay the price in speed since the strokes need to be rendered before they can be previewed in realtime. The other disadvantage is that hundreds of channels will be created with the spline information even if you do not plan on using them.

Cloning/Paint Touch-Up

Most paint work done in the rotoscoping process is used for touching up film or video footage. This includes removing wires and rigs, removing logos, dust busting, scratch removal, etc. In these circumstances, the roto tool must be able to provide temporal and spatial cloning. Spatial cloning is a type of cloning which takes pixels from one position of the frame, and paints the source onto another position on the frame. Photoshop’s rubber stamp tool is an example of spatial cloning. Temporal cloning allows you to paint pixels from one frame in a sequence to another frame. Commotion’s Super Clone tool is an example of temporal cloning. A good roto tool should provide both of these options together so users can offset position and frame number together. Other cloning tools include wire removal tools which allow you to draw a line to zip out a wire. Typically, wire removal tools clone pixels from a specified value on either side of the line, then smear the outside pixels together to cover up the wire or scratch. More advance wire removal tools will add advanced cloning techniques to the wire removal process. For example, Commotion looks at a specified number of pixels on either side of the line, flips those pixel values then cross dissolves to cover up the wire.
There are excellent specialist plugin tools for wire removal such as Tinder's Furnace plugins for Shake and discreet's inferno or flame

Matte creation (Keying, Rotosplining, Painting)

Creating hold-out mattes, sometimes referred to as masks or alpha channels, is a major piece of the compositing process. A matte is a grayscale clip which is used to stencil portions of the background footage. Anything in the black area will be obscured, and anything in the white area will show through (in some systems like Avid this is backwards). Any gray area in the matte will be semi-transparent. Roto artists are expected to cut precise mattes with consistent edges which will not chatter. If the matte is sloppy, the shot will look fake. The best compositor will produce unacceptable work if provided with poor mattes. Mattes can be created with three different techniques; Extraction, Rotosplining, and Painting. For most situations a combination of these three techniques will have to be used.

Extraction is the process of procedurally generating a black and white matte. This can be done by shooting an element against a blue or green screen, then using a color keyer to knock out the specified color. Sometimes bluescreens are not practical, and in these cases other types of extractions need to be performed. Luminance keying can extract a matte based on the luminance values of the source. Either dark or light areas can be extracted into a matte. An image can be de-saturated then leveled to create a high contrast matte. Sometimes it is better to start with one of the color channels to create an extraction. It is always a good idea to check out each color channel to see how the contrast looks, then pick the best one to start leveling into a high contrast matte. The Shift Channels filter in AE or Commotion can shift one of these color channels into the Alpha Channel, which can then be leveled into the final matte. Another type of extraction is Difference Keying, which generates a matte based on differences between two clips.

Rotosplining is the process of creating vector shapes to manually cut an element out of it’s background. These shapes can be re-positioned on various keyframes, and the software will interpolate the in-betweens. The process isn’t as automatic as an Extraction, but at least the computer can interpolate some of the frames for you. Good roto tools will offer multiple rotosplines with the ability to keyframe each shape separately. By using multiple splines, complex elements can be cut out from their background. For example, an actor running would have separate shapes for the hand, forearm, upper arm, chest, torso, thigh, shin, etc. By breaking the shapes down into smaller elements, it is much faster to set the keyframes by moving the shape and not individual points, and the software will interpolate much more accurately. Commotion has the most advanced rotosplining tools on the market. Most applications use bezier splines for their rotosplines, which require tweaking both the points and the handles. Commotion has bezier splines, but the real power is in the B-Splines, which are much easier to control. B-Spline, also called Natural Splines, do not have the handles found on Beziers. Instead they always create a curved surface depending on how far apart the points are. The points default to an average tolerance, and can be interactively changed to loosen or tighten the curve. B-Splines are consistently faster and easier to work with than beziers. Commotion also has the ability to play multiple shapes in realtime over the background footage. This allows you to quickly preview how your shapes are animating compared to the source footage. Other important functions found in Commotion’s rotosplining tools include directional feathering, unlimited splines, color coding and naming splines, motion blurred mattes based on direction and velocity of the splines, a curve editor for fine tuning the motion between keyframes, rotating and scaling splines and selected points, global position offsets, and composite previews.

Mattes can also be generated with paint tools. This is generally the last resort, as painting mattes generally will produce inconsistent results due to the fact that every frame needs to be painted on. Auto-paint functionality can help with this consistency problem, but for the most part painting mattes should be left for final tweaking of an extracted or rotosplined matte. Advanced rotoscoping tools offer the ability to paint mattes directly into the Alpha Channel while continuing to see an overlay of your RGB channel. This is sometimes referred to as a Mask Overlay, or QuickMask, and is crucial for painting complex mattes.

Motion Tracking
Motion Tracking is a computer based process which analyzes a pixel or sub-pixel in a clip, and follows that pixel or sub-pixel to find the exact coordinates on each frame. There are two primary uses for motion tracking. The first is for stabilization, and the second is for match moving.

Once a motion tracker knows where a sub-pixel is on every frame, it can re-position the image on every frame in the opposite direction to counteract a camera shake. This stabilization process works fantastic in most cases. Tracking one point allows you to stabilize position. Adding a second tracker will allow the software to compare the relative positions of the two trackers, which can also stabilize rotation and/or scale.


The second use for motion tracking is match moving. If you needed to add a logo to a car door, you can track the handle on the door, then apply that data to a logo on another layer. As mentioned above, a second tracker can be added to match move a logo which needs to rotate and/or scale. If perspective changes, four point tracking can be used to track four points. Each tracker can then be assigned to a corner of a CornerPin filter applied to the image.

Serious roto tools need motion tracking to help automate tedious processes, as well as to produce convincing results. Motion trackers should allow you to track 1, 2, and 4 points simultaneously. Advanced trackers, like the one found in Commotion, allow for unlimited point tracking, and access to the tracked data in text format so it can easily be used in other applications (Commotion can export text, as well as data formatted for AE, Flame, Digital Fusion, Electric Image, and other apps). Motion trackers should also allow you to apply the tracking data to rotosplines and individual points on a rotospline for automated matte creation, as well as attaching tracker data to paint and cloning tools. And most importantly, the motion tracker has to be accurate. Flame, Shake, Digital Fusion and Commotion have the fastest, most accurate trackers.

VFX Tips

The History of VFX in Indian Films dates back to the silent era. At the same time as hollywood films were experimenting and executing VFX, India had also closely followed and caught on. Dada Saheb Phalke's silent movie Kaliamardhan in the year 1919 is one such example with amazing work. Later came movies like Paadhala Bhairavi and Maya Bazaar which took VFX to greater heights. With the advent of Computer Graphics in hollywood in the 1970s Indian films yearned to similar work and in the 1980s India's first Computer graphics facilities came up led by Prasad Video Digital which later gave birth to Prasad EFX who imported the first film scanners and recorders to India and thus pioneered the digital image revolution in India. The recent film "Krrish" had hollywood vfx supervisor Craig Mumma and was completely executed in India by Prasad EFX.