Saturday 10 April 2010

Film vs. Videotape

" Ultimately, digital image acquisition will definitely replace film for TV production. You don't need a crystal ball to know that."

-Ed Nassour, senior VP of postproduction,
20th Century Fox Television

New Insights in an Ongoing Debate -

paragraphWhich is better: film or videotape?"

The fact is, each is superior in a number of ways; it depends on your needs.

At the same time we need to acknowledge the fact that much of the information about the "inferiority of video" is no longer valid. Even so, old beliefs persist.

First, let's look first at the advantages of film.

Advantages of Film

paragraphThe TV Production modules concentrate on television, so here we tend to emphasize the advantages of video. Even so, film has some advantages.

After more than 100 years of film production, a rich and highly sophisticated tradition has grown up around film. Unlike video production where newcomers may quickly find themselves functioning as camerapersons and even in some cases as directors, the feature film tradition typically involves long, highly competitive apprenticeships.

Less motivated people tend to drop out in favor of those who are more talented, persistent, and dedicated.

Because of rich heritage of film, the production and postproduction processes have not suffered from a lack of talent or supporting industries. In Southern California alone there are thousands of companies that specialize in various aspects of film production.

" We have a big industry and a lot of vested interests, and it takes a long time to adapt.... I don't think we will be doing much of anything on film very soon; most of the things that used to be the advantages of film are gone."
-Dean Delvin, Director of Photography

Comparing the closing credits of a major film feature with those of a typical video production provides some measure of the differences that still exist between the two media. (Try sitting through the closing credits of Pearl Harbor or Ratatouille!)

For decades, film has enjoyed rather consistent worldwide standards. A 16mm film can be broadcast on any of world's broadcast systems, regardless of the broadcast standard, and a 35mm film can be shown in almost any theater in the world.

paragraphVideo, on the other hand, has not only progressed through numerous tape formats, but there are now a half-dozen incompatible broadcast standards being used in various parts of the world. For producers with an eye on international distribution, film has for decades been the obvious choice.

paragraphThe line between the two production approaches has now become a bit blurry. Today, many productions start out on film, and using a DI (Digital Intermediate) step the film is immediately transferred to video for subsequent postproduction work.

Assuming that the production has to end up on film, the video is then converted back to film as a final step. But as theaters continue to convert to digital projectors, this "final step" is no longer necessary in many theaters.

Relative Equipment Durability

paragraphIt is commonly assumed that film equipment is more durable than digital video equipment -- primarily because film cameras are mechanically much simpler.

Although this may be true, professional video equipment--especially with the new solid-state recording media--is now being successfully used under the most extreme conditions.

However, when stored and used in extreme temperatures, color film stock will suffer problems such as color shifts.

Technical Quality Compared

paragraphIt is commonly believed that the quality of 35mm motion picture film as viewed on television is better than video. If we are talking about the artistic differences, then film may still have an advantage for the historical reasons we've noted.

Although artistic differences between film and videotape are difficult to measure, purely technical differences are not. This brings us to the following statement.

    If production conditions are controlled and if comparisons are made solely on the basis of sharpness and color fidelity, the best 35mm film will be slightly inferior to the best video, assuming the latest professional-quality video equipment is used and the final result is broadcast.

As controversial as this statement might be with some film people, the reason becomes obvious when the production process for each medium is traced.

First, it is important to realize that if a signal from a video camera is recorded on the highest-quality process, no discernible difference will be noted between the picture coming from the camera and the picture that is later electronically reproduced.

With film intended for broadcast the process is far more complex.

First the image is recorded on negative film. Typically, the original negative film is then used to make a master positive, or intermediate print. From the master positive a "dupe'' (duplicate) negative is created; and from that a positive release print is made. This adds up to a minimum of three generations.

At each step things happen: color and quality variations are introduced by film emulsions and processing, there is a general optical degradation of the image, and the inevitable accumulation of dirt and scratches on the film surface starts.

After all of these steps, the film release print is projected into a video camera to convert it to an electronic signal, which is where the video signal started out in the first place.

There is also this: Unlike video, film is based in a mechanical process. As the film goes through the gate of a camera and projector there is the inevitable loss of perfect registration. This is easy to see when you sit close to a large motion picture screen the note ever-so-slight variations in the placement of sharp (primarily) horizontal lines. This is often referred to as judder, and it results in a slight blurring of projected film images.

paragraphTo understand more of the film-video sharpness difference we must bear several other factors in mind. Film is theoretically capable of resolving several times more detail than standard video.

But, since it looses much of its sharpness in its route from film camera to television camera, when the film is converted to video electronic image enhancement is routinely used to restore lost sharpness. Although image enhancement sharpens the overall look of the film image, once lost, subtle details cannot be enhanced back into existence.

At the same time video is becoming capable of resolving ever-greater levels of fine detail. Eastman Kodak has announced a CCD chip capable of holding 16,777,216 bytes per square inch, which is double the resolution of standard 35mm film.

But the sharpness of video isn't necessarily a plus.

Many people think the slightly softer look of film is actually one of its advantages. For one thing, the soft ambiance surrounding the film image is subconsciously if not consciously associated with "Hollywood film making.''

There are also subtle tonal and color changes with film, which, while not representing the true values of the original subject matter, are subconsciously associated with film and it's historical heritage.

At the same time, the slightly sharper image of video is associated with news and the live coverage of events, subject matter that is very much in contrast to the normal fare of feature films.

Coping With Brightness Ranges

paragraphUntil recently, video cameras simply could not handle the brightness range of film. (Remember ▲30:1 is the maximum brightness range for many home receivers.)

If film exposure is carefully controlled, a bright window in the background of a scene, for example, will not adversely affect the reproduction of surrounding tones.

As a result of early experience with professional tube-based video cameras, many producers concluded that film had a major advantage over video. And, at that point, it clearly did.

But times have changed. One video camera (the Phantom 65) demonstrated at the 2008 NAB convention can handle a 10,000,00:1 contrast ratio -- or 23 f-stops of exposure latitude in the same scene. Red One Camera

In a demonstration the camera was able to clearly see the burning a filament in a clear lit 500-watt bulb and, at the same time, reproduce background objects.

paragraphAs we note here "The Red One" video camera (shown here) from the Red Digital Cinema Camera Company has a resolution of 5,000K, exceeding the best broadcast HDTV.

Other digital cameras being used in production are Sony's F35 and F23, Panavision's Genesis, and Arri's Arriflex D--21.

This graphic shows the relative pixel resolution of several ultra-high definition formats.


paragraphThere is also a less obvious difference between film and video.

With NTSC television the film-to-video conversion process requires some technical "fancy footwork" that results in the introduction of almost subliminal effects associated with the film image on TV.

NTSC video is transmitted at 30 frames per-second and the frame rate for film is 24 per-second. (The machine shown on the right converts film images to video.)

Because there is no nice, neat math associated with dividing 30 by 24, the only way to make the conversion is to regularly scan some film frames twice.

This results in a subtle high-speed jitter, a type of artifact that has become associated (if only subconsciously) with the film image on TV.

With the SECAM and PAL broadcast standards used in non-NTSC countries the conversion process is easier. Both of these video systems operate at 25 frames per-second-very close to the 24 fps used in film. The 1 fps difference is almost impossible to detect, so adjusting the film camera or projector rate to 25 fps is a common solution.

DI - the Intermediate Digital Step

paragraphBy 2005, major motion pictures were using the advantages of digital imaging (DI) as an intermediate step between the color negative film shot in the camera and the final release print copied for use within theaters. (Here, we are talking about films made for theatrical release.)

Scanning the film into digital form provides much more control over color correction and artistic color changes.

Of course once in digital form special effects with video are much easier and less expensive than with film.

Uncompressed Video

paragraphOne of the quality compromises involved in HDTV has been the need to compress the signal.

However, as the cost of digital recording and storage has decreased we are seeing some production facilities move to uncompressed (4:4:4, 10 bit) video recording and editing. Silence Becomes You, released in 2005, was billed as the world's first uncompressed 4:4:4 feature production--shot with a video camera and later converted to film.

Once this approach is more widely adopted, we'll see a major jump in image quality and post-production speed and economy, making the switch to HDTV even more attractive.

Digital Cinema

paragraphSo-called digital cinema or e-cinema (electronic cinematography) is rapidly gaining ground, especially since it is becoming almost impossible for most theater patrons to distinguish between it and film.

E-cinema is now preferred by many independent "filmmakers," and major "film" competitions now have more entries on video than on film.

The major weakness in the move to digital cinema has been with projectors. But, the latest generation is based on projector imagers with a 4-megapixel resolution--twice that of the previous generation of projectors. The detail possible with these projectors exceeds that of 35mm film projection.

Now the major stumbling block for digital cinema is the great initial investment in equipment--the projector and the associated computer. However, once this investment is made, major savings can be realized.

paragraphDirectors of Photography in film often resist moving to video equipment because "everything is different." It can take decades to move up to a Director of Photography position, and old habits and patterns of thinking are difficult to break.

For this reason, video camera manufactures have made some of their cameras resemble the operation of film cameras.

The video camera shown here uses standard 35mm motion picture lenses.

This means that directors of (film) photography do not have to abandon all that they have learned about the lenses.

Previously, we mentioned the almost subliminal effect that the NTSC film-to-video process creates. To make video look even more like film, even this "double-step" effect (resulting from the extra film fields being regularly added) can be electronically created. In fact everything, right down to electronically-generated random specks of "dust" can be added to the video image! (For a time -- and for questionable reasons -- video was being made to look like film -- bad film, in fact, -- by adding a host of electronic scratches, dirt, and even flash frames.)

paragraphThis extreme step aside, the first practical step used in creating a "film look" with video is through the use of filters. This link lists filters that are often used to make video look like film (if that's your goal).

Film also can have a more saturated color appearance. With sophisticated video equipment this can be simulated by adjusting the color curves in a sophisticated video editor. This can also be addressed in post-production by channeling video through computer programs such as Photoshop CS3, After Effects, or Chroma Match.

By softening the image to smudge the digital grid of video, and reducing the contrast, you can take additional steps to make video look like film.

Of course, the question is why would you want to degrade the quality of one medium to match another?

Possibly it's a matter of what people get used to. When people first heard high-fidelity audio, they didn't like it. After listening to music and voice for decades on low quality radio and phonograph speakers, they had become used to this as "the standard" in audio quality, and anything else--even something much better--didn't sound right.

paragraphThe feature film, 28 Days Later, released in mid-2003, did very well at the box office and was shot with video equipment.

By 2007, a number of feature films had been shot in high-definition video and then transferred to 35mm film for release in theaters.


Single-Camera, Multiple-Camera

Production Differences

paragraphPurely technical considerations aside, the primary underlying difference between film and video lies in the way it's shot.

Film is normally shot in a single-camera style, and video is normally shot in the studio using a multiple-camera production approach.

In film each scene can be carefully set up, staged, lit, rehearsed, and shot. Generally, a number of takes are made of each scene and the best one is edited into the final production. As they strive for perfection in today's high-budget feature film productions, some directors re-shoot scenes many times before they are satisfied. (Possibly the record is held by one well-known film director who reportedly shot the same scene 87 times.)

Quite in contrast, video is generally shot with several cameras covering several angles simultaneously. Instead of lighting being optimized for one camera angle, it must hold up for three or more camera angles at the same time. This means that it's generally lit in a rather flat manner, which sacrifices dimension and form. And, with the exception of single-camera production, multiple takes in video are not the rule.

By replacing film with videotape and speeding the production process George Lucas saved at least $3-million on the 2002 Attack of the Clones.

--Larry Thorpe, Senior VP, Sony Electronics

Film and Videotape Costs

The minute-for-minute cost of 16mm and 35mm film and processing is hundreds of times more than the cost of broadcast-quality video recording.

And, unlike film, tape is reusable, which results in even greater savings.


paragraphOffsetting the savings with video is the initial cost of video equipment.

Depending on levels of sophistication, the initial investment in video production and postproduction equipment can easily be ten times the cost of film equipment.

The cost of maintaining professional videotape equipment is also greater -- although this is changing with the adoption of computer disk and solid-state recording.

On the other hand, there is a substantial cost savings in using video for postproduction (special effects, editing, etc.). As we've noted, for these and other reasons film productions intended for television are routinely transferred to videotape. This transfer can take place as soon as the film comes out of the film processor.

Reversal of the negative film to a positive image, complete with needed color correction, can be done electronically as the film is being transferred to videotape or computer disk. From this point on all editing and special effects are done by the video process. The negative film is then locked away in a film vault and kept in perfect condition.

Even for film productions intended for theatrical release, major time and cost savings can be realized by transferring the film to videotape for editing. Once edited, the videotape is then used as a "blueprint'' for editing the film.

Will Video Replace Film?

So will video or digital imaging soon replace film for primetime TV production?

Yes, eventually, just as it will eventually replace film in motion picture work. The move is well underway in both areas. Aesthetic issues aside, the transition is being driven by pure economics.

In 2009, the majority production done for TV was shot digitally.


No comments:

Post a Comment