Up Against the Wall
Posted Feb 1, 2003

Today's professional digital studios, from Hollywood to the hinterlands, are humming with projectors, and few are old-fashioned film models. How are studios using digital projectors, and whose are they using?

February 2003|Getting into a real Hollywood studio is harder than getting into the Pentagon—I should know, I've been in both places. Hollywood's desire for secrecy comes not from the need to protect national secrets, but from the need to keep the "competition" in the dark—well, maybe that's the basis for Washington's tight-lipped policy too—and keep the other guys guessing.

Anyway, the result is that no one wants to talk on the record about how they do things or what they use to do it with, even with something as innocuous as digital projectors used to preview studio-developed projects. But I got the scoop anyway.

The Rat Pack's Stack
So what kinds of digital projectors are used in Hollywood? Well, at one large studio that favors smiling rodents—I can't say their name since they don't want to endorse anyone's products unless they get paid a lot—I saw just about every kind of projector from small conference room models to the latest in large-venue projectors. That's right, the happy rat people use projectors in their meetings with PowerPoint presentations just like everyone else does. They use their projectors to review and preview the progress of all their ongoing animation projects as well. Since the final result will be seen on the big screen, they need to see how it's going to look before hand. One animator told me that they constantly run digital progress reports on the big screen so most of their screening rooms are outfitted with cool projectors.

So what kinds of projectors are at the happy rat palace? Like I said, all kinds—whatever is hot. I've seen just about every model there and, as you can imagine at a place where money flows like well water, they have the latest and greatest. They favor high resolution—the higher the better—and they like big, bright projectors. So, I imagine that when JVC started shipping their QXGA (the "Q" is for quad, or four times XGA's 1024x768 pixel resolution) LCD projector early this year that the rat people got one or two or more in place. I hope that they got a discount over the advertised price of $225,000, since that's major money, but who really cares—it's Hollywood.

JVC's latest and greatest projector (the DLA-QX1G) lives up to its hype as being the highest-resolution projector in the world. It has a 7000-lumen, 1000:1 contrast ratio optical engine consisting of three of JVC's home-grown 1.3-inch diagonal reflective LCD panels, each with QXGA resolution—that's 2048x1536 pixels each for a three-panel total of about 9.4 million pixels. When I saw it operating I thought that the HD video images from JVC's QXGA projector really did appear to rival those of 35mm film (as they bragged in their press release) and that's why JVC developed it—for the electronic cinema market and for all the people working hard in Hollywood to make those movies. The best that the DLP electronic cinema crowd can do today is SXGA resolution—1280x1024 pixels on each panel and that's only a little more than 3.9 million pixels total. TI may think that SXGA is sufficient. But I have heard several Hollywood people complain about SXGA resolution, saying that SXGA does not give them the kind of resolution they want. So maybe JVC's 9.4 million pixels will—and according to JVC, their projector meets the SMPTE DC 28.8 study group and the American Society of Cinematographers (ASC) members' 2000 lines of resolution standard for digital cinema.

Why won't SXGA work well? For one thing, besides not even getting close to the SMPTE standards discussed earlier, SXGA resolution has an oddball aspect ratio; it's a 5:4 aspect instead of the well-accepted computer standard 4:3 aspect ratio. JVC tries to get around this in their smaller projectors by offering SXGA plus at 1365x1024 pixels, which can then make the standard 4:3 aspect ratio instead of the weird 5:4 ratio. But that 4:3 aspect doesn't help to provide clear 16:9 HD or DVD images or the even wider aspect ratio required by "real" movies which use a 2.35 aspect ratio image that is more than two times wider than it is tall. Only a few companies make projectors with native 16:9 aspect ratios (like Sanyo's PLV-70HT), probably because, as with SXGA, the 16:9 market is so very small compared to the "unwashed" masses of XGA business projectors.

Even many of the companies who use SXGA resolution in electronic cinema don't use 16:9 chips to start with. They either use "anamorphic" projection lenses with standard aspect ratio chips—which means that the lens stretches and distorts the 5:4 or 4:3 image on the LCD or DLP to match the super-wide, 2.35 screen format—or they merely use a narrow portion of the available active area and throw the rest away. The anamorphic lens method results in bright but distorted pixels, while the throw out the extra-pixels method results in a dimmer, but more accurate on-screen image. Of course, if you start that process with JVC's bright QXGA projector, you can give up some light and pixels or both without it suffering in comparison to the other units.

Hollywood people still use a lot of the latest and greatest SXGA projectors as well. I know because I've watched them shop for projectors and I've seen them in their screening rooms. They like the big, bright 5000-lumen-and-up SXGA units like NEC's "TriDigital" HD 10K digital cinema projectors as well as NEC's SX6000DC. Both of these projectors are built around three of TI's standard (not the much more expensive electronic cinema enhanced "Black" chips) 0.9-inch SXGA resolution DMD chips. The SX6000 uses a 1000-watt Xenon lamp to make about 5,000 lumens and the HD10K uses a 2000-watt lamp to make around 8,000 lumens. The base prices for these units are $79K and $125K respectively—at most, only about half of the big JVC's ultimate price—but NEC's TriDigital circuits work hard to make up for their lack of pixels.

Most digital cinema demos I've seen rely upon cartoons—excuse me, animated movies—to demonstrate their wild color gamuts. However, live-action films are more difficult than cartoons because people need to look human. In addition, another big difference between live action movies and video (such as TV sit-coms) is that movies are generally shot a lot darker. I hate to characterize broadly, but television uses bright, well-lit scenes, whereas movies, which are designed to be watched in darkened theaters, use darker scenes with many more shades of gray shoved down in the darker sections. The result is that TV-video gamma curves are a lot different from digital cinema gamma curves.

NEC's TriDigital Processing consists of three special digital processing circuits. Two of these three processes are called "ColorBit" pre-processor and "Wide ColorBit" post-processor that have been designed to manipulate a digital movie's color gamuts, skin tones, and gray scales such that the on-screen result mimics film's color response. The last piece of NEC's technology, called a "Deep BlackBit Decoder," was designed to extend and map the projector's dynamic range to film's black levels and dynamic range. In addition, these algorithms seem to be not just static processes that are set up once and forgotten; there appears to be some kind of dynamic, adaptive behavior as well. The result is that the light scenes are bright enough and the dark scenes are dark with wonderful detail and yet when there's a bright flash on screen, it's so much brighter that it makes you squint.

Which ones work better in the digital studio? NEC's TriDigital or JVC's QXGA? Or even the true "digital cinema" projectors? Hard to tell—I haven't been able to get them all in the same room, but I have critically observed them all under similar conditions and they are all very impressive. I think that the ultimate shoot-out winner of this hypothetical contest will depend greatly upon the source material. I've seen the JVC projector running an HD digital tape recording directly from an HD digital camera and the on-screen image looked three-dimensional and alive. Maybe that is because JVC uses 10-bit digital color processing along with a 12-bit gamma correction. However, the NEC uses very similar processing bit depth and therefore results in a very similar on-screen performance with similar gamma settings. I've also watched digital movies—telecined digital film transfers with both technologies, and again with similar results. In fact, I liked the direct HD video camera images the best—on either projector. In general, I thought that the telecined film transfers (unless done from master negatives) lacked contrast and depth in comparison to the freshly shot HD images.

The SDI Initiative
Provided that you've got the right projector in your digital studio, the input image is really key and one very important item for that is the interface card. If you want the best in image quality for a movie that is being created in your PC, you'll benefit from a Serial Digital Interface (SDI) card that will let you drive an SDI signal directly from your computer workstation into a high-definition tape recorder or directly into a big projector like the JVC, NEC, or even a lower-priced Sanyo unit with an optional SDI unit. That's right—the SDI input is an optional, additional cost item on most projectors.

If you're into "reality shows"—and who isn't these days—then the SDI option board is for you. Sanyo's $8795 option board, the POA-MD08HD, plugs into one of the two "digital" slots on the biggest Sanyo projectors (like the "Professional" UXGA 1600x1200 PLC-UF10 model), and provides a tremendous improvement in video image quality—so much so that the on-screen image gains an almost 3D-like quality. I've seen many 3D demos over the years—screenings that usually involve wearing weird cardboard glasses to watch snakes pop out of the wall. A big Sanyo projector fitted with the SDI option will not make snakes pop out of the wall. However, the images are so clear and defect-free, and have such noticeably greater bit depth than the standard input cards with the same (but analog form) signal, that the on-screen images look very life-like and well, almost 3D.

The big Sanyo UF10 is a 7700-lumen projector based upon three 1.8-inch diagonal LCDs—each of which has 1600x1200 pixels. Four 200-watt lamps—totaling 800 watts of power drive the UF10 over 7,000 lumens along with an advertised contrast ratio of 700 to 1. The UF10 carries a MSRP of $59,995—without lens, so budget up to $3K to $5K more for lenses along with the SDI input cards $8795—which puts you back about $70K in total. The Sanyo's price is certainly a few dollars less than a similarly equipped projector from NEC—such a deal; more resolution for less money.

One of the reasons that the SDI option board does such a good job is the HD source material, which is hard to come by, but that's another story. However, the serial data formats supported by the option board include most of the SMPTE 260M to 274M high-definition TV standards—and as you can imagine, you need to have source material created with those standards in order to use the SDI board. But however you get those signals—whether off the air, from an HD camera, or from your workstation—an SDI input board does an even better job with a digital version than what can ever be done with a normal three-wire "analog" component version of the same signal. And forget about even trying to use composite video signals here—they'll be useless to you if you want the real quality that these projectors can deliver.

Anyway, like its name implies, the SDI board takes a 1.483 or 1.485gHz-rate serial bitstream consisting of 10 bits of data from each red, green, and blue channel and then converts that serial data into 10 bits in each color of parallel data (along with syncs) that can be fed into the projector's digital backplane at a high-definition "base-band" data rate of about 75mHz. The Sanyo's on-board video processor does little else besides allow for the standard contrast and color adjustments before feeding the eight most significant bits of that digital bit stream into the corresponding red, green, and blue LCDs. That's right, Sanyo's LCDs are only 8 bits—wish there were more, like the big JVC's and NEC's 10 to 12 bits into the imager and I wish that the Sanyo's 10-bit gamma processing could replace those lost bits. But forget about that—with a true high-definition, 10-bit digital video signal, no additional fooling around needs to be done—just use as much of the signal as you can. The digital signal is supposed to be good to start with—and if you added in some processing, the image could be degraded—all that's required is a pure path into the projector and that's what an SDI input board does.

That may not seem like much effort, but even with an 8-bit projector like the big Sanyo's, it amounts to a significant improvement over watching the exact same signal in analog form going into the standard five BNC input block through the three-color video component lines. I split the signal coming off the back of one of Sanyo's HD tape players—the HVD-M100, which outputs both three-wire component and one-wire SDI signals—and ran those two signals into identical SXGA-resolution XF30 projectors. Sanyo's SDI board gave a very significant advantage in both small detail contrast, color, and clarity over the standard board. This is because if the analog form of the HD signals is used, first of all, the digital data has to be converted into analog data (in the tape playback machine)—with some loss of accuracy as always happens with D to A conversions. Then the analog data goes into the standard five BNC video processing board where it gets filtered some more and then converted back into digital form again—at the standard 10-bit sampling rate—before going to the 8-bit LCDs on the way to the big screen. You would think that would be okay, but no matter how well D-to-A and A-to-D conversions are done, the result will be no better than what a copy of a copy allows.

Anyone who's ever stood in front of a copy machine knows that the second- or third-generation copy is not as smart and crisp as the first. So if you want your HD movies to look as good as they should, which has the potential to propel your viewing experience into the "Third Dimension," then spend some money and get some of those "pure channel" SDI boards so that you can see the original HD material as it is supposed to be seen.

Studio to Cinema
More expensive projectors like NEC's HD4K projector and JVC's QXG1 projector accept more bits—10, 11, or 12 or so per color depending upon where and how you count—and others like Sanyo's big guns can only let eight bits all the way through to the imagers. Most HD signals (and the D5 tape format) can have up to 10 bits or more per color and the SDI standard strives to maintain all those bits. Three-wire component signals are obviously analog signals—nothing wrong with analog—but those analog signals need to be converted into digital somewhere behind the plugs on the projector. Every time you re-convert a signal, you lose definition.

So, if you want the ultimate in digital cinema for your studio, think about getting one of TI's digital cinema projectors to use for viewing the dailies—like one of the DLP digital cinema projectors from Christie—the DCP-H or I. These projectors carry along some extra bits—up to 15 or so "color processing" bits per color for the ultimate in bit depth and virtual reality. But in spite of all those bits, they're still just SXGA-resolution machines and along with a lack of pixels, those units use the traditional film projector's unwieldy (and huge 7000-watt) lamp house for light plus an amamorphic lens to squeeze the SXGA imager's 5:4 aspect down to filmic shape.

The Christie projectors (which are priced below JVC's ultimate machine) also match film's 24 frames per second—something that the big, higher-resolution Sanyo projectors supposedly cannot do, and they provide more contrast—up to 1350:1 according to their specifications—due to their use of TI's newest "Black" DMD chips. NEC's TriDigital processing is supposed to make the more expensive "Black" chips unnecessary and they also provide 24fps processing as does the big JVC. However, one thing that "true" digital cinema projectors like the Christie units provide is more color saturation. I've measured the optical systems in most of these projectors—including TI's digital cinema units and I've found way more color saturation in the digital cinema systems.

If you need the ultimate in color saturation along with the max bit depth and 24fps processing, look no further than Christie or the other TI cinema partners. But if you want the best in pixel resolution, you'll need to go elsewhere. Perhaps Hollywood has still not chosen its ultimate projector , and I guess that's why the folks in the Hollywood rat race like to keep shopping around so much.

Companies Mentioned in this Article

Christie, www.christiedigital.com

JVC America, www.jvc.com

NEC Projectors, www.nec-pj.com

Sanyo Fisher Company, www.sanyo.com

Texas Instruments, www.ti.com