Gamma, HDBaseT, Projectors
I notice that most projectors don't have a spot-on gamma of 2.2. They are either a little above or under the ideal 2.2. Is this less important than color temperature and contrast ratio?
For one thing, there will always be some minor variation from one model to another and even one sample to another. Also, manythough certainly not alldisplays have a gamma control that lets you select one of several values, and a few even let you "draw" a gamma curve. As for the importance of gamma, let me defer that discussion for a moment.
For those who don't know this term, gamma is a number that describes how a display responds to the brightness information in the input signal. It is often said to determine how quickly the display "comes out of black" as the brightness information in the input signal increases from its minimum to maximum value. The higher the gamma, the more slowly the brightness of the display increases along with the brightness information in the input signal. Among other things, this affects shadow detailhow much detail you can see in dark areas of the image.
You might think that a gamma value of 1 would cause the display to increase its brightness in a linear manner, which would be true if the original video was captured in the same way. But video brightness is not captured in a linear manner. Why not? Because the electron guns in CRT displayswhich comprised virtually all video displays until recentlydo not respond to the input voltage in a linear manner. Instead, they produce less current at low voltages and more current at higher voltages, as shown by the solid curve in the diagram above.
In order to end up with a linear increase in CRT brightness, the video brightness had to be captured with an inverse gamma-correction curve, as shown by the dotted curve in the diagram. When brightness is captured in this way, it compensates for the natural response curve of CRT electron guns, and the result is a linear increase in the display's brightness.
These days, virtually no displays use electron guns, so they have no inherent gamma curve. However, capturing video with an inverse gamma-correction curve is still ubiquitous, so modern displays must simulate a CRT's gamma curve so that the increase in brightness from the screen is linear.
I wouldn't say that gamma is less important than color temperature and contrast ratio, but these three elements are somewhat different in character. There's a well-defined "correct" color temperature (6500K), and the contrast ratio should simply be as high as possible, but the best gamma value is somewhat subjective.
Objectively, the best display-gamma value is the inverse of the gamma-correction value used to capture the video image. This is supposedly standardized at a value of 2.2 for video, but not all video is captured at precisely that value, and there's usually no way to tell what value was used to capture a particular piece of video.
On the subjective side, the best-looking gamma setting depends on the specific content; for example, a dark movie might look better with a lower gamma, whereas a bright movie might look better with a higher gamma. And most 3D movies are helped by a lower gamma because they look so dim compared with 2D. Also, some viewers prefer a generally higher gamma value than others. In any event, the subjective difference between, say, 2.2 and 2.4 is pretty subtle.
HDMI vs. HDBaseT
There have been several stories lately about HDBaseT and how great it is compared to HDMI. The website for the organization behind this connection scheme has a chart that compares HDBaseT to HDMI 1.4, DiiVA, and DisplayPort 1.2. The chart says that HDBaseT can carry 10.2Gbps of audio/video data but is capable of scaling up to 20Gbps. Similarly, it says that HDBaseT can accommodate Ethernet speeds up to 100Mbps but can scale to gigabit.
What does it mean when they say that HDBaseT is "capable of scaling"? Normally, I think of scaling as "adding more of something" like a server or CPU, not a wire. I couldn't find a clear answer on Google.
You're right that "scaling" can mean "adding more to something," though it can also mean "subtracting part of something," as in downscaling a 1080p image to 720p. In the case of HDBaseT, the organization behind it claims that the transmitting and receiving circuitry can currently handle 10.2Gbps of uncompressed audio/video and 100Mbps of Ethernet trafficthe same as HDMI 1.4but the underlying technology can handle 20Gbps A/V and gigabit Ethernet, respectively. This fits perfectly with the definition of "adding more to something"in this case, more bandwidth to the connection circuitry.
HDBaseT looks interestingit carries audio, video (including 3D and 4K), Ethernet, USB, and control signals on a single, inexpensive CAT5/6 cable, which can be a lot longer than an HDMI cable. Even more impressive, it can supply up to 100W of power to the receiving device over the same cable; for example, it could power a remote TV. This seems like a great idea as A/V devices become more network-oriented, and it's supported by LG, Samsung, Sony Pictures, and Valens Semiconductor.
Still, HDMI is so entrenched at this point, I doubt that any competing technology has a chance of supplanting it, at least in the A/V market. Then there's the issue of copy protection, which is not mentioned in any of the articles I've read so far. Of course, none of the movie studios will support any digital-connection scheme without robust copy protection, though with Sony Pictures onboard, I have to assume they've addressed this issue. For more on HDBaseT, see today's news item on HomeTheatermag.com.
Is Bigger Not Better?
I'm thinking of getting a projector, maybe an Optoma, with a movie screen that hangs from the ceiling instead of a plasma or LCD. I know I'll need blackout shades on the windows, but with a projector, I can get an image of 80+ inches. My concern is how much will the picture quality suffer on high-def content if I go with a projector vs. a 65-inch plasma or LCD?
If you get a good 1080p projector and screen, and you have complete control of ambient light, the picture quality should not suffer much at all. It's true that the larger the image size, the more you will see any defects in the picture, and smaller pictures tend to look sharper at the same resolution. But there's nothing like a big projected image to draw you into the story being told on the screen. I've seen plenty of really large projected images that look spectacular, so I wouldn't worry about it.
In general, Optoma projectors are excellent performers and offer superb value, but some of its low-end models do not provide lens shift. This severely limits your placement options, at least without engaging keystoning, which I avoid at all costs because it noticeably degrades the image quality. So make sure the model you get has horizontal and vertical lens shift. Another option is Epson, which also makes excellent, high-value projectors, especially models with the "UB" (Ultra Black) designation.
If you have a home-theater question, please send it to email@example.com.