Friday, May 19, 2006

The Less Discussed Complications of HDMI, HDCP and DVI in our HD world

The following article is my attempt to understand this technology and its complications. It is RAW (Researching As I Write) and comments as to mistakes are greatly appreciated.

It seems to be a daunting task to buy an LCD TV today, in an era where the world's laws are reeling from an onslaught of digital content piracy, new distribution methods and usage paradigms for video content, powerful processing power of consumer electronics and a new renaissance for Digital Rights Management after a multitude of failed attempts. It was for me. 3 factors inspired my research culminating in this blogpost:

  1. My company has become one of the major players for LCD panels and PC VGA cards in Singapore
  2. I had to buy 2 LCD TVs in the past month for personal usage, and embarassingly, I did it blindly despite the fact that we sell thousands upon thousands of LCD panels.
  3. Most of all, an excellent forum thread in Hardwarezone titled 'Discussion on LCDTV vs LCD monitor'
Don't let the acronym-heavy title intimidate you. After some research which results pissed me off, that's like playing to the hands of the powers that be. Just to prime you up, here's a very simple glossary I found from about.com:


HDCP - HIGH-BANDWIDTH DIGITAL CONTENT PROTECTION

HDCP stands for High-Bandwidth Digital Content Protection and was developed by Intel Corporation. This link stands for how it works, Electronics Engineer version. MT - One comment about HDCP - it's Content Protection, not Copy Protection. What's the difference? Copy protection prevents copying/recording and all the consequences like time shifting, pausing, etc. which is stupid, as these functions are DESIRED and USEFUL. Content Protection does not prevent copying of content, but allows the Licensor and Licensee of the Content Protection scheme to control the usage of this content - putting the control of who can view this content, in what quality level to show it, what devices can view this content, what devices can record this content, how much of this content is permitted to be stored, etc. on the hands of the device manufacturer, who can do whatever he wants within the terms of his HDCP license. This is an essential, but subtle difference - it makes the desicions of the Designer of the a HDCP licensed product FINAL, and not subject to second guessing by hackers etc until HDCP is indeed irretrievably broken. And it enables future changes to rules, without needing to change the entire content protection scheme. So, CONTROL, not PREVENTION.


DVI - DIGITAL VISUAL INTERFACE

DVI was created by the Digital Display Working Group, and stands for Digital Visual Interface. It allows for a high speed uncompressed connection between a digital television, personal computer, and other DVI-based consumer electronics devices. The input is something like you’d find on the back of your computer. One big benefit of DVI is the uncompressed transfer of high definition video.

While you don’t see it when you receive HD programming, it goes through a conversion from the source to the set-top box to your screen. Usually, component cables are used to transfer the red-blue-green signal. The advantage of DVI is that it only requires one cable to transfer the red-blue-green signal, and the speed it transfers an image is significantly faster than the analog component cables, which benefits the overall viewing experience on DLP, Plasma, and LCD televisions.

Combined with HDCP, DVI was the standard for digital television until a few years ago when HDMI was introduced.

HDMI - HIGH-DEFINITION MULTIMEDIA INTERFACE

HDMI stands for High-Definition Multimedia Interface, and like DVI, it allows for the uncompressed data transfer of video between a digital TV and HDMI-enabled consumer electronics devices. The big difference between HDMI and DVI is that HDMI transfers the video and audio signal. DVI only carries the video signal.

According to the HDMI’s official Web site, the advantages of HDMI are:
1) The highest quality video seen and audio heard
2) Fewer cables behind the TV means less mess and confusion-free connection
3) Automatically configures remote controls of devices connected by HDMI
4) Automatically adjusts video content to most effective format
5) HDMI is compatible with DVI, which means it will allow connection to PCs

Because it combines the audio and video signal, HDMI has tremendous support from the MPAA. It was created by some of the heavyweights in the consumer electronics industry - Hitachi, Matsushita, Philips, Silicon Image, Sony, Thomson, and Toshiba. The HDMI input is similar to a USB connector on a PC.

HD

High Definition video, comprosing of a picture higher in resolution that what we were using for the past 30+ years. The only variants worth considering are 720p, 1080i and 1080p, where the numbers signify the number of horizontal lines, and p and i signify progressive and interlaced line drawing patterns, a legacy from the old electron gun scanning paradigm for traditional TVs.

Now that you have the basic official stand, now get the dirty secrets from my perspective.

First of all, if you're buying an LCD TV today, there's a lot of bullshit which is going to hit your face courtesy of salespeople, forum posts and magazine articles, and worst, official positions which don't tell you the whole story. Well what I write here may be bullshit too, and misinformed and ill-researched. So read this article AND many other articles on the web. Google is your friend.

Keep it simple if DVD quality is good enough

There is one kind of guy I can simplify life for, just instantly! If you:

1) are NOT the kind of guy who downloads video torrent files, and not planning to connect your PC to your LCD TV, and yet
2) want need to buy an LCD TV right now, because your old CRT TV is failing, and
3) need the ability in your life sometime within the next 3 years to play all those HD broadcasts on your TV in DVD quality at least,
4) Want to play all your DVDs at the best possible resolution the DVD can garner, and
5) Firmly believe that DVD-quality is good enough

then keep your selection process simple. Get ANY widescreen 16:9 LCD TV with a component input. These sets, you'll be equipped to enjoy DVD-quality video without issues and without needing to worry about anything else. And they can get really cheap if you don't ask for stuff like DVI, HDMI, HDCP etc. Most of the sales may be stock closeouts, you will really get a great bargain without trying hard.

Why not worry about HD? Why is my recommendation above devoid of any recommendation for any digital video interface and I cut through all the HDMI/HDCP/DVI complications? Simple, LCD TV and DVD player and Set top box manufacturers, and broadcasters, are not going to eschew the component interface. They DARE NOT. If they take out all the composite, S-Video and component connections - NOBODY WILL BUY THEIR TV OR BOX. And it costs them next to nothing to implement these interfaces on their machine anyway. So, you'll still get HD in some way - the component, S-Video and composite RCA interfaces will still be there, and you can play future HD content using the component interface at least in DVD quality, assured. You're covered for playing any HD stream in at LEAST DVD-quality anytime from now to a forseeable future 3 to 5 years from now. Case closed.

The rest of this blogpost will be focussing on the digital interfaces, and more on what they have been engineered TO UNDERPERFORM THE TECHNOLOGY'S POTENTIAL, and less focus will be placed on what they CAN do. Suffice to say, these interfaces, DVI and HDMI, can do MUCH MORE beyond what is reasonably demanded of. DVI today, with DUAL-Link function capability, can support stratospheric resolutions like 2560 x 1600 WQXGA - so there are NO technical limitations with these interfaces which we can reasonably push against within a foreseeable timeframe of 3 years or perhaps 5 years.

Most of all, now I'll speak about things close to the hearts of people who want to connect PCs to LCD TVs, and want to really use their LCD TVs to the highest resolution and people who are treating their cable and terresterial and satellite feeds as secondary, and want to use their LCD TVs as new media devices, having the LCD TV to display their Windows Desktops, high resolution web pages, photos, very high resolution videos, or combine all the windows on 1 screen.

OK, the way things are going, here are the pitfalls not usually discussed, but you can find them on the web.

1) Video Degration because of HDCP, or the lack of it

If the FCC (US government) and MPAA (Hollywood guys) have their way, any television displaying a program encoded with HDCP not connected through DVI or HDMI might be degraded – meaning a high definition signal of 1080i will be automatically converted to a lower resolution. Matthew Torres of About.com thinks that the signal may be blocked, but I think that's not ever going to happen. You'd still at least get DVD resolution. In a limited way, I've already been proven right: On a HD DVD or a Blu-ray disc, if the manufacturer or publisher of the disc has set the HDCP protection flag set as ON, if this player is connected to a non-HDCP compliant LCD TV, the player will output a downsampled signal of 540p. (More good news: Sony has declared that they will NOT set the downsample flag and a US court has already struck down the FCC's broadcast flag regulations)

It is clear that sometime in the near future, bad things (from the perspective of a pixel geek) are going to happen if you use a TV not equipped with HDCP to view HDCP encoded video. If you MUST view your official, legal content, like high definition Video discs of the future, HD cable, HD Digital Terresterial Video broadcasts, HD Satellite feeds, HD Cable feeds at the very highest resolutions it was transmitted at, AT THE EXPENSE OF COMPUTER VIDEO FEEDS, you MUST get a LCD TV with the evil HDCP embedded in its heart. There is no alternative short of breaking the HDCP scheme with some third party device like Spatz-Tech's DVIMAGIC which might be illegal. What's more, breaking the HDCP scheme is not as straightforward as obtaining the keys, as compromised keys can be revoked as part of the HDCP scheme.

Still, if you do get a HDCP-enabled LCD TV, you are signing on to HDCP's draconian terms. Note that HDCP licensees all sign agreements with the HDCP Licensor, Digital Content Protection, LLC, to LIMIT the capabilities of their products. These agreements are both private and public, some limitations may be more draconian than others. The money you spend on the technology is already substantial, and paying MORE for HDCP just to have it cripple the technology your purchased, is pretty much a bad deal.

2) The HD-Ready Label - without it, can I play HD?

Yes you can. On January 19, 2005, the European Industry Association for Information Systems (EICTA) announced that HDCP is a required component of the European "HD ready" label. However, HDCP is NOT a technically essential component mandatory for the playback of HD content, and it ignores HD content not encrypted with HDCP, for example, titles from Sony, PC-generated HD video without HDCP, or any HD video in general not employing HDCP for example high quality videos from Usenet or some peer-to-peer download.

There are gems of LCD TVs not having the HD-Ready logo, which might give you (the pixel geek) better technology without HDCP in it. If you exclusively play torrent files, you need not care about any HD-Ready label nor HDCP. Just get a LCD-TV with DVI and not HDMI-only (discussed below).

3) If you buy a HDCP compliant LCD TV, there is a possibility that your HDCP keys may be revoked sometime, anytime.

Heck, the original agreement is here ... I don't know how up to date it is. The plain english is here, and the excerpt below reworded in the context of LCD TVs.

For instance, let’s assume that you’ve purchased an LCD TV with HDCP. Everything is going fine, until one day, it's no longer working with your some new HD-DVD discs you just bought and some channels you're watching on your HD cable. What happened is that your cable box just used some signals, within the architecture of HDCP, to invalidate the keys used by your LCD TV. From that point on, your cable box and HD-DVD player will treat your LCD-TV as a rogue device. As such, it will not allow it to play HDCP content, or at best allow you to play it at lower resolutions.

Why did this happen? HDCP has the capability to revoke `compromised keys'. Say the HDCP keys of your LCD TV, which are essential for its proper operation, have been hacked by some hacker in Brazil - the HDCP licensor can initiate a procedure to invalidate the keys on almost EVERY HDCP device in the world which has that key, online or not, by using a myriad of methods including the signal from broadcasters of HDCP-embedded content and HD-DVDs.

In short, if you buy a HDCP compliant LCD TV, you MAY be worse off than not buying into HDCP if your keys are revoked. Fancy that!

Simple lesson - if you read about some easily hackable LCD TV model on the forums, avoid that. Yes, this is a paradigm shift from buying DVD players - where you'd go for the most easily hackable model. In the HDCP world, eventually that LCD TV would act worse than a non HDCP box. Your legal remedies are unclear, but now that you're forewarned, why would you want to place yourself in that position?

4) You may not be able to feed a high resolution signal into a HDCP LCD TV

So far, all that I have posted, points the way that the only LCD TV I should buy, is one which has HDCP built-in. Don't jump the gun. HDCP is evil to some people.

What if you feed a video signal with no HDCP whatsoever in that signal, into a HDCP-capable LCD TV? What about the high resolution 1600x960 desktop? Can you feed it into the HDCP LCD-TV? With partial aid from about.com I have compiled the feed-display matrix below and the issues involved:

Computer to LCD-TV connection matrix. Computer is an May 2006 computer built with readily available parts, the VGA card with or without HDCP license (2 matrices below), using both a DVI (digital) and VGA (analog) output. Assume that you have a Media Player software with HDCP support capability and the requisite drives.


With non-HDCP compliant VGA card

HDCP LCD non-HDCP LCD
HDCP content, digital output Content downsampled since VGA card is non-compliant Content downsampled since VGA card is non-compliant
non-HDCP content, digital output OK OK
HDCP content, analog output HDCP bans rendering full resolution to Analog, video degradation by computer HDCP bans rendering full resolution to Analog, video degradation by computer
non-HDCP content, analog output OK OK
With HDCP compliant VGA card

HDCP LCD non-HDCP LCD
HDCP content, digital output OK Content downsampled since LCD TV is non-compliant
non-HDCP content, digital output OK OK
HDCP content, analog output HDCP bans rendering full resolution to Analog, video degradation by computer HDCP bans rendering full resolution to Analog, video degradation by computer
non-HDCP content, analog output OK OK



What does all this mean? A lot. We can deduce some rules from this, and some parts of the conclusion is based on these 2 matrices.

Rule 1: HDCP is adverse to analog
Every time HDCP sees analog, it degrades the resolution of the HDCP content.

Rule 2: HDCP needs all devices in the chain to be HDCP compliant for full resolution
If you break the chain, HDCP degrades the resolution.

Rule 3: HDCP does not affect non-HDCP content
This might be obvious, but it took the most time in my research. Some forum posts I encountered in my research, gave me a sinister suspicion that HDCP would be adverse to high resolution non-content as a measure to plug the torrent hole. Continued research and my own testing, disproved this suspicion.

5) HDMI will fully be backward compatible with DVI, but ...

Aside from HDCP, all DVI-only LCD TVs today will be supported by HDMI sources now and in the future video-wise. Read that carefully though - video-wise. The multi-channel audio of HDMI will not be supportable in its native form, but surely there will be stereo downmix - in LCD TVs and playback devices, all present HDMI outputs have a corresponding analog audio output, and all present DVI inputs have a corresponding analog audio input. So audio shouldn't be too much an issue.

Quoting from HDMI site's FAQ:

Is HDMI backward-compatible with DVI (Digital Visual Interface)?
Yes, HDMI is fully backward-compatible with DVI using the CEA-861 profile for DTVs. HDMI DTVs will display video received from existing DVI-equipped products, and DVI-equipped TVs will display video from HDMI sources.
The big BUT - what's worse about using a HDMI connector (compared to a DVI connector) in an LCD-TV with a PC is this: HDMI supports only 720P and 1080I/P, as stated here:

What types of video does HDMI support?
HDMI has the capacity to support existing high-definition video formats (720p, 1080i, and 1080p/60). It also has the flexibility to support enhanced definition formats such as 480p, as well as standard definition formats such as NTSC or PAL.

This is brought to a head by Solano in Hardwarezone, who stated in the Hardwarezone forum posting - In the scenario that you use your computer's DVI graphics port, convert it easily (just a pin remap) to HDMI and connect the computer to your LCD-TV's HDMI port, and if your screen HAS resolutions like 1366x768 (commonly), and you pump a standard computer output like 1280x720 into that, the LCD-TV has got to scale it to 1366x768, you'll get the horrible fuzziness. Here is his text quoted ad verbatim:

In case you don't know, both DVI and HDMI support two formats: computer format and video format. Computer format goes by 1280x720 or 1920x1080. Video format
goes by 720p or 1080i. These two formats are not compatible. Okay maybe I should
rephrase it - What TV makers do is making their HDMI port NOT to support the
computer format at all. So the LCD TV is always expecting a video format from
the HDMI connector. So you will have to output a video format though DVI, which
can be done actually. But for 1366x768 LCD TV neither 720p nor 1080i matches its
native resolution, so the TV's scaler start working and gives you a fuzzy picture, which is exactly what IceShelterX has got into. This fuzzy picture, although is digital, is much worse than the analog VGA connection, so I don't even consider it as workable.

What Solano says, I have found corroboration in various other sources on the web, and in my opinion it is true. As a consequence, the preponderance of text in most computer applications other than games, the screen looks horrible on an LCD display. LCD displays are razor sharp in their native resolution, but if you have to scale it to another resolution, it looks horrible.

When you connect a computer to an LCD screen through the HDMI port, WHY can't you choose 1366x768? Because HDMI doesn't support it, at least, not for what we can buy currently. When you connect a PC to an LCD TV, the LCD TV in question will supply Extended display identification data (EDID) to tell the capabilities of the monitor to the PC's graphics card, and 1366x768 native resolution won't be among the supported resolutions because HDMI doesn't support it.

Can we bypass EDID? Sure! Current NVidia (and perhaps ATI) drivers allow you to bypass the EDID. Uncheck "Hide resolutions that this monitor does not support". Then just select from the big long list. BUT, that doesn't mean the monitor will accept the custom resolution, and timings etc have to be specified in addition to the resolution and even if everything were specified correctly, the firmware and design of the LCD panel or your graphics cards may have critical bugs which prevent the custom resolution from working. So you're back to square one.

Can we connect the PC's DVI port it to the LCD-TV's DVI port? Sure, if it has the DVI port. And if the DVI port reports the EDID data correctly as 1366x768, which it should unless the manufacturer of the LCD TV intentionally sabotages it. From what I've seen, DVI ports on LCD-TVs have faithfully supplied the correct EDID to date as far as I know.

So, let's use the DVI port! What's the catch? From reading the specsheets of several LCD-TVs, for LCD-TVs equipped with both HDMI and DVI ports, the DVI ports are the DVI-I type of port - meaning they're digital AND analog. Why, I don't know. I always thought that DVI-I were important for outputs, not inputs. And LCD-TVs which have both HDMI and DVI ports put HDCP support at the HDMI port, not the DVI port. This is a double whammy - HDCP doesn't like analog, and the manufacturer doesn't put HDCP on the DVI. This means, if you connect your computer to the DVI-I port of an LCD-TV which has both HDMI/HDCP and DVI ports, your computer isn't going to be able to play HDCP content on this LCD-TV in full resolution.

What about VGA? VGA is analog, and if HDCP sees analog, it's going to degrade the HDCP video to a lower resolution. And analog is slightly fuzzier than a digital interface. Besides this fuzziness, the EDID should be correct, and the full 1366x768 resolution would be offered to your computer without the need to scale. So, this is not a perfect solution.

Is there a perfect solution? Happily, yes, at least in theory. Certain LCD-TVs - which were designed for no-nonsense corporate use, like the LG L3200TF, have DVI/HDCP inputs. I find these kind of monitors few and far between, mainly because consumers PREFER HDMI/HDCP inputs as the HD interface of choice, mainly because of the industry's push towards HDMI for products meant for the general home user, for reasons unknown (I can only speculate that the industry pushes HDMI to the consumer, and DVI for industrial use, as internal corporate politics to segment the market between their consumer electronics and industrial/computer electronics departments). Anyway, the DVI/HDCP LCD-TVs looks like it is the perfect solution for now. And somehow, they're cheaper.

So why don't the LCD manufacturers make 1280x720 panels to exactly match HDMI's resolution? I don't know. Really. Do any of you LCD manufacturers wanna tell me? Email me, all names will be kept confidential.

Update 22 May 2006 - Solano has some suggestions on why the LCD manufacturers do not make 1280x720 anymore here.

A summary of his suggestions: 1280x720 LCD TVs do exist in the past, but they are no longer in vogue now, and a lone Toshiba 27" exists in that resolution. Then when Sharp came up with a non-matching resolution like 1366x768, then everybody followed suit into that marketing black hole. He also mentions that since most future Blu-ray content would be 1080i, the manufacturers thought that since that video content would have to be rescaled anyway, might as well rescale it to 768 and not 720, for more detail (MT - but that would screw PC users, as I have stated). And the most compelling thing which Solano said in his excellent post, was that most TV is shown overscanned, and now that we have 768, we can see the entire size of the video now since 768 is 6.7% more than 720, fully accomodating the traditional 5% overscan. MT - All this is nice for traditional settings, but today when more and more people want to usefully connect their computers to LCD TV, the `enhancements' they did by upping 720 to 768, makes it very difficult for us to connect PCs to LCD TVs. The best compromise is to to use the DVI interface. It it that difficult to sacrifice HDMI in favour of DVI even though you lose nothing?


6) Is HDMI going to be THE standard for the next 5 years?

Looks like the answer is NO. A new standard called the Unified Display Interface is poised to take over the torch from HDMI. Another new standard by VESA called the DisplayPort, backed by pretty much the same guys who backed UDI, is also competing. With big names like Intel, Apple, LG, nVidia, ATI, Dell, LG, Samsung backing one or both of these standards, looks like HDMI may have very serious competition in the near future. How near? How does 2007 sound like?

7) Are graphics cards available as of May 2006 capable of supporting HDCP?

There's a class action suit against ATI at present, where it's claimed that `ATI has begun to revise its website materials to delete reference to video cards being HDCP ready or compliant. A respected PC hardware site, Firingsquad.com, has an article on the Great HDCP fiasco where they claim that none of ATI's or nVidia's cards support HDCP.

8) Can a driver upgrade for a graphics card make it HDCP compliant?

There is an implicit assumption that no driver is trustworthy, and `upstream authentication' which detects driver cheating `masquerading HDCP compliance' was not included in the original HDCP specification.

I do not know whether, in the failure of fixing an `upstream authentication', whether it is allowed for a graphics cards in a PC be HDCP compliant. A companion specification has been defined, however, but watch this space.

The keys have to be stored in a secure location, and it is unclear whether current graphics cards have a secure location within the chip to store these keys. Drivers are definitely not a secure location. The 2002 EDN article states (probably outdated, but I lack a better link):

Adding an HDCP-capable DVI output to a product such as a graphics adapter or
set-top box is a similar exercise. You still need to add the interface silicon,
the nonvolatile key memory, and the DVI connector.
Looks like a mere driver upgrade won't be able to support HDCP. nVidia seems to say, in the Firingsquad.com article on the HDCP fiasco, that BOARD manufacturers have to be responsible for building in HDCP compliance, so that's another indication that, if the board manufacturers are oblivious to the dangers and pitfalls of HDCP, if they are not HDCP licensees, the keys are not implemented, and there's no HDCP on any existing graphics board today.

More confirmation here:

An ATI representative said: “People will not be able to turn on HDCP through a software patch since the HDCP keys need to be present during the manufacturing. We are rolling out HDCP through OEMs at this time but we have not finalized our retail plans yet.”
Well, what about NVIDIA? They were actually very direct: “The boards themselves
must be designed with an extra chip when the board is manufactured. The extra
chip stores a crypto key, and you cannot retrofit an existing board after the
board is produced.”
What was the problem? It's clear to me, that the whole HDCP scheme requires a lot of reading up to understand. I spent an entire day. And I'm not exactly foreign to this. End-users, big company marketeers, salespeople, EVERYBODY, did not bother to spend this DAY. Well, this blogpost just takes a few minutes of reading, and I hope it's clear. Now you know.

In Conclusion ...

The most striking thing I found, was the surprising crap I found out about HDMI. As a video standard, HDMI is not superior to DVI or vice versa. By HDMI's commitment, HDMI has to appear in DVI TVs and DVI has to be able to appear in HDMI TVs. Whatever HDMI can do, DVI can do. But obviously, whatever DVI can do (for example, support a wide range of PC resolutions) the HDMI interface does not do yet, but it can. So it seems that right now, DVI is the superset and HDMI merely the subset of DVI, at least in terms of video.

I managed to go through all the stuff, and surprisingly, I only have 3 recommendations.

  1. For people who think DVD quality is good enough and want an LCD TV for space and aesthetic considerations, and do not plan to attach a PC to it, buy any LCD TV you want.
  2. For people who want to connect their PC to an LCD TV, in addition to connecting HDCP devices like HD DVD players and future set top boxes, get a HDCP compliant LCD TV with a HDCP-compliant DVI port. Most probably this LCD TV would not have HDMI, and you won't miss it.
  3. For people who do not plan to play any HDCP content, but need to play HD content which are not HDCP-encoded, get a non-HDCP LCD TV with a DVI input. They're cheaper. Won't hurt to buy a HDCP LCD-TV tough besides hitting your wallet.
Any clarifications, please leave a comment below.

Time has come for Phones to kick Point and Shoot butt


I always chose phones based on functionality, and my present phone is a Nokia Communicator 9500.

It's the last of the line though. My new phones will be based on megapixel count. The time has come that a good 3.2 megapixel autofocus handphone with a Xenon flash will kick Point and Shoot butt - the Sony Ericsoon K800i - which even comes with an image stabilizer (whether it's digital or optical remains to be seen, but .... ain't that important to me).

Dell finally will release an AMD machine, but ....

Ok, they'll let customers buy AMD-based Opteron servers at the end of the year. Guess the negotiations with Intel took too long. But this smacks of ... big company slow bureaucracy syndrome. Makes them seem out of touch. Why so much sacrifice, when clearly Intel's new chips are pretty much gonna kick AMD's butt real soon?

Of course, I may be missing something.