[mythtv-users] Does mythtv support 1080p?

Michael T. Dean mtdean at thirdcontact.com
Thu Jan 12 23:35:07 UTC 2006


Steve Adeff wrote:

>On Wednesday 11 January 2006 14:21, Michael T. Dean wrote:
>  
>
>>Chris Lynch wrote: 
>>
>>>You can get 1080p working - I have it up and running on my box.  The
>>>biggest thing to keep in mind is a CPU powerful enough to play back
>>>1080p (3Ghz+ is a good idea I think), a video board that can output
>>>the resolution and has DVI on it      
>>>
>>Only if your TV supports 1080p over DVI/HDMI.  Many 1080p TV's only
>>support 1080p over VGA because a single-link DVI connection only
>>supports 1080i and a dual-link DVI connection is required for 1080p.  
>>
>not true, DVI Single Link is more than capable of carrying 1080p. Why some 
>tv's won't take 1080p over DVI I don't know...
>
Sorry, I was leaving out a lot of details because I didn't feel like 
typing up the whole story...

Note that different display technologies have different requirements, 
such as blanking intervals or other overhead requirements, which may 
reduce the bandwidth available for signal transfer (or, put another way, 
the bandwidth required to transmit a signal for a given resolution and 
refresh rate may be different for different devices).  To drive a 
1080p60 display using VESA's Generalized Timing Formula requires a 
dual-link connection.  However, it is possible to reduce the blanking 
interval to allow for more data to be sent over a single-link 
connection, making it possible to support 1080p60 via single-link (i.e. 
by using a 5% blanking interval, as you might for an LCD display).

So, although it's possible to drive a 1080p60 display using single-link 
DVI with a 5% blanking interval, doing so requires almost 140MHz of the 
165MHz bandwidth provided by single-link DVI.  Because this is near the 
top-end of the bandwidth, many factors can cause a degredation of 
display quality, including less than perfect TMDS transmitters in the 
video card (and many of these exist, BTW--for example, some of the 
non-discrete TMDS transmitters in NVIDIA GPU's can max out at 135MHz), 
poor-quality cables, cable length (i.e. using a 2m cable, the display 
may look fine, but may be unacceptable with a 3m cable--of course, 
that's because of poor design decisions made by the Digital Display 
Working Group when they created the DVI spec, but that's another story), 
and other factors.

Now, this is pure conjecture, but I would presume that many of the 
companies manufacturing 1080p displays that retail for $4,000+ decided 
that rather trying to explain all these factors to the user (who just 
wants a pretty picture after paying all that money), it would be a lot 
easier to require a cabling format that would "always" work.  Since VGA 
cables have much more headroom than single-link DVI, and since dual-link 
DVI is more expensive than single-link DVI, and since dual-link DVI 
requires support from both the display device and the video card, and 
since a single-link DVI cables will plug into a dual-link DVI connection 
(so the user thinks, "It fits, so it should work"), companies may have 
decided that the cost of providing for 1080p over DVI outweighs the 
benefit.  When you factor in the marketing "checkbox" approach (DVI 
connector: check; VGA connector: check), the benefit becomes even 
smaller (as only people who understand the issues in the first place 
would see the benefit of 1080p60 over dual-link DVI).  And, finally, 
when you consider the quality of today's A/D and D/A converters...

So, to correct my previous generalization, I should have said, "A 
dual-link DVI connection is required to properly and reliably drive a 
1080p60 display."

For more details: http://www.ddwg.org/lib/dvi_10.pdf

Mike


More information about the mythtv-users mailing list