[mythtv-users] Artifacts on screen with DVI to HDMI cable

Michael T. Dean mtdean at thirdcontact.com
Mon Jun 25 15:08:44 UTC 2007


On 06/25/2007 09:35 AM, ryan patterson wrote:
>
> On 6/25/07, Michael T. Dean <mtdean at thirdcontact.com> wrote:
>>
>> On 06/24/2007 11:05 PM, David Cramblett wrote:
>>>
>>> This turned out to be the MSI FX5200 Video card.  It just
>>> couldn't run either the 15' cable length or it doesn't like the
>>> DVI->HDMI cable, I don't know which. However, 15' VGA -> RGB
>>> Component worked fine.
>>
>> DVI cables were designed to be cheap to manufacture, rather than
>> being designed to carry the DVI signal (SDI), so they are actually
>> far more sensitive to long cable runs than are VGA cables or
>> component cables. HDMI cables are typically even worse (because--in
>> addition to having the same design flaw as DVI cables--the cables
>> tend to be smaller gauge).
>>
>> Some cable manufacturers do different things to work around the
>> inherent weakness of the design (often adding significantly to the
>> cost of the cables).  Whether their techniques work, I couldn't
>> tell you--I'm also using VGA happily.  TTBOMK, no cable
>> manufacturer has actually created a "proper" cable for DVI/HDMI.
>> (SDI is designed to be carried over coaxial wires, but DVI/HDMI
>> uses twisted pair.  I don't know of any source of DVI/HDMI cables
>> using coaxial wires.)
>>
>
> Sorry Mike you are 100% wrong.  SDI signaling is not the same as DVI
> or HDMI signaling.  TMDS signaling used for DVI and HDMI is designed
> for twisted pair cabling not coaxial.

True (and that will teach me to try to simplify so as to not to post a
novel).  The whole story is that when the professional video industry
moved to digital signals, it decided to use a standard, SDI, which was
designed to be run in coaxial cables.  The ability of coaxial cable to
control impedance allows even uncompressed video (i.e. order of
magnitude more data than compressed video) to be transmitted even across
hundreds of feet with no loss of information.

Silicon Image, Inc. decided that--rather than use a tried and true
technology--they would create a new standard for digital video (that was
cheaper to implement), but based on many of the ideas of SDI.  This
standard, TMDS, while it may have been designed to work using twisted
pair cables, did not account for the fact that the /best/ twisted pair
cables can only control impedance to about +/- 10%.

Up to some length of cable, TMDS (and, therefore, DVI/HDMI) works
exactly as designed (and that length is actually dependent upon the
cable itself, as well as the TMDS transmitter and receiver).  However,
at some point, bit errors will start to occur.  And, since DVI/HDMI
don't use error correction (since they're meant to transfer data in real
time), these uncorrectable bit errors will affect the display quality. 
Initially, bit errors will occur, causing pixel dropouts (often called,
"sparklies").  If the cable were a tiny bit longer, the frequency of
errors increases until these errors start to affect neighboring pixels,
creating blocks of errors.  If the cable were just slightly longer,
still, the bit errors would be so pervasive that the an insufficient
amount of the bitstream can be recovered to create an image.  This point
is called the "digital cliff" (once you hit the cliff, it just doesn't
work).

However, TMDS was designed as a cheap way of transferring digital
signals that would be sufficient for /most/ uses in the average
consumer's setup (where that setup typically does /not/ extend beyond
the entertainment center cabinet).

> Using coaxial cables and BNC
> jacks for a TMDS signal would be absurd.

IMHO, it's impossible to make that claim without also specifying the
context in which those cables would be used.  Going from a STB (that's
actually "on top of" a TV) to a TV), it would be absurd (i.e. overly
expensive).  Going from a Myth box in one room, up to the attic, into
another room, and down to a projector, it may not be so
absurd--especially if the user couldn't find any DVI/HDMI cable that
could handle that length without bit errors.  Still, it would be expensive.

However, had the design requied coaxial cable, it probably wouldn't have
been so expensive (through the magic of volume production).  After all,
we use a /lot/ of cheap coaxial cables today (RG-6, RG-59, composite,
S-Video, component video, ...).  Unfortunately, the industry was looking
for inexpensive in the short term, and the consumer gets to pay for it
(through inferior design) for the long term.

Unfortunately, because the connectors for DVI/HDMI have already been
designed--and because the connectors themselves also affect impedance
control--just hacking your own coaxial cables into DVI/HDMI connectors
wouldn't likely solve the problem.  So, instead, users of DVI/HDMI just
get to live with its problems.  Most will never see those problems, but
those who are trying to do more than a "typical" setup may.

> There is no inherent design
> weakness in the DVI/HDMI cable designs.

Except lack of impedance control.  And my entire point was that DVI/HDMI
were designed to be cheap and were /not/ designed for long cable runs. 
If you have information that disproves this point, please let me know,
because I'd love to have a citation showing that DVI/HDMI can be proven
(through empirical evidence) to be broken designs (i.e unable to meet
their design goals).

> Also the video part of the
> HDMI standard is 100% compatible with DVI.  So there are no issues
> when manufacturers make HDMI to DVI cables.  The digital signals used
> in DVI/HDMI are much better suited to longer cable runs then VGA or
> component signals.

Right, I never said otherwise.  However, because of the additional wires
required to carry the audio portion of an HDMI signal, HDMI cables
typically use inferior (smaller gauge) wires, making them even more
susceptible to bit errors over long runs.  The DVI->HDMI adapter
generally isn't an issue (although it does affect impedance control,
too, so...).

Mike


More information about the mythtv-users mailing list