[mythtv-users] Struggling with Xwindows DVI to HDTV 1080i
Dan Christian
dac at x.cx
Tue Jan 3 03:36:49 UTC 2006
On Thursday 29 December 2005 17:52, Steve Adeff wrote:
> the NVidia component output uses an HDTV output chip so it will work. It's
> interlaced over DVI that doesn't since it uses the normal monitor output.
> Its a known bug that NVidia won't fix since relatively few people use it.
> Apparently one of the older (search the list archives) drivers does support
> it though. Some TV's will accept a 1920x540p signal from the DVI and
> display it as 1080i though, so thats worth a try too.
Have you actually gotten 1080i to work with component? Can you send the
Xorg.conf file?
I tried to get it to work from my 6600GT, but couldn't get a usable picture.
The picture would jump up-down by a line on every frame (like the interlace
was happening right). Both component and DVI acted the same. My eyes
started to water after about 5 minutes. I haven't found anyone with a usable
1080i.
Do you know which drivers worked? Which list archvie were you referring to:
nvidia or mythTV?
Any CRT that takes 1080i should also take 540p. The only differences are in
the timing. Of course, LCD and plasma have to digitize and rescale
everything; so they might act differently.
Thanks,
-Dan
More information about the mythtv-users
mailing list