[mythtv-users] Help with interlaced DVI output configuration

Chris Weiland hobbiticus at gmail.com
Wed Mar 21 17:07:37 UTC 2007

I've searched and search and can't seem to find a definitive answer to my
questions, so hopefully the all-knowing body of the mythtv discussion list
will know.

My Situation:
I've got a 32" Samsung LCD TV that has a native resolution of 720p.  I have
a cable box with an HDMI output that I've hooked into my TV's HDMI input in
the past, which resulted in a very clean and crisp picture.  I guess I can't
be 100% sure that it was outputting in 1080i, but it says it was.  My myth
box is setup to run at the 720p resolution, outputting over the DVI
interface on my geforce 6200, through a DVI-HDMI cable, into the HDMI input
on the TV.  As you might suspect, playing 1080i content without
deinterlacing is filled with interlacing artifacts.

My Problem:
I'm very sensitive to even the slightest visual artifacts, and none of the
built in deinterlacing methods in mythtv are as good as I would like it to
be.  So, logic along with searching through lots of mailing lists suggests
that if I can output 1080i from my myth box, the (much better) TV
deinterlacer/scaler will kick in and give me the same crisp and clear images
that I get when I hook up my cable box directly.  This should theoretically
also save CPU cycles in the deinterlacing/scaling that myth is currently

The Solution:
Output in 1080i! (and turn off mythtv's deinterlacing)

So, first of all, am I correct in everything I've mentioned above?

I'm not exactly sure how to go about doing this.  My TV's EDID info says
that it will accept 1080i sources, and the nvidia
driver seems to have the 1080i modes built in.  However, if I try to use the
1080i modes, the X log complains that the vsync is out of range.  This is
"true" because the TV wants 60hz, but the mode is listed as 120hz.  So, I
turn off the vertical sync mode validation.  Now it says that the 1080i
screen size is bigger than the LCD's native resolution.  Fine, so I turn off
the DFP native resolution mode validation.

Now it works!  Except not.  X accepts the 1080i video mode, and the change
is apparent on my TV because it is only showing the top half of the screen,
and the horizontal scan lines are out of sync.  It's a little hard to
explain, but it's almost like instead of drawing lines 0,1,2,3,4... it's
drawing lines 1,0,3,2,5,4..., except making each scan line twice as big, so
the screen only shows half of the picture.  I couldn't think of what to do
after that.

I also see that I can define a custom modeline.  I had to do a lot of mode
validation disabling to get this to work too.  I have to admit that I havn't
played with this too much, but I can't get the driver to like anything that
I create.  I think once I finally got X to start using my modeline, all I
got was a blank screen.  That's where I gave up.

I've seen elsewhere that interlaced modes over the DVI output stopped
working after driver version 6629, but that driver doesn't support the 6200
card.  However, this was from a post back in 2005, so hopefully that's not
the case anymore.  I've also seen some people say to enable TwinView with
only one display device connected for some reason, but no one ever explained
what settings they used for that.  Others have said that I need to turn off
some vsync settings in the nvidia-settings program.  I havn't investigated
these issues myself yet, so hopefully someone can tell me so I don't have to
waste my time.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mythtv.org/pipermail/mythtv-users/attachments/20070321/b2e77c25/attachment.htm 

More information about the mythtv-users mailing list