[mythtv-users] Any hope for 1080i content at 1080i with nVidia?

Tom Dexter digitalaudiorock at hotmail.com
Fri Aug 24 15:29:46 UTC 2007

>From: Preston Crow
>On Thu, 2007-08-23 at 17:54 -0700, Seth Daniel wrote:
> > If I just use
> > the EDID directly I would get a 'virtual' 1920x1080 screen where I could
> > just view the top half (the first 540 lines) of the screen.  This
> > appears to be a driver issue.  It used to work with the 7xxx and 8xxx
> > driver (except that XV playback hard locked the system), but the 9xxx
> > (and beyond) driver broke it.
>Yup.  I'm using the latest 8xxx, and it seems to work fine for me with
>the EDID timings for 1080i (using DVI->HDMI on a 5200FX card).
>I'm not having the XV playback problem.
>It does occasionally switch to the wrong video mode, but resetting it
>(CTRL-ALT-BACKSPC) fixes that.
>XvMC works fine for the video, but now I'm getting a subtle audio
>stutter, and I haven't found any settings that help.  [With my Athlon
>2500+, I need XvMC for 1080i to avoid pauses; 720p doesn't need it.]
>I'm not using deinterlacing, and I haven't noticed a problem with 1080i
>(or 480i, for that matter), but perhaps I'm just not that sensitive to

If I try to use any of the built-in 1080i modes I only get the top 540 lines 
of the screen as well.  I think this bug is discussed here:


...and they are apparently aware of it.  I wonder if by chance in addressing 
this, they may happen to actually fix interlacing.

It kills me to hear that the 8xxx drivers actually handle 1080i correctly.  
My frontend has only PCI-E and, as far as I can see, there are no chipsets 
available on PCI-E that are supported by 8xxx drivers.  I'm not about to 
replace my frontend machine because nVidia hasn't addressed this.

For the time being, I've settled on using 1080i output (with a custom mode 
line) and using Viktor's patch here:


...to allow bob deinterlacing, and that made a huge difference...not as good 
as it would look with the driver working correctly and no de-interlacing at 
all, but very good.  By the way...someone on avsform.com noticed that the 
line in that patch that reads:

if (mode_line.flags && 0x010) // #define V_INTERLACE 0x010

...should actually be using s bitwise operator:

if (mode_line.flags & 0x010) // #define V_INTERLACE 0x010

...otherwise it could it would be true if any flags at all happen to be set 
in those flags (though it didn't cause me any problems before I changed 
that).  I'm still baffled as to why mythtv reports a 60Hz refresh rate as 30 
if the output is interlaced, which is what that patch changes in order to 
allow use of bob.


A new home for Mom, no cleanup required. All starts here. 

More information about the mythtv-users mailing list