[mythtv-users] Refresh Rate reported for interlaced output (was: New nVidia 169.07 and interlace changes)

Tom Dexter digitalaudiorock at gmail.com
Sat Dec 29 21:50:35 UTC 2007


On Dec 26, 2007 11:17 PM, Alex Halovanic <halovanic at gmail.com> wrote:
> My understanding was that for most purposes X doesn't really distinguish
> between a progressive and interlaced display; it outputs a 1920x1080 picture
> at ~ 60fps and then the hardware device handles sending the interlaced
> picture to the tv, presumably by simply discarding half the field lines for
> each frame.
>
> Most of the time when you just display an interlaced video in myth, the fields
> being displayed for the video don't match up with those fields getting sent
> to the device.  Applying interlacing to a 60fps video gives you about
> effectively 30fps for moving objects; applying another round of interlacing
> introduced by the unsynced display makes it more like 15fps and very juddery
> for motion.
>
> Bob2x deinterlaces the picture by showing all 1080 fields at 60fps.  It does
> this by showing just the odd fields for 1 frame, which measure 1920x540,
> stretching it out vertically to 1920x1080 (making each field 2 pixels high
> instead of 1).  This causes some very slight bounciness in thin elements or
> the edge of motion, as some pixels are rapidly switching between showing an
> edge and showing the space adjacent.
>
> Even with just displaying static elements, you still necessarily get this
> bounciness on a 1080i display (check out some of the menu elements in the
> myth main menus for example).  I think that any bounciness introduced by
> bobbing is covered up by the inherent bounciness in the 1080i display, even
> when the fields are being displayed in reverse order, and you're basically
> now just interlacing a progressive 60fps video, instead of reinterlacing an
> already interlaced video.  Therefore: a perfect 1080i display at the expense
> of a lot of apparently unneccessary cpu work.
>

Thanks for the explanation Alex.  That makes sense.  Actually even
with bob I find the CPU usage to be very acceptable.  It actually uses
less CPU than kernel deinterlacing.  I'm not using XvMC...for some
reason it actually uses more CPU than just using libmpeg2 on my
system.

I'd still love to know why MythTV reports that refresh rate as 30.

> Incidentally, I finally figured out how I got a perfect 1080i picture without
> bob.  It absolutely required the double refresh rate patch, otherwise the
> video drifts in and out of sync every few seconds even with everything else
> identical.  I've watched the CBS broadcast of the Buffalo-Cleveland NFL game
> for over 30 minutes and 100,000 interlaced frames now and it's never once
> lost sync.  I'd appreciate if others could test this, especially with display
> devices other than the onboard component out (such as DVI with the predefined
> 1920x1080_60 tv modeline)
>
> My set-up:
> -XV video sync enabled in nvidia-settings
> -OpenGL video sync enabled (I require both to stop tearing, ymmv)
> -Use Video As Timebase enabled (this is crucial or else the fps drifts all
> over the place)
> -No zoom or overscan in Mythfrontend's playback settings, at least vertically
> (there's no point trying to sync things if their sizes aren't identical)
> -The double refresh rate patch applied to SVN (0.20 will probably do fine as
> well)
>
> Obviously you can't be dropping frames from a bad recording or a loaded CPU or
> hard disk, and I'm not sure if it will work if you're stuck with a station
> that is broadcasting a combination of interlaced and progressive frames.
> When you start playing the recording, there's a good chance it won't be in
> sync.  If it's not, pause and unpause the video, check any motion for judder,
> and keep pausing and unpausing until it starts synced up.  I think it's about
> a 50% chance of it being synced whenever it starts or restarts playback so it
> shouldn't take more than 5 pause-unpauses to get it right.  Now, sit back and
> watch the perfect 1080i picture, resisting the urge to pause or skip around
> and throw things back out of sync ;)
>
>
> -Alex

It appears I was able to duplicate your test there.  I have a 1080i RP
CRT connected via DVI.  I'm still using a modeline, as I'm still using
the 100.14.11 driver and the built-in modes give me that half screen
bug.  The only setting you had there that I had never tried is the
'use video as timebase'.  I've never been crazy about using video as
the timebase only because video is much more forgiving about dropping
stuff than audio.

I think you're correct...when watching 1080i shows that don't mix
frame rates (as NBC seems to do with dramas and comedies all the
time), for example CBS shows, it appears that, once it gets in sync,
it does seem to stay there if you just let it play.  I was curious
though...you're not suggesting that's a viable way to use myth are you
:D.  If I were going to abandon pausing and time shifting, I'd just
flick my DVI switch over to my old Samsung HD receiver to get true
1080i :D.

An interesting test none the less.  I don't know if you've ever
noticed this, but a user on the vNidia Linux forum has noticed that,
when outputting 1080i with the nVidia linux drivers, the following:

nvidia-settings -q RefreshRate

...reports 60.05 even though xorg sees it as 59.9.  That very well may
be at the heart of the whole problem of interlacing drifting in and
out of sync at fairly steady intervals.  Apparently it reports that
60.05 regardless of your modeline.  Jeez...I really wish those folks
at nVidia would act like we're alive, you know?

Thanks again.

Tom


More information about the mythtv-users mailing list