[mythtv-users] Bob de-interlace at 1080i

Tom Dexter digitalaudiorock at hotmail.com
Sat Aug 4 21:39:57 UTC 2007

A few days ago on my Gentoo frontend, I made myself a patched ebuild of 
MythTV 0.21.1 SVN 13344 including this patch of Viktor's:


...which changes the refresh rate reported for interlaced content being 
output in interlaced mode by X, in order to allow bob de-interlacing to be 
used.  I've read in a number of places (including the above closed ticket) 
that you shouldn't use bob with interlaced displays. I tend to take that 
with somewhat of a grain of salt...especially given the reality that we 
shouldn't have to use _any_ deinterlacing with interlaced displays to begin 
with (if only nVidia would get that straightened out).

First of all, this is the _best_ that 1080i content has ever looked on my 
MythtTV system...no question about it.  Even the non HD content on stations 
broadcasting at 1080i looks much better.  Not only that, but bob seems to 
use slightly less CPU than kerndeint on my frontend.  Obviously it doesn't 
affect 720p broadcasts as de-interlacing gets disabled...they look great as 
they always did.  The patch has had no ill affects whatever.

Anyway...all that aside...does anyone know why 1080i content being output as 
1080i at 60Hz gets reported as 30Hz (a refresh rate of 33333) by the frontend 
rather than 60 (a refresh rate of 16666) in the first place?  That one just 
didn't make sense to me.  If there was some important reason for this, I'd 
expect the patch to cause problems, and it certainly doesn't appear it does.


Learn.Laugh.Share. Reallivemoms is right place! 

More information about the mythtv-users mailing list