[mythtv] Interlaced vs. Progressive

Isaac Richards ijr at case.edu
Sun Feb 26 06:00:30 UTC 2006


On Sunday 26 February 2006 00:14, David Asher wrote:
> I've been following the development of the Internal video player closely
> because I'd love to punt xine for displaying MythVideo stuff -- esp.
> with the advent of DVD menu support.
>
> But every time I've used it on an .avi (or DVD .iso), the picture
> quality has been terrible.  I never bothered to look into why until
> today.  It turns out that it appears to be entirely attributable to the
> internal player thinking progressive video is interlaced -- and applying
> the deinterlacer to it.  If I disable the deinterlace in TV playback
> options the quality is quite good.
>
> It appears the "culprit" is detectInterlace which simply says (for non
> 720 video) if the fps is <= 45 it must be interlaced.  These videos all
> report ~23.97fps (DVD reports 29.997fps).  Of course there is the
> following comment too:
>
>         // The scanning mode should be decoded from the stream, but if it
>         // isn't, we have to guess.
>
> So, obviously, I need to figure out why progressive/interlaced isn't
> being "decoded from the stream".
>
> I'm happy to look into coding an improved interlace detector -- but
> don't have the slightest idea where to start.  Anybody willing to point
> me in the right direction?  Other players which do it better learn
> from?  I suppose I could resort to adding a key to toggle deinterlace
> from the player...  That's pretty icky, though.

Check avformatdecoder.cpp, where it calls SetVideoParams() (two places).  Both 
are calling it with kScan_Detect, and they _could_ be telling it what the 
video style is.  In the struct returned by avcodec_decode_video, there are 
two fields, 'interlaced_frame' and 'top_field_first' which give that info.  
If you got it using that data to pass in to SetVideoParams, it'd probably 
work properly.

Isaac


More information about the mythtv-dev mailing list