[mythtv] Interlaced vs. Progressive

David Asher david.asher at caviumnetworks.com
Mon Feb 27 15:21:45 UTC 2006


Ok, I spent some time looking into this yesterday.

One call to SetVideoParams is in InitVideoCodec.  It looks like this 
would be a great place to use something other than kScan_Detect, but 
that would require decoding some video frames during init.  I started 
thinking about that and realized that there is no reason why the 
interlaced status of a video stream need be the same all the way 
through.  For instance can't an ATSC transmission switch to 480i for a 
little while and then back to 720p?  I don't think practically it 
happens with any of my stations, but I think it could.  Also certain 
DVDs are known do this, aren't they?

So, it occurred to me that it would be better for the interlacer to be 
turned on and off in GetFrame based on the current avcodec_decode_video 
results.  Do this seem reasonable?  It looks like it would require some 
changes to how NuppelVideoPlayer handles the deinterlacer.  I wanted to 
check before trying to implement it to make sure I'm not running down 
the wrong path.

Thanks,

David.

Isaac Richards wrote:
> On Sunday 26 February 2006 00:14, David Asher wrote:
>   
>> I've been following the development of the Internal video player closely
>> because I'd love to punt xine for displaying MythVideo stuff -- esp.
>> with the advent of DVD menu support.
>>
>> But every time I've used it on an .avi (or DVD .iso), the picture
>> quality has been terrible.  I never bothered to look into why until
>> today.  It turns out that it appears to be entirely attributable to the
>> internal player thinking progressive video is interlaced -- and applying
>> the deinterlacer to it.  If I disable the deinterlace in TV playback
>> options the quality is quite good.
>>
>> It appears the "culprit" is detectInterlace which simply says (for non
>> 720 video) if the fps is <= 45 it must be interlaced.  These videos all
>> report ~23.97fps (DVD reports 29.997fps).  Of course there is the
>> following comment too:
>>
>>         // The scanning mode should be decoded from the stream, but if it
>>         // isn't, we have to guess.
>>
>> So, obviously, I need to figure out why progressive/interlaced isn't
>> being "decoded from the stream".
>>
>> I'm happy to look into coding an improved interlace detector -- but
>> don't have the slightest idea where to start.  Anybody willing to point
>> me in the right direction?  Other players which do it better learn
>> from?  I suppose I could resort to adding a key to toggle deinterlace
>> from the player...  That's pretty icky, though.
>>     
>
> Check avformatdecoder.cpp, where it calls SetVideoParams() (two places).  Both 
> are calling it with kScan_Detect, and they _could_ be telling it what the 
> video style is.  In the struct returned by avcodec_decode_video, there are 
> two fields, 'interlaced_frame' and 'top_field_first' which give that info.  
> If you got it using that data to pass in to SetVideoParams, it'd probably 
> work properly.
>
> Isaac
> _______________________________________________
> mythtv-dev mailing list
> mythtv-dev at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev
>   
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mythtv.org/pipermail/mythtv-dev/attachments/20060227/acc30c82/attachment.htm 


More information about the mythtv-dev mailing list