[mythtv-users] Correctly behaving 576i (or 480i) from vdpau-capable chip?

Mark Kendall mark.kendall at gmail.com
Sun May 23 12:28:06 UTC 2010

On 23 May 2010 20:13, Paul Gardiner <lists at glidos.net> wrote:
> Hi,
> New TV!!. So I want to build a new front end, mainly to be able
> to connect via HDMI. For the next year or so, I don't foresee
> having any HD content to play, so I don't need vdpau, but I'd
> like to build a vdpau-capable system for future proofing.
> For now though, with the entirely SD content (PAL 576i), I'd rather let
> the TV do all the processing (deinterlacing and scaling), and
> not use vdpau at all. I'm planning on using software decoding,
> the "2 x Interlaced" software deinterlacer, with X set up to
> a 576i mode. The TV accepts that mode via HDMI.
> So what I'd like to know, is whether the vdpau-capable chips
> can be set up to run that way. I know some chips/drivers
> don't support interlaced modes. In the past there were problems
> with nVidia cards and interlaced modes (although possibly
> only if using XVMC).
> Anyone know what the current situation is?

As far as I'm aware there shouldn't be any issue with any recent
nvidia card/driver combination - though I haven't really tested for a

Your more likely problem is whether or not the 2x Interlaced filter
works with your new TV. In my testing I found it was very much
hardware dependant whether the tv consistently sync'ed to the input
when using it. If it doesn't, you simply get the lottery of having
perfect playback 50% of the time on start up, pressing pause/unpause
to try and get it in sync and then occasionally having the sync go
again when there is a small interruption to playback.



More information about the mythtv-users mailing list