[mythtv-users] Stupid question - deinterlacing

Tom Dexter digitalaudiorock at gmail.com
Sun Feb 22 16:04:05 UTC 2009


On Sun, Feb 22, 2009 at 1:34 AM, Jean-Yves Avenard <jyavenard at gmail.com> wrote:
> Hi
>
> 2009/2/22 Phil Wild <philwild at gmail.com>:
>> Perhaps this is a stupid question. I have a 1080p plasma panel and
>> watching the OTA 1080i signal, the picture looks real good. So to me,
>> that would infer that the TV has a good quality de-interlacer built
>> in. I don't know if it is possible, but could mythtv switch the video
>> signal of the graphics card to match the OTA stream and leave it to
>> the TV to perform the work in making the picture look as good as it
>> can on the display?
>
> I tried that.
>
> I set my Sony Bravia TV to 1080i and fed it 1080i signal via MythTV
> after turning off all deinterlacer ; result:  it looked crap
>
> Like no deinterlacing happened at all.
>
> Some people commented that there was an issue with nvidia cards not
> outputting an interlaced signal properly ...
>
> Jean-Yves

Yup...anyone with a 1080i native display (like my Hitachi RPCRT) can
tell you that it's impossible with the Linux nVidia drivers (newer
than some ancient version...8183 I think) to get correct 1080i output
of 1080i content without enabling deinterlacing in MythTV.  A common
symptom on a 1080i display is that it might look great for 10 or 15
seconds and then drift out of sync causing unspeakable motion blur.

Since proper 1080i would surely work on a 1080i native display, that
tells me they're doing something very wrong...something that would
surely screw up a 1080p TVs deinterlacer.

1080i TV owners have been griping on the nVidia Linux forum about this
for literally years without a peep out of nVidia.  If this in fact
behind the problem 1080p owners are having, I hope some of them take
up the fight.

Tom


More information about the mythtv-users mailing list