[mythtv-users] 1080i to 1080p Deinterlacing on Backend was Raspberry Pi now ships with 512MB RAM

Joseph Fry joe at thefrys.com
Wed Oct 17 16:07:32 UTC 2012


> > on 10/16/2012 11:00 PM Michael T. Dean carved the following into a
>> > picnic table:
>> >> And, yes, a system that's transcoding everything to H.264 (versus
>> >> storing it in its native recording format) does make for an extremely
>> >> power hungry system.
>> >>
>> > You have just touched on something I have looked at a few times and
>> > never successfully figured out.  I already have a powerful backend
>> > (Xeon E3) with power to spare because it does other things like
>> > hosting VMs for building MythTV, cloud testing, etc.  It can commflag
>> > at 1200+ fps on 1080i content.  What I have noticed is that 1080i
>> > content looks pretty crappy no matter which deinterlacer I use on the
>> > frontends (mixture of ION, ION2, 9400, GT430 all running VDPAU).  720p
>> > content looks better and Blu-Ray looks stunning without even making
>> > the frontends break a sweat.
>> >
>> > Is there a way to take my 1080i recordings and just deinterlace them
>> > on the backend?  I don't care about transcoding out the commercials,
>> > don't care what format they end up in, don't care about disk space, as
>> > long as I can use whatever deinterlace algorithm looks best to my eyes
>> > when it's played.  Maybe I don't understand the limitations of 1080i
>> > deinterlacing, but it seems that with a good enough transcode, the end
>> > result should be somewhere between 720p and Blu-Ray.
>>
>> This would only be better than deinterlacing on playback if it allowed
>> you to use a better-quality deinterlacer while transcoding.  However,
>> you're unlikely to find a deinterlacer better than the ones VDPAU
>> provides.  So, it's likely that there's some other problem (perhaps even
>> that your system isn't using the deinterlacer you think).
>>
> It is entirely possible (and seemingly more likely as we discuss it) that
> I don't understand what I am doing with the DVPAU settings. Your argument
> does make sense.  I was working on the assumption that given sufficient
> computing power, ffmpeg could do a 2-pass deinterlace that was better than
> what realtime VDPAU is capable of on a 9400.  If that assumption is flawed,
> then, no problem: I will just go back to trying to get a setting that looks
> good on the frontend.
>

I don't think that your assumption is flawed... however there is a bit more
going on than I think you realize.

If you were to pre-deinterlace a file, you would be transcoding that file
in the process.  Essentially, you would render a frame, deinterlace it, and
recompress it.  So lets say you have the best deinterlacer you could
possibly imagine and get the best possible resulting progressive frame...
now you need to compress it to the new video file... here is where your
efforts are wasted... whenever you recompress a video your going to
introduce artifacts.  In the end the resulting video may actually be of
lower quality than your current interlaced video and deinterlacer.

If your 9400 doesn't deinterlace well... you may consider not deinterlacing
1080i material at all.  My TV looks almost as good without deinterlacing
(on 1080i only), even though my card is outputting a 1080p signal not sure
how/why it works but it does.  You can also try having your video output
change to 1080i by setting different modes for playback in the setup and
letting your tv use it's built in capabilities (see
http://www.mythtv.org/wiki/User_Manual:JudderFree).
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.mythtv.org/pipermail/mythtv-users/attachments/20121017/01f02683/attachment.html>


More information about the mythtv-users mailing list