[mythtv-users] high cpu utilisation with bob deinterlacer

Bill Williamson bill at bbqninja.com
Sun Feb 22 00:19:06 UTC 2009


On Sun, Feb 22, 2009 at 10:20 AM, Adam <adam at fastmail.com.au> wrote:

> Hi,
> Does anyone know why some of the deinterlacers (such as BOB) cause
> extremely high Xorg cpu usage when playing livetv or a recording inside
> mythtv?
>
> My system is as follows: Intel Core 2 duo 2.4GHz, 2GB ram, Geforce
> 9600GT with HDMI out, Hauppauge Nova-T-500 DVT card.
>
> When playing back a 1080 stream (with the BOB deinterlacer) the cpu
> utilisation is typically:
> mythfrontend - ~50%
> X - ~95%
>
> If I use the linear deinterlacer the cpu usage is more like:
> mythfrontend - ~45%
> X - ~5%
>
> Although live tv does play back perfectly with the bob deinterlacer, I'm
> not happy about the high cpu usage that results when watching tv.
>
> I'm running mythtv-0.21_p18314-r1, 32-bit gentoo with kernel 2.6.27-r8,
> xfce 4.4.3 and nvidia-drivers-177.82. Since experiencing this problem I
> have tried adding the UseEvents option to my xorg.conf under the nvidia
> device section and I have tried upgrading my nvidia drivers to 188.29.
>
> I look forward to hearing any suggestions.
>

It's because of what they do.

Linear blend is "dumb" and simply blends two frames together.  It is the
simplest possible deinterlacer.

Bob doubles the framerate and then doubles the linecount of each frame.
This takes a bit of cpu to do, especially since you're outputting at least
2x the amount of data.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mythtv.org/pipermail/mythtv-users/attachments/20090222/b776aa71/attachment.htm>


More information about the mythtv-users mailing list