[mythtv-users] high cpu utilisation with bob deinterlacer
Adam
adam at fastmail.com.au
Sun Feb 22 00:39:20 UTC 2009
Bill Williamson wrote:
> On Sun, Feb 22, 2009 at 10:20 AM, Adam <adam at fastmail.com.au
> <mailto:adam at fastmail.com.au>> wrote:
>
> Hi,
> Does anyone know why some of the deinterlacers (such as BOB) cause
> extremely high Xorg cpu usage when playing livetv or a recording
> inside
> mythtv?
>
> My system is as follows: Intel Core 2 duo 2.4GHz, 2GB ram, Geforce
> 9600GT with HDMI out, Hauppauge Nova-T-500 DVT card.
>
> When playing back a 1080 stream (with the BOB deinterlacer) the cpu
> utilisation is typically:
> mythfrontend - ~50%
> X - ~95%
>
> If I use the linear deinterlacer the cpu usage is more like:
> mythfrontend - ~45%
> X - ~5%
>
> Although live tv does play back perfectly with the bob
> deinterlacer, I'm
> not happy about the high cpu usage that results when watching tv.
>
> I'm running mythtv-0.21_p18314-r1, 32-bit gentoo with kernel
> 2.6.27-r8,
> xfce 4.4.3 and nvidia-drivers-177.82. Since experiencing this
> problem I
> have tried adding the UseEvents option to my xorg.conf under the
> nvidia
> device section and I have tried upgrading my nvidia drivers to 188.29.
>
> I look forward to hearing any suggestions.
>
>
> It's because of what they do.
>
> Linear blend is "dumb" and simply blends two frames together. It is
> the simplest possible deinterlacer.
>
> Bob doubles the framerate and then doubles the linecount of each
> frame. This takes a bit of cpu to do, especially since you're
> outputting at least 2x the amount of data.
>
Since you mentioned this point I'll add that I have a a mythtv client
which has a 3ghz core 2 duo processor with 4gb of ram and a geforce
8800gt. The cpu usage I get on this pc while watching a HD stream is
typically:
mythfrontend - 25%
X - 4%
A massive difference considering the CPU is only slightly faster.
More information about the mythtv-users
mailing list