[mythtv-users] Very variable CPU usage from X using bob deinterlace

junk junk at giantblob.com
Sun Jan 28 22:39:24 UTC 2007


Paul Bender wrote:
> junk wrote:
>   
>> Hi,
>>
>> I'm experiencing very high and very variable CPU usage from X during DVB 
>> playback with the 'bob' deinterlacer - sometimes X uses a few percent of 
>> the CPU, sometimes it uses nearly 90 percent. I do have cpufreq turned 
>> on with the ondemand governor but this is not enough to explain the 
>> variation I'm seeing (as much as a hundred fold difference in load). 
>> There is no apparent difference in playback quality when the load 
>> changes and the load changes up and down while watching a single 
>> program. Switching deinterlacing off or choosing a different 
>> deinterlacing algorithm results in consistent CPU load.
>>
>> Does anyone else see this? Is it expected behaviour? I understand that 
>> bob is CPU intensive, but it seems odd that it sometimes takes 
>> practically no CPU and other times it nearly swamps my machine.
>>
>> My setup is:
>>    AMD Athlon 64 3000+ (1800MHz), 2GB memory, NForce motherboard (6500, 
>> I think?), using built in TV Out to PAL TV
>>    Gentoo 2006.1
>>    Linux 2.6.19.2, (vanilla kernel, not gentoo kernel)
>>    modular X.Org server (from stable portage)
>>    recent proprietory NVidia drivers (emerge ~x86)
>>    recent MythTV (from SVN a few days ago)
>>    'Standard' output (not XVMC as it seems to crash X and or the front end)
>>
>> Thanks,
>> -- jeek
>>     
>
> Yes, I have seen this behavior. Adding
>
> Option "UseEvents" "true"
>
> to the nvidia device section in xorg.conf solved the problem.
> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
>   

Thanks - that seems to have fixed it.

-- jeek



More information about the mythtv-users mailing list