On 12/20/06, <b class="gmail_sendername">Steven Adeff</b> <<a href="mailto:adeffs.mythtv@gmail.com">adeffs.mythtv@gmail.com</a>> wrote:<div><span class="gmail_quote"></span><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
On 12/19/06, foo bar <<a href="mailto:foobum@gmail.com">foobum@gmail.com</a>> wrote:<br>> On 12/17/06, J. Miller <<a href="mailto:mythtv.org@elvenhome.net">mythtv.org@elvenhome.net</a>> wrote:<br>> > I've been fighting the scourge of myth causing X to consume 100% of the
<br>> > CPU when Xv Sync to Vblank is turned on in nvidia-settings. Everything<br>> > works just fine with it unchecked but that is the only sync setting that<br>> > gives me a perfect picture with no tearing. Myth's native OpenGL sync
<br>> > still tears a little. The problem turned out to be the modeline I was<br>> > using. It was a 1368x768 56Hz modeline which apparently triggered the<br>> > issue. I switched to a 1360x768 60Hz modeline and now everything works
<br>> > peachy. X still starts consuming large amounts of CPU occasionally but<br>> > it quickly drops back down to the 2-4% range. I'm not 100% sure that it<br>> > was the refresh but a modeline change did the trick. Hopefully posting
<br>> > this here will help point others that have the same problem in a new<br>> > direction.<br>><br>> NVIDIA have added an option in (I think) 9626 to poll() instead of busy-wait<br>> for vblank, which fixes the high cpu usage when sync to vblank is switched
<br>> on. It's called "UseEvents".<br><br>interesting, you mean in the nvidia-settings Xv section?<br><br>I'll have to give this a try.<br>--<br>Steve<br></blockquote></div><br>No - in the device section of
xorg.conf<br><br>e.g.<br><br>Section "Device"<br> ...<br> Option "UseEvents" "True"<br>EndSection<br> <br>