[mythtv-users] nvidia 177.82 pegs mythfrontend when xscreensaver activates during paused program?

Adam Stylinski kungfujesus06 at gmail.com
Tue Jan 27 00:10:04 UTC 2009


Version 180 of the nvidia drivers recently caused me to have terrible
jittery framerates.  It was most noticeable on the higher resolutions videos
(1080i) but was also still noticeable on the lower resolution progressive
scan streams as well (720p).  Nvidia needs to stop breaking things :(.

On Sun, Jan 18, 2009 at 11:32 AM, Tom Dexter <digitalaudiorock at gmail.com>wrote:

> On Sun, Jan 18, 2009 at 1:45 AM, Yeechang Lee <ylee at pobox.com> wrote:
> > Tom Dexter <digitalaudiorock at gmail.com> says:
> >> Has anyone experienced the freezes described here using the 180.22
> >> drivers:
> >>
> >> http://www.nvnews.net/vbulletin/showthread.php?t=123912
> >>
> >> Except for the fact that I get a black screen rather than the
> >> corrupted video described there, this is very much like what I'm
> >> running into with 177.82.
> >
> > I use 173.14 and have seen the hangs (of the type you describe, with a
> > black screen) on my frontend . . .
> >
> >> I neglected to mention in my original post that I'm using xv-blit
> >> output with opengl sync enabled.
> >
> > . . . only when I enable OpenGL sync in mythfrontend (I also use
> > xv-blit, as most MythTV users do). Enabling it reliably causes a hang
> > within an hour or two. I can't figure out what triggers it, exactly,
> > but am pretty sure I've seen it outside playback (such as while
> > looking over a list of upcoming movies).
> >
> > Disabling OpenGL sync is no hardship; as I've mentioned in a few
> > other messages over the past few years, RTC works just as well as
> > OpenGL does in my experience.
> >
>
> Wow...that's definitely the culprit.  I enabled RTC in my kernel,
> disabled opengl timing in MythTV, and made sure it was in fact using
> RTC.  After that I was even able to go to the 177.82 driver and was
> unable to cause the xscreensaver freeze.  With that driver I tried
> enabling opengl timing again and it only took me two tries to get the
> paused video and xscreensaver to send the CPU into orbit.
>
> I certainly can't see any difference in picture quality using RTC.  It
> could be my imagination, but if anything it almost seems to look
> better.  What's more, it seems to have corrected a small issue I've
> had for a long time:  The combination of bob x2 with 1080i ouput that
> I've always used had a minor issue with some shows (dramas mostly) on
> NBC...after the commercials, which are often different frame rates
> than the show, I'd get a momentary video speed up for a second.  Using
> RTC it doesn't do that...amazing.
>
> II actually feel much better about using RTC anyway, as the less
> dependent I am on nVidia's proprietary crap the better.  That's one of
> the reasons I don't use XvMC.
>
> 1000 thanks for the suggestion.  This has worked out just great all
> the way around.
>
> By the way...what's up with the Device Drivers -> Real Time Clock, vs.
> Device Drivers -> Character devices -> Enhanced Real Time Clock
> Support (the one that Myth apparently uses) in the kernel config??  I
> can't tell you how confused that had me.
>
> Tom
> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mythtv.org/pipermail/mythtv-users/attachments/20090126/d208d8d7/attachment.htm>


More information about the mythtv-users mailing list