[mythtv-users] Nvidia just released 8756.

Steven Adeff adeffs.mythtv at gmail.com
Mon Apr 10 15:57:26 UTC 2006


On 4/10/06, Jesse Guardiani <jesse at wingnet.net> wrote:
> Steven Adeff wrote:
> > On 4/10/06, David Asher <david.asher at caviumnetworks.com> wrote:
> >
> >> my bad.  please ignore me.
> >>
> >> for some reason even though i installed the 8756-kmdl,  Xorg was still
> >> running 8178.
> >>
> >> w/8756 OpenGLVsync IS fixed on my 5200-based card.
> >>
> >> David.
> >>
> >
> > good news. time to recompile MythTV with OpenGL Vsync support.
> > Questioning whether I should try and run XvMC now too, or not...
> >
>
> If you can live without Bob deint (i.e. you don't mind the deint
> artifacts or you run an
> interlaced modeline) then it works perfectly for me in SD and HD.
>
> However, if you enable Bob then all bets are off. I get all kinds of
> problems with Bob
> enabled.
>
> I currently run XvMC for SD and HD on a progressive scan LCD TV and I'm very
> happy with it. I'd prefer to have Bob work correctly, but I can live
> with the artifacts
> in trade for the reduced CPU utilization. My 2.93ghz Celeron D won't
> play back
> 1080i without XvMC anyway.

Jesse,
 I take it your email refers to using XvMC?

I seem to need BOB (or some sort of deint) for 1080i output even of
1080i content (possibly because I'm not using OpenGL Vsync?). I'll
enable both OpenGL Vsync and XvMC in my build and play around with it.
I've got Myth setup to run 1080i content at 1080i (which my TV is
native at) and 720p at 720p output to my tv. I'd run 720p at 1080i
except that Myth/X doesn't seem to like this and I get all sorts of
stuttering and weird spikes in CPU usage, etc. Once I have the new
build I'll play around with all sorts of combinations of these to see
how things work out though. I'd love to cut my CPU usage for HD with
XvMC if things remain stable and I can live with a black and white OSD
menu for now.


--
Steve


More information about the mythtv-users mailing list