[mythtv-users] Nvidia just released 8756.

Steven Adeff adeffs.mythtv at gmail.com
Mon Apr 10 21:56:52 UTC 2006


On 4/10/06, Jesse Guardiani <jesse at wingnet.net> wrote:
> Steven Adeff wrote:
> > On 4/10/06, Jesse Guardiani <jesse at wingnet.net> wrote:

<snip>

> > I seem to need BOB (or some sort of deint) for 1080i output even of
> > 1080i content (possibly because I'm not using OpenGL Vsync?).
>
> Not sure what you mean. I'm running an nvidia 6200 AGP -> DVI-D -> 32"
> LCD TV @ 1360x768.
> So I'm treating my TV as a monitor. I don't "need" Bob. I don't "need"
> any deinterlacing. Some people
> are annoyed by deinterlacing artifacts, but they don't bother me or my
> family much.
>
> Whether or not you're in the category of people who need Bob or not is
> probably up to you.

For 1080i content, if I don't use a deint filter, I get back-n-forth
flicker where it looks like the frame before is being played after
instead. Turning on BOB (which is the first filter I tried) fixes
this. Though If the TV and my video out are synced, then this
shouldn't be needed.

Like I said I need to play around to see why this is happening...


>
> >  I'll
> > enable both OpenGL Vsync and XvMC in my build and play around with it.
> > I've got Myth setup to run 1080i content at 1080i (which my TV is
> > native at) and 720p at 720p output to my tv.
>
> OK. That's complicated. I just scale everything in software to 1360x768.
> I'm not sure how that
> would effect you.

I tried scaling 720p to 1080i but I get horrible video and audio
stutter so I just have Myth output 720p as 720p which solves the
problem. Again, this may be something that OpenGL or switching to the
DVI cable will fix.

--
Steve


More information about the mythtv-users mailing list