[mythtv-users] Nvidia just released 8756.

Jesse Guardiani jesse at wingnet.net
Mon Apr 10 16:16:25 UTC 2006


Steven Adeff wrote:
> On 4/10/06, Jesse Guardiani <jesse at wingnet.net> wrote:
>   
>>> good news. time to recompile MythTV with OpenGL Vsync support.
>>> Questioning whether I should try and run XvMC now too, or not...
>>>
>>>       
>> If you can live without Bob deint (i.e. you don't mind the deint
>> artifacts or you run an
>> interlaced modeline) then it works perfectly for me in SD and HD.
>>
>> However, if you enable Bob then all bets are off. I get all kinds of
>> problems with Bob
>> enabled.
>>
>> I currently run XvMC for SD and HD on a progressive scan LCD TV and I'm very
>> happy with it. I'd prefer to have Bob work correctly, but I can live
>> with the artifacts
>> in trade for the reduced CPU utilization. My 2.93ghz Celeron D won't
>> play back
>> 1080i without XvMC anyway.
>>     
>
> Jesse,
>  I take it your email refers to using XvMC?
>   

Yes.

> I seem to need BOB (or some sort of deint) for 1080i output even of
> 1080i content (possibly because I'm not using OpenGL Vsync?).

Not sure what you mean. I'm running an nvidia 6200 AGP -> DVI-D -> 32" 
LCD TV @ 1360x768.
So I'm treating my TV as a monitor. I don't "need" Bob. I don't "need" 
any deinterlacing. Some people
are annoyed by deinterlacing artifacts, but they don't bother me or my 
family much.

Whether or not you're in the category of people who need Bob or not is 
probably up to you.


>  I'll
> enable both OpenGL Vsync and XvMC in my build and play around with it.
> I've got Myth setup to run 1080i content at 1080i (which my TV is
> native at) and 720p at 720p output to my tv.

OK. That's complicated. I just scale everything in software to 1360x768. 
I'm not sure how that
would effect you.

-- 
Jesse Guardiani
Programmer/Sys Admin
jesse at wingnet.net



More information about the mythtv-users mailing list