[mythtv-users] 720x480 interlaced modeline for nvidia/svideo
Louie Ilievski
loudawg at comcast.net
Fri Mar 10 16:47:51 UTC 2006
On Friday 10 March 2006 07:40, Nick Bartos wrote:
> So why did you go back to the nvidia card if the 350 works so well? If it
> would be better, I am definitely willing to buy another card and put it
I got a 5200 card because at the time Xv support was still very immature, and
I couldn't watch videos in MythVideo. It couldn't keep up. With Xv, the
image wasn't being scaled properly and such. Also, I wanted to try MythGame
and had read that the 350 doesn't cut it for things like emulation
(ironically, I STILL haven't set up MythGame :-P ) On top of that,
MythMusic's GOOM would not be possible on the 350. Back when I was using it,
also, the decoder was a little sketchy sometimes. Doing really fast FF would
eventually cause the screen to get all colorful and the decoder to lock up,
usually.
So, just the combination of many things drove me to try out a 5200. I was
building a new system anyway, and just decided to get fresh components and
try something new...everyone on the lists said they were using one anyway.
The 5200 definitely has it's advantages. Especially if you're trying to
drive a new HD TV using DVI or VGA, the 350 is not an option at all (of
course, I don't have a TV with that capability yet).
> in. How is the 350 output compared to using the nvidia + bob deinterlace.
> Is there a noticeable difference?
The 350 output is simply stunning. It beats the nvidia output hands down, any
day. The picture is perfect. Not much more to say there.
More information about the mythtv-users
mailing list