[mythtv-users] nVidia FX5200 - reccomendations please

Brian Wood beww at beww.org
Mon May 1 16:33:13 UTC 2006


On May 1, 2006, at 9:43 AM, James Kaufman wrote:

> On Thu, Apr 20, 2006 at 03:10:12AM -0600, Brian Wood wrote:
>>
>> On Apr 20, 2006, at 3:03 AM, mythtv-users at spam.dragonhold.org wrote:
>>
>>> On Thu, Apr 20, 2006 at 02:41:08AM -0600, Brian Wood wrote:
>>>> I have tried cards from a 5200 to a 6800 and have not noticed any
>>>> difference, with the VGA output feeding a 32" LCD, and I don't use
>>>> XvMC.
>>> There is a world of difference between the VGA output and the TV
>>> output.  Friends of mine
>>> using a PC under a 36" widescreen TV ended up ditching their nVidia
>>> card and going back to a
>>> Matrox one because the TV out of the nVidia card was so bad.
>>>
>>> Admittedly that was about 5 years ago, so I'd hope things have
>>> improved a lot - but it's
>>> something to be aware of.
>>>
>>> I'm certainly waiting until we get our new HDTV (delivered this
>>> evening *YAY*) before I go
>>> back to using my mythTV box, simply because the quality of the PAL
>>> output is too bad for the
>>> menus... It's fine for watching programs, but the menus have
>>> flickering edges to all the
>>> sharp lines.
>>>
>>
>> No argument here. I was using the S-Video output and it was so
>> (relatively) crappy I changed my entire living room furniture
>> arrangement so I could connect using VGA. Haven't tried DVI yet as my
>> satellite receiver direct output is using that for HD, my Myth system
>> is strictly SD at this time.
>>
>> I had assumed that video cards all had pretty much the same quality
>> NTSC/PAL outputs, because it seems like an afterthought to the card
>> makers. If some card on the market has markedly better TV output by
>> all means it should be used if you're going that route, and the fact
>> should be made well-known to the Myth community.
>
> What is '(relatively) crappy'? I've noticed on my old Analog 27" NTSC
> TV that the TV out from my FX-5200 is darker than live TV is. LIve TV
> looks brighter, more vibrant. The FX-5200 looks washed-out and dingy.
> (And yes, I did try tweaking MythTV's controls to adjust brightness,
> contrast, etc.)
>

The "(relative) crappiness" was mostly in resolution, not in the  
"1024x768" sort of sense but in the "sharpness" of the image. Using  
the VGA output/input I could read the text in a console window or web  
browser screen. It was not as noticeable playing video full-screen,  
because SD video doesn't have that much detail to start with.

"Dark" and "washed out and dingy" are typical descriptions of low  
video levels. I would think that Myth's adjustments would not be the  
place to correct such a problem, it would be better dealt with using  
the video driver's controls. If you're using the "proprietary binary  
blob" then "nvidia-settings" gives you controls that would have far  
more effect than Myth's settings.

Using nvidia-settings I was able to get a composite output signal  
that looked pretty much like an internal tuner image on my set,  
except for the slight artifacts that accompany any mpeg-compressed  
analog video (the very slight "nylon stocking" effect, not the motion  
artifacts).

Most people overdrive both the video and the chroma levels on their  
home TVs, and "consumers" are sometimes dis-satisfied with a truly  
properly set up monitor, just because they are used to over-saturated  
colors and smashed blacks and whites, I learned a long time ago not  
to "correct" these problems on friends sets, they often want it "the  
way they like it", even when it is very "wrong".


More information about the mythtv-users mailing list