[mythtv-users] Nvidia vs. ATI video cards

Bruce Markey bjm at lvcm.com
Fri Jun 25 22:21:19 EDT 2004


Clyde Stubbs wrote:
> On Fri, Jun 25, 2004 at 12:49:58PM -0700, Bruce Markey wrote:
> 
>>'I like that'. However, I do want to point out the factual
>>differences between using an nVidia card with the 4363 driver
>>and ATI with the GATOS "devel" branch for tvout.
> 
> 
> Hmm, most of your comments are accurate, but there are a few
> subjective comments, and some incorrect or maybe just out
> of date info. Indulge me...

I will indulge and respond.

>>and compile xfree86 source. nVidia has tvout support by just
>>running the install script. Note that this is just an annoyance
> 
> 
> Didn't work for me, for reasons discussed elswhere.

I won't go looking for those reasons and don't know what particular
problems you, or any other individual encountered. I do know that
hundreds (thousands?) of other people have configured their nVidia
cards to work correctly. I also know that dozens (hundreds?) of
people didn't grasp that they needed a specific CVS branch and
X11 source to make GATOS do tvout. Either way, these are not
output quality issues but installation frustration issues. Agreed?

>>GATOS supports 800x600 and 640x480 only. The nVidia driver also
>>supports 1024x768 if the card model is capable (and most are).
> 
> 
> True, but somewhat irrelevant since you can't get 1024x768 with
> anything close to standard PAL or NTSC timings.

Here's a key concept that most people don't grasp. On TV output,
there is not a one-to-one relationship between the number of
horizontal pixels in X and the number of scanlines used for display.
There are always ~480 scanlines viewable in a TV signal. A 640x480
X display is normally underscanned to fit inside the cowl of your
TV set so it is shown on, maybe, scanlines 20-460. The X data is
scaled and smoothed to fit 480 dots on to 440 lines (hint, if you
fill a terminal window with "=======" you will see that the lines
are different thicknesses and densities in different places). Same
for 800x600 scaled to fit on 440 TV scanlines.

The 1024x768 really is that size as far as X is concerned but it
does look blurry because of the limited resolution of an NTSC TV.

>>may come just before or just after the vsync causing a twitch in
>>motion (watch the crawler on CNN). nVidia has vsync information
> 
> 
> I get perfectly smooth video on the Radeon, and yes, I use
> the ticker as a measure of that.

See, I almost quoted some stats but noooo... that would have
been overkill and too geeky ;-). Now I guess I have to =).

There is some test code built into myth call the jitterometer to
generate statistics that are used in testing the video playback
loop. If you run "mythfronend -v playback" you'll see lots of goo.
Every few seconds there is a long line that shows the standard
deviation for the frame timing intervals. With no special features
turned on, the stddev will normally be in the 5000-6000ms range.
With Jitter Reduction, 3000-4000ms. With nVidia polling on tvout,
this can be as low as 40-200ms(!) YMMV and other factors can push
it into the hundreds but it is quantifibly oddles closer to perfect
timing.

[On a computer monitor, these numbers will still be in the thousands
because the differences between monitor sync timing and NTSC timing.
It updates at the right time for the monitor independently from
the expected time for NTSC ...don't worry about it ;-]

Having spent a great deal of time on the code to make it almost
acceptable but certainly imperfect, I'm a little uneasy about you
(and someone else earlier today) telling me that your output is
"perfectly smooth". I could sit with you in front of your set
turned to CNN and point out the flaws and explain why they happen
for over an hour but you wouldn't like it so let's not do this =).

This may have been a point where you thought that I was being
subjective but I can not only see that it is smoother but can
statistically prove that it is smoother. Sorry for the overkill
this time around ;-).

>>GATOS doesn't support all if the Xv picture settings controls most
>>notably, many cards don't support Hue adjustments.
> 
> 
> Mine certainly does.

I had one card that did but three that didn't. If you use xvinfo,
there are other things that nVidia supports that ATI does not but
since myth doesn't use these it's moot. However, I did experiment
with the XV_ITURBT_709 for a while.

>>rank them for color reproduction, my nVidia cards would be 1-4 and
>>the ATI card 5-8.
> 
> 
> My experience is the opposite. The Radeon produces more natural colour
> and better sharpness than the two nVidias I tried
> (one nForce2 embedded, one MX440 AGP card). This was a bonus - I switched
> primarily because of driver issues.

I used the "Video Essentials" DVD test patterns and optical
filters (as well as dead reckoning and common sense ;-) to come
to this conclusion. There is no way that I know of to quantify
this which is why I said that this could be subjective. However,
I really do believe that (not just feel like =) all three of the
nVidia reproduced the colors in the test patterns more accurately
than either of the two ATI cards I used in side by side tests on
the same screen last year. I can extrapolate that the cards I
didn't test would give similar results as I can see that they are
similar to other cards of the same vendor during playback.

If you have any other test tools or test patterns that can show
how these compare, please let me know.

--  bjm


More information about the mythtv-users mailing list