[mythtv-users] [ATrpms-users] Confused: why did video performance DECREASE (up to 70%) after upgrading video card by 2 generations??? Please validate my analysis

Jeffrey J. Kosowsky atrpms at kosowsky.org
Sun Dec 7 22:11:20 UTC 2008

For those of you following my videocard saga, I upgraded my 6+ year
old GeForce4 Ti 4600 card [Leadtek Ultra A250 GF4600 128 MB with
VIVO/TVO] (which then was top of the line) for a new eVGA GeForce 6200
AGP card with 256 MB.

However, the 6200 seems to be SIGNIFICANTLY slower on several
benchmarks and parity on mythtv (and I think a detailed look at the
specs explains it -- see below)

Given that the old card was 6 years old and several generations older
than even the 6200, I was expecting some reasonably improved
performance just from Moore's law...

Instead I got the following:

                  GF4600            GeForce 6200
                  ______            ____________________________
glxgears          4700 fps          1360 fps (decrease by 70%!!!)
quake3 demo       203 fps           74 fps (decrease by 64%)

In mythtv, the visual performance is about the same even though the
older 4600 had mythfrontend using 80% of CPU on HD programs vs about
40% for the newer 6200. In both cases it seems like the video lags the
sound with the sound pausing every few seconds to wait for the "slower
motion" video to catch up giving the sound a jerky appearance.

Also with the older card when viewing HD programs (but not SD
programs) I got a lot of "NVP: prebuffering pause" errors in addition
to the "WriteAudio: buffer underruns" that occur with each sound
pause. I assume this is consistent with the fact that CPU usage is
becoming a bottleneck on the 4600 for HD.

I tried the GeForce 6200 with the 96.xx, 173.xx, and 177.xx drivers
without any difference.
The GF4600 only works with the 96.xx drivers.

Both cards have glx working with direct rendering
Both are AGP 4x with SBA and Fast Write turned on (same behavior with
it off too).
Both have XvMC on (though XvMC seems to only affect cpu usage and does
not improve or worsen the video itself with the 6200)
Both are using identical xorg.conf (with UseEvents=True and
Both have NvAGP=1 (though changing to the kernel agppart didn't make
any difference).

So, it seems to me that either the 6200 is a bum card or I must be
missing some configuration to unleash its power. 

- Is it really possible that the older 4600 is better than the 6200 for
  the limited graphics demands of older games like quake3 and demos like

- Why isn't the reduced cpu load for mythtv of the 6200 translating into
  better real-time video playback? 
- Is there a "bottleneck" in terms of getting data into/off the video
  card that would explain why video lags even with total cpu usage at
  only about 60%?

If I had to guess, it seems like I traded a CPU bottleneck with the
4600 card (with cpu maxed out in live HD tv) for a video bandwidth
bottleneck with the 6200.

Any suggestions or should I just return the 6200? Would any other AGP
card do any better?

Here are the spec comparison of the 2 boards:

                  GF4600            GeForce 6200
                  ______            ____________________________
Memory            128MB DDR         256MB DDR
AGP               4X                8X (I can only use 4X on my motherboard)
RAMDAC            350MHZ            400MHZ
Memory bandwidth  10.4GB/sec        3.2GB/sec
OpenGL            1.3ICD            Full OpenGL ICD
Clock(MHZ)        300 core          350 core
                  650 memory        532 memory

So, maybe the problem is that while the GPU may be faster and more
sophisticated on the 6200, it's actual memory bandwidth is 3x slower
(consistent with glxgears and quake3 performance). Thus maybe that
explains why even though the CPU load goes down under HD, I don't get
any effective improvement in actual video performance due to the
similar RAMDAC, memory, and  core GPU speeds???

atrpms-users mailing list
atrpms-users at atrpms.net

More information about the mythtv-users mailing list