[mythtv-users] High-Definition Playback performance data
John Kuhn
kuhn at razorsys.com
Mon Apr 4 15:57:39 UTC 2005
is this using Xv or XvMC?
--John
>I've lost track of the thread where I said something about trying a GeForce
>6200 in place of a GeForce 4 MX with an Athlon XP 1800 system, to see if that
>would give any significant playback performance boost with High Definition
>recordings. Well, the answer is no, not really, but a bit of a cpu upgrade
>sure helped. Here's a quick summary of my findings today (from memory though,
>wish I'd actually written it all down). In all cases, I'm using libmpeg2 for
>decode.
>
>Athlon XP 1800 w/AGP GF4MX, 6629 nvidia driver:
>
>-Able to play back 720p, cpu around 85% used.
>-Able to play back 1080i without a deint filter, cpu around 95% used.
>-Stutter every second or two on 1080i with deint filter on, cpu completely
>pegged.
>
>Athlon XP 1800 w/AGP GF4MX, 7174 nvidia driver:
>-Able to play back 720p, cpu around 70% used.
>-Able to play back 1080i without deint, cpu around 85% used.
>-Stutter every few seconds on 1080i w/deint filter on, cpu pegged.
>
>Athlon XP 1800 w/AGP GeForce 6200, 7174 nvidia driver:
>-Maybe slightly better, but no significant difference from the GF4MX.
>
>Athlon XP 2600 w/AGP GF4MX, 7174 nvidia driver:
>-Able to play back 720p, can't remember cpu usage.
>-Able to play back 1080i w/no deint, can't remember cpu usage.
>-Able to play back 1080i w/kernel deint filter enabled, cpu usage fluctuates
>between 80 and 95%.
>
>I've decided to stick with the 2600 and GF4MX, and my secondary Myth box can
>now handle 1080i programming without a problem. I'll get the missing cpu
>utilization numbers one of these days...
>
>
>
>------------------------------------------------------------------------
>
>_______________________________________________
>mythtv-users mailing list
>mythtv-users at mythtv.org
>http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
>
>
More information about the mythtv-users
mailing list