[mythtv-users] Slowest processor for software HDTV decoding

Seth Heckard seth.lists at gmail.com
Thu Apr 28 04:04:17 UTC 2005


On 4/27/05, Joe Barnhart <joebarnhart at yahoo.com> wrote:
> I think you also have to specify the final output size.  I get the
> impression that 1280x720p output is easier than 1920x1080i.  Note, the
> SOURCE material could be in either format, but the size of the DISPLAY
> resolution is what I'm talking about here (i.e. the "modeline" you're
> using).

Ideally it would be at the native resolution (my TV has HDMI in so a
720p or 1080i resolution would be preferable, although 720p would be
upconverted on the TV), but I haven't had any luck with 1080i
modelines with the nvidia drivers.

> I use a 1080i modeline and I could not use an Athlon XP 2800 and get
> satisfactory output.  I had to go to a 3GHz P4 with hyperthreading to
> get the headroom you speak of.  (Recently I discovered that 512M wasn't
> enough, either. I have two HD-3000 cards, and comm flagging was killing
> me at 512M.)

Interesting.  I had pretty much ruled out an Athlon XP, although I
would prefer to go that route if it could work as well.  I can't ever
see myself getting dual HDTV cards though, I barely use the dual
tuners I have now simultaneously.


More information about the mythtv-users mailing list