[mythtv-users] Seeking Advice

Joshua Gottlieb joshuagottlieb at gmail.com
Wed Jul 25 17:38:21 UTC 2007

On 7/25/07, Michael T. Dean <mtdean at thirdcontact.com> wrote:
> On 07/25/2007 12:23 PM, Joshua Gottlieb wrote:
> > Ok, so I'm a noob at all this.  Now I have tried to do a good deal of
> > reading on the forums and what not trying to get a feel for the
> > recommended hardware etc.
> >
> > I'm trying to build a box from scratch, specifically for mythtv.
> > Originally I had planned on getting a Hauppauge PVR-350 so I could
> > have the MPEG2 decryption on the card.
> That's actually decoding.  Oh, and, IMHO, putting MPEG-2 decoding on an
> external card is an anachronism.  Decoding standard-definition TV in
> real-time has been an easy-enough task for general-purpose CPU's since
> we hit the 800MHz mark or so.  Today, you can't really buy a CPU that
> can't handle it.
> And, in the event you get a "low-power" alternative processor (some of
> the VIA's or whatever) and you want some "help" with decoding, you can
> do so with XvMC--which is a function of the video card and its drivers.
> >   This seemed like the ideal solution and I could get very low end
> > hardware as all the work would be done by the card.  However, then I
> > saw various comments about how the card is being phased out, soon to
> > be not supported etc, etc...
> And (again, IMHO) a complete and total waste of $50 to $100.  (Says
> someone who wasted his money and bought a PVR-350 in 2003, on which he
> used the video decoder for about 4 hours before deciding it was a
> waste.  After then, I used the PVR-350 as if it were a PVR-150 or a
> PVR-250.)
> > So I started doing more research and noticed that there are a few
> > small form factor motherboards that have TV-outs and built-in MPEG2
> > decryption, so I thought maybe that would be a good path, but there
> > didn't seem to be very many folks having used the newer hardware.
> My recommendation is to get an NVIDIA video card and a PVR-150 (or
> multiple PVR-150's).  The NVIDIA drivers work very well (for
> proprietary, binary-only drivers, that is) and support XvMC on any
> relatively recent GPU.  I'd recommend something from the GeForce 6000
> series (i.e. GF6200) since the GeForce 4 series was end-of-lifed (no
> longer supported by current drivers) within the last year and the
> GeForce 5000 series is probably next on the chopping block, so the
> oft-recommended GF5200 is likely to be EOL'ed "soon."  (Wouldn't it be
> nice if NVIDIA published a "Linux driver support roadmap"?)

What about  motherboards with built-in nvidia 6100 video?  Would these work,
or should I just get a dedicated video card?

> Anyway, what I'm trying to do is get advice from the experienced folks
> > on here to setup a good solid mythtv system.
> Basically, the problem is that if you offload MPEG-2 decoding from the
> CPU, it limits what you can do with video.  Using the PVR-350's built-in
> decoder means you have to use the PVR-350's TV out--for video /and/ for
> GUI.  It does not (and cannot) support OpenGL (which is the future of
> the MythTV UI).  It also only supports 720x480 resolution and, IMHO, you
> can get a much better result using a video card running at 800x600 or
> 1024x768 with the TV-out chip sampling from that picture.  (And, BTW,
> regardless of what people may tell you, there's no equivalent of 1:1
> pixel mapping for SDTV using a TV out.)
> Mike
> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mythtv.org/pipermail/mythtv-users/attachments/20070725/3cf2c666/attachment-0001.htm 

More information about the mythtv-users mailing list