<br><br><div><span class="gmail_quote">On 7/25/07, <b class="gmail_sendername">Michael T. Dean</b> <<a href="mailto:mtdean@thirdcontact.com">mtdean@thirdcontact.com</a>> wrote:</span><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
On 07/25/2007 12:23 PM, Joshua Gottlieb wrote:<br>> Ok, so I'm a noob at all this. Now I have tried to do a good deal of<br>> reading on the forums and what not trying to get a feel for the<br>> recommended hardware etc.
<br>><br>> I'm trying to build a box from scratch, specifically for mythtv.<br>> Originally I had planned on getting a Hauppauge PVR-350 so I could<br>> have the MPEG2 decryption on the card.<br><br>That's actually decoding. Oh, and, IMHO, putting MPEG-2 decoding on an
<br>external card is an anachronism. Decoding standard-definition TV in<br>real-time has been an easy-enough task for general-purpose CPU's since<br>we hit the 800MHz mark or so. Today, you can't really buy a CPU that
<br>can't handle it.<br><br>And, in the event you get a "low-power" alternative processor (some of<br>the VIA's or whatever) and you want some "help" with decoding, you can<br>do so with XvMC--which is a function of the video card and its drivers.
<br><br>> This seemed like the ideal solution and I could get very low end<br>> hardware as all the work would be done by the card. However, then I<br>> saw various comments about how the card is being phased out, soon to
<br>> be not supported etc, etc...<br><br>And (again, IMHO) a complete and total waste of $50 to $100. (Says<br>someone who wasted his money and bought a PVR-350 in 2003, on which he<br>used the video decoder for about 4 hours before deciding it was a
<br>waste. After then, I used the PVR-350 as if it were a PVR-150 or a<br>PVR-250.)<br><br>> So I started doing more research and noticed that there are a few<br>> small form factor motherboards that have TV-outs and built-in MPEG2
<br>> decryption, so I thought maybe that would be a good path, but there<br>> didn't seem to be very many folks having used the newer hardware.<br><br>My recommendation is to get an NVIDIA video card and a PVR-150 (or
<br>multiple PVR-150's). The NVIDIA drivers work very well (for<br>proprietary, binary-only drivers, that is) and support XvMC on any<br>relatively recent GPU. I'd recommend something from the GeForce 6000<br>series (
i.e. GF6200) since the GeForce 4 series was end-of-lifed (no<br>longer supported by current drivers) within the last year and the<br>GeForce 5000 series is probably next on the chopping block, so the<br>oft-recommended GF5200 is likely to be EOL'ed "soon." (Wouldn't it be
<br>nice if NVIDIA published a "Linux driver support roadmap"?) </blockquote><div><br>What about motherboards with built-in nvidia 6100 video? Would these work, or should I just get a dedicated video card?<br>
</div><br><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">> Anyway, what I'm trying to do is get advice from the experienced folks<br>
> on here to setup a good solid mythtv system.<br><br>Basically, the problem is that if you offload MPEG-2 decoding from the<br>CPU, it limits what you can do with video. Using the PVR-350's built-in<br>decoder means you have to use the PVR-350's TV out--for video /and/ for
<br>GUI. It does not (and cannot) support OpenGL (which is the future of<br>the MythTV UI). It also only supports 720x480 resolution and, IMHO, you<br>can get a much better result using a video card running at 800x600 or
<br>1024x768 with the TV-out chip sampling from that picture. (And, BTW,<br>regardless of what people may tell you, there's no equivalent of 1:1<br>pixel mapping for SDTV using a TV out.)<br><br>Mike<br>_______________________________________________
<br>mythtv-users mailing list<br><a href="mailto:mythtv-users@mythtv.org">mythtv-users@mythtv.org</a><br><a href="http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users">http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
</a><br></blockquote></div><br>