<div class="gmail_quote">On Thu, Jan 22, 2009 at 11:03 AM, Cool Frood <span dir="ltr"><<a href="mailto:aaranya%2Bmythtv@gmail.com">aaranya+mythtv@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
On Thu, Jan 22, 2009 at 10:47 AM, Travis Tabbal <<a href="mailto:travis@tabbal.net">travis@tabbal.net</a>> wrote:<br>
><br>
><br>
> On Thu, Jan 22, 2009 at 8:35 AM, Cool Frood <<a href="mailto:aaranya%2Bmythtv@gmail.com">aaranya+mythtv@gmail.com</a>><br>
> wrote:<br>
>><br>
>> Hi there,<br>
>><br>
>> I have a MythTV setup currently running with an NVidia FX 5200 that<br>
>> pushes DVI to a 1680x1050 Acer LCD. I'm planning to get a 1080p HDTV.<br>
>> I have a bunch of questions:<br>
>><br>
>> 1. Is it possible to do 1080p without DVI-D? FX 5200 doesn't do DVI-D<br>
>> (I think). Can I do 1080p over VGA?<br>
>> 2. Can I run an HDTV at 1680x1050? It took me some effort to coax<br>
>> X.org into doing that resolution over DVI but I suspect that an HDTV<br>
>> will be less forgiving of clocks and frequencies.<br>
>> 3. If nothing works, what is a good card to do 1920x1200 to an HDTV in<br>
>> Linux? I assume it would require an DVI-D to HDMI cable.<br>
><br>
><br>
> 1080p is 1920x1080, so yes, I believe VGA can support that. The question is,<br>
> can your TV support that over its VGA input (if it has one, they are<br>
> stupidly rare these days).<br>
><br>
> Ditch the 5200 and get a more modern NVidia card. They have modes for HDTV<br>
> built into the driver now. So you can tell it you want "HD1080p" and it will<br>
> output the right timings and such. I'm doing it now for 720p. Works on 6150<br>
> and 8200 mobo chipsets, should work on any 6000+ series card with the newer<br>
> drivers. Perhaps older, I don't really know which cards are required. You<br>
> can use DVI-HDMI cables or get a card with HDMI output. My 8200 based<br>
> motherboard has an on board HDMI that works just fine for me.<br>
><br>
> Even if you don't use it now, I'd get a card that can support VDPAU. They<br>
> are reasonably cheap and have you covered for the future should you want to<br>
> use them that way. My 8200 based system is using VDPAU for mplayer and it<br>
> works very well.<br>
><br>
> _______________________________________________<br>
> mythtv-users mailing list<br>
> <a href="mailto:mythtv-users@mythtv.org">mythtv-users@mythtv.org</a><br>
> <a href="http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users" target="_blank">http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users</a><br>
><br>
><br>
<br>
Thanks for the information! I must be living under a rock because I<br>
thought HDMI wasn't really supported well on Linux. In particular,<br>
what would be a good (i.e., cheap and well supported) card to use?<br></blockquote></div><br>Just check out newegg or somewhere like that for the card that meets your specs. There are cards in the $30-$40 range that meet most people's needs. I believe all of the Nvidia 8000 series and newer cards will support the VDPAU hardware acceleration and are currently the most reccommended cards for myth. I think your confusion about HDMI may be because of HDCP. HDCP is the hardware encryption that may be used with HDMI for encrypted sources like blue-ray drives. I don't think that is supported at all in linux. Since you will just be playing unencrypted material from mythtv you won't have to worry about HDCP at all. <br>