<br><br><div class="gmail_quote">On Tue, Apr 29, 2008 at 7:23 AM, Ma Begaj <<a href="mailto:derliebegott@gmail.com">derliebegott@gmail.com</a>> wrote:<br><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
2008/4/28 <<a href="mailto:zoiks2004-ivtv@yahoo.com">zoiks2004-ivtv@yahoo.com</a>>:<br>
<div class="Ih2E3d">> After 4.5 years doing myth in SD I'm getting ready to<br>
> go high-def. Unfortunately, my research has so far<br>
> indicated that high-def myth is a combination of<br>
> crapshoots and settling for less. Not a fault of Myth,<br>
> per se, more of a lack of support by GPU suppliers,<br>
> nvidia and Intel included.<br>
><br>
> That said, I'd really like to know what people have<br>
> working for full 1080i hi-def playback, on a 1080x1920<br>
> native display. Specifically, the combination of:<br>
><br>
> 1) video card<br>
> 2) television<br>
> 3) link between video card and television<br>
><br>
<br>
</div></blockquote><div><br>1) Nvidia 6200 <br>2) Vizio 32" (Circuit City version)<br>3) DVI->HDMI converter @ 1280x768<br><br>also<br><br>1) Nvidia 6150 (Asus Onboard video)<br>2) Optoma 1080p Projector HD80<br>
3) DVI->HDMI @ 1080p/60<br><br>I think modelines and what not are mainly a thing of the past. The nvidia drivers are better and the TVs are better. Both my TV's work by just plugging them in and using the nvidia-settings GUI to autodetect the setings. This also worked on my fathers 5 year old 1080i Sony Wega HDTV.<br>
<br>Mitchell<br> </div></div><br>