[mythtv-users] Nvidia TV Encoder not listing any support HD modes
Skitals
hondacrxsi at gmail.com
Sat Mar 22 20:43:48 UTC 2008
Craig Whitmore wrote:
>
>
> On Sat, 2008-03-22 at 16:24 -0400, Justin Nolan wrote:
>> I'm giving up trying to output 1080i via DVI->HDMI without all types
>> of weird judder, so I've turned to tv-out/component video. I've never
>> seen any different between Component and DVI on my Sony 1080i RPTV, so
>> it shouldn't be any loss. Although I've ran into a major hurdle: I
>> can't get any HD resolutions working w/ TV out. My graphics card is a
>> XFX 7200GS, and I'm using a nvidia 7-pin HDTV breakout cable plugged
>> into the TV out port (gfx card manual says the port supports 4, 7, and
>> 9 pin cables/adapters).
>
> I am using DVI -> HMDI cable into a 50" Sony RPS. Easy as using the built
> in modes for the nvidia drviers for example
>
> SubSection "Display"
> Depth 24
> Modes "1920x1080_50i"
> EndSubSection
>
> Thanks
>
>
>
> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
>
>
Do you need to use a deinterlacer to get decent looking video playback? It's
my understanding there is a longstanding bug in the nvidia driver that
causes video playback to studder/tear when outputting at 1080i. That can be
"corrected" by using a deinterlacer, but I'm trying to avoid that to output
a true 1080i signal.
--
View this message in context: http://www.nabble.com/Nvidia-TV-Encoder-not-listing-any-support-HD-modes-tp16226990s15552p16227159.html
Sent from the mythtv-users mailing list archive at Nabble.com.
More information about the mythtv-users
mailing list