[mythtv-users] Nvidia TV Encoder not listing any support HD modes

Skitals hondacrxsi at gmail.com
Sun Mar 23 12:39:24 UTC 2008




Bonj wrote:
> 
> Justin Nolan wrote:
>> I'm giving up trying to output 1080i via DVI->HDMI without all types of 
>> weird judder, so I've turned to tv-out/component video. I've never seen 
>> any different between Component and DVI on my Sony 1080i RPTV, so it 
>> shouldn't be any loss. Although I've ran into a major hurdle: I can't 
>> get any HD resolutions working w/ TV out. My graphics card is a XFX 
>> 7200GS, and I'm using a nvidia 7-pin HDTV breakout cable plugged into 
>> the TV out port (gfx card manual says the port supports 4, 7, and 9 pin 
>> cables/adapters).
>> 
>> I'm using the latest driver from nvidia, and my xorg.conf is as such:
>> 
>>> Section "Monitor"
>>>     Identifier     "Generic Monitor"
>>>     HorizSync       30.0 - 130.0   
>>>     VertRefresh     50.0 - 160.0
>>>     Option         "DPMS"
>>> EndSection
>>>
>>> Section "Device"
>>>     Identifier     "nVidia Corporation G72 [GeForce 7300 SE]"
>>>     Driver         "nvidia"
>>> EndSection
>>>     
>>> Section "Screen"
>>>     Identifier     "Default Screen"
>>>     Device         "nVidia Corporation G72 [GeForce 7300 SE]"
>>>     Monitor        "Generic Monitor"
>>>     DefaultDepth    24
>>>     Option         "UseDisplayDevice" "TV"
>>>     Option         "TVOutFormat" "COMPONENT"
>>>     Option         "TVStandard" "HD1080i"
>>>     SubSection     "Display"
>>>         Depth       24
>>>         Modes      "1920x1080"
>>>     EndSubSection
>>> EndSection
>> 
>> According to the xorg log, it doesn't seem to complain about the HD1080i 
>> mode. But then:
>> 
>>> (WW) NVIDIA(0): No valid modes for "1920x1080"; removing.
>>> (WW) NVIDIA(0):
>>> (WW) NVIDIA(0): Unable to validate any modes; falling back to the 
>>> default mode
>>> (WW) NVIDIA(0): "nvidia-auto-select".
>>> (WW) NVIDIA(0):
>>> (II) NVIDIA(0): Validated modes:
>>> (II) NVIDIA(0): "nvidia-auto-select"
>>> (II) NVIDIA(0): Virtual screen size determined to be 800 x 600
>> 
>> What actually happens on the screen is that for the first few minutes 
>> its a garbled mess of flickering red lines. After a few minutes the 
>> screen appears in a small 800x600 window in the center of the screen 
>> surrounded by black. My tv displays grey bars on the left and right on 
>> everything except an HD signal. That tells me it actually is receiving a 
>> 1080i signal, but with only a 800x600 pixel image.
>> 
>> And here's the part that is really making me pull out my hair. The 
>> nvidia tv encoder on the card isn't reporting any supported HD modes:
>> 
>>> (--) NVIDIA(0): Connected display device(s) on GeForce 7300 SE/7200 GS
>>> at
>>> (--) NVIDIA(0): PCI:1:0:0:
>>> (--) NVIDIA(0): NVIDIA TV Encoder (TV-0)
>>> (--) NVIDIA(0): NVIDIA TV Encoder (TV-0): 400.0 MHz maximum pixel clock
>>> (--) NVIDIA(0): TV encoder: NVIDIA
>>> (II) NVIDIA(0): TV modes supported by this encoder:
>>> (II) NVIDIA(0): 1024x768; Standards: NTSC-M, NTSC-J, PAL-M, PAL-BDGHI,
>>> (II) NVIDIA(0): PAL-N, PAL-NC
>>> (II) NVIDIA(0): 800x600; Standards: NTSC-M, NTSC-J, PAL-M, PAL-BDGHI, 
>>> PAL-N,
>>> (II) NVIDIA(0): PAL-NC
>>> (II) NVIDIA(0): 720x576; Standards: PAL-BDGHI, PAL-N, PAL-NC
>>> (II) NVIDIA(0): 720x480; Standards: NTSC-M, NTSC-J, PAL-M
>>> (II) NVIDIA(0): 640x480; Standards: NTSC-M, NTSC-J, PAL-M, PAL-BDGHI, 
>>> PAL-N,
>>> (II) NVIDIA(0): PAL-NC
>>> (II) NVIDIA(0): 640x400; Standards: NTSC-M, NTSC-J, PAL-M, PAL-BDGHI, 
>>> PAL-N,
>>> (II) NVIDIA(0): PAL-NC
>>> (II) NVIDIA(0): 400x300; Standards: NTSC-M, NTSC-J, PAL-M, PAL-BDGHI, 
>>> PAL-N,
>>> (II) NVIDIA(0): PAL-NC
>>> (II) NVIDIA(0): 320x240; Standards: NTSC-M, NTSC-J, PAL-M, PAL-BDGHI, 
>>> PAL-N,
>>> (II) NVIDIA(0): PAL-NC 
>>> (II) NVIDIA(0): 320x200; Standards: NTSC-M, NTSC-J, PAL-M, PAL-BDGHI, 
>>> PAL-N,
>>> (II) NVIDIA(0): PAL-NC
>> 
>> What would cause no HDTV modes to be listed? I've been struggling with 
>> this for going on 8 hours, so any input is much appreciated. Thanks!
> 
> I use a 7200GS outputting via component breakout
> cable to a 76cm widescreen Sony CRT at 1080i. At 1080i I experience 
> combing effects without deinterlacing turned on. This is a side effect 
> of using an interlaced mode and cannot be avoided without deinterlacing.
> I have only recently upgraded my box to this hardware, so I'm still 
> playing with the deinterlacers. The current setting is the "one field" 
> deinterlacer, which gets rid of the combing, but isn't the best visually 
> IMHO. I didn't have much time to play with it before it got commandeered 
> for actual TV watching, so I have yet to experiment further, but I can 
> say that XvMC with Bobx2 didn't look real flash either... must be a 
> problem of either my XvMC setup, or the driver/card itself, but it 
> seemed to have excessive buffering pauses.
> 
> Anyway, below is my xorg.conf:
> 
> Section "Monitor"
>      Identifier     "Generic Monitor"
>      Option         "DPMS"
> EndSection
> 
> Section "Device"
>      Identifier     "Generic Video Card"
>      Driver         "nvidia"
>      Option         "NvAGP" "1"
>      Option         "DPI" "100x100"
>      Option         "UseEvents" "1"
>      Option         "AddARGBVisuals" "1"
>      Option         "AddARGBGLXVisuals" "1"
>      Option         "NoLogo" "1"
>      Option         "UseDisplayDevice" "TV"
>      Option         "TVOutFormat" "COMPONENT"
>      Option         "TVStandard" "HD1080i"
> EndSection
> 
> Section "Screen"
>      Identifier     "Default Screen"
>      Device         "Generic Video Card"
>      Monitor        "Generic Monitor"
>      DefaultDepth    24
>      SubSection     "Display"
>          Depth       24
>          Modes      "1920x1080" "1280x720" "1024x768" "720x480" 
> "800x600" "640x480"
>      EndSubSection
> EndSection
> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
> 
> 

That is really disappointing that deinterlacing is still necessary. What I
really want to know is, how does the cable box do it? You can hook up a
cable box to any tv, interlaced or progressive, with any cables--dvi, dvi to
hdmi, component, etc--and it will look perfect. I'm capturing a stream that
should be bit for bit identical to what the cable box is playing, yet it
doesn't look as good and deinterlacing is required. Granted, the prior is
probably caused by the latter.

And what's just as frustrating is that I know I've tried that xorg.conf
letter for letter. Someone posted it as an example on another forum, if I'm
not mistaken. I had the same effect--a garbled mess. Even if I don't end up
using component, I want to get to the bottom of why my 7200GS tv encoder is
only reporting being capable of SD tv standards. It's really irking me.

Anyway, thanks a lot for the reply. I guess I can just add this to the list
of excuses of why I need to upgrade my TV to 1080p. Although I imagine I
would have all the same headaches. But again, I really want to know what the
cable box does, and how we can mimic it.

-- 
View this message in context: http://www.nabble.com/Nvidia-TV-Encoder-not-listing-any-support-HD-modes-tp16226990s15552p16235529.html
Sent from the mythtv-users mailing list archive at Nabble.com.



More information about the mythtv-users mailing list