[mythtv-users] HD Woes

Nate Crosno nate at crosno.net
Mon Jan 29 17:12:44 UTC 2007

Steven Adeff wrote:
> Assuming your using the 9xxx series drivers,
> try removing your 1080i modeline and use
> Modes "1280x720_60" "1920x1080_60i"
> this should use ModePool versions which will have the proper refresh rate.
> Then turn on Xv Vsync in nvidia-settings and use the Standard decoder.

I am indeed using the latest nvidia driver. Sorry I forgot to add that!
I have a very hard time trying to figure out the right amount of info to
post without putting so much that, no one will bother to read through
the whole thing.

Try as I might, I could never get 1080i working over DVI.  When I
started X in verbose mode ("startx -- -logverbose 5" or something like
that), there was no 1080i mode listed in the modepool.  After seeing
your post, I thought maybe the fact that I had a Modeline of the same
name in there was keeping it from showing, so I commented out that
modeline and tried again -- still no go.  In my log, I get:
No valid modes for "1920x1080_60i"; removing.

The 1080i modeline that I showed before, was one that I built using the
EDID information from the xorg log:

(--) NVIDIA(0): DPMS Capabilities            :
(--) NVIDIA(0): Prefer first detailed timing : Yes
(--) NVIDIA(0): Supports GTF                 : No
(--) NVIDIA(0): Maximum Image Size           : 0mm x 0mm
(--) NVIDIA(0): Valid HSync Range            : 15.0 kHz - 46.0 kHz
(--) NVIDIA(0): Valid VRefresh Range         : 59 Hz - 61 Hz
(--) NVIDIA(0): EDID maximum pixel clock     : 80.0 MHz
(--) NVIDIA(0):
(--) NVIDIA(0): Established Timings:
(--) NVIDIA(0):   640  x 480  @ 60 Hz
(--) NVIDIA(0):
(--) NVIDIA(0): Detailed Timings:
(--) NVIDIA(0):   1920 x 1080 @ 60 Hz
(--) NVIDIA(0):     Pixel Clock      : 74.25 MHz
(--) NVIDIA(0):     HRes, HSyncStart : 1920, 2008
(--) NVIDIA(0):     HSyncEnd, HTotal : 2052, 2200
(--) NVIDIA(0):     VRes, VSyncStart : 540, 542
(--) NVIDIA(0):     VSyncEnd, VTotal : 547, 562
(--) NVIDIA(0):     H/V Polarity     : +/+
(--) NVIDIA(0):     Extra            : Interlaced

This motherboard also has component out, and after much tinkering, i got
that to do 1080i out. I had to set the desktop to 1920x1080 in order for
it to fill the screen (with overscan).  I then tried playing a 1080i
clip in Myth with 'Standard' and 'libmpeg2' and without deinterlacing.
This had the same CPU usage as before and played back with pauses every
couple seconds.  Also, the content still looked like it needed
deinterlacing.  Am I not understanding this correctly? I thought at
1080i, I should have deinterlacing turned off?  I was hoping to take
load off the Myth box by having the TV scaler do some of the work.

Here are some xorg.conf details from that test:

Section "Device"
         Identifier  "Card0"
         Driver      "nvidia"
         Option      "NoLogo" "true"
         Option      "coolbits" "1"
         Option      "RenderAccel" "true"
         VendorName  "All"
         BoardName   "All"
         Option      "XvmcUsesTextures" "false"
         Option      "NvAgp" "0"
         Option      "UseEvents" "True"
         Option      "UseDisplayDevice" "TV"
         Option      "TVOutFormat" "COMPONENT"
         Option      "TVStandard" "HD1080i"
                     #It wouldn't work without the above line
         Option      "UseEDID" "FALSE"
         Option      "TVOverScan" "0"

Section "Screen"
         Identifier      "Screen0"
         Device  "Card0"
         Monitor "Monitor0"
         DefaultColorDepth 24
  Modes  "1920x1080"

I admit that I did not search enough for info about component settings. 
I found a few more things to try....mostly taking out the "TVOutFormat" 
line above and double-checking the various 'vblank' settings.

In the mean time, I throw myself at the mercy of the list and hope for a 
kind soul to offer up any other possible ideas.

Time to go do some real work and stop messing with the TV for a while.


More information about the mythtv-users mailing list