[mythtv-users] Modeline adjustments?

Steven Adeff adeffs.mythtv at gmail.com
Fri Aug 11 18:32:23 UTC 2006


On 8/11/06, Tim Scholl <tascholl at gmail.com> wrote:
> On 8/10/06, Steven Adeff <adeffs.mythtv at gmail.com> wrote:
> > On 8/10/06, Morten Rønseth <morten.ronseth at webfx.no> wrote:
> > >
> > >  Tim Scholl wrote:
> > > But looking closer I still do not see modelines for
> > >
> > >  "1920x1080_60i" or  "1280x720_60"
> > >
> > >  Any suggestions?
> > >
> > >
> > >  Please enclose a verbatim (unedited) version of xorg.conf.
> > >
> > >
> > >  Cheers,
> > >
> > >
> > >  -Morten
> > >
> > >
> > > Tim
> > >
> > >
> > >
> > > On 8/10/06, Tim Scholl <tascholl at gmail.com> wrote:
> > > >
> > > > I down loaded and installed 1.0-8762
> > > >
> > > > NVRM: loading NVIDIA Linux x86 Kernel Module   1.0-8762  Mon May 15
> > > 13:06:38 PDT 2006
> > > >
> > > > It updated my XF86Config-4 with the new modeline:
> > > > ModeLine       "1920x1200" 193.2 1920 2048 2256 2592 1200 1201 1204
> 1242
> > > +hsync -hsync
> > > >
> > > > But when I look at the XFree86.0.log
> > > >
> > > > --snip--
> > > > (II) Setting vga for screen 0.
> > > > (**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
> > > > (==) NVIDIA(0): RGB weight 888
> > > > (==) NVIDIA(0): Default visual is TrueColor
> > > > (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
> > > > (**) NVIDIA(0): Enabling RENDER acceleration
> > > > (II) NVIDIA(0): NVIDIA GPU GeForce FX 5200 at PCI:1:0:0
> > > > (--) NVIDIA(0): VideoRAM: 262144 kBytes
> > > > (--) NVIDIA(0): VideoBIOS: 04.34.20.27.02
> > > > (II) NVIDIA(0): Detected AGP rate: 8X
> > > > (--) NVIDIA(0): Interlaced video modes are supported on this GPU
> > > > (--) NVIDIA(0): Connected display device(s) on GeForce FX 5200 at
> > > PCI:1:0:0:
> > > > (--) NVIDIA(0):     Samsung (CRT-0)
> > > > (--) NVIDIA(0): Samsung (CRT-0): 350.0 MHz maximum pixel clock
> > > > (II) NVIDIA(0): Assigned Display Device: CRT-0
> > > > (WW) NVIDIA(0): No valid modes for "1920x1200"; removing.
> > > > (II) NVIDIA(0): Validated modes:
> > > > (II) NVIDIA(0):     "1024x768"
> > > > (II) NVIDIA(0):     "800x600"
> > > > (II) NVIDIA(0): Virtual screen size determined to be 1024 x 768
> > > > (WW) NVIDIA(0): No size information available in CRT-0's EDID; cannot
> > > compute
> > > > (WW) NVIDIA(0):     DPI from EDID.
> > > > (**) NVIDIA(0): DPI set to (56, 65); computed from "DisplaySize"
> Monitor
> > > section option
> > > > (--) Depth 24 pixmap format is 32 bpp
> > > > (II) do I need RAC?  No, I don't.
> > > > --snip--
> > > >
> > > > Since I will be connecting this through a 9A60 to my HDTV dont I also
> need
> > > to specify interlaced?
> > > >
> >
> > please please bottom post (check my sig for mailinglist etiquette.
> >
> > here's some of my xorg.conf for my frontend that uses my PCX5300 (a
> > FX5200 grafted on to a PCIE card) VGA output through my 9A60 to my
> > HDTV with driver 8756...
> >
> > Section "Device"
> >     Identifier  "fx5200"
> >     Driver      "nvidia"
> >     Option      "RenderAccel"       "1"
> >     Option      "RandRRotation"     "1"
> >     Option      "IgnoreDisplayDevices"      "TV"
> >     Option      "Coolbits"  "1"
> >     VendorName  "All"
> >     BoardName   "All"
> >     BusID       "PCI:5:0:0"
> > EndSection
> >
> > Section "Screen"
> >     Identifier     "Sanyo"
> >     Device         "fx5200"
> >     Monitor        "HDTV"
> >     DefaultDepth    24
> >     Option         "NoLogo" "False"
> >     Option         "IgnoreDisplayDevices" "TV"
> >     Option         "RandRRotation" "True"
> >     Option         "DPI" "148 x 148"
> >     Option "ConnectedMonitor" "CRT"
> >     Option "UseDisplayDevice" "CRT-0"
> >     SubSection     "Display"
> >         Depth       24
> >         Modes      "1920x1080_60i" "1280x720_60"
> >     EndSubSection
> > EndSection
> >
> > I have no modelines for "1920x1080_60i" "1280x720_60"
> >
> >
> Steven,
>
> Thanks for the informtion, Do you have a "Monitor section" ?  That is where
> my modes are defined.  Could you post it ?

I have no modelines defined in my monitor section. That was my meaning
in my first post in that they are defined as part of the driver
package. I'm making the assumption that the Nvidia folks realized that
people were connecting their cards to HDTV (and SDTV's as well) and
defined modelines for these specific uses using the HDTV standards for
output.

As for PAL/NTSC, I think they did create default modelines for SDTV
resolutions as well, but I'm not sure, its something better asked on
the nvnews.com forums (the unofficial official nvidia drivers for
linux forum).


-- 
Steve
Before you ask, read the FAQ!
http://www.mythtv.org/wiki/index.php/Frequently_Asked_Questions
then search the Wiki, and this list,
http://www.gossamer-threads.com/lists/mythtv/
Mailinglist etiquette -
http://www.mythtv.org/wiki/index.php/Mailing_List_etiquette


More information about the mythtv-users mailing list