[mythtv-users] Need help with Modeline/1080i
mlists
mlists at dressler.ca
Tue Oct 2 00:12:21 UTC 2007
Hi all,
I'm sure this has been hashed out a hundred times already but this is
driving me nuts.
I have an Nvidia 5200... works great with VGA connection but I want to
use DVI to HDMI on my TV, which is a Daytek 37" LCD. I have it also
hooked up to an HD cable receiver and 1080i works great on it - nice
crisp and clear -- better then my VGA connection.
I turned off my machine, connected the dvi to hdmi cable. I get nothing
on the monitor at all. My xorg log has:
(II) NVIDIA(0): NVIDIA GPU GeForce FX 5200 (NV34) at PCI:1:0:0 (GPU-0)
(--) NVIDIA(0): Memory: 131072 kBytes
(--) NVIDIA(0): VideoBIOS: 04.34.20.87.00
(II) NVIDIA(0): Detected AGP rate: 8X
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(--) NVIDIA(0): Connected display device(s) on GeForce FX 5200 at
PCI:1:0:0:
(--) NVIDIA(0): PYX 26LCDTV (DFP-0)
(--) NVIDIA(0): PYX 26LCDTV (DFP-0): 135.0 MHz maximum pixel clock
(--) NVIDIA(0): PYX 26LCDTV (DFP-0): Internal Single Link TMDS
(II) NVIDIA(0): Assigned Display Device: DFP-0
(WW) NVIDIA(0): No valid modes for "1360x768"; removing.
(II) NVIDIA(0): Validated modes:
(II) NVIDIA(0): "1280x768"
(II) NVIDIA(0): "800x600"
(II) NVIDIA(0): "640x480"
(II) NVIDIA(0): Virtual screen size determined to be 1280 x 768
(--) NVIDIA(0): DPI set to (56, 60); computed from "UseEdidDpi" X config
(--) NVIDIA(0): option
(==) NVIDIA(0): Disabling 32-bit ARGB GLX visuals.
How do I get 1080i set up on this?
Thanks
Norm
More information about the mythtv-users
mailing list