[mythtv-users] optimizing nvidia tv-out for widescreen rear projection tv connected via s-video

James Buckley xanium4332 at googlemail.com
Fri May 18 16:29:25 UTC 2007


On 18/05/07, Daniel Agar <daniel at agar.ca> wrote:
>
> > now that I've been using Myth for a while... starting to wonder if I can
> > get better picture quality, not that it's bad... seems pretty darned
> nice
> > really.. but one just never knows.
> >
> > My TV is a 46" sony widescreen rear projection TV, the only connection I
> > have available on my FE is s-video or composite.. (TV has DVI,
> > component..etc..etc..).  I have a feeling that s-video will be the
> > limiting factor for me.  My card is nVidia Corporation NV34 [GeForce FX
> > 5200] (rev a1)
> >
> > One other thing I've noticed, that perhaps someone can chime in on.  I
> > route most of my video devices through my receiver/amp, for whatever
> > reason the nVidia driver/card will only initialize the tv-out if I'm
> > directly connected to the TV (don't have to be on that particular input,
> > TV doesn't even need to be on) or I have the receiver turned on and that
> > input selected.  I'm guessing that the card/driver does something to
> > attempt to detect the TV, which fails when hooked up to my receiver.
> >
> >
> > The pertinent xorg.conf  sections, I left out my other monitor config,
> > it's rarely hooked up and I don't think have any effect on things.  I'm
> of
> > course open to any and all suggestions..
> >
> > tia.
> >
> > Section "Monitor"
> >     Identifier  "TV"
> >     HorizSync   30-50
> >     VertRefresh 60
> >     Option "UseEdidDpi" "FALSE"
> >     Option "DPI" "100 x 100"
> > EndSection
> >
> > Section "Device"
> >     Identifier  "nvidia1"
> >     Driver      "nvidia"
> >     Option      "NoLogo"        "true"
> >     VideoRam    128000
> >     Option      "TVStandard"    "NTSC-M"
> >     Option      "TVOutFormat"   "SVIDEO'
> > #    Option      "TVOverScan"    "0.6"
> >     Option      "TwinView"
> >     Option      "TwinViewOrientation"   "Clone"
> >     Option      "ConnectedMonitor"      "TV"
> >     # Insert Clocks lines here if appropriate
> > EndSection
> >
> >
> > Section "Screen"
> >     Identifier  "TVOut"
> >     Device      "nvidia1"
> >     Monitor     "TV"
> >     Option      "TVStandard"    "NTSC-M"
> >     Option      "TVOutFormat"   "SVIDEO"
> >     DefaultDepth 32
> >     Subsection "Display"
> >         Depth       24
> >         Modes       "1024x768" "800x600" "640x480"
> >         ViewPort    0 0
> >     EndSubsection
> >     Subsection "Display"
> >         Depth       32
> >         Modes       "1024x768" "800x600" "640x480"
> >         ViewPort    0 0
> >     EndSubsection
> > EndSection
> >
> >
> >
> >
> This probably isn't the suggestion your looking for, but you might
> consider swaping that FX 5200 for one that has dvi on it. I was using an
> s-video connection on my 37" lcd and the quality difference when I
> switched to dvi was incredible, completely worth the $40 spent on the
> card.
>
> Other than that I don't see any main changes you could make with your
> xorg.conf, its about the same as the config I use on another SD frontend.
> You might consider adjusting the overscan with the nvidia-settings tool
> which will adjust in real time.



I second this, DVI/HDMI gives awsome pictures, and in some circumstances
allows one-to-one pixel mapping.

_______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mythtv.org/pipermail/mythtv-users/attachments/20070518/9525afb2/attachment.htm 


More information about the mythtv-users mailing list