[mythtv-users] Sony Bravia, Modlines & Intel chipset!

Douglas Wagner douglasw0 at gmail.com
Wed Apr 25 21:41:56 UTC 2007


Not sure how much help I can be but lets give a few things a shot.

First off, your best debug tool will be starting X in a higher log level
mode with a bit more information going out to the log files.  The trick to
doing this has been posted all over these boards and I don't have it handy,
but in essence you're starting X with a command line switch in level 5 or 6
I think.  In your /var/log directory you'll see an XOrg0.log (or something
like this) which should at this point print out a whole POTLOAD of
information about your X configuration file.

My GUESS is that your TV is trashing your attempted modeline due to EDID.
All recent TVs (or at least the LCD ones like I have) seem to tell the video
subsystem what it is they can and can't support.  For instance, my Sharp 37"
has a documented pixel size of something like 1344x768.  However, there
isn't a modeline in existance even close to that that my TV won't choke on
and spit out.  It only likes 480p (640x480), 720p (1280x720), and 1080i
(1900x1080 interlaced) and then 1080i ONLY if I explicitly tell the video
subsystem that using interlaced modes is ok, despite the fact that the TV
has the modline listed in EDID.  There's also a really weird resolution of
1440x480 or something like this that I have no idea why it's there.

The thing is, when nVidia tries to boot it's drivers it first goes to EDID
and says "What's available?", EDID returns a set of modes that it says it
can report.  X then THROWS OUT anything that doesn't match those
modelines...it won't even try them.  On my TV, forcing it (see below) to use
a modeline that isn't in EDID results in the TV going "blue screen" on me
and simply not syncing up.

The reason you should probably try to run X in a higher log level is that at
level 5 or 6 (forget which) it starts actually reporting what EDID says it
can handle and X will start reporting on a modeline by modeline basis what
it's doing and, if it's rejected, it will actually describe why (typically
because it's not reported in EDID.)

You have a couple options if you find this to be the case.  1) You can abide
by the EDID lines like I have.  I didn't try to force my monitor into a 1:1
pixel ratio and instead just live with fuzzy web browsing since Myth and TV
Playback work just fine.  2) You can in essence tell the nVidia drivers to
knock off looking at EDID and just try what you want it to try.  There are
at least 5 - 10 different options to turn off parts or the entirety of the
EDID checks that happen when X boots, all documented in NVidia's
documentation that ships with the driver.

Using a VGA connector to your TV however may not be your best option.  If
your graphics card can handle it, I would HIGHLY recommend you go with
another solution.  Your BEST option would be DVI->HDMI on your TV.  If that
isn't possible (either your graphics card doesn't have a DVI out or your TV
doesn't  have an HDMI in) then you want to go with the following:

Composit (Red, Blue, Green Cable)
S-Video (Looks somewhat like a PS/2 Connector)
Component Video (Red, White, Yellow Cable)

In that order (from Composit and above you can get HD resolution, below
composit I don't believe HD resolution is possible, someone can correct me
on that if I'm wrong.)

In essence the documentation you're reading may actually be correct...you
may very well NOT be able to get a better resolution on your TV via the VGA
output than 1024x768.

--Douglas Wagner


On 4/25/07, Paul <thannet at gmail.com> wrote:
>
> I have been messing around with this for at least a week, following
> numerous links to numerous dead ends! My problem is that I am new to
> Linux and Myth, and really suffer from an understanding problem.
>
> So here goes:-
>
> I have a small fujitsu siemens desktop (Nice and quiet) Installed Ubuntu
> fiesty (7) Installed Myth frontend (backend is already running fine on
> another machine)
>
> Connect up to monitor, all works well at 1280*1024.  Move unit to the TV
> and plug in via the VGA socket and cannot get anything other than 1024 *
> 768. After further googling I found out that the default resolutions of
> the Intel graphics does not do the default Bravia resolution of 1280*768
> 60Hz. So off I went and researched the chip-set, and lo and behold found
> a program called 915 resolution. Following instructions, replaced one of
> the modes with the required res, restarted X, sadly it still did not
> show up as an option in resolutions!
>
> Ok at this stage I read a link about modlines, now this really is
> getting beyond my very limited understanding.
>
> Can any kind soul please help, this is driving me nuts.
>
> Paul.
>
> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mythtv.org/pipermail/mythtv-users/attachments/20070425/d590d62d/attachment.htm 


More information about the mythtv-users mailing list