[mythtv-users] How I got great quality TV-out on my nVidia MX4000

Joe Votour joevph at yahoo.com
Thu Mar 10 04:31:28 UTC 2005


I'll preface this by saying that I'm not a graphics
guru.  However, I have been doing graphics-related
work off and on for the last ten years (although
nothing too fancy, mostly 2D stuff).

The key to MythTV (or any program, really) being able
to render a display without tears or choppiness is
really in two things:
1. Being able to know when the vertical sync is, and,
2. Being able to react to the vertical sync event in a
timely manner

With regards to (1):
Basically, the way that images are displayed on the
screen, or a monitor (I don't know about projection
units or the like) is that the image is drawn left to
right, up to down, via an electron beam sometimes
called the "raster".  When the raster goes out of the
viewable area, then you are said to be in the vertical
blanking interval (VBI).  For the purposes of this
discussion, we'll ignore that some TV signals have
Closed Captioning data sent during the VBI on the
input signal.

Generally speaking, when you are within the VBI, you
may update any portion of the screen that you want,
and have no tearing or artifacts (because the raster
isn't updating it).  It is extremely important that
you update your screen fully during the VBI, or, at
the very least, update your screen before it is drawn
by the raster (otherwise, you will see
tearing/jitter).

If your output driver/software doesn't have any way of
knowing when it is within the VBI area, then you can't
hope to accomplish this, and there are very good odds
that you will see tearing or jitter.

As Isaac said in his response, DRM is one method.  The
OpenGL vsync is another.

Now, on with (2):
Knowing when the VBI occurs isn't enough - you must be
able to get control of things while you're still in
the VBI to be completely effective (again, otherwise,
you might be updating the screen while the raster is
drawing it).

The biggest problem that MythTV faces is that Linux is
not a real-time operating system (this problem is not
limited to Linux, Windows suffers from the same fate).
 While the 2.6 kernel has a much improved scheduler
(and I was actually at the Embedded Systems Conference
today, discussing some scheduler related issues my
employer is having on one of their products, with
MontaVista, LynuxWorks and TimeSys to see if any of
them can help us), it is still not real-time (like
vxWorks).  So, although you might know that you're in
the VBI, your task might not have gotten it's
time-slice from the kernel yet.  This means that you
have this precious time you should be drawing in,
going to another task/thread, and by the time you get
the time-slice, you might be out of the VBI.

Now, back before computers (and their OS') became so
complicated, I used to be able to get perfect, tear
free displays back on my Commodore 64 with the VIC-II
chip, all in software.  How, you might ask?  Well,
first of all, I wrote my code in 6510 assembly
language, and chained it off of the IRQ (interrupt
request).  Secondly, the VIC-II had a register, which,
when read, would give you the scan line (raster line)
the chip was drawing on, writing to this register
would allow you to trigger an interrupt whenever the
chip hit the desired scan line.

Thus, I was able to have such complicated things as
splitting the screen into five parts, making changes
to part two, while part four was being drawn by the
raster.

Unfortunately, we don't have such control over regular
Linux...

(As an aside, cards like the PVR-350 handle this in
their hardware, as part of the video decoder chip,
that's why they get such a clean image.  I suspect
that XvMC also helps in this regard, but I really
don't know anything about XvMC, so I might be talking
out of my ass on that one).

-- Joe

--- Jeroen Brosens <jeroen at fotoniq.nl> wrote:
> > On Wednesday 09 March 2005 02:06 pm, Jeroen
> Brosens wrote:
> >> I'd like to add that I'd like to challenge the
> MythTV dev-people to
> >> review
> >> the Xv/XvMC code regarding the handling of vsync
> while using bobdeint.
> >>
> >> One needs hardware with OpenGL support to have a
> Vsync to get bobdeint
> >> working without going out of sync now and then
> (horrible jittering
> >> occurs)
> >> and that isn't good news for users of a barebone
> with built-in graphics
> >> that can't support that, like myself (using an
> ASUS Pundit).
> >
> > Well, how do you expect it to know when to flip
> the buffers, if the video
> > card
> > can't tell it accurately?  Magic?
> >
> > Isaac
> >
> >
> 
> No Isaac, just something else than GL Vsync. I am
> not venting my
> frustrations upon people either, rather just
> stirring up some new ideas on
> this. Afterall, this is MythTV, Linux, Open
> Source... where meeting
> challenges is the fun of everything! I also could
> have installed Windows
> MCE and to be 'just a regular user on the safe side'
> but I want to be able
> to participate in meeting the challenges where I
> can.
> 
> What I understand now is that all of the devvers use
> nVidia, can you agree
> that this diminishes compatibility regarding
> video-related
> functionalities? I am not a C++ developer you know;
> if I were I would have
> tried to fix the problems myself but I can't.
> 
> Now on topic; am I talking plain nonsense when I ask
> whether the VBI
> device can be used for Vsyncing? What I know is,
> that it is used for
> teletext data and 'walks' in sync with the video
> fields so maybe that is
> an alternative for using GL vsync.
> 
> -- Jeroen
> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
>
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
> 


		
__________________________________ 
Do you Yahoo!? 
Yahoo! Small Business - Try our new resources site!
http://smallbusiness.yahoo.com/resources/ 


More information about the mythtv-users mailing list