[mythtv-users] Myth scaling/deinterlacing vs Faroudja

Brad Templeton brad+myth at templetons.com
Sat Jun 24 19:46:45 UTC 2006


On Sat, Jun 24, 2006 at 11:36:50AM -0600, John P Poet wrote:
> 
> Unfortunately, your video card gets in the way.  There are very few
> video cards that can be fed an interlaced video stream.  When you
> output interlaced, the video card is actually taking  in progressive
> frame, and splitting it into two fields to send the the tv.

What do you mean by "video card _fed_ an interlaced video stream."

Video cards (as opposed to capture cards) output video streams, they
are not fed them.   Capture cards all handle capturing interlaced streams.

In the earliest days of PCs, you typically could only get 1024x768 by
going interlaced.  Due to that legacy, most video cards still do support
interlaced output from what I recall.   1080i mode on your video card
gives you a 1920x1080 line frame buffer.   The video card spits out the
even lines from that buffer one (half)frame, and the odd lines the next.

The TV then takes this and maps it back into its own frame buffer.  Digital
TVs all have a frame buffer, and it's typically rasterizied progressively
to the output device, be it DLP or LCD panel.   In the old days a TV would
be an analog device, and the lines were rasterized directly onto phosphor.
The odd frame just one scan line below the even frame, and with a gap
between each line.  Persistence of phosphor kept the prior half-frame
on screen while displaying the current half-frame.

But as noted, that's not how digital TVs work.


So my understanding is, if you set your video card at 1080p (59.97 fps)
then the frame buffer in your video card will be spit out, 60 times a
second, to be copied into the frame buffer of your TV, which is effectively
what you want.

> Now, if I understand things correctly, you can trick the system.
> Myth's BoB deinterlacer sends out data at double-rate --- first the
> "top" field, then the "bottom" field.  So, if you set your modeline to
> 1920x540p (60Hz), most TVs will think they are receiving 1920x1080i
> (30Hz), and deinterlace the two "fields".

My understanding is the bob deinterlacer is trying to emulate what the
old phosphor TV did.      The transmitted signal is 60fps of Odd, Even, Odd,
Even etc. so the "full" frame only arrives at 30fps.   However, a bob
deinterlacer transmits 60 full frames per second, each one composed of
the current half-frame (either odd or even) and the most recent other
half frame.   A smarter bob deinterlacer (known as bob+weave) still
gives 60 full frames per second, but tries to be clever about interpolating
among the half-frames to be smoother.  Takes the most cpu.


If you keep your tv in 1080p mode, myth will be doing this deinterlace
for you.   If you run yoru tv in 1080i mode and switch your video card,
the TV will do it.  But you'll have to be switching your video mode with
each video, since you don't want do display 720p video in an interlaced
mode.  For 720p video, you either will run the tv and video card at
1080p, and let xvideo scale up the 720p to 1080p, or run the tv and
video card at 720p and let the tv scale up the 720p with its own hardware
scaler.   Both of these should produce similar results.

For menu mode, you will probably want to run in 1080p.   When you
push aside mythtv to use the display as a computer display (for browser,
email) you will want to be at 1080p for sure.

So you either will be switching modes all the time, or leave it at 1080p
and let myth do the conversions to that mode.


More information about the mythtv-users mailing list