[mythtv] IVTV VBI reading

Bruce Markey bjm at lvcm.com
Fri Jan 26 20:07:24 UTC 2007


Hans Verkuil wrote:
> On Wednesday 24 January 2007 00:42, Bruce Markey wrote:
> > Daniel Kristjansson wrote:
> > > On Fri, 2007-01-19 at 12:21 +0100, Hans Verkuil wrote:
> > >> Note regarding ivtv VBI support: it is flaky in all current ivtv
> > >> versions. Basically when VBI capturing is on it is possible for
> > >> MPEG or VBI data to turn up in the wrong stream. This is a
> > >> firmware bug for which only the current ivtv subversion trunk code
> > >> contains a workaround. This code will become available Real Soon
> > >> Now for kernels 2.6.18 and up. It is extremely unlikely that it
> > >> will ever be backported to older kernels since it required a huge
> > >> interrupt/DMA rewrite in ivtv.
> > >>
> > >> Due to these problems I would recommend that for ivtv VBI is
> > >> turned off by default in MythTV if possible.
> >
> > But thousands of people use it every day right now. To impose
> > on them that they are not allowed to use VBI because you say
> > it is imperfect is blown way out of proportion.
> 
> Just to clarify: for ivtv-based cards the MythTV VBI setting should be 
> off by default (i.e. after installing MythTV from scratch) for driver 
> version <0.10.0. You should of course always be able to turn it on if 
> you want.

I'm lost. I've been looking for any VBI setting "for ivtv-based cards"
and I'm not finding one. There is no reference to VBI in "./configure
--help". The only thing that I know of to 'turn off VBI' is:

mysql> select * from settings where value like '%VBI%';
+-----------+---------------------+----------+
| value     | data                | hostname |
+-----------+---------------------+----------+
| VbiFormat | NTSC Closed Caption | NULL     |
+-----------+---------------------+----------+

which is set from mythtv-setup General, second page. This turns
off VBI for all cards on all hosts regardless of type and disables
Closed Caption for all frontends even if the information is already
stored in the recordings.


> It can record it perfectly,

I disagree.

>  but scaling does introduce slight amount of 
> ghosting. Whether this is a driver, firmware or hardware bug is 
> something that I need to look into one of these days. 

This seems to be a severe issue if only the extreme maximum
resolution is considered to be acceptable. A 720x480 recording
file with the same relative bitrate as 480x480 would be 50% larger 
for no (or very little) visible difference.

> > Lastly, as has been covered many, many times here over the years,
> > NTSC/PAL is not very high resolution. As the electron beam sweeps
> > across the screen, the analog resolution of broadcast equipment
> > and TV sets are in a range equivalent to 300-500 vertical scan
> > lines.
>
> Out of curiosity, is this also true when you record
> with S-Video from a DVD player, for example?

Yes, the output of the NTSC analog signal wave has no digital
characteristic but that isn't even the point. The signal directly
from a DVD player may be clearer and depending on the TV set, you
may be able to get a little more out of the law of diminishing
returns by using a higher number of samples per scan line as the
(now analog) signal is re-sampled. However, the limitations are not
just the input signal but the output of the display device.

> > http://www.ntsc-tv.com/ntsc-index-04.htm

If you looked at this page you may have been distracted by an
animation to demonstrate Kell Factor.

There are a couple ways of looking at this but there are limits to
how detailed a beam flying across a screen can be, The example I
like is to imagine a black picture with a vertical line in the
middle that is extremely bright and extremely narrow. On a TV
screen it will be no brighter than the maximum brightness of the TV
set. When the beam is on the line, the line will be as wide as the
beam itself. Further, as the beam approaches the line, the power
needs to go from nothing to max and takes some time and distance for
the amplitude to change. As it leaves the line it again takes time
to ramp down.

Both the width of the beam and the time it takes to change limit
the horizontal resolution. Even a 'good' large TV tube isn't
going to be much better than 400 lines of resolutions but this
is a vague comparison because it is an analog signal. But however
it is measured, you could have thousands of digital values per
scan line but the picture can't go from black to white to black
to white in less than the width of the beam or even get distinct
vertical lines at near the width of the beam.

> Well, an 720x480 MPEG stream has 1) the correct display ratio,

You mean square pixels? Four thirds of 480 is 640 and 720
is no where near 4:3 for PAL. I know of nothing where this
is the correct display ratio for anything.

> 2) can easily be burned to DVD without need for resizing.  

1) 704x480 has been used as a standard resolutions for DVDs and
any of several resolutions could be written to DVD.

2) I can set my bt8x8 chip to output 720 samples per scan line from
a bttv card but that driver isn't broken at all other resolutions.
Therefore, I don't have to generate huge files for no benefit to
workaround a bug. A 720x480 recording file with the same relative
bitrate as 480x480 would be 50% larger and that doesn't even
account for differences between mpeg2 and mpeg4.

3) Something well beyond 99% of the TV shows recorded by MythTV are
not burned to DVD and I would be surprised if it were as high as 1
in 100,000 or if as many as 1% of myth users have ever burned a
DVD. Regardless of the statistics, the possibility of burning a DVD
does not negate the existence of the bug and sounds more like an
attempt to rationalize rather than a legitimate benefit.

> The default bitrate 
> set by ivtv is sufficient for good quality encoding at that resolution.

Now you've peaked my curiosity. What is this default bitrate
and why is it a good choice?

--  bjm


More information about the mythtv-dev mailing list