[mythtv-users] Hardware for Best TV-Out?

Cory Papenfuss papenfuss at juneau.me.vt.edu
Wed Oct 27 12:06:11 UTC 2004


> They do line-lock to the original source; they do not change the timing.  The
> generated sub-carrier will be off by whatever percentage your sync is off.
> (It is not a TBC at all.  If you want a quality transcoding TBC(*) it's more
> like $4k off the shelf, but much cheaper on eBay.  I have one with component
> in, but I don't have an RGB one to try.)  You are expected to provide the
> Harmonic unit with correctly timed video to begin with (that's why they say
> broadcast-quality in :).  If you are not "reasonably close" it will not work
> at all.  If you are _only_ close then the output will also be only close, but
> the frequency relationships among the components will be the correct.  That's
> what you are paying for.

 	Since you say it uses a PLL like I was thinking of doing, then it 
really does boil down to how accurate your HSYNC is.  I'm sure that the 
lock-in range of the PLL on their circuit won't sync beyond a certain 
(hopefully narrow) range.  Even still, at 227.5*15.75kHz = 3.583125MHz.  A 
far cry from the intended +-10Hz subcarrier tolerance.

>
> I think the comment about regenerating all sync and timing signals is slightly
> misleading.  It is _true_ (I traced the circuit to see what they were doing;
> to their credit they do not sand the part numbers off the ICs like some)
> but it is more a matter of convenient use of the components they chose than
> necessity.

 	Market-speak at its best.

They could have created the locked sub-carrier and used the sync
> as provided, but by the time they had done the former they already had both
> vertical and horizontal sync available as a side effect.  Using the original
> sync at that point would have required a more complicated circuit, plus they
> might have had to worry about propagation delays.  And of course, the input
> sync pulses might not have quite the right widths or such.

 	So they did run it through a frequency-multiplier based on the 
horizonal sync then?  Trouble with that is now your HSYNC's tolerance is 
227.5 times more important to get right.

>
> (*) Pet-peeve: have you noticed that some TBC manufacturers make you pay
> extra for the "transcoding" feature beyond what you pay for the input and
> output options?  Given that they are basically rendering to a frame buffer
> and generating video from that it's hard to see how a TBC could fail to
> transcode unless they go out of their way to prevent such behavior when you
> haven't paid for it...

 	I guess it depends on whether or not it's trying to account for a 
single frame here and there being out of sync, or a truly repetivive 50/60
framerate (I'm assuming you mean PAL/NTSC transcoding).  The former is 
easy to do without much smarts, and a frame or two every few seconds won't 
be noticed.  A difference of 5/6 framerate at least *should* be given some 
more though.

> | 	OK... true enough.  It isn't "real" NTSC as in broadcast-quality
> |NTSC.
>
> "Real" NTSC will have the defined timing relationships.  Broadcast-quality
> will be "real" and also get the color encoding matrix right, have a highly
> stable clock, etc.
 	Strictly speaking, I agree that "real" is standards-quality NTSC 
with all timing relationships you mentioned related and referenced to one 
master clock.  The difference is how "real" you need your mythtv box to 
be.  With a sufficiently dumb TV (or more likely anything but a 
super-advanced TV), it'll work with a pretty sloppy signal.  It's not 
something you'll try to use studio-editing equipment with, but for a 
one-off, isolated system it'll be fine.  From what I understand regular 
consumer VCRs and DVD players often produce rather horribly 
non-standards-compliant signals (Macrovision notwithstanding).

> I don't know exactly what terminology they are using.  Historically a
> synchronous system was one where multiple sources were synchronized to
> the same master time base.  Moreover, every source was not only at the
> same point in the field but was in the same field of the four-field
> sequence.  This used to be very important to broadcast studios and entire
> networks so they could do seamless cuts between sources without any video
> storage.  These days it is fairly easy to interpose a TBC whose output is
> synchronous and external sources can be asynchronous.  But a single NTSC
> signal unto itself is always synchronous.
 	... except for the discrepancy that we're talking about. 
Without either storage or phase-locking two devices together, it's not 
possible to get truly standards-compliant NTSC from a setup like we're 
talking about.  With COTS VGA card, you're choices are:
1. AD724 async (like I'm currently doing)
 	3.58MHz subcarrier accurate in frequency, but not phase-locked to
 	HSYNC
2. AD724 sync with PLL (like Harmonic does, apparently)
 	3.58MHz subcarrier will be off by the ratio of error in HSYNC to 
the standard of 3.58E6/227.5=15.734kHz.  Subcarrier will be phase-locked 
to HSYNC, however

> You would be making the VGA your "studio master" clock and slaving the
> AD724 to it.  That might work, but I'm not sure how stable that clock
> is or whether it is even always available.

 	Right... it would be nice, but it appears that the "Dot Clock" 
signal on the "VGA feature connect" isn't always available.  I don't know 
how stable it's guaranteed to be (as far as ppm drift, etc), but it *is* 
programmable via the modeline.

> That's what the Harmonic product does.  I don't think it is entirely trivial
> to do well...

 	What I was thinking would be to run a PLL off the HSYNC, but could 
be programmed via DIP switches or something.  Then for a modeline like:
ModeLine     "coryntsci" 14.318 720 760 824 910 480 484 492 525 interlace
it could be set to re-muliply the HSYNC by 910, thus reproducing the 
14.318MHz clock.  That's 4*fsc and could be fed right into the AD724.  Of 
course it depends on X setting the clock *exactly* to 14.318Mhz to get a 
truly accurate subcarrier frequency.  Now that I think about it, however, 
the 910 shouldn't have to change... even if you doubled the modeline to 
28.6MHz and 1440x480 resolution.  It's the *time* that matters, and 
that'll stay constant.  What the DIP switches would allow you to do is fix 
the signal if the card wouldn't do exactly the right dotclock.  Hrm... 
910/4 = 227.5.  Coincidence?  I think not... :)

>
> | 	Much harder... actually almost impossible without resampling.
>
> You don't need to resample to make the clock *relationships* correct.
 	Again, take your pick on which accuracy is more important... the 
color subcarrier frequency, or the line relationships between it and 
HSYNC.

> | 	That's a PAL-ism, no?  I thought NTSC didn't flip the SC phase.
>
> No, it's an NTSC-ism.  (PAL uses a 90 degree shift.)  It's built into the
> timing.  Each scan line comprises exactly 227.5 cycles of the color sub-
> carrier.  With an odd number of half-cycles per scan line the initial sub-
> carrier phase flips each line.  And it gets better.  With an odd number of
> scan lines per frame the phase also flips per frame.  This is why there are
> actually _four_ distinct NTSC field types rather than the two that you often
> hear about.
 	Gotcha... so it's a 180 degree flip as per the 227.5 multiple, 
rather than 90 for PAL.

> | 	I must admit that I've really only tried my circuit on two TV's.
> |A more advanced TV might try to do fancy comb filtering to try to extract
> |*all* the Y/C info from the CBVS signal.
>
> It really isn't just a question of comb filters.  I've seen some evidence to
> suggest that my XBR100 uses the sub-carrier to adjust the fine tuning.  Time
> code sensitive devices (editors, etc.) use the phase of the sub-carrier to
> know where they are in the four-field sequence.  I don't know what VCRs might
> do...
>
 	Again... very few people need to connect their mythtv box to 
studio-editing hardware.  If you need that, it's pretty tough to expect a 
$35 vid card to pull it off.

> |I would find it interesting to know how good TVOUT's
> |NTSC is.
>
> All the ones I've tried (and I've been trying since the ATI EGA Wonder) look
> pretty bad... to me...

 	Me too... that's why I built mine.  I think that most of the 
quality is from the scaling and temporal interpolation that needs to be 
done to rectify the scanrate problems.  Laying down a standards-compliant 
NTSC raster should be easy, since they've got their own clocks and 
multipliers to work with.  Use a master clock at Fsc (or a multiple) and 
count everything else from that.

> Ah, but there are so many other ways to screw up the signal.  The color
> encoding itself isn't trivial, and you really need to limit the bandwidth
> of the luma channel without distorting it too much.  There's only so much
> left to spend on the TV-out function of a retail $35 card.
>
> 				Dan Lanciani
> 				ddl at danlan.*com

 	The AD724 seems to address all these pretty well from what I can 
see.  The biggest drawback is also it's feature to allow for a 
free-running oscillator.  Throw a multiply-by-910 PLL on it to generate 
the subcarrier from HSYNC and I think we're both happy.


 	How the *HELL* did we got off-topic so badly on the mythtv-users 
list anyway?  Sorry everybody, but maybe this thread enlightened some 
folks as to why most TVOUT doodads suck so horribly.  Even if you are 
willing to work with 1/2 VGA speeds, it's *STILL* difficult!

-Cory


More information about the mythtv-users mailing list