[mythtv-users] Hardware for Best TV-Out?
papenfuss at juneau.me.vt.edu
Tue Oct 26 12:53:51 UTC 2004
> Harmonic Research makes some of the less expensive (though not cheap) units.
> I have their component model, but not the RGB one.
From their website:
"If you can furnish broadcast-quality RGB or Component signals, then these
units can generate broadcast-quality video*. All sync and timing signals
are regenerated from the input reference and include a true phase locked
"The inputs must be analog, and they must be reasonbly close to the video
scan rates (15,750 Hz for NTSC, 15,625 Hz for PAL)."
That still doesn't quite add up. I'll buy that they can make a
spec-compliant raster, but without frame (and in this case, *line*)
locking to the original source, it would seem that you'd have to have
sampling/resampling to get around the slight frequency mismatch. If you
didn't a vertical bar on the computer screen would look slightly diagonal
on the TV.
A quick calculation: The datasheet says 15.75kHz, but NTSC is 17.734kHz.
After 480 lines of being off by that much, a vertical bar is delayed by
30us. That's about 1/2 of the width of the screen. (In fact, my gut
tells me that's the luma/chroma frequency interleaving rearing it's head
again and is intentionally off by 1/2 a line)
> |> How did you phase lock the color sub-carrier to the existing sync?
> | Easy... I didn't... :)
> Ok, but then it's not "real" NTSC. You can get an off-the-shelf version
> like that for ~$200, still with a reasonably good color encoder matrix.
> Harmonic makes those too.
OK... true enough. It isn't "real" NTSC as in broadcast-quality
NTSC. I'm just reading through the datasheet one more time, and I do see
some references to "synchronous" video systems vs. "asynchronous" video
systems. You're referring to the former, and I'm using the latter (the
datasheet does talk about possibly delays in the colorburst causing
"visual artifacts in some high-end video systems" when using an
Now that you've successfully gotten me off on this tangent, I'm
looking into the feasibility of using the 14.318Mhz clock that's on the
VGA card. The AD724 datasheet mentions using this for a synchronous
system, and I might be able to get ahold of it via the "VGA Feature
Connector" That would greatly simplify the circuit
construction/components/tuning since it wouldn't even need a crystal then
(and would also be more accurate NTSC). If that doesn't work, a PLL
programmed to reconstruct P/Q times the dotclock from the HSYNC signal
might be a very viable alternative. Of course if you change the modeline,
you'd have to reprogram your circuit (Let's see... 912 in binary for the
DIP switches is...)
> But it isn't just a matter of frequency. In a real NTSC signal all the
> components are locked to the same clock, being derived from (probably a
> multiple of) the color subcarrier. It's much harder to reproduce this
> locked relationship after the fact since the highest frequency you have to
> work with is the horizontal sync.
Much harder... actually almost impossible without resampling. If
you read through the datasheet, it mentions some very interesting (and
vague) information about the delay filters present in the signal path.
Also something about auto-tuning. I don't think they they're trying to
adjust this, but it's still and interesting read.
> Depends on what you are trying to achieve. There are two issues. In the
> abstract you can mess up (technical term :) the distribution of power across
> the frequency spectrum of the signal. I don't remember the exact consequences
> of even failing to flip the sub-carrier phase on each scan line, but it can
> at least slightly degrade the picture quality.
That's a PAL-ism, no? I thought NTSC didn't flip the SC phase.
On the practical side, various
> gadgets depend on the exact timing relationships in various ways and to
> varying degrees. The more sophisticated the device, the more it may depend.
> My old Zenith tube set really didn't care. My XBR100 is pretty finicky. The
> problems can be subtle. Nobody documents their exact dependence on the NTSC
> timing because, after all, it is supposed to be correct if it is NTSC...
> So the choice is between doing it right once or checking compatibility with
> each target device (and hoping that you aren't cheating yourself by missing
> a subtle problem). Now if you have only the one target device, it may
> not matter much.
I must admit that I've really only tried my circuit on two TV's.
A more advanced TV might try to do fancy comb filtering to try to extract
*all* the Y/C info from the CBVS signal. The frequency interleaving of
the Y and C does depend on the relationship between the subcarrier and
multiples of the horizonal frequency. Each harmonic is only 60 (30?) Hz
wide, spaced 15.7kHz apart. I'd imagine that you'd have to get to fairly
high harmonics before there'll be Y/C interference due to a frequency
mismatch. You'd have most of the signal by then.
> In any case, I stand by my statement that generating real NTSC (especially
> after the fact) is not trivial.
> Dan Lanciani
> ddl at danlan.*com
Absolutely. I stand by my statement that my circuit works very
well for me... :) I would find it interesting to know how good TVOUT's
NTSC is. Since it has the advantage of having some hardware clocks
available, I'd bet it's pretty good. It's the scaler and temporal
resampling that screws the picture quality (as well as crap physical
construction on low-end boards).
More information about the mythtv-users