AW: [mythtv-users] Hardware for Best TV-Out?

Flo Kohlert flokohlert at
Mon Oct 25 20:55:39 UTC 2004

> -----Ursprüngliche Nachricht-----
> Von: mythtv-users-bounces at [mailto:mythtv-users-
> bounces at] Im Auftrag von Cory Papenfuss
> Gesendet: Montag, 25. Oktober 2004 13:34
> An: Discussion about mythtv
> Betreff: Re: [mythtv-users] Hardware for Best TV-Out?
> > Keep in mind that a "real" (yet still not broadcast quality) RGB to NTSC
> > encoder is going to cost ~$500--and that is for an encoder *only* with
> > no rate conversion.  Not that I think a high quality encoder would
> really
> > help from what I can see of nVidia's output.  A transcoding TBC might
> help.
> > Unfortunately, mine has the component input option rather than RBG so I
> > can't try it. :(
> >
>  	It cost me about $10 in parts to build one.  No rate conversion,
> just VGA->NTSC encoding (color subcarrier with UV modulation, etc).  Aside
> from having to set the VGA card up for 480i timings, (read: no console,
> only X), it works great.
> > I'd love to get an answer on this as well.  As it stands I'm about ready
> to
> > give up on the project on the theory that what I want cannot be done.
> Because
> > my source is ATSC the PVR-350 is not supported for output, and from what
> I've
> > seen generic video card TV-Out technology has not advanced much since my
> last
> > exposure.
>  	I have gotten an email from a guy at NVIDIA who says that so long
> as you use a standard size for the TVOUT, the driver will try to bypass
> the scaler on the card.  It basically hard-codes the modeline for a
> standard mode (e.g. 720x480) to do this, and is probably why people have
> been unsuccessful tweaking modelines with nvidia cards.  I haven't played
> with it much myself, though.
> The problem is that generating a reasonable quality NTSC signal is
> > not trivial (TV Typewriter Cookbook to the contrary notwithstanding :).
> You
> > can't easily tack on cheap TV-Out functionality to a VGA card and get
> quality
> > results.
>  	Check out the AD724 chip.  A few passive components, one chip, and
> a crystal is all you need.
> -Cory

I´m currently using a gf4mx with a selfmade vga2rgb converter
( which is giving me uber-superb
quality on the frontend and good quality @watching recordings. I´m still
investigating the cause for the not´so sharpenes off my recordings/livetv.
It seems to me that there ain´t no interlaced output if I watch
tv/recordings, although I don´t get as heavy stuttering on the cnnticker as
I had on svideo out of the card. I´ll try to throw some official
testpatterns on it via mpeg and mythgallery maybe I can then clearly
describe the problem.

Maybe anyone can help me here as I clearly have interlaced output on the
frontend. Whats the difference between frontend output and watchingtv -
output? It even seems to look better in the video epg. XV, OpenGL vsync
should I try to compile without one/both of these (I think I wonßt get any
reasonable output on my p3-700 without xv)? 
I have no deinterlace or any other filters activated, playback is full pal
resolution 720x576 my x resolution that’s put on my tv is 768x576 (should´t
that be right due to square/non square pixel issues?). So can anyone
describe me the difference between frontend and movie-watching please?

More information about the mythtv-users mailing list