[mythtv-users] Ultimate Mythtv vid card, part II

papenfuss at juneau.me.vt.edu papenfuss at juneau.me.vt.edu
Mon Feb 9 18:17:04 EST 2004


> >From when I was using the TV out on an nVidia card, ISTR that the TV 
> output hardware does some deinterlacing which reduces the quality of the 
> output.  Basically that's good for showing computer generated stuff (i.e. 
> the menus on Myth) since it reduces flicker, but not so good for the 
> actual telly.

	Right... the chips (Conexant or somethign) have differing levels of 
"flicker reduction," not really deinterlacing.  Some of this highest flicker 
reduction amounts to throwing away every other field, so I guess that is sorta 
deinterlacing.  The nvtv software supposedly lets you tweak that and keep a 
good flicker-full  output for the best video.  I, personally, would rather 
fudge the UI a bit (anti-aliased multi-line horizonal lines, for one) so it 
looks acceptable without flicker reduction.  

> 
> The nVidia cards also won't let you set the native PAL resolution of 
> 768x576 (at least with the closed source drivers), so you have so run in 
> 800x600 and let it scale the picture slightly (so you capture at 768x576, 
> scale up to 800x600 and then scale down to 768x576 again, whcih is rather 
> silly and can't do the quality much good).

	You should be able to code up a modeline for anything you want (save 
the interlace).  It's not a direct link to the output on the TV, but it'll let 
you overscan, etc.

> 
> > well as out.  Now, I'm pretty sure I shot myself in the foot.  First off, the
> > second head won't do over 1024x768 or something ridiculously low, and secondly,
> 
> ISTR the second head on dual-head cards is actually driven off the bit of 
> the chipset that was designed to drive the LCD screens in laptops, so it's 
> really not designed to do nice resolutions.

	Of course I find that out *after* I buy it and find out how bad it is.  
The irritating thing isn't the resolution, but even the quality of the video at 
that resolution... very fuzzy.


 > 
> > - Here's a good one...if there's a card that supports both XVideo and 
> > Sync-on-green, wouldn't it be possible to write a dummy XVideo colorspace 
> 
> If you google around a bit then I'm sure you'll find some circuits that 
> will inject the sync signals into the green signal (you should be able to 
> do it with a couple of transistors)... if you feel competent with a 
> soldering iron that is...

	Yes, but then I could just build a transcoder to begin with.  Just 
about ready to do that, but I'd rather not have to.

> 
> > "transformation" to go YUV->VUY?  The thought here is to do component out 
> > without needing an external transcoder box.  Put the 'Y' on green with 
> > composite sync, then put the 'U' and 'V' on red/green... just a VGA-BNC cable 
> > and the TV's a VGA monitor.
> 
> Sounds like a nice idea, but isn't the YUV->RGB conversion matrix 
> hard-wired into the hardware? (not sure on this one)..  if it is then 
> you'd have to do the conversion in software, which would suck.
> 
	IIRC, MPEG data is in YUV anyway... thus the hardware scaling from 
YUV->RGB for VGA.  Why do the conversion twice (three times, actually).  If you 
think about it, running mpeg video to progressive TV changes colors like this:

MPEG (YUV) -> VGA (RGB) -> YUV (transcoder to TV) -> RGB (guns on TV).  

	Whew... seems silly to me!  Hopefully TV's with VGA ports will become 
more popular.

-Cory

 -- 
*************************************************************************
* The prime directive of Linux:  					*
* 	- learn what you don't know, 					*
* 	- teach what you do.						*
* 						(Just my 20 USm$)	*
*************************************************************************




More information about the mythtv-users mailing list