[mythtv-users] best quality capture card for software encoding?

Cory Papenfuss papenfuss at juneau.me.vt.edu
Tue Feb 28 18:08:14 UTC 2006


> I would sure love some elaboration on this point if someone is willing!
>
 	My previous posts elaborated on this already.

> I've recently been swayed by a lot of anecdotal information to the
> belief that deinterlacing *should* improve the quality of my playback
> (especially fast motion video, like sporting events).  I have yet to
> conduct any real experimentation to figure out the best settings though.
>  While I'm willing to experiment, it would be great to have a little
> more solid background understanding about the fundamental concept of why
> one should deinterlace a signal destined for an interlaced display.
>
 	For displaying on a TV, deinterlacing and "flicker-filterign" are 
CRAP!  They should *NOT* have to be done.  The fact that deinterlacing 
improves the quality in some (many?) circumstances is due to error or poor 
choices in other parts of the signal path.  Proper playback should have 
the frame refresh of the video playback synchronized with the vertical 
refresh of the TVOUT.  The framebuffer should be set up to produce 
NTSC/PAL compatible resolutions (that means 525 total lines, 480ish 
visible, interlaced).  Anything other than this and the tvout chip has to 
do scaling in time (refresh frequency) and size (primarily number of 
vertical scanlines).  The scaling introduces bogus artifacts that are 
often reduced in offensiveness if the video is first deinterlaced 
(introducing more bogus artifacts) before being played.

 	Note that this is a necessary, but not sufficient condition to 
have correct, synchronized output.  Just setting your video to "640x480" 
or "720x480" does not guarantee that the fancy-schmancy NVIDIA 
SuperSkaler-Whizbang 9000 chip doesn't assume you want a 3.72% underscan. 
That would add another level of scaling into the mix.  It may be smart and 
give you what you ask for, orit may not, but without detailed specs on the 
hardware and drive one can not know for sure.

> What really caught my eye is the comment about scaling and possible
> interference with deinterlacing.  This is a new one for me.  I hope
> someone can explain a bit more.  Which cards do this scaling?  How do
> you avoid it (without, e.g., buying something like an eden board as
> mentioned above) if you've got one of these scaling cards?
>
 	Scaling is done by video cards for two things relevant to mythtv:
- Xv scaling... transforms MPEG YUV->RGB, and rescales the size of the 
video to match the VGA raster (typically 800x600)
- TVout chip... See rant above.

> I've been reading up everything I can on this subject, but short of
> pursuing an engineering degree, I'm afraid it continues to seem like a
> lot of voodoo!

 	Bottom line is that there is only so much sane, rational thought 
that can be applied to the concept of "good tvout" within the context of 
mythtv.  Short of really expensive, genlocked, synchronous decoding and 
playback that a tv studio might have, everything else is a shortcut and a 
fudge to the specs.  The consumer stuff that one builds mythboxes out of 
almost by definition violates lots of these rules.  That's why what works 
for someone might not for others.  Also why counterintuitive things like 
enabling deinterlacing when using a TV on TVOUT *might* make things look 
better.  It's because they were extra-bad to begin with... now just less 
extra-bad.

-Cory

-- 

*************************************************************************
* Cory Papenfuss                                                        *
* Electrical Engineering candidate Ph.D. graduate student               *
* Virginia Polytechnic Institute and State University                   *
*************************************************************************



More information about the mythtv-users mailing list