[mythtv-users] DVI v. Component

Cory Papenfuss papenfuss at juneau.me.vt.edu
Thu Aug 12 07:28:58 EDT 2004


 	Scaling does have the potential to exacerbate any resolution 
differences, by my experience has been that the real problem you describe is 
due to mediocre TV video out on most video cards.  The scaling, bluriness, 
interlacing, and flicker filtering are all typically done at the final stage on 
the video out card.... often on a second chip.

 	As an example, consider my GeForce2 400MX I bought 2 years ago or so. 
I wanted the potential for video in/out on it, so I got one that said it did 
both. I was unpleasantly surprised when I actually started using it to find how 
crappy the quality was.  Turns out they used a crappier video output chip that 
also had video in.  Marketspeak said it was great, and only by looking at the 
datasheet on the chip could you see that it was limited to 240 lines of output. 
Basically worthless, but I didn't discover that until way too late after I got 
it.

 	That's the main reason I use my funky VGA->NTSC transcoder box.  Any 
VGA port can output high quality video at TV resolutions (or else it would be 
an even crappier computer card).  I bypass the flicker filtering since its only 
use is to make still computer-generated text/graphics look better.  Video... 
especially captured NTSC video... should be displayed without flicker filtering 
to minimize processing and distortion.

 	WRT scaling, it's also why I use a 720x480 modeline on the funky 
VGA->NTSC transcoder box... don't need to scale anything while viewing max 
quality ivtv or dvd video.  Now, if only I could get pseudo-genlocking to force 
vertical sync between capture and output cards.  Don't some NVIDIA driver 
thingies do that and support interlaced output now?

-Cory

*************************************************************************
* Cory Papenfuss							*
* Electrical Engineering Ph.D. Graduate Student                         *
* Virginia Polytechnic Institute and State University 			*
*************************************************************************


On Thu, 12 Aug 2004, Adam Clarke wrote:

> Hello
>
> This is exactly what i thought. However, my experiments have found you really 
> need to take into account any scaling on the video thats done between the 
> source (normally the video output card) and the display its been shown on. If 
> video is scaled it tends to look horrible, it tends to be blury. For 
> interlaced displays, flicker fixing on the video output also makes a 
> difference on quality, within a GUI, flicker fixing on the output is great, 
> and helps the text be easier to read. Flicker fixing while displaying video 
> tends to give a lot softer look to the video output.
>
> Adam
>
> Cory Papenfuss wrote:
>
>>     My understanding of DVI is that it's a multifaced spec.  It can be 
>> digital which I suppose would be best, and probably affected by broadcast 
>> flag for new equipment.  It can also have analog (RGB), so then it's 
>> basically the same as component.  Some people argue that RGB is better 
>> than component, but it's rather academic and depends on what the original 
>> source is.  Most digital video is encoded with YUV rather than RGB but the 
>> quality wouldn't be degraded unless it's converted from RGB to component 
>> in analog (e.g. A960 "transcoder"). I guess one could argue that it's 
>> slightly reduced quality.  I guess it would go something like:
>> 
>> RF-modulated composite < Composite < Svid \
>>     < Component < RGsB < RGBHV = DVIanalog < DVIdigital
>> 
>> Whew! :)
>> 
>> -Cory
>> 
>> 
>> *************************************************************************
>> * Cory Papenfuss                            *
>> * Electrical Engineering Ph.D. Graduate Student                         *
>> * Virginia Polytechnic Institute and State University             *
>> *************************************************************************
>> 
>> 
>> On Wed, 11 Aug 2004 gabe at mustbethemoney.com wrote:
>> 
>>> All this talk of video quality (and it has been an education, and I have
>>> learned a hell of a lot on how it all works), leaves me with just one
>>> question regarding DVI in/output.  This has not been mentioned much, but 
>>> I
>>> am assuming that if the TV has a DVI input and you have a card with DVI
>>> output that is the best possible signal-better than component?  From my
>>> understanding, this is a pure digital signal that the TV can then 
>>> convert
>>> to an analog image, whereas a component output converts the digital 
>>> signal
>>> to a very high quality analog signal with the chominance and luminence
>>> information.  Is this correct?  And if so, are we then talking bout:
>>> RF<Composit<S-Vid<Component<DVI ???
>>> 
>>> Also, I know this has been mentioned before, but how will DVI be 
>>> affected
>>> by the broadcast flag?  Will TVs with DVI inputs sold after June next 
>>> year
>>> not take an input that does not have the flag?
>>> _______________________________________________
>>> mythtv-users mailing list
>>> mythtv-users at mythtv.org
>>> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
>>> 
>> ------------------------------------------------------------------------
>> 
>> _______________________________________________
>> mythtv-users mailing list
>> mythtv-users at mythtv.org
>> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
>> 
> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
>


More information about the mythtv-users mailing list