[mythtv-users] Does it matter which Hauppauge tv-card i buy???
Cory Papenfuss
papenfuss at juneau.me.vt.edu
Sat Sep 15 12:43:36 UTC 2007
On Fri, 14 Sep 2007, jdonohue654-mythtv at yahoo.com wrote:
>>> My experience has been quite different ... 350 xv
>>> output is far superior to anything I've managed to
>> get
>>> with my 5200.
>>
>> In what way? Fewer MPEG artifacts? Fewer interlace
>> artifacts? What sort
>> of display are you driving? How are you driving it?
>
> Sony Trinitron WEGA SD 36". The 5200 looks good for
> talk shows, news, sitcoms, etc. where there isn't much
> camera movement and I can disable deinterlacing.
> Mostly I watch sports, and without deint the picture
> is jagged. With deint the picture is fuzzy and just
> not right. I find both options very distracting.
>
> The past couple of years I've tried every combination
> of nvidia settings and deint possible, but have never
> got the great picture others have described.
>
I'm jumping into this late, but I'm assuming you're using
composite or s-vid out of the 5200? The problem is that video cards with
tvout are inherently crap. Not fundamentally, but inherently. The
frequencies, interlacing, and resolution necessary to generate a TV signal
are completely different that that which is in the VRAM on the card to be
output to a monitor. Horizontal, vertical, and temporal rescaling are
typically done, and in most cases done in a proprietary fashion that may
be crappy.
>>> Granted, the 350 can't use opengl, but I can't
>> imagine
>>> that would matter since the output is already
>> nearly
>>> indistinguishable from live tv.
>>
>> OpenGL is more for "Eye Candy", which seems to
>> matter to some folks and
>> not to others.
>>
>
> I thought that the next major release will have some
> hardware/opengl-based deint, though I may have
> misunderstood something. Though, like I mentioned, I
> can't imagine I'd be losing much since the output I
> get from the 350 is almost indistinguishable from live tv.
OpenGL has nothing to do with it. The difference is that the 350
is generating the video specifically for NTSC hardware... not as an
afterthought and add-on to the primary monitor. Before I went to
component on a new TV, I build a circuit to connect directly to the VGA
port. It required a modeline to generate *exact* NTSC timings and worked
and looked beautiful. Trouble is it's interlaced, 29.97 Hz vertical, and
15.738kHz horizontal in 720x480... basically half of regular 640x480 VGA
and many vid cards refused to do it.
-Cory
--
*************************************************************************
* Cory Papenfuss, Ph.D., PPSEL-IA *
* Electrical Engineering *
* Virginia Polytechnic Institute and State University *
*************************************************************************
More information about the mythtv-users
mailing list