[mythtv-users] Nvidia chipset motherboard recommendations

Joey Morris rjmorris.list at zoho.com
Wed Nov 13 04:01:10 UTC 2013


Joseph Fry <joe at thefrys.com> wrote on Sat, Nov 09, 2013 at 10:11:23PM -0500:
> On Sat, Nov 9, 2013 at 3:21 PM, Raymond Wagner <raymond at wagnerrp.com> wrote:
> > As long as you never want to do complex deinterlacing of HD. The chip will
> > do decoding and playback just fine, since that's all performed in a common
> > ASIC, however the deinterlacing is performed within the graphics shaders,
> > and an GT210 is lacking in that aspect.
> 
> The desire to output a 1080p signal to your television is often
> misplaced.  Most people are not very sensitive to interlacing
> artifacts, which is why the use of interlacing survived the digital
> transition.  You may be best served by simply not deinterlacing at
> all.
> 
> A quality TV typically will have quality interlaced output considering
> over half of all broadcasts are interlaced (480i or 1080i).  I
> recommend outputting 1080i in its native format and seeing if your
> happy with it before spending the extra money on a card that will do
> 2x Advanced Temporal deinterlacing for you.

What is the state of support for sending interlaced output to the TV? I was
under the impression that it wasn't easy (and maybe impossible in some cases) to
configure X and/or MythTV to do that.



More information about the mythtv-users mailing list