[mythtv-users] What renderer, interlacer for CrystalHD card

Tom Flair tom at graniteskies.net
Thu Dec 9 17:13:26 UTC 2010

On Sat, Oct 16, 2010 at 1:55 PM, Raymond Wagner <raymond at wagnerrp.com>wrote:

> On 10/16/2010 14:45, dave at 0bits.com wrote:
>> Got my crystalhd card and recompiled with trunk/svn. All went well and
>> i've set up the decoder as crystalhd in the settings. but unsure what i
>> should use for the renderer, and primary/secondary deinterlacer. My
>> understanding that the crystalhd can do deinterlacing also but there doesn't
>> seem to be an option for it or do i just set it to 'none' ?
> The CrystalHD is a decoder, not an output card.  It is a replacement for
> the 'standard' decoder.  You need to use Xv or OpenGL just like before.
> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users

I've pretty much used the defaults since I've started with Myth, so forgive
me if this is an obvious question.

Assuming I want to use the CrystalHD to decode anything 720p and above... I
should have the "Match criteria" set to >= 1280 x 720, correct?

If I wanted it do everything, it would be <= 1920 x 1088?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mythtv.org/pipermail/mythtv-users/attachments/20101209/2d04e54b/attachment.htm>

More information about the mythtv-users mailing list