[mythtv-users] De-interlacing. Work well?
Matthias Thyroff
lists at thyroff.net
Tue Aug 15 14:11:08 UTC 2006
Am Dienstag, 15. August 2006 15:49 schrieb list at onnow.net:
> I am wondering if many people use this feature in Myth and what the
> feedback is.
>
> I have two front ends. AMD 3400 and a Celeron 2.4. Both with built
> in NVidia. One into an LCD TV and one into an LCD projector.
>
> I see there is Linear, Kernel, and Bob 2X Framerate.
> Bob requires Vx or XvMC.
> Which is best? Can I use Bob with a built in NV?
>
> Some info and feedback would be great.
> Increased quality ( I am on Analog cable ) is always appreciated.
>
> Mark
> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
I have been wondering that too, different people with different requirements
(CRT, LCD, HD-Ready... HD or SD TV) seem to get different results.
You should search the archive for the results of different users; myself (SD
digital SAT TV on an LCD HD-Ready TV via VGA input) had a bad flicker problem
with "Bob", but I am quite happy with "Kernel" (with XvMC). I suppose if
there is a better option for me but I cannot detect it from the visual result
on the screen, I don't have to care.
What are your impressions of the different options?
Matthias
More information about the mythtv-users
mailing list