[mythtv-users] Anyone know WHY I have to deinterlace when using a 1080i tv?
Patrick Ouellette
pat at flying-gecko.net
Mon Nov 20 19:29:52 UTC 2006
Howdy Campers,
I've noticed that I have to have de-interlacing enabled when watching
MythTV (live or recorded) to get a good picture. The crazy part of this
is my display device is a 1080i format television (CRT, not DLP/LCD).
If I turn off de-interlacing the picture is jumpy/ragged as I would
expect an interlaced video to appear on a progressive display.
TV is a 5 year old Samsung that does 480p and 1080i (ONLY) driven by an
Nvidia 6600 card with the component outputs. I have been using XvMC,
but also recall noticing this effect without using XvMC.
I'm pretty sure the video card is in 1080i mode since thin lines on the
Myth menus flicker.
The real question: is this a Myth issue, a Nvidia driver issue or some
other thing?
Pat
--
Patrick Ouellette pat at flying-gecko.net
kb8pym at arrl.net Amateur Radio: KB8PYM
Living life to a Jimmy Buffett soundtrack
"Crank the amp to 11, this needs more cowbell - and a llama wouldn't hurt either"
More information about the mythtv-users
mailing list