[mythtv-users] What do I have to do to get HD working?

Daniel Kristjansson danielk at cuymedia.net
Sat Apr 15 18:33:43 UTC 2006


On Sat, 2006-04-15 at 13:38 -0400, Steven Adeff wrote:
> On 4/15/06, Joe Votour <joevph at yahoo.com> wrote:
> > But, does that mean that you use any deinterlacing at
> > all?  I suspect not, if you're running the TV
> > interlaced.
> 
> You'd think, and from what people say OpenGL is supposed to let me
> not, but I get really bad frame flicker if I don't use BOB, with or
> without OpenGL Vsync enabled. In fact, and I have to test this. I
> think if I disable it in MythTV and use the GL vsync from
> nvidia-settings it actually works this way. But I need to play some
> more. This change for me has been a pleasure and painful. I'm playing
> with 1080i, XvMC and OpenGL vsync all at the same time and it seems
> like something in the whole equation doesn't quite work right, at
> times I feel like going back to 7676 and running in 720p again...

To avoid running deinterlacing you need to match the frame refresh
rate and the resolution of the monitor exactly with the video. Which
means for 1080i you need to run full screen with a 1080i modeline
(i.e. 30 frames per second + interlaced *), and without MythTV overscan
and without time stretch.

Also, if you are displaying to an LCD or Plasma rather than a CRT
the monitor is almost certainly+ deinterlacing internally anyway.

* This one catches a lot of people, you can not use a 60 frame per
  second interlaced modeline unless you have an extremely smart
  deinterlacer in your monitor. You also don't want a 30 frame per
  second modeline for normal non-TV use, so this means you really
  have to use MythTV's XRandR support unless this is for a dedicated
  MythTV box.

+ There is at least one Plasma that does interlaced display,
  but the quality improvement of avoiding deinterlacing entirely
  is minimal at 1920x1080 with screens that are less than about 40".

-- Daniel



More information about the mythtv-users mailing list