[mythtv-users] XvMC and CVS

dean vanden heuvel deanv at cox.net
Fri Sep 10 19:19:30 EDT 2004


once again, thanks for the help...

I now have 2 modes that look reasonable:
1) 1080i Sw decode, no deinterlace
2) 1080i XvMC, bob deinterlace

BTW, both of these modes are using the Nvidia (6111) interlace mode, so the frame rate should not be a problem.  In fact, I believe that the bob deinterlace, coupled with the interlace driver mode, may just be cancelling each other out (bob creates two full frames from one, expanding each line twice [I think], and then the driver interlace mode sends only every other line each 1/60 second, thus undoing "bob", if I understand).  Because of this, if I am correct, there should be no framerate issue with my TV.

SW decode with no deinterlace looks good also.  Can you recommend a denoiser?

Finally, I'm still not sure if using my (TV native) 720p mode, coupled with a SW decode and deinterlace (kernel probably, since bob indeed "bobs" with SW decode) is better or worse than the 1080i.  Maybe you have an opinion?  And perhaps it is already included in your recommendation regarding the TV filters...If I extend your theory to include scaling, then the best picture should result from a 720x480i mode (matching what I capture), then let the TV SCALE and DEINTERLACE.  Problem there is a 480i mode uses TOO LOW a scan rate for the TV, and I can't seem to get doublescan to work.  Ever tried doublescan?

-dvh



On Fri, 10 Sep 2004 12:17:50 -0400 (EDT)
Daniel Thor Kristjansson <danielk at mrl.nyu.edu> wrote:

> On Wed, 8 Sep 2004, dean vanden heuvel wrote:
> 
> ]Thanks for the reply...
> ]
> ]It is the video output itself that is the problem.  When I enable XvMC
> ]and disable interlace (from the frontend), and use a 1920x1080i output
> ](which you are correct, my TV can handle), the video looks very
> ]*blocked*, sort of like each pixel is VERY large.  Diagonal lines are
> ]truly stepped, etc.
> 
> That does read like deinterlacing interacting badly with the resampler
> in the DLP. I'll make a patch for you when I get a chance, look out for
> it on the dev list.
> 
> ]In the past, I was using XvMC (no deinterlace, as none was possible in
> ]0.15) driving the 1080i, resulting in a reasonable picture.  However,
> ]it seemed like scaling from the recorded 480i up to 1080i, only to have
> ]my TV scale back down to 720p (its native display mode) could be
> ]causing some degradation.  I figured that using Myth to scale from 480i
> ]to 720p, thus doing NO SCALING in my TV might improve things.  It may
> 
> Having XvMC convert to progressive mode is probably a bad solution if
> your projector can handle 1080i, the filters in a projector or TV will
> usually do a better job than the point sampling in XvMC (which will
> always make diagonal lines jagged at full resolution). It is really
> intended for displays not capable of such high resolutions, and/or
> with poor scaling hardware.
> 
> ]have (I cannot really tell because it takes too long to change from one
> ]to the other), but the bob deinterlace (without XvMC) makes the picture
> ]jumpy and movement in the XvMC mode seems to be slightly jerky, so I
> ]thought a return to 1080i might be in order...thus my question.
> 
> That's probably because of the processors in your projector or your
> computer struggling with the framerate. Are you getting any warnings on
> the console with "mythfrontend -v playback" ? (There have also been
> reports of OpenGL vsync not working well with 5x.xx series nvidia
> drivers.)
> 
> -- Daniel
> 


More information about the mythtv-users mailing list