[mythtv-users] pvr350 hurting my TV?

Louie Ilievski loudawg at comcast.net
Sat Mar 18 21:47:40 UTC 2006

> It did seem that using the pvr350 mpeg2 decoder was giving me better
> picture quality, but I think that was due to the fact that I had to set
> the aspect ration to "fill" during playback to get it to fill the screen.
> Once I found out that I could just set the aspect ration in the recording
> options to "square" it seems to be a lot better (leaving the playback
> aspect ration to "off").  Is that how you are doing it?

I never touched the aspect ratio with my setup.  It is just "off" and the 
overscan seems to make it fill the screen perfectly (comparing it to a 
straight signal into my tv looked pretty much identical).

> Also I got a binary version of ivtvdev_drv.o because I couldn't seem to
> get the source that I found to compile on my distro (ubuntu 5.10).  What
> did you use?  I thought it might be a bit better if I compiled it on my
> distro.

I've always used the binary version.  Just make sure you're using the latest 
one.  There seem to be multiple versions out there.  The offical, and latest 
one can be downloaded from  http://dl.ivtvdriver.org/xdriver/0.10.6/

> Also the overscan seems to be a bit excessive but it doesn't appear that
> you can adjust that.  How do you get around the osd going off the screen?
> I suppose I could just use another OSD or modify the xml myself but I
> would think there would be a better way.

In my experience the overscan has been perfect, but yes, you probably will get 
a little clipping on the OSD.  I haven't really played with this issue much 
since it cuts off just a little bit anyway.  I don't have a solution for this 
right now short of modifying the OSD theme.  I think a good feature in myth 
(and I swear I saw something about this a long time ago) would be to have it 
use the GUI dimensions that were set to determine OSD position and size.  I 
suppose this would work if you had the "Use GUI size for playback" option 
set, but then it's scaling the video and I don't want that.  There may be 
some trick I'm not aware of.
> This has got me a little curious though if similar quality could have been
> achieved using the nvidia framebuffer instead of the usual X driver.  I
> probably won't mess with it though (at least for now), since this seems to
> be going pretty good.

And this is what always plagues me time and time again  :-)  I honestly 
wouldn't bother.  I think my experience alone is enough justification.  I've 
literally spent days and days and days on this over the past year.  I'm just 
way too persistent sometimes, and in this case it did not pay off.

MAYBE it's possible that the hardware on these nvidia boards is capable of 
doing what we want, but the driver isn't flexible enough to achieve this and 
we have no way of changing it since it's closed source.  If you haven't read 
Cory Papenfuss's rant on the subject like he mentioned in another on of your 
threads, you really should read it.  He makes some very good points.  The 
thread was called "when to deinterlace"


More information about the mythtv-users mailing list