[mythtv-users] Better looking playback?

Michael T. Dean mtdean at thirdcontact.com
Mon Oct 31 20:27:45 EST 2005


MacNean Tyrrell wrote:

>On 10/31/05, Michael T. Dean <mtdean at thirdcontact.com> wrote:
>  
>
>>MacNean Tyrrell wrote:
>>
>>>Ok so i'm using bitrate of 5500 with a max of 8,000. Kernel
>>>deinterlacing (which is best?). It looks better but could be better.
>>>It's probably because the tv is so big. We'll see.
>>>
>>That's probably about the best you can get. You're limited by a) NTSC
>>(as already discussed) and b) the electronics in the PVR-250. In
>>theory, it may be possible to get a slightly better picture with a
>>PVR-150 (it has newer and better electronics), but the amount of
>>improvement is probably not worth the cost to you. You may find that
>>you get about the same quality at 4000/6500 or even 4000/5500 (even with
>>the additional bitrate, the other limitations have probably maxed out
>>your quality before using the available bits).
>>
>>I've been recording at 2200/9800 (it never actually uses that headroom,
>>1 1/3 times average bitrate is more appropriate, but that actually
>>exists within the range and I was too lazy to change it), and my
>>recordings looked great on my 27" NTSC piece of junk TV, but I can
>>definitely see artifacts on my new 67" 1920x1080 DLP. I'm impressed
>>that it doesn't look worse than it does with the amount of
>>magnification. However, I haven't changed my bitrate as motivation to
>>get my HDTV antenna/cabling/amps/splitter/4xHD-3000 rig in place.
>>(Which I won't be doing tonight. Handing out candy to
>>trick-or-treaters... :( )
>>
>Amen to that. To bad newegg had to ship my stuff separate. Still waiting on
>my new case. So how do i get better resolution coming out of my 5200. Right
>now i beleive it's at 1024x768. How would i go about outputting 1920x1080 to
>my tv?
>
You just need a modeline in your X config (XF86Config or xorg.conf).  
(Note, however, that if you have a 1280x720 TV, you're probably better 
using a resolution of 1280x720 (and, it might not even accept a higher 
resolution over VGA/DVI).  Even though your TV can accept a 720p or 
1080i signal, it's either a 1280x720 set or a 1920x1080 set.  If you 
bought it before July of this year, it's a 1280x720 set--the 1080p 
Digital Micromirror Device (DMD) is brand new this year.)

> Also, would a higher resolution make those artifacts worse or better?
>  
>
Depends on whose scaler is better, Myth's or your TV's.  I would have 
Myth output the TV's native resolution primarily because you have 
control over all sorts of stuff in Myth, but the TV just does whatever 
it thinks is appropriate.  Then again, I'm a bit of a control freak.

>If worse would going to 800x600 be better? And i'm thinking about picking up
>that 6 dollar cable that's rgb to component and trying it out in the
>bedroom. Hope it works. And 67", nice, only 52" here.
>  
>
Yeah.  I was actually planning on the 61", but buying from a reputable 
dealer meant getting the 67" (they didn't cary the 61").  On the bright 
side, I seem to have gotten a good deal through a mistake.

Mike



More information about the mythtv-users mailing list