[mythtv-users] native resolution vs 1080i for HDTV and SD on 1280x768 lcd?

Steve Adeff adeffs at gmail.com
Thu Nov 17 11:02:23 EST 2005


On Thursday 10 November 2005 15:19, mrwester wrote:
> Hi-
>
> I'm contemplating adding a HDTV tuner and have a few quick questions, but
> 1st, some brief background. Current myth setup is FC4 and 18.1 on AMD 2900,
> 512mb ram, 500gb of hard disks, 1xPVR-350, 1xPVR-500, chaintech fx5200,
> Olevia 30" LCD TV that has DVI, VGA, S-video in as well as HDTV ready
> (1080i, 720p, 480i) component inputs YPbPr and YCbCr. DVI out at HDTV are
> not an option. Native resolution of the panel is 1280x768. I'm currently
> running the tv-out of the PVR-350 for my myth needs and am very happy with
> it. Because I don't archive many shows and have plenty of spare drive space
> I capture at 720x480 resolution at the default or higher bitrate. Picture
> quality is comprable to cable straight into the tuner of the TV.
>
> I know that the pvr-350 won't work for HDTV, so I've begun tinkering with
> the Nvidia card. I can run the FX5200 at native resolution 1280x768 for the
> LCD via DVI and mythtv itself, the menus, etc.. look fantastic-much sharper
> than the PVR-350. Playback of recordings or live tv from the PVRs is a
> different story. No matter what deinterlace I try within mythtv, the CNN
> ticker stutters every few seconds and is blocky and it just doesn't quite
> compare up to the PVR-350's TV-out. Probably watchable overall, but not
> perfect and difficult to tolerate when I've got the PVR-350 in the box.
> I've tried running the tv-out of the fx5200, using flicker filter (nvidia
> hardware de-interlace?) and the overall quality is still even worse. I'm
> running the latest 7676 drivers without XvMC. I'm not 100% sure why the DVI
> stutters on the 5200 FE/BE as I can play to a remote Pundit-R frontend
> using it's ATI based vga out to a lcd monitor without any stuttering of CNN
> but that's a whole other story/question...
>
> If I do add a HDTV tuner (planning to use OTA or QAM, not cable tv provided
> external box) to the setup, I'll need to use the FX5200. I've searched and
> googled for a few days now and perhaps since my questions are so basic,
> I've not found the answers. My questions are- Should I output via DVI at
> native resolution? 

Yes, always when possible. ie. my HDTV converts everything to 1080i but I 
can't find a DVI 1080i modeline that will work with my geforce so I'm forced 
to output at 720p, I might eventually get a newer geforce with an HDTV output 
chip and use that, I'll prbly end up buying a new TV before that happens 
though.

> Or should I get an external transcoder box to process 
> the DVI or VGA out to component 1080i or 720p and let the tv scale to
> 1280x768? What about one of the Nvidia 6200 or 6600 cards that have the
> HDTV component outputs? 

see above. They work well if needed. Quieter aftermarket gpu fans can be 
purchased as well if it bothers you that much.

> they have a fan (a minus), but since these cards 
> are now cheaper than a transcoder box may make more sense if the component
> outputs work in Linux. And finally, when you have a SD/HDTV myth setup, do
> you watch your SD content over the 1080i connection and let the video card
> scale?

From what I can tell theres no way to have Myth output vary depending on the 
source resolution. If it did/does, and your HDTV supports 480p native (ie 
doesn't scale it to another resolution) then it would be best to output SD at 
480p and HD at its native HD resolution.


Steve

> or would it be best to use 720p and use deinterlacing in mythtv? (I 
> will only be getting a few HD channels, most of my viewing is still SD) Or
> is there another completely differnt way to do this and I've just missed
> it? Thanks,
>
> Mike


More information about the mythtv-users mailing list