[mythtv-users] native resolution vs 1080i for HDTV and SD on 1280x768 lcd?

mrwester mrwester at gmail.com
Thu Nov 10 15:19:58 EST 2005


Hi-

I'm contemplating adding a HDTV tuner and have a few quick questions, but
1st, some brief background. Current myth setup is FC4 and 18.1 on AMD 2900,
512mb ram, 500gb of hard disks, 1xPVR-350, 1xPVR-500, chaintech fx5200,
Olevia 30" LCD TV that has DVI, VGA, S-video in as well as HDTV ready
(1080i, 720p, 480i) component inputs YPbPr and YCbCr. DVI out at HDTV are
not an option. Native resolution of the panel is 1280x768. I'm currently
running the tv-out of the PVR-350 for my myth needs and am very happy with
it. Because I don't archive many shows and have plenty of spare drive space
I capture at 720x480 resolution at the default or higher bitrate. Picture
quality is comprable to cable straight into the tuner of the TV.

I know that the pvr-350 won't work for HDTV, so I've begun tinkering with
the Nvidia card. I can run the FX5200 at native resolution 1280x768 for the
LCD via DVI and mythtv itself, the menus, etc.. look fantastic-much sharper
than the PVR-350. Playback of recordings or live tv from the PVRs is a
different story. No matter what deinterlace I try within mythtv, the CNN
ticker stutters every few seconds and is blocky and it just doesn't quite
compare up to the PVR-350's TV-out. Probably watchable overall, but not
perfect and difficult to tolerate when I've got the PVR-350 in the box. I've
tried running the tv-out of the fx5200, using flicker filter (nvidia
hardware de-interlace?) and the overall quality is still even worse. I'm
running the latest 7676 drivers without XvMC. I'm not 100% sure why the DVI
stutters on the 5200 FE/BE as I can play to a remote Pundit-R frontend using
it's ATI based vga out to a lcd monitor without any stuttering of CNN but
that's a whole other story/question...

If I do add a HDTV tuner (planning to use OTA or QAM, not cable tv provided
external box) to the setup, I'll need to use the FX5200. I've searched and
googled for a few days now and perhaps since my questions are so basic, I've
not found the answers. My questions are- Should I output via DVI at native
resolution? Or should I get an external transcoder box to process the DVI or
VGA out to component 1080i or 720p and let the tv scale to 1280x768? What
about one of the Nvidia 6200 or 6600 cards that have the HDTV component
outputs? they have a fan (a minus), but since these cards are now cheaper
than a transcoder box may make more sense if the component outputs work in
Linux. And finally, when you have a SD/HDTV myth setup, do you watch your SD
content over the 1080i connection and let the video card scale? or would it
be best to use 720p and use deinterlacing in mythtv? (I will only be getting
a few HD channels, most of my viewing is still SD) Or is there another
completely differnt way to do this and I've just missed it? Thanks,

Mike
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mythtv.org/pipermail/mythtv-users/attachments/20051110/54c1e58e/attachment-0001.htm


More information about the mythtv-users mailing list