[mythtv-users] What's easier to convert/decode? 720p or 1080i?

Joe Barnhart joebarnhart at yahoo.com
Tue Mar 15 00:21:25 UTC 2005


--- Art Morales <bioart at gmail.com> wrote:

> I have a Samsung DLP tv that displays natively at
> 720p. 
> Would it be better to send the signal as 1080i and
> have the machine
> convert 720p to 1080i? 

If your set is 720p native, by all means use a 720p
modeline.  I have a 1080i native set, and the CPU
requirements are significantly higher for 1080i
content than 720p content.  So the modeline isn't the
sole reason you are seeing higher CPU usage on your
setup.

I preduct your picture quality will suffer greatly if
you use a 1080i modeline and have the content
upconverted in myth and then downconverted by the set.
 I know that 720p content looks pretty marginal at
times on my set, and I think it is because the nvidia
video card (and its driver) does the scaling and does
a poor job of it.


__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 


More information about the mythtv-users mailing list