[mythtv-users] High end, state of the art Myth Frontend

Andre Newman mythtv-list at dinkum.org.uk
Thu Sep 19 11:45:10 UTC 2013


On 19 Sep 2013, at 00:18, Michael T. Dean <mtdean at thirdcontact.com> wrote:

> On 09/18/2013 04:36 PM, Joseph Fry wrote:
>>>> No, 1080i60 has exactly as many pixels as 1080p30.
>>> I'm not talking about 1080p30.
>>> 
>>>>  the number of pixels
>>>> required for 1080p60 wouldn't fit into the 19.39Mbps bandwidth available
>>>> with sufficient picture quality
>>> Citation needed.
>>> 
>>> This statement is extremely hard for me to accept.
>> Because it's not entirely true.
> 
> And is not what I said.  What I actually said--before some over-zealous clipping--was:
>>>> the number of pixels required for 1080p60 wouldn't fit into the 19.39Mbps bandwidth available with sufficient picture quality (using the MPEG-2 encoders available at the time the standard was made), 1080p60 wasn't an option. 
> 
> Where the missing, "using the MPEG-2 encoders available at the time," part is a very important part, and seems to be pretty much what Joseph is saying, below, too.

There was also the culture of interlaced, it was all just a natural progression from 480i or 576i to 1080i, most normal broadcast people thought progressive video was some wacky odd ball thing that only computer people liked and for some unknown reason.

I still meet broadcast "engineers" who don't really know the difference between i & p, back then very few did! Very few channel execs have the slightest clue even today.

I had the head of Sports TV for Sky Germany tell me that they shot all the Bundesliga matches in both 720p and 1080i and there was no difference, later I found out he'd been shown the output of a cross converter as all their cameras at the time could only output 1080i. Of course now they bring some 3G (1080p50) back too, not quite sure what it's used for though.

When the TV reviewers started labelling 720p as not proper HD no one wanted to be seen to launch a HD station that wasn't proper HD.

> 
>> Within the limits that the ATSC allows, it's true.  Theoretically,
>> compressing a 1080p/60 video would use twice the bandwidth of
>> compressing a 1080i/60 video at the same level of quality.  In reality
>> that's not true because a) file size is not directly related to
>> resolution, b) the higher amount of detail would allow a greater
>> percentage of compression without a perceptible difference quality.
>> 
>> But the fact of the matter is, compressing a 1080p/60 signal will
>> result in a larger bandwidth file than compressing 1080i/60 to the
>> same perceptible quality level.

Sorry Mike, replying to Joseph here:

Incorrect, the EBU have been running a trade roadshow for many years disproving this commonly held belief.

This is not a proper paper from the EBU but I am not permitted to pass on the studies this derives from, it covers the concept quite well I think and the engineers I saw watching this (if they understood what they were watching) left quite amazed. Sadly I was told that most visitors told the EBU they had the labels wrong :-(

http://tech.ebu.ch/docs/techreview/trev_308-hdtv.pdf

This is not a proper paper from the EBU but I am not permitted to pass on the studies this derives from.

From the emission section: referring to the effect of high compression on 1080p50 over 1080i50 and 720p50

"However, at the lower bitrates (i.e. <10...13 Mbit/s) the 1080p/50 encoder becomes more overloaded with information, depending on the content, and this information overload appears to become the dominant factor affecting quality. The impairments with high compression are not as bad as those for 1080i/25 but more visible than 720p/50."


>>  Your starting with double the data,

Which is irrelevant, please see the above techreview doc.

"1080p/50 provides more information in the spatio-temporal domain [than 1080i] and encoders can conduct the compression more efficiently."

>> and if you try to compress it into the same bandwidth using the same
>> compression settings, you will introduce more artifacts.
>> 
>> That said, by changing the compression used, and subsequently the
>> capabilities of the decoder, you could definately compress a 1080p
>> signal into the same bandwidth without perceptible loss of quality.
>> But that would mean that the receiver would need to support
>> compression outside what the ATSC allows.  Which is why ATSC 2 is
>> being introduced.  It adds h264 compression and 1080p/60 to the
>> standard (among other things).
> 
> Mike
> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://www.mythtv.org/mailman/listinfo/mythtv-users



More information about the mythtv-users mailing list