[mythtv-users] Concert footage and video artifacts
Michael T. Dean
mtdean at thirdcontact.com
Fri Nov 18 18:25:44 UTC 2016
On 11/18/2016 12:33 PM, Ian Evans wrote:
> On Nov 17, 2016 4:24 PM, "Jay Foster" wrote:
>> On 11/17/2016 12:40 PM, Ian Evans wrote:
>>> This isn't a pressing issue, but more a "just curious, learn something
>>> everyday" type of thing.
>>>
>>> Sometimes when watching a music awards show or concert footage, the
>>> picture can go crazy if there's a quick change of lighting or fast strobing
>>> effects. Suddenly the picture goes from the crystal clear bronze glow of
>>> the Zildjian cymbals and chrome on the drum set to a blocky visual worthy
>>> of an 8-pixel video game emulator.
>>>
>>> Is that how it's coming from the source? Is it a temporary deinterlacing
>>> issue caused by elves? Will the right video card prevent it?
>>>
>>> Curious minds want to know. :-)
>>>
>> That is compression artifacts. Usually due to insufficient bandwidth to
>> encode the video. This is typically introduced by the local broadcaster,
>> most of which choose quantity (i.e., more subchannels) over quality (i.e.,
>> a higher bit rate).
> Oddly I usually see it on the Grammys, on an OTA channel with zero
> subchannels.
Some of the "celebration" scenes in awards shows, etc.--especially those
with tons of flashing/moving lights and glitter "snow" being shot in the
air or similar--require tremendously high bitrates to encode such that
it looks correct (I wouldn't be surprised if transmitting only
intra-coded pictures (I-frames--basically fully-specified images) and
not even using the "bit-saving" predicted (P-frames/delta-frames) or
bi-predictive (B-frames) pictures during some such scenes would take
fewer bits). While encoders try to account for change-heavy scenes
using variable-bitrate encoding (giving higher bitrate to more
complex/faster-changing scenes, as required), the maximum bitrate limits
applied are generally below what's required for these "chaos" scenes.
The maximum bitrate limits may be applied to account for a variety of
things beyond just making room for subchannels--including keeping the
bitrate low enough that decoders "in the wild" can generally keep up
with the decoding (where not every TV in use today has a 2016
decoder--such as my ~2005 HLR-6768W, which struggles and drops frames
even in low bitrate situations, but no worries since MythTV is the
decoder I'm using, and it does fine thanks to a good frontend). So, in
general, most broadcasters decide that a short section of garbage video
for chaotic scenes is good enough. I don't think I've ever seen any
such chaos scene that looked good, regardless of
source/broadcaster/decoder/display/...
The funny thing is that these scenes looked just fine in our analog
transmission formats (which, really, just sent intra-coded
pictures--full frames). Digital TV broadcast uses lossy compression
and, therefore, has issues with some types of scenes/video. Dark scenes
are also a huge challenge for our digital TV encoding formats--which is
funny because dark became popular (ref Christopher Nolan/Batman Begins
:) about the time the US (and, really, much of the world) was making the
change to digital. And, no, I'm not saying we should go back to analog
transmission--just saying that we did have to make some sacrifices to
fit 60M pixels/sec of images into our 6 (or 8) MHz channel allotments.
FWIW, the MPEG (Moving Picture Experts Group, not any format) is
actually noticing these limitations, too, and are working to improve
their video-coding standards to account for such weaknesses in the
existing standard. Some of these improvements should make it to our
broadcast standards in the next 50+ years or so. :)
Mike
More information about the mythtv-users
mailing list