[mythtv-users] Getting 1080p from comcast cable boxes (Was VUDU)

Robert McNamara robert.mcnamara at gmail.com
Wed Feb 25 01:47:01 UTC 2009


On Tue, Feb 24, 2009 at 5:28 PM, Brad Templeton
<brad+myth at templetons.com> wrote:
>
>>> Note that when providing movies, 1080p is no particular technical
>>> challenge.   Movies are shot at 24fps, and doing them at 60 fields/second
>>> interlaced is actually a terrible shame, though that is what is done
>>> on a the broadcast and many cable/satellite channels.
>>>
>>> I was quite pleased to learn that some of the cable channels,
>>> such as AMC-HD, provide their movies at 1080 p, 30fps.   This is
>>> good because it means a smaller file and no deinterlacing required
>>> to watch or transcode.
>>>
>>>
>>
>> Well, it's actually 1080*i* at 60 fields per second/30 frames per
>> second.  Obviously no way for me to show that your local headend isn't
>> doing deinterlace themselves, but AMC HD, at least, broadcasts in
>> 1080i.
>>
>>
>>>
>>> The comcast cable box spits out mp2 over the firewire.  Does anybody
>>> know if Comcast actually sends mp2 over thier QAM channels, or
>>> do they send mp4 and then have the cable-box up-transcode it to mp2
>>> to fit the spec (and then I re-transcode it down to mp4 again if I
>>> will not be watching it for a while, sigh
>>>
>>
>> The entire system is in MPEG-2.
>>
>> Robert
>> _______________________________________________
>> mythtv-users mailing list
>> mythtv-users at mythtv.org
>> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
>>
>
> The AMC-HD recordings I get out of firewire are definitely progressive
> (or at least identified as such by mythtv) and at 30 fps.

can you dd out a short sample and upload it somewhere?

>
> I am curious as to how you know that AMC-HD broadcasts in 1080i?   I
> don't doubt it could be true, but it seems rather odd since the
> alternative makes a lot of sense.  AMC-HD does nothing but movies which
> are 24fps.   The cable and satellite companies, if they are able to
> receive it, would be well served to get 1080p 24fps from the source
> channels, since that is the easiest thing for them to mogrify into
> whatever format they want to use, even for them to do the telecine to
> get 1080i.   It's the easiest thing to transcode, to downres etc. and it
> makes a smaller bitstream than a telecine'd video.   However, since
> 24fps is not a standard rate, I think, for many devices, I can see them
> doing that at 30fps, which does not make a much bigger mpeg2 since the
> dup frames take up very little space.

Like I said 60 fields, 30 frames.  Telecining (and inverse telecine)
have nothing to do with 1080i and p, they have to do with framerates.
If they *were* doing an inverse telecine locally, you would be getting
a 24 frame signal, not a 30 frame signal (although I understand that's
not the point of your query).

No nationally syndicated channel is broadcasting 1080p.  They are all
SD, 720p, or 1080i.  Channels which broadcast in 720p are generally at
60 frames per second to take advantage of smoother motion (sports
channels, most commonly).  Channels at 1080i do so because they tend
to have less motion and want to take advantage or greater resolution
(Dramas).  Of the national networks, CBS and NBC are at 1080i.  Fox
and ABC are 720p.

My answer comes from direct experience working in the cable industry.
This is the reason it was such a "big deal" when Dish said they were
going to start providing 1080p movies.  They were amongst the first
"broadcasters" to announce such a move.

I'm not saying it's not possible that your local headend is doing some
jiggery-pokery to deinterlace a 1080i signal, but I *am* saying from
firsthand experience that the AMC-HD feed is 1080i.


>
> You suggest AMC-HD sends out 1080i to Comcast, and they inverse telecine
> it and send it out over QAM as 1080p, and then have their cable boxes
> convert it to 1080i and 720p to drive HDTVs?    Could be true but makes
> a lot less sense than the other.

Again, inverse telecine is the process of taking a film-sourced 30 or
60 frame signal and "de-converting" it to a 24 frame signal.  Has
nothing to do with resolution or interlacing.

>
> Of course, we myth users would love it if they just gave us 24fps files
> of movies.  mpeg4s even better.   I am told the satellite companies use
> mpeg4 natively.
>

A mix of MPEG-2 and MPEG-4 right now, yes.  They are switching to
h.264 over the course of time.

>
> I have notice a fair bit of difference in the bitrates used by various
> stations.  OTA ATSC stations tend to run 11 to 13 megabits of mpeg2.
> My local PBS, sadly, does 1080i at 9 megabits, which is way too low, and
> it looks terrible.  (They try to air 3 other SD channels on the same
> multiplex.  Greedy.)
>
> AMC-HD takes about 9 megabits and looks good (because it is 30fps
> movies).   SciFi HD which I record a lot seems to run as high as 16
> megabits.
>
> I have also seen 1080p 30fps on Universal HD.   FXHD seems to air in
> 720p.  I guess we should make a chart of these.
>

Uni HD is another 1080i channel.  Again, from direct experience with
the feed I can guarantee that it's interlaced.  Once again, I'm not
trying to be anti-social and say that it's not *possible* that there's
something being done locally, just that a) it would be useless (and
almost deceptive) as the source is interlaced and b) if true, it's
certainly the exception and not the rule.

I'm not trying to be snarky in asking for samples, I genuinely would
like to see what if any changes have been made to the stream, and if
there is a misconception, explain how it might have happened.

Robert


More information about the mythtv-users mailing list