[mythtv-users] wrong bitrate detected

Marco Nelissen marcone at xs4all.nl
Fri Mar 10 18:09:20 UTC 2006


>Hi,
>
>I have some low bitrate recording from my local PBS station (KQED San
>Francisco) in digital SD that I recorded off the Comcast cable using
>HD-3000.  Half an hour worth of recording is about 800MB.  So I calculated
>the bitrate is about 1800 kbps.

If a half hour is 800 megabytes, then wouldn't that be 800*1024*8 kilobits
for 1800 seconds, or 3640 kilobits per second?

>  MythTV is having problems playing back
>these recordings.  It plays as if it's in slow motion.  And the sound is
>distorted too--shifted to lower frequency.  And in the frontend log I found
>out that MythTV thought the recording was at 81 kbps.
>
>   stream: start_time: 8234.631 duration: 82875.702 bitrate=81 kb/s

I wouldn't be surprised if kb/s actually meant kilobytes per second. Not
everyone is equally consistent with the use of 'k' vs 'K', 'b' vs 'B', etc.

It would be interesting to chop a few minutes off the start of the file
and try to play the remainder and see what it says about the bitrate then.
Most bitrate detection problems I've come across are with variable-bitrate
streams where the application only looks at the first few frames to detect
bitrate.

Marco


More information about the mythtv-users mailing list