[mythtv] Channel change time

Janne Grunau janne-mythtv at grunau.be
Sun Jun 6 19:50:52 UTC 2010


On Sun, Jun 06, 2010 at 12:21:02PM +0100, James Courtier-Dutton wrote:
> On 5 June 2010 13:01, James Courtier-Dutton <james.dutton at gmail.com> wrote:
> > Hi,
> >
> > I have not improved anything yet, but this is what I have found.
> > 1) With my DVB card, the frequency tuning is quick.
> > Time from channel change request to first TS packet is 711ms.
> > 2) The ringBuffer then waits for 2028ms before it lets any TS packets
> > to be processed.
> > So, a PAT is not even processed until 2028ms after the first TS packet
> > from the newly tuned channel.
> > For this entire 2028ms, TS packets are just dropped.
> >
> > I believe this is not the ideal way to do things.
> > Is there any reason to delay the ringBuffer initialisation at all. Why
> > not initialise the ringBuffer at channel frequency tune point?
> > I think this would shave 2 seconds off the DVB channel switch time.
> >
> 
> I have found out where the 2 second delay is.
>     MPEGStreamData *streamData = NULL;
>     if (HasFlags(kFlagWaitingForSignal) && !(streamData =
> TuningSignalCheck())) {
>         return;
>     }
> 
> So maybe this long channel change time is caused by the
> "TuningSignalCheck()" function call.

Does it takes 2000ms for a single function call or 2000ms until
TuningSignalCheck() returns non-NULL?

The former looks like an bug the latter is just SignalMonitoring delay. For
digital channels the signalmonitor looks for correct PAT+PMT+SDT+?VCT before
it decides that the tuning was successful. Could you check the repetition
rates for those tables 2000ms seems high.

Another delay before writing data starts is waiting for a video keyframe.

Janne


More information about the mythtv-dev mailing list