[mythtv-users] Network tuners vs. USB Tuners

Stephen Worthington stephen_agent at jsw.gen.nz
Mon Nov 15 02:40:44 UTC 2021


On Sun, 14 Nov 2021 16:54:39 -0500, you wrote:

>I have a production remote backend with a Hauppauge WinTV-quadHD PCIe card
>tuner.  I'm very pleased with it. I also have a test FE/BE combo built with
>a NUC and it uses a HDHomeRun Connect Quatro.  Both tuners are connected to
>the same attic antenna via a splitter.  I'm so close to the TV broadcast
>towers that I usually get between 90 and 100% signal strength.
>
>Occasionally, I get recordings from the HDHR tuner that have pixelation
>throughout the recording, maybe 1 or 2 a minutes.  However, on the
>WinTV-quadHD tuner the same program recorded at the same time will have no
>issues for hours.
>
>I'm guessing this problem is tuner related, since over the years
>SiliconDust has had to update the firmware in the Quatro to fix issues with
>reception with certain TV stations.
>
>So my question is what about using the Hauppauge WinTV quadHD USB tuner. It
>requires USB3 which my 11th gen Intel Core i7 NUC has.  Is that tuner as
>good as the PCIe quad tuner??
>
>JIm A

Recording of DVB/ATSC signals is very simple - just tune the correct
frequency then receive the digital packets that arrive on that
frequency.  Some tuners (eg HDHR) will also then allow selection in
the tuner of which digital packet streams are to be passed on to the
CPU, which reduces the CPU workload.  But in any case, all the CPU has
to do is to select which packet streams form part of the channel to be
recorded, and then just write those streams to disk.  The packets are
not modified in any way, and the CPU will not even read their contents
except to check for validity.  It is not like analogue signals where
there can be problems in the signal that cause trouble with the video
or audio on playback.  With digital recording, either a valid packet
is received and written to disk, or there is a gap in the recording
where no valid packets were received (packets may not have been
received at all, or they may have been invalid and been dropped).

There can be problems at the start of playing back a recording if the
playback code is not doing the right things.  Until the header packets
that tell the decoder how to decode things have all been seen, the
decoder may interpret the contents of the packets incorrectly and
display funny things or play bad sound.  What should be happening is
that until the decoder is fully configured by reading the necessary
header packets, it should not start displaying anything or producing
sound.  The transmitter sends the header packets frequently, but the
actual time between repeats of the headers is something that each
transmitter or each channel can have different values for.  Some
channels choose to have a much longer period between header packet
retransmissions so that they can use the extra bandwidth for more of
the actual channel data and so have a higher bandwidth to use for a
higher quality signal.  The tradeoff is that it takes much longer for
recording to start on a channel - the tuner code has to wait for the
headers to arrive.  We have one channel here in New Zealand where the
headers are around 7 seconds apart - all of the others are less than 2
seconds.

Since the video and audio (and other things such as subtitles) are all
separate streams, they have to be coordinated so that they are
synchronised on playback.  This is where most of the playback problems
come from.  The playback code will normally find that there is extra
video data at the start of a recording with no matching audio data, or
vice versus, and needs to discard the extra data until it has both
streams available.  As well, in the streams, if they are compressed
(and they almost always are), it will not be able to decompress them
until it sees the internal stream data (usually another type of header
or special frame) that allows it to start decompressing and decoding.
The point at which that happens in each stream will vary quite a bit,
and the other streams that are already decodeable need to be discarded
until all streams are decodeable.  Then after that, more data needs to
be discarded until all streams have data that says it is to be applied
to the same time in the recording (the streams can be synchronised).
Only after all streams are decodeable and have data for the same
timestamp should the decoder start the actual playback to the user.
But this does not always happen properly, and sometimes video gets
played back when it is not actually able to be decoded properly, hence
you get artifacts like a green colour wash, or missing bits.
Similarly, you can get nasty audio artifacts too.  The proper fix for
these things is to fix the decoder, but decoders are complicated and
it is difficult to cover all the various problems they have to deal
with and most will not be perfect.  So what happens is that you find
you have some combination of things in the recorded data at the start
of recordings for a channel that will trigger bad display or bad audio
until a new set of headers for all the channels is seen and the
decoder can correct itself.  It is usually one or two particular
channels that will do it, and keep on doing it until something
changes, such as the studio producing the channel alters the settings
it uses to transmit with, or your decoder gets an update that fixes
it.

In your particular case, where you likely have very high signal
levels, it is entirely possible that the HDHR is actually too
sensitive and is getting overloaded.  So what I would recommend is to
try reducing the signal level it is receiving.  The easy way to do
that is to add another two way splitter in its aerial path, with one
output of that splitter going to the HDHR and the other unused.  That
will reduce the signal it is receiving to a little less than half what
it is currently getting.  Tuners operate over a very wide range of
signal levels, so if it is getting too much signal, halving the signal
should mean that it will still be getting a level that is still in the
high end of its available range, but it should no longer be too much.

As to USB tuners, I have found that USB is usually more problematic
than PCIe or networking for tuners, as USB cables just tend to cause
trouble.  I used USB tuners for many years, and every so often a USB
tuner would have a problem.  I eventually worked out that it was the
cables, and so I started using the USB tuners on extension cables that
allowed them to be laid down on a flat surface and the cables taped in
place with parcel tape.  That made them significantly more reliable,
but even then, eventually there would be a problem where I would have
unplug the cable and replug it to get the tuner to work again.  So I
solved my problem by buying PCIe tuners - I have two 8 tuner cards in
my production MythTV box, one for DVB-T(2) and one for DVB-S(2).
Hazards for USB cables include bumping the box they are plugged into,
bumping the tuner, earthquakes (we get quite a few of them in New
Zealand), vacuuming nearby, the cat, the baby, ...  We have an older
home with wooden tongue and groove floors and the house is on piles.
So there is a little flex in the floors, and I have had one occasion
where I just walked past the MythTV box and the floor flexing moved
the table the MythTV box is on just enough that a USB tuner stopped
working at that moment when I could see it happen.


More information about the mythtv-users mailing list