[mythtv-users] Most tuners in a single machine

Brian Wood beww at beww.org
Fri Mar 3 20:17:08 UTC 2006


On Mar 3, 2006, at 12:29 PM, Chad wrote:

>> Did you connect the amp at the source (i.e. antenna or cable  
>> company's
>> feed) instead of the destination (i.e. where you would be  
>> amplifying the
>> noise rather than signal)?
>>
>> Mike
>> _______________________________________________
>
> To add to that slightly, if you *are* getting a crap signal and
> amplifying it, you are amplifying (and paying for) a crap signal.  In
> my neck of the woods and it would seem to hold true for most cable
> providers, you can call them and have them check the signal quality at
> your feed into your home.  For me (Comcast in SLC Utah area) if it's
> bad at the point where it comes into the house, the call is free.  If
> it's bad from the wall to the TV, it costs me for the call but the
> "technician" will work with me for a while to clear up the noise, at
> the very least help me to diagnose where the problem lies (usually bad
> cable or connector).  And if it's not in a wall, they will usually
> cook you up a new set of cables onsite to replace your existing ones.
> Yeah, 45-100 bucks for new cables is expensive, so I usually check to
> make sure it's not a problem on my side of the wall before making a
> call.
>

There is an FCC requirement for the signal strength delivered to a  
customer by a CATV system:

A minimum of 0.0DbMV on any one channel

No more than a 12DbMV difference between any two channels on the system

No more than a 3DbMV difference between any two adjacent channels

No more that 5% "Hum" or low-frequency AM noise on the system

No more than a 12 DbMV variation in any 24-hour period.

Composite Triple Beat or "Cross-Mod" components at least 48Db. below  
signal level.

There are, however, some "catches":

Technically these standards only apply to "broadcast signals", so  
theoretically channels like HBO and CNN do not count.

The requirement specifies the signal "delivered to the Television set  
input terminals", so if a set-top box is being used, whatever gain  
and/or AGC the box has can be used in the possible favor of the CATV  
operator. These standards were written before the common use of set- 
top boxes (converters).

Most TV sets are a bit noisy with a 0 DbMV input signal, and are  
happier with around 10DbMv. With 50 or more channels on a system you  
can get into beat product problems in the tuner with more than 15 or  
20 DbMV per channel.

I've read that "Capture cards require more signal than TV sets",  
which is nonsense, capture cards use the same tuner modules that are  
used in TVs.

The bottom line is that most cable operators try and deliver an  
average of around 10DbMV to the reference point of the home, usually  
the grounding block or terminal box. A two-way splitter will lose  
about 3.5 db. per leg, thus a 4-way split would lose 7db, plus the  
loss of the cable from the splitter to the sets.

Thus most cable feeds can be split at most four times before really  
apparent noise problems develop, assuming a good feed from the  
provider and average cable lengths.

Any amplifier should be placed as close to the source (the Cable  
system) as possible. These amplifiers *must* be of the push-pull type  
and capable of 50 or more channel operation. A single-ended VHF  
antenna amplifier designed for at most 6 or 8 channels will be  
totally swamped by a typical CATV signal.

All this refers to "analog" signals, with digital signals things  
start changing a lot :-)

Anyway, just my $0.02USD.


More information about the mythtv-users mailing list