[mythtv-users] Using Cable-tv certain channels fuzzy

John T. Nissley jnissley at nissley.org
Tue Jul 4 02:12:01 UTC 2006


From: mythtv-users-bounces at mythtv.org
[mailto:mythtv-users-bounces at mythtv.org] On Behalf Of Brian Wood
Sent: Monday, July 03, 2006 8:29 PM
To: Discussion about mythtv
Subject: Re: [mythtv-users] Using Cable-tv certain channels fuzzy

 





 

On Jul 3, 2006, at 7:01 PM, Daniel Leaberry wrote:





 

On 7/3/06, Brian Wood <beww at beww.org> wrote:


On Jul 3, 2006, at 5:58 PM, Daniel Leaberry wrote:

> This is a mystery to me so I figured I'd let others offer help.
>
> I receive basic cable (about 20 channels) at my house and use a
> pvr-500 to record with a dedicated backend in the garage. 1 week 
> ago channels 7-13 began to show moderate amounts of noise. I
> changed nothing. Today I had the technician come out and check my
> line. The signals are fine, the filters are fine everything seems
> fine on their end. The setup has been running flawlessly for over 2
> months.
>
> Things I've tried:
>
> 1) replacing the cable between the pvr-500 and the splitter (one
> end goes to the cable modem) 
> 2) fine tuning the frequencies using ivtv
> 3) rebooting
>
> Things I suspect might have something to do with it:
>
> 1) The backend runs in the garage which is un-airconditioned and 
> typically between 80-95 degrees. Maybe the heat affects only
> certain channels? I would think it would add noise to all the
> channels.
>
> My setup is as follows:
>
> *Dedicated backend running dual p3's and pvr-500. lspci -v of the 
> tuner is as follows
>
> 02:09.0 Multimedia video controller: Internext Compression Inc
> iTVC16 (CX23416) MPEG-2 Encoder (rev 01)
>         Subsystem: Hauppauge computer works Inc. WinTV PVR 500 (2nd 
> unit)
>         Flags: bus master, medium devsel, latency 64, IRQ 7
>         Memory at f4000000 (32-bit, prefetchable) [sizedM]
>         Capabilities: [44] Power Management version 2
> 
> I'm using ivtv 0.4.5 on kernel 2.6.15 (gentoo).
>
> Just fishing for possible issues. I would love to hookup a tv and
> see if it's just the card but I don't own one (The frontend
> connects to a 21" monitor). I'll have to ask the neighbor and see 
> if I can borrow one.
>

Interesting that you mention channels 7 - 13, as that is what's known
as the "high band" VHF channels. You could not have picked those
numbers at random unless you knew about frequency allocation, or you 
have a problem directly related to frequency.



I know nothing about frequency allocation (I'm glad you do!)

	 

	In fact, if you have only 20 or 22 channels, and are putting
them on
	a wire in frequency order, 7 - 13 would be the highest in
frequency
	of all.
	
	I know that sounds strange, but the actual order would be :
	
	2 - 6 (low band)
	14 - 22 (mid-band)
	7 - 13 (high-band)



This definitely seems to be it. I didn't mention it but channels 14-22
have a slight amount of noise. I wasn't sure if it was just me not
remembering what a clear channel looked like but put in this context the
whole thing makes sense. 
 

	 

	Coaxial cable attenuates RF energy at a rate proportional to
	frequency. In fact, a piece of cable that has 10db. of loss at 
	channel 2 (54 Mhz.) will have 20db. of loss at channel 13 (220
Mhz.).
	
	Cable loss also increases with the temperature of the cable, and
	proportionally with frequency, so as a cable heats up channels 7
- 13
	will be affected the most. Chennel 13 will be affected twice as
much
	as channel 2.
	
	So if you were experiencing problems due to either your garage
or
	your cable system in general heating up, it would be expected
that 
	channels 7 - 13 would be affected most noticeably.
	
	Taking the noise floor as a constant (which it is not, but let's
make
	things easy) a reduction in signal level would result in a
	degradation of the signal-to-noise ratio, and a noisy picture
under 
	low or marginal signal conditions.
	
	Thus it would not "add noise to all the channels" equally, but
to 7 -
	13 more so than the others, exactly what you are seeing.
	
	Cable techs are famous for saying "everything's fine at their
end", 
	when it is not. If you are splitting the signal several times
you
	could well be down enough that the increased loss due to high
	temperature would be visible.


This is a local community cable system so I like to think they'd care a
little more. They pulled out something that looked like a fluke (for
those familiar with networking) and plugged the coax in to check it. I
only have one splitter. It's a two way splitter with one run to the
cable modem and one run to the pvr-500. Granted it looks like the
cheapest splitter I've ever seen. 

I think you've solved my mystery Brian. I'm going to try a different
splitter, maybe put the box in the living room for a few days to test
the air-conditioning.

 

The device that "looked like a Fluke" was a Signal Level Meter
(sometimes wrongly called a "field strength meter"). It measures the
level of an RF TV signal in Dbmv., which is decibels relative to one
milli-volt across 75-ohms. This used to be known as a "dbj" (J for
Jerrold) in the 1950s before the standard was adopted officially. dbj or
dbmv was initially measured with the Jerrold model 704 meter, a beast
that contained lead-acid batteries and vacuum tubes and was great fun to
haul up a telephone pole. There was great rejoicing when it was replaced
by the model 727, which was transistorized, but still weighed a ton.

 

Jerrold, if you're interested, was a company involved in the early
construction and supplying of CATV systems, and was named for its
founder Milton Jerrold Schapp, who later became the Governor of
Pennsylvania. Jerrold was bought out by General Instrument in the 1980s.

 

Interestingly, Fluke now makes signal level meters.

 

Without giving the entire standards for signal strength (which involve
variation over a 24-hour period, and adjacent channels being allowed
less delta than non-adjacent ones and other specs) I will say that
nominally a cable system should deliver at least 0 dbmv., (that's a
"zero") although most TV sets and capture cards are happier with
something closer to 7 - 10 dbmv.

 

Systems also have problems with "tilt", the difference in level between
the lowest and the highest frequency channel, caused mainly by
differential attenuation of the cable. This problem is dealt with by the
cable companies by using equalizers and tilt-control amplifiers of
various designs, often using "dual pilot" designs with the AGC
responding to signals at both ends of the spectrum.

 

A two-way splitter will have a loss of 3.5 db. to each leg, so it would
be nice if you could put 10 -14 dbmv. into the splitter input.

 

There is a very good chance that the device you think is a splitter is
not. It is more likely to be a "directional coupler", as they are often
used to get more signal to the cable modem than if you were using a
two-way splitter, and isolate the modem from problems. DCs have very
little loss on the "through" leg, less than a db. in some cases, and
from 8 to 35 db. loss on the "tap" leg. They provide much better
isolation between devices than hybrid-splitters and thus more protection
to the modem from problems caused by troubles in other places on your
system. They are also less dependent on proper termination of the ports
to provide adequate isolation.

 

All such devices are made extremely cheaply as they are bought by the
thousands by cable companies.

 

So much for CATV 101, quiz tomorrow morning :-)

 

I hate to state the obvious but have you tried to connect a regular TV
to the cable that is connected to your PVR card?  This way you will be
able to tell if the signal going into the PVR card is actually any good.
I had a cable guy tell me for two months that the cable was fine even
though I had a snowy picture.  I switched to satellite and kept the same
TV and now my picture is crystal clear.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mythtv.org/pipermail/mythtv-users/attachments/20060703/2b7b08aa/attachment.htm 


More information about the mythtv-users mailing list