[mythtv-users] Confusion about Output Cards.

Nick Ellson grimm at nickellson.com
Fri Mar 17 18:06:42 UTC 2006



It certainly sounds like I will have some trial and error testing infront 
of me. The system will be ready this next Thursday, in a 4U chassis that 
must go in the garage. Getting cable (analog) to it is easy, it comes into 
the house near my cabinet. But it is 60 feet to the back of my Sony Wega 
under the house, so I will have to figure out how best to get playback 
done.

I see expensive HDMI cables meeting my length requirements, but if the 
internal capacitance degrades the signal...

I already have one RCA video drop from my TV-out on my GeoVision home 
security system, and that looks just like the VGA screen, but then it is a 
security camera after all ;)

I will have S-Video output available on the G-Force card I will be using. 
I could hit my local Norvac/Greybar and find some quality cable that meets 
the conductor/shield requirements to make a HCC from wall to wall.

I tried testing with the 2.4Ghz Video transmitters from X10/Radio Shack.. 
Heh.. The video was pretty good for small spans of time, but you have to 
curb your hunger for Microwaved Popcorn! ;) I don't think I saw more than 
3-7 contiguous pixels of real video when the wife fired up the TV dinner.

Now, what I am curious about is this idea of a remote client for playback. 
Can Myth TV integrate a larger capture system and have a smaller playback 
PC do the output?

Nick


-- 
Nick Ellson
CCDA, CCNP, CCSP, CCAI,
MCSE 2000, Security+, Network+
Network Hobbyist, VFR Private Pilot.


On Thu, 16 Mar 2006, Michael T. Dean wrote:

> On 03/16/2006 12:04 PM, Meatwad wrote:
>
>> Jesse Guardiani wrote:
>>
>>
>>> Steven Adeff wrote:
>>>
>>>> as well, don't expect a long run of any video cable to look great. You
>>>> may want to look into a putting together a remote frontend or see if
>>>> RokuMyth will work for you.
>>>>
>>> Is there actually signal loss over a digital cable? I would have thought
>>> it would be all or nothing...
>>>
>> Yes. Of most interest to us is the capacitance element inherent in all
>> multiconductor cable. This element increases as the run gets longer.
>> Signal is signal be it analog or digital. Analog will decrease in
>> quality as the cable run gets longer. So will digital but the reciever
>> will not care that the signal is degraded - to a point. When that
>> threshold is reached, the digital receiver will not sync up and simply
>> stop functioning, hence the "all or nothing".
>>
>>
>
> Although with video, you may get bit errors that prevent the decoding of
> individual frames or blocks within your video, so you see
> "artifacts"--such that the number and frequency of artifacts increases
> (because of increasing number/frequency of bit erros) until the stream
> is so corrupt it cannot be decoded at all.
>
> See also
> http://www.gossamer-threads.com/lists/mythtv/users/138046#138046 (and
> the link within it) for reasons why digital is not necessarily better
> than analog (and, why analog is not necessarily better than digital).
> In the post, I talked about the problems with HDMI and DVI cables (= why
> long runs won't work) and the article at the link within talks primarily
> about component versus DVI/HDMI, but the same holds true for VGA versus
> DVI/HDMI...
>
> Mike
>
> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
>


More information about the mythtv-users mailing list