[mythtv-users] MythTV looks grainy on big screen TV

Brian Wood beww at beww.org
Thu May 25 01:37:25 UTC 2006


On May 24, 2006, at 6:37 PM, moogie wrote:

>
> MythTV box:
> PVR-350 with straight Comcast coxial connection (No
> set top box, Limited Basic and Expanded Basic
> Service.)
> nVidia FX 5200 with VGA, svideo and DVI
> DVI-HDMI cable connects to
> Sony KDF60WF655 LCD Rear Projection HDTV with HDMI
>
> MythTV has all the channels that I get from Comcast.
> Verified this by comparing channels to a old
> (non-Myth) TV in another room.
>
> Program Guide, Recordings, ... all working fine.
>
> Got the xorg.conf working just right.
> The only Modline reported by xorg log file is used in
> the xorg.conf:
> Modeline  "1920x1080" 74.250 1920 2008 2052 2200 1080
> 1084 1094 1124 interlace +hsync +vsync
>
> Experimented with nvidia-settings and
> loaded the ".nvidia-settings-rc" in ".xsession" with:
>    nvidia-settings --load-config-only &
>    startxfce4
>
>
> The video stream and text quality are a bit grainy.
> Not pixelated, not choppy.
> This is true for every channel and on screen
> displays like CNN News Ticker, Myth overlays, etc.
>
> What else can I do to improve this?
>
> This Sony has several inputs:
> The 1st HDMI input is mentioned above => PVR-350
> tuner.
> The 3rd coax input is (also) connected to Comcast
> directly => Sony native tuner.
>
> When you compare the same channel on 1st input and
> 3rd inputs, it is obvious that 1st HDMI input
> channel is grainy.

If the Myth overlays are noisy/grainy you may have a connection  
problem, the following relates mostly to cable signals, but sometimes  
noise in them can make the overlays look bad.

Check your record profiles. Even when you are watching live TV you  
are "recording". Increasing the bitrate(s) will improve the video  
quality, at the cost of storage space of course.

Make sure your capture card is getting adequate signal, in other  
words you don't have multiple splitters feeding it. Some folks have  
stated here that they believe capture cards require more signal than  
a regular TV set and, while I do not particularly subscribe to that  
idea, there might be something to it, one never knows :-)

Make sure you have good quality cable feeding your card, and that the  
connectors are properly applied (ie: not crimped with pliers or your  
teeth). Make sure the cables are no longer than necessary.

Make sure your capture card has the correct frequency table (STD, IRC  
or HRC) as a slightly mis-tuned channel can be noisy. I have never  
had to get into individually fine-tuning each channel but some people  
have reported here that such action has been necessary to obtain good  
pictures.

These are all obvious things, but you'd be surprised how many people  
miss them.

While I can tell the difference from my Myth output and direct analog  
cable signals most people can't, as they are very close. Certainly  
there should not be enough degradation in going through the Myth  
system to cause complaints.


More information about the mythtv-users mailing list