[mythtv] hardware questions

Ray mythtv-dev@snowman.net
Tue, 31 Dec 2002 16:06:10 -0700


On Tue, Dec 31, 2002 at 12:15:16PM -0600, m0j0.j0j0 wrote:
> > 
> > The file sizes are definately going to be larger than mpeg4 at similar
> > resolutions but the quality seems pretty darn good.  You also need a
> > reasonably hefty cpu to decode full resolution mjpeg in real time (although
> > there may be some optomizations to be done with the decoding sw).  TV-Out
> > was a pain to set up but I'm reasonably happy with it now.
> 
> Do you think the lesser CPU-load due to the hardware MJPEG encoding
> significantly out-ways the extra load gained from doing real-time
> decoding? If in the end using a G200 doesn't really lower my CPU load
> much, I'd probably be better off buying a nice simpler-to-setup tuner,
> right?

Well, I only have Myth running on a dual Celeron 500 box (and temporarly on
a Duron 650) and I have to use decimation 2 (basically 352x240 but looks
much better than software encoding at that res) so I can only speculate on
what cpu would be enough to decode at full res.  I'm guessing an Athlon 850
or so would just be good enough but I also have a feeling that some room for
improvement in the mjpeg decoding codec as well.  Even if it takes an Athlon
1GHz that would still be as low or lower than what it would take to do
software mpeg4, especially when recording one show and ff through another. 
Eventually I plan to have a second encoder card so the difference will be
even greater then.

> 
> Can you clarify "reasonably happy" regarding the TV-Out? I'm wondering
> if I'm better off just buying a VGA-TO-NTSC converter than messing
> around getting TV-Out card like the G200 working.

Up until today my TV and monitor were dropping into power save mode when I
first started Mythfronted and when it would start a scheduled recording (but
not when starting live tv).  Changing resolutions would bring it back and
setting "vdo_enable=0" in the script that loads the marvel drivers seems to
have fixed that perminantly.

My tv still has a slight greenish or yellowish tint, especially with darker
scenes.  Initially this was much worse (and the screen was too dark and had
not enough contrast) which I've mostly fixed using maven-prog tool (see the
HOWTO posted to this list earlier).  This tool allows you to set many of the
registers that affect TV-out (contrast, hue, etc) but documentation for it
is limited and getting it right requires a lot of trial and error.  The
green tint was also with me in Windows too so this isn't a Linux issue.  I
recently noticed another register that seems to help a lot but now I've got
to go back and re-adjust the others.

I only have good modelines for 800x600 and 640x480 resolutions and if I set
X to use 1024x768 as well (which I want for some games) and just switch to
800x600 for Myth then Myth still produces a 1024x768 picture.  This is
probably fixable but I havn't gotten around to it.  If I remove the 1024x768
modeline completely then Myth is fine.

I also have just a tiny bit of noise on TV which may well be a poor cable
but you really have to get up close to the TV to see it.

The bottom line is that I'm confident I'll have it perfect eventually but
all the time I've spent tweaking the TV-Out stuff would have been better
spent on other things.  A good NTSC converter probably only would have cost
me $100 or so and a crappy one that would have given me the quality I have
today would have only cost maybe $40.  My business has been a little slow
this month but not THAT slow.

-- 
Ray