[mythtv] Trading CPU for disk (capture using low cpu codec?)

Scott and Jill Gargash mythtv-dev@snowman.net
Thu, 12 Dec 2002 08:29:06 -0700

> Bruce Markey wrote:
> > Scott and Jill Gargash wrote:
> >
> > But there are good reasons to use trade IO for CPU cycles beyond saving
> > $$$s.  The less you need to do in real-time the better, especially if
> > you don't want to have howling fans in your living room. 
> I agree about the fans but I think the more you get done in
> real-time the better. 

We may just have to agree to disagree, but demanding realtime
performance from an application when running on an arbitrary end-user
system with a non-realtime OS seems risky.  This doesn't mean you
wouldn't like to try to get as much done in real-time as possible, but
you shouldn't require it. And even with an RTOS, you still like to keep
as much out of the realtime domain as possible. It makes it much easier
to get right and to validate.   

It doesn't mean you don't want to run as fast as possible, which might
even be real-time, but it would be nice to require as little true
realtime operation as possible.  I don't like dropped frames, and I'd
like to be able to use my computer for other things while it's capturing

Of course, it's not like this is a medical dvice or anything.  A dropped
frame won't kill anyone. 

> However, I first want to say that I
> think the best solution is Linux support for tuner cards with
> on-board compression. 

I waver on this.  Certainly at any particular point in time, custom hw
will be more efficient, but as soon as a new codec comes out (DIVX vs
MPEG-2) all of the custom hardware is useless.  Software can be

FWIW, the general consensus by the HTPC people (avsforum.com) is that if
you have the CPU, software decoding of DVDs gives better image quality
that HW decoding.  

> My neighborhood computer stores
> don't sell CPUs <1.2GHz anymore and these are fast enough to
> easily encode good MPEG4 files. 

Have you seen the new micro-itx motherboards with the built-in ~900 MHz
low power processor?  They don't require any cooling.  Very neat, and
they would make a perfect settop box, but they don't have the CPU to do
high quality realtime encoding and decoding.  They should have the CPU
to do high quality decoding in realtime. 

> Good stuff but if we're talking about re-encoding to do very
> good motion compression wouldn't we be better off trying to
> do it on CPUs faster than 500MHz? $51 buys a CPU ~3X faster
> (http://www.pricewatch.com/).

Processing always expands to fill the time available.  You can always
use more cycles.  And, to put it another way, if I record 4 hours of TV
a day, that still gives me 20 hours of idle CPU time.  Regardles of the
speed of my processor, why not use that time to try to get the best
quality for a given amount of space?  And if you're going to reencode,
you'll want the source material (now 1 generation removed from the
actual source) to be as close to the origianl as possible.  And I can
easily imagine an acausal compression algorithm.  You can't do that in
real-time regardless of how fast your CPU is.

> > And the amount of video processing can be greater than just
> > compression/decompression. Is anyone else using a progressive scan
> > monitor for playback? Some deinterlacing routines can be very compute
> > intensive. 
> MythTV uses a linear blend filter by default that can be
> turned on or off in the settings file. Look for "Deinterlace".
> This uses MMX and doesn't seem to impact CPU usage very much.

Check out deinterlace.sourceforge.net. It may have improved in
efficiency since I last tried it (about ~1 year ago), but back then I
couldn't get the higher quality deinterlacing routines to run in
realtime on a (windows-based) P800 doing nothing but the deinterlace
code.  That certainly seemed expensive.