[mythtv] improving quality of capture/playback
Fri, 6 Dec 2002 11:40:31 -0500
> But now I want more. My in-laws have a Tivo (and a big screen tv) so I
> have seen the quality of the video. It seemed as good as regular tv to
> me (or good enough anyway).
Unfortunately, I have never seen a real-life TiVo, so you'll have to just
take my opinion below with a grain of salt. I only merely have experience
with the comparison between my TV's quality and my MythTV quality
> The capture that I get with my Pinnacle pctv pro however, while definitely
> watchable, has obviously gone through compression. How can I get a
> picture approaching the quality of Tivo? Is it a hardware limitation or
> software/CPU or both? Will it cost mucho $$$ to get there or can I just
> tweak some settings and or upgrade CPU? Perhaps just adjusting screen
I don't know what you're running hardware-wise, but I'm running a (probably
overkill) AthlonXP 1700+ for my MythTV machine which takes up about 50% CPU
during normal TV-watching. I 'nice'd the mythfrontend startup with a value
of -15 so the other server tasks don't interfere with my TV watching or
So now to comment on quality. There are two categories I use to judge that:
1) Resolution/Quality Of Compression
This category includes the blurriness of lines, the appearance of
compression artifacts, etc.
On my machine, I am running mpeg4 with a resolution of 480x480. This gives
me the absolute best balance of quality vs. performance (and it keeps
everything at 50% usage). The picture appears very clean, contains a
tolerable amount of artifacting (see the next category for more comments on
artifacting), and I get reasonable disk usage from that (about 1.5gb/hr w/no
My only beef is that it is tough to actually attain as perfectly sharp an
image as TV has. Of course, in most of my testing, I check xawtv
full-screen before mythTV full-screen to see difference between compressed
and non-compressed sources. I don't see too much blurring being added by
the compression, and blame most of the sharpness-loss on the tvout and
Also, I had made an earlier post regarding "motion blur" happening with
compression. Apparently there are some strange things that happen when you
combine a 29.97fps source with a 30fps capture medium. Also, same goes for
that same 29.97fps source being displayed at 30fps. To make matters worse,
take a cartoon, originally 25fps, moved to 29.97fps, and then recorded at
30fps. That's where that wonderful motion blur begins to come into play.
Some comments regarding "real tv sync" were made on the list and I think
that would be a great thing to look into (but probably not very realistic
for the majority of us). Anyway, combine all that bad stuff with a motion
compensating compression medium and you get the worst of it all. This is
all very useless to worry about in the end since most cable signals aren't
all that perfect.
Unfortunately, I am just confined to a tuner, so I can't really comment
about the uglies I would imagine you'd get from digital cable and digital
satellite. Taking an original source that has been compressed at some point
(mpeg2?) and then recompressing it will also give you some fun results as I
2) Quality Of TV-Out and V4L
Here is where a lot of the magic has to happen. I have seen absolutely
fabulous results by just merely tweaking my tv-out. Also, make sure you're
running your tvout as close to the resolution of your tv input as possible.
No sense adding pixel scaling artifacts into the mix! (as well as more cpu
needed to scale...)
When I first installed mythTV, I thought "this is as expected. completely
unwatchable compressed computer tv. yuck." The colours were washed out,
the compression artifacts were jumping right out at me, and everything was
so damned blurry!! ATI used to really boast about their crazy filtering to
get rid of the interlaced images and make everything look purdy on monitors.
Unfortunately, relaying that same signal out through your TV is just plain
nasty. You suddenly lose all your sharpness.
So I dealt with it. I worked really hard with xawtv open and the settings
panel tweaking everything while switching my tv back and forth between tv
and video (on the same channel, of course). I thought I had it as close as
I could get it.
So then I decided to try and get better tvout quality and picked up a
vga->tv scan converter. I didn't really see enough of an improvement to
justify an additional C$150 on it. So I finally found nvtvout (as a result
of a posting on this list) and was happy to see all sorts of crazy tweaking
I could take part in. This made me (the very nitpicky guy that I am) quite
happy. I fired it up and mucked around for the better part of 2 hours.
I was especially happy to see that I could actually decrease the built in
flicker filter and boost the sharpness on the nvtvout! Not to mention the
ability to set overscan on the display!! So now, with xawtv running, I can
say that the tv is about as close to the source as it'll get.
So once I tweaked *all* the colour settings, the blacks are as black as I
want, the colours are nice and vivid, and everything appears hunky dory.
When I fired MythTV up again, watching TV was fantastic. Everything looked
very close to the original source and I actually spend a lot more time in
mythTV watching tv than watching normal tv (which now only occurs when a
recording is in progress, and I want to watch a different show).
A side effect of using the nvtvout app is that my mame display is much nicer
now, too. I can tell the difference between orange and yellow bubbles in
puzzle bobble now!! :D
Anyway, I hope this might help your decision out a bit. I'm happy I didn't
go the route of trying to find the "perfect hardware" out there and got it
to work pretty damned well with the setup I've had sitting here for a year.
My trusty GeForce2MX400 card (Asus V7100Pro) with my ATI TV Wonder will
likely remain as parts of the myth machine spec I would like to come up with
in the future.
Maybe we should toss information like this into the docs somewhere. I'm
sure there are a lot of people looking to find information on how to tweak
their display quality without springing for either a) an HDTV or b) a
broadcast-quality VGA->TV scan converter.