[mythtv-users] Re: Re: Best video capture resolution for output to TV?

Joseph H. Fry joe at thefrys.com
Sat May 17 21:59:49 EDT 2003


Question, would it be correct to assume that if you are using an MPEG
video source (IE Digital Cable or Sat) and you set your system to encode
at the same rate and resolution that your receiver is decoding you will
get a signal equal to the output of your receiver... other than
interference in the analog portion between the receiver and your myth
system.  Has anyone tried to find out what settings the mpeg stream is
broadcast at and replicate it to see if it offers an improvement over
custom settings.

I know it seems silly... but I have noticed that decoding an mp3 to an
uncompressed .wav file then re-encoding can significantly reduce quality
from the original and the second MP3... however just for kicks once I
encoded a file, decoded it, and re-encoded it using the originals
settings and experienced an exact duplicate final copy... even the file
size was the same. (of course this is digital to digital to digital)

So perhaps any imperfections that the original mpeg compression
introduces on the broadcasters end can sometimes be amplified when using
different compression settings, whereas they may be merely duplicated
when compressing using the exact same settings. (of course this is now
digital to analog to digital)

Sorry so wordy, I just didn't want to have to clarify later.

Joe

-----Original Message-----
From: mythtv-users-bounces at snowman.net
[mailto:mythtv-users-bounces at snowman.net] On Behalf Of Bruce Markey
Sent: Friday, May 16, 2003 10:49 PM
To: Discussion about mythtv
Subject: Re: [mythtv-users] Re: Re: Best video capture resolution for
output to TV?

Allen T. Gilliland IV wrote:
>>Good point (if I undersood you correctly =). The bit
>>rate
>>also has an impact on how much detail is preserved
>>during
>>compression. In testing I found that given a medium
>>res and
>>medium bit rate, raising the bitrate improved the
>>picture
>>quality more than raising the resolution to hit a
>>target
>>file size.
> 
> 
> Actually, the bitrate is exactly what controls the
> quality of your recording.  Basically, uncompressed
> video = video at max bitrate.  Essentially, when you
> compress your videos you can think of it in terms of
> compressing each frame individually.  The bitrate
> controls how much data or storage space you are
> willing to commit to a single second of video.  So if
> your bitrate is 3000 kbit/s @ 30 fps, then you are
> using 100 kbits per frame.

It sounds like what you are describing is M-JPEG. For MPEG2
and MPEG4 they encode the differences from one frame to
the next. Therefore, the bitrate can be a small fraction of
the actual data in a frame. If there is very little motion
a low bit rate usually gives you a frame pretty close to
the original. However, if there is a lot of motion and the
differences are greater then can be fit into the bitrate,
the changes are simplified which causes artifacts. For a
given target file size, if there is a lot of motion, lower
res and higher scaled  bitrate will look better. With little
motion, higher res and lower scaled bitrate will look better.
The myth default of 2200 is pretty low so I think most 
people will see a bigger improvement by raising the bitrate
than by raising the resolution.

--  bjm

_______________________________________________
mythtv-users mailing list
mythtv-users at snowman.net
http://lists.snowman.net/cgi-bin/mailman/listinfo/mythtv-users



More information about the mythtv-users mailing list