[mythtv] ffmpeg SWSCALE!

Yeasah Pell yeasah at schwide.com
Thu Aug 31 01:54:00 UTC 2006


Michael T. Dean wrote:
>
> True, but if you're processing the images correctly, you're not making 
> up information (nor are you "upsampling"--i.e. 1 pixel becomes 4 or 9 or 
> whatever).  Notice, however, the, "If done right," in my original statement.
>
> A digital image defines a set of samples, which can be displayed as is 
> (i.e. a 1:1 pixel mapping) to /approximate/ the original image, or can 
> be used with a proper reconstruction algorithm to recreate the original 
> image /function/ from which the samples were taken (where the image 
> function is a continuously-defined function, as opposed to the digital 
> samples in the image itself).  If the image function is recreated, it's 
> possible to resample the function for the desired number of pixels.  In 
> doing so, no information is "made up"--instead, the information that's 
> encoded into the original samples is just better applied.
>
> And, as mentioned in reply to your post in the thread, "Graphics card 
> recomendation" ( 
> http://www.gossamer-threads.com/lists/mythtv/users/220344#220344 --mine 
> will appear somewhere below there once the archive catches it), a 
> display device needs at least 2x the pixels on each axis to fully 
> represent the information in a given image.  So, a 720x480 DVD needs at 
> least 1440x960 pixels.  And, a 720p TV has 1280x720 pixels, so you'd 
> need a 1080p TV (with 1920x1280 pixels) to fully appreciate the images 
> on the DVD.
>
> (BTW, the reply in the other thread is one which your laughter prompted 
> me to actually send, after it sat in my Drafts folder for a day and a half.)
>
> In the real world, it's impossible to recreate the image function 
> exactly from the samples that are taken--primarily because in the real 
> world, signals are rarely band-limited.  Nonetheless, there is 
> additional information in the image that can be extracted through an 
> appropriate reconstruction process (and can result in a much better 
> image given 4x the pixels for output).  Once you've reached a 
> sufficiently-higher resolution than the image resolution (somewhere 
> around 2x the pixels in each dimension), you will no longer see an 
> improvement in picture quality.  At this resolution, you've used all the 
> available information in the signal and eventually you will begin to 
> bring out the artifacts/aliasing in the image.  But, as mentioned, 
> 1920x1080 works quite well for DVD's...
>   
This is very interesting, for some reason I've never thought of video as 
an array of continuous samples. It never would have occurred to me that 
video at resolution n x m would not be optimally viewed on a display 
that had the exact same resolution, but when you think of it in the 
context of sampling of a 2 dimensional continuous signal, it makes 
perfect sense.

It's pretty easy conceptually in the world of audio. You start with a 
continuous analog signal, and when you digitize it you are taking 
discrete points along the signal. That's all well and fine, but if you 
were to play it back by just stair-stepping an output signal according 
to the values you digitized, there would be all sorts of high frequency 
aliasing on all those hard edges of the stair-steps. So conceptually the 
signal is run through a low-pass filter (not necessarily literally, 
there are many ways to do this) which is designed to pass signals up to 
the nyquist frequency, and reject anything above that -- the result is a 
close approximation of the original smooth signal, without the jagged steps.

But a monitor doesn't really have any such low pass filtering -- it's an 
array of square (or rectangular) pixels, with hard edges. As a result, 
the high-frequency aliasing is (and must be) fully present in the 
displayed image. Depending on how high the resolution is, you can see 
this aliasing easily -- but even though it is more noticeable in lower 
resolution images, there's still just as much aliasing in higher 
resolution images, it's just harder to see (use a loupe if you need :-)

In any case, the aliasing can be removed in the same way as it is in the 
audio world, by stripping out the high frequency components that are 
above the nyquist rate of the image -- however, since pixel-based 
display devices are discrete by definition, there is no way to do this 
without displaying the result on a discrete device with much higher 
resolution, essentially utilizing the extra resolution to remove the 
high frequency aliasing that would otherwise be unavoidable.

What I'm not convinced of, though, is that the visual perception of the 
aliasing is a major factor at the kinds of resolutions we're talking 
about. The frequency of the aliasing is by definition higher than that 
of the sampled image (here, higher frequency means a smaller displayed 
object), and assuming you have set up your screen size correctly (such 
that your eye's limit of detail resolution matches that of the display 
device), you wouldn't even see the higher frequency aliasing. In fact, I 
would suggest that there *is* in fact a low-pass filter in a correctly 
set up system: your eyeball.

Of course, people's eyes vary, and one is not always able to get the 
ideal screen size or viewing distance for various real-world reasons. 
But if it's between setting up a display that is 4x the resolution of 
the source content, and setting up a correctly sized and distanced 
screen, I would think the latter would inevitably be more cost effective 
(I don't even want to know what the price of a 4x 1080p display would be 
-- 7680x4320? ha!)

-y


More information about the mythtv-dev mailing list