[mythtv] ffmpeg SWSCALE!
Yeasah Pell
yeasah at schwide.com
Thu Aug 31 02:25:13 UTC 2006
Daniel Kristjansson wrote:
> On Wed, 2006-08-30 at 21:54 -0400, Yeasah Pell wrote:
>
>> Michael T. Dean wrote:
>>
>> But a monitor doesn't really have any such low pass filtering -- it's an
>> array of square (or rectangular) pixels, with hard edges.
>>
> You mean an LCD, a CRT monitor has many RGB components for each pixel,
> and the electron beam is in fact filtered.
>
>
Right, CRTs do have some amount of filtering built in, though its
character isn't necessarily exactly what you want -- it has to do with
the shape of the pixel sub-elements and the dot pitch of the monitor
relative to the resolution of the image displayed. But I don't know many
people who still have a CRT as their main TV viewing device, so I wasn't
really thinking about that. :-) Most new displays (front and rear
projectors of various sorts other than CRT, direct LCDs and plasmas) are
discrete.
>> As a result,
>> the high-frequency aliasing is (and must be) fully present in the
>> displayed image. Depending on how high the resolution is, you can see
>> this aliasing easily -- but even though it is more noticeable in lower
>> resolution images, there's still just as much aliasing in higher
>> resolution images, it's just harder to see (use a loupe if you need :-)
>>
>
> You are confusing aliasing with other properties of the display.
> Aliasing is seen the _low_ frequency you see when the number of
> samples is insufficient for the high frequencies in the signal.
> If there is no aliasing present in the transmitted signal, you
> will never get aliasing when you display the image so long as
> you do not re-sample the image. Of course if you do re-sample
> the image by theory you need to double the display resolution
> wrt to the source image to fully resolve the image, and in
> practice you need to more than double the image. If you don't
> have such a high resolution display you need to bandlimit
> the image before scaling, i.e. you need to blur it.
>
The term "aliasing" is pretty overloaded, it can mean virtually anything
depending on the context. Granted in video the word usually means
something else, but I was talking about the aliasing present in an
unfiltered signal reconstructed from discrete samples, i.e. in the
signal processing sense. I wasn't talking about the characteristics of
display devices at all, but rather thinking of them as idealized
discrete devices (to which LCDs, plasmas, etc. come quite close), and
the question I was addressing is simply "do I need a greater resolution
discrete output device than my source signal in order to get the best
image possible from the source", not "is my CRT better than your LCD" or
anything else related to the specifics of any display device.
The answer I came up with is "if you have a properly set up viewing
distance given your eyes and the size of the display device, you do not."
More information about the mythtv-dev
mailing list