[mythtv-users] [OT] LCD TV or LCD Monitor for Front-end

John Pilkington J.Pilk at tesco.net
Thu Feb 2 15:37:00 UTC 2012


On 31/01/12 20:45, Michael T. Dean wrote:
<snip>
>
> Now, because LCD is a terrible choice for video, the industry decided to
> come up with approaches to compensate for LCD's display characteristics
> to allow them to use LCD for video.  Originally, the first generation of
> TVs to compensate for LCD characteristics would display the pixel at
> full brightness for 1/2 (or some other fraction) of a frame duration and
> then display blackness for 1/2 of the duration (so, for a 60fps video,
> you'd see the pixel at full brightness for 1/120th of a second and at
> black for 1/120th).  However, this reduced the brightness of the display
> significantly, so some manufacturers started to display full brightness
> for 1/2 of a frame, then a dimmed version of the same pixel for 1/2 of a
> frame.  This reduced the brightness, but not as much as using full
> black.  Both of these approaches work very well, and due to persistence
> of vision, look very "natural."  Unfortunately, because there was a
> brightness loss (and because brightness in a store display is what sells
> a /lot/ of LCD screens) some manufacturers got the idea that it would be
> better to invent new pixels--i.e. try to interpolate pixel positions
> between frames--and display those at full brightness (so you got 1/120th
> second of the pixel specified in the video, then 1/120th second of the
> pixel they expected would have existed if the video were 120fps).  Some
> have taken it farther to try to get to 240Hz (inventing 3 pixels for
> every one specified in the video).
>

<more snip>

>
> Mike

OK, so that's why in Europe we see TV displays labelled as 100 Hz, 200 
Hz on up-from-basic LCD models.  How does this affect deinterlacers and 
other Myth frontend goodies on 720x and 1080x?  I suppose the display 
box still gets its drive at 50 Hz - or am I wrong about that?  Any good 
references that aren't obscured by brand-name puffery?

It still seems to me that if the manufacturer's solution is good for an 
OTA digital input it should be good for an Ethernet input too, but I 
suspect that may not be the commercial view.  Any evidence on that?

John P




More information about the mythtv-users mailing list