[mythtv-users] Graphics card recomendation
Brad DerManouelian
myth at dermanouelian.com
Thu Aug 31 01:40:39 UTC 2006
On Aug 30, 2006, at 5:59 PM, Michael T. Dean wrote:
> On 08/28/06 08:35, Steven Adeff wrote:
>
>> On 8/28/06, Dylan Semler <dylan.semler at gmail.com> wrote:
>>
>>> On 8/27/06, Steven Adeff <adeffs.mythtv at gmail.com> wrote:
>>>
>>>> On 8/27/06, Dylan Semler <dylan.semler at gmail.com> wrote:
>>>>
>>>>> My highest ambitions would have me try to take the two monitors
>>>>> apart, perhaps do a little machining, and try to rig it up so
>>>>> it looks like one very big and very wide monitor. Though, I
>>>>> guess the chances that this is even possible are very, very,
>>>>> very small.
>>>>>
>>>> just get a bigger monitor/HDTV.
>>>>
>>> I haven't looked extensively, but it seems that any monitors that I
>>> would find of the 2200x1200 genre would be prohibitively
>>> expensive.
>>>
>> But you don't need that resolution for Myth... at most you could use
>> 1920x1080(1080p).
>>
>
> Or, really somewhere around 3840x2160 would be an appropriate output
> resolution for a 1080i/p input resolution... In addition to the fact
> that (as previously mentioned by Daniel) 3840x2160 is an integral
> multiple of both 1280x720 and 1920x1080, sampling theory
> (specifically,
> the reconstruction theory part of it) says that an output resolution
> must be greater than an input resolution to fully represent the detail
> in the image. As a general rule of thumb, the output resolution needs
> to be at least 2x the number of pixels on each axis (i.e. 4x the
> pixels
> of the input signal).
>
> Yes, I know I took a lot of heat last time I said this, but last
> time I
> was on the road for a few weeks and didn't have the time to "prove
> it,"
> so I eventually just let it drop. In fact, the truth is, I'm not
> smart
> enough to prove it, but I do ask that before anyone writes back with
> messages saying, "How can you need more than 1920x1080 pixels to
> display
> a 1920x1080 image?" or "Well, actually, if you upscale the image,
> you're
> just 'inventing' new information, meaning it [adversely affects
> picture
> quality|displays a made-up image, not the captured image]," that they
> read some of the many, many good books on sampling theory and
> reconstruction theory. Instead, if you feel you must disagree with me
> and don't feel like reading up on reconstruction theory, just write me
> off as some crazy guy who shouldn't be given e-mail access. (Oh, and
> make sure you post all sorts of messages about how crazy the marketing
> guys must think we are when they start trying to sell 3840x2160
> displays
> even though ATSC defines a maximum resolution of 1920x1080. I always
> enjoy a good rant--especially one with an "inside joke.")
>
> Boiled down to basics, there's a difference between image pixels
> (which
> are truly "picture elements"--samples of a picture at a point (of zero
> size)) and display "pixels" (which are really "dots"--that have a
> physical area). The picture elements are created by sampling a
> continuously-defined image function, and, although you can display an
> image by painting pixels of the same value and at the same positions
> used to generate the picture elements (i.e. "1:1 pixel mapping"), you
> can create a much better image by recreating the image function and
> taking more samples at different positions and incorporating the
> additional information about the image function into the final display
> pixel values.
>
> Oh, and, of course, dots aren't "just" dots. Each display pixel's
> brightness can vary in intensity across its "surface" (i.e. as on a
> CRT). And, display pixels can vary in intensity across the display
> (i.e. having a directionality as on an LCD). And... But, that's a
> whole different argument.
>
> Unfortunately, there aren't many good sources on the 'net. Why? Who
> knows? Perhaps computer programmers are too smart to be fooled by all
> those complex mathematical formulae. Fortunately, though, the ideas
> have been incorporated into the algorithms we use everyday (in
> image-processing libraries, in printers/printer drivers, and even in
> graphics hardware).
>
> Mike
I challenge anyone to tell me the difference between 1080p resolution
displayed at 1920x1080 and 3840x2160 resolution without literally
putting them side-by-side and deliberating. Further, I challenge
anyone to be able to afford a 3840x2160 display and video card to
drive it. :) I imagine the difference is comparable to listening to
an SA-CD and a DVD-Audio disc. I have both of these and I can't
determine which is "better" resolution without googling for the
answer, but they are both much better than a regular CD (which would
be Standard Def TV in this analogy).
More information about the mythtv-users
mailing list