<br><div><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">You would have scaling issues there too, because scaling a 720p signal<br>to 2160p would have to create pixles "out of nowhere", which is only
<br>just asking for noise. Sadly, there is no perfect solution other than<br>a display that can output at whatever framerate it is given and for<br>all sources to be of the same pixel resolution. I vote 1080p for<br>everything ;-)
<br><br></blockquote></div><br>
You could always just triple or double the size of each pixel, so you
still get a pristine 720p or 1080p image, just with bigger pixels made
up of four or nine smaller pixels.<br>
<br>
Of course, that would be jaggy at some level, so you can use the extra
resolution and smooth out the jaggies. I think that is what
anti-aliasing is doing on your GPU, but there I think they create the
geometry at the high resolution and downsample to the display
resolution. For a giant screen running at quad1080p, this wouuld
be taking a lower-res signal and upscaling it to a higher res
display. <br>
<br>
I think the only place you will really see problems upscaling are with
crazy test images, like geometric shapes and line test patterns.
Normal stuff you should not see too much of a problem when
scaling. Maybe it blurs some things, but I doubt you would see it
much in motion.<br>
<br>
Nobody has even mentions sub-pixel rendering yet... Technically
your 1920x1080p display is actually (3*1920)x1080p due to RGB
pixels. You could smooth out the picture there too, given enough
gpu<br>
<br>
I read somewhere the resolution that would be discrenable to the human
eye would be around 4000x2000, which would be quad 1080p or so...
One day.<br>
<br>
<br>
<br>
<br>