<html>
<head>
<meta content="text/html; charset=ISO-8859-1"
http-equiv="Content-Type">
</head>
<body bgcolor="#FFFFFF" text="#000000">
On 05/12/14 14:04, Will Dormann wrote:<br>
<blockquote
cite="mid:CAJ-a_WxwPTu8nF9=GZHCweRW4+4Z2MiyotPwnwMpu3PVmwqfyg@mail.gmail.com"
type="cite">
<p dir="ltr">
On May 12, 2014 1:43 PM, "Jim Oltman" <<a
moz-do-not-send="true" href="mailto:jim.oltman@gmail.com">jim.oltman@gmail.com</a>>
wrote:</p>
<p dir="ltr">> May I ask why nVidia is a requirement? Why not
a lower power Intel NUC?<br>
</p>
<p dir="ltr">It's my understanding that vdpau still can't be
beat. My poor little atom has been happily handling HD h.264
for years without breaking a sweat. And that's because the gpu
is doing the heavy lifting. </p>
<p dir="ltr">Based on other threads, alternatives like Intel
graphics can be coerced. into working, but you still may end up
with artifacts like tearing or dropped frames that are most
noticeable in scenes with horizontal panning. Which is
unacceptable for me. Working with video over the years has
tuned my vision to be very aware of these things. :-/ </p>
<p dir="ltr">Is this still the case, or have other technologies
caught up with vdpau? </p>
</blockquote>
There have been a lot of discussions on this very topic in other
threads as of late.<br>
<br>
A lot of the talk is about which deinterlacing algorithms you can
use. Someone expressed concern about decoding more complex audio
streams on low-power CPUs.<br>
<br>
What I haven't heard about is the ability to play back interlaced
output. I have an old CRT HDTV that needs a real 1080i input. Can
Intel graphics generate a 1080i signal that is properly synced to
the video?<br>
</body>
</html>