<div dir="ltr"><br><div class="gmail_extra"><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">
<blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div class="im">
1. Intel GPU decoding will not even come close to NVidia in<br>
quality. Fine for casual watching, but I wouldn't want it in<br>
my home theater.<br>
<br>
<br>
Intel GPU decoding will not do anything. Neither your G540, nor<br>
the OP's proposed G1610, have that feature enabled.<br>
<br>
<br></div><div class="im">
My G540 is definitely using it's GPU to decode the video via VA-API.<br>
</div></blockquote>
<br>
Your CPU does not support it.<br>
<br>
<a href="http://ark.intel.com/products/53416#CVTHD" target="_blank">http://ark.intel.com/products/<u></u>53416#CVTHD</a></blockquote><div><br></div><div style>Then how do you explain my using it in mythtv and having CPU utilization drop from 110% to 5% on a test video.</div>
<div style><br></div><div style>My CPU has Intel HD Graphics... which are VA-API compatible: <a href="https://01.org/linuxgraphics/community/vaapi">https://01.org/linuxgraphics/community/vaapi</a></div></div></div></div>