<div dir="ltr"><br><div class="gmail_extra"><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">
<div class="im"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
This device is actually very good.<br>
The API for h264/mpeg decode is the v4l2 API in linux. It looks like Samsung<br>
supplied patches and drove some of the changes for v4l2.<br>
Theres a git repo<br>
<a href="http://git.infradead.org/users/kmpark/public-apps/tree/HEAD:/v4l2-mfc-example" target="_blank">http://git.infradead.org/<u></u>users/kmpark/public-apps/tree/<u></u>HEAD:/v4l2-mfc-example</a><br>
that provides a sample program showing how to use the v4l2 api for decoding!<br>
<br>
If this was build in to MythTV then I think it would make a very nice<br>
frontend.<br>
Of course you'd have to consider the deinterlacing, but a quad core<br>
1.7GHz should be<br>
capable, right.<br>
</blockquote>
<br>
Quad core ARM? Umm... not likely. ARM != i7<br></blockquote><div><br></div></div><div>For deinterlacing? the decode will be HW accelerated.</div></div></div></div></blockquote><div><br></div><div>Depends upon the deinterlacing used. Assuming that you can use the CPU to deinterlace a GPU decoded video (not sure it it will even let you do that), you would have a range of deinterlacers to choose from.</div>
<div><br></div><div>Simple deinterlacers like One Field, or Bob, use minimal CPU... they do little processing of the frames. Advanced deinterlacers actually detect motion in the frames and manipulate the fields to make each frame appear as close to complete as possible. The motion detection, and manipulation of the feilds can be VERY cpu intensive when your deinterlacing HD video in realtime.</div>
</div></div></div>