<html><head></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><div><div>On 2010-08-26, at 8:32 PM, Mark Kendall wrote:</div><br class="Apple-interchange-newline"><blockquote type="cite"><span class="Apple-style-span" style="border-collapse: separate; font-family: Helvetica; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-border-horizontal-spacing: 0px; -webkit-border-vertical-spacing: 0px; -webkit-text-decorations-in-effect: none; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; font-size: medium; ">If you want to post the full output of mythfrontend -v<br>playback and glxinfo, I can try and take a look. My gut feeling is<br>that the hardware just isn't powerful enough</span></blockquote></div><div><br></div><div>I put a zip of some logs <a href="http://feeds.nassas.com/OpenGL.zip">here</a>. The DailyShow.txt file is from playing a 720p ATSC recording and the TED.txt is one of the recent TED Talks, I think they're lowish res. glxinfo and glxgears output is there too. I understand that glxgears is not representative of anything but it's fun to watch.</div><div><br></div><div>Do I have the scenario correct? OpenGL gets a decoded frame from myth and all it has to do is scale it and stamp it to the screen? That doesn't sound very taxing but what do I know.</div><div><br></div><div>Anyway, thanks for looking.</div><br><div>- George</div></body></html>