[mythtv-users] Optimizing Livetv for AMD?

Nick Rout nick.rout at gmail.com
Fri Jun 18 07:27:26 UTC 2010


On Fri, Jun 18, 2010 at 6:41 PM, Sepp Herrberger <Seppl at dcemail.com> wrote:
> Hi all,
> i have the GM890 Pro3 Mainboard (Containing a ATI HD 4290)
> http://www.asrock.com/mb/overview.asp?Model=890GM%20Pro3&cat=Specific
> ations
> As TV card i'm using a TEVII S470
> I've did the folowing guide:
> http://www.linuxquestions.org/questions/linux-desktop-74/amd-hd-serie
> s-graphics-guide-optimizing-video-playback-for-mythtv-mplayer-and-oth
> ers-786335/
>
> But if i use this:
>    * Decoder leave at Standard.
>    * Video Renderer Set to: OpenGL
>    * OSD Renderer Set to: OpenGL2
>    * Next Screen (Deinterlacers)
>          o Primary Deinterlacer Set to: Bob(2x,HW)
>          o Fallback Deinterlacer Set to: Linear Blend(HW)
> CPU Goes up to 80%
>
> And if i use this:
>    * Decoder change to VIA XvMC
>    * Next Screen (Deinterlacers)
>          o Primary Deinterlacer set to One Field
>          o Fallback Deinterlacer set to One Field
> I always get the message
> 2010-06-13 12:16:14.071 VideoOutputXv: Desired video renderer 'xvmc-blit' not available.
>                        codec 'H264' makes 'xv-blit,xshm,xlib,' available, using 'xv-blit' instead.
> 2010-06-13 12:16:14.071 VideoOutputXv: Desired video renderer 'xvmc-blit' not available.
>                        codec 'MPEG2' makes 'xv-blit,xshm,xlib,' available, using 'xv-blit' instead.
>
> Anybody can help me according this?

yep, get an nvidia 220 or 240 :)


More information about the mythtv-users mailing list