[mythtv-users] Watching HD streams on a SD set

R. G. Newbury newbury at mandamus.org
Thu Oct 4 17:46:13 UTC 2007


Enigma wrote:
>>  From the man page for the i810 driver:
>> Option "XvMCSurfaces" "integer"
>> This option enables XvMC. The integer parameter specifies the
>> number of surfaces to use. Valid values are 6 and 7. Default:
>> XvMC is disabled.
>>
>> Now I notice that the Xorg.0.log says something to the effect that the
>> XvMCSurfaces line is ignored, and then the i810 driver seems to use 
>> xvmc...
>>
>>
>> So maybe *both* of them don't do xvmc? That is clearly not correct, so
>> something is going on...And the mythbox still records, and plays, so I'm
>> not touching it until the weekend!
>>
>> Geoff
> 
> Yeah, I quickly found out that XvMC doesn't seem to work with this 
> chipset.  Upon closer inspection of the manual the 'XvMCSurfaces' option 
> is listed under the section that says "The following driver Options are 
> supported for the i810 and i815 chipsets:".  Since this frontend has a 
> 915G chipset it appears that I am SOL for XvMC.
> 
> I am still using the 1.x version (i810) of the driver rather than the 
> brand-new 2.x (renamed intel) version.  The main advantage of the new 
> driver seems to be the ability to use non-BIOS resolutions and since I 
> just use a standard resolution and from your description it sounds like 
> it doesn't do XvMC I will probably just continue to use my current 
> driver.  I have played around with the various deinterlacing options and 
> I am getting much better video quality than before.  It appears that I 
> need to deinterlace the HD stream before it is interlaced and sent to my 
> TV.  My CPU seems to be able to handle the playback and deinterlacing so 
> I probably don't have to have XvMC although if I did it might help me to 
> use BOB deinterlacing.
> When I first tested this machine for HD I ran into the 'badalloc' error 
> and googled a solution and set LinearAlloc to 8192.  When I was reading 
> the manual pages today I saw the line recommending 6144 and so I changed 
> my xorg.conf to that value.  This caused my machine to lock up HARD with 
> nasty things happening to my display when I attempted to view HD 
> content.  Changing the value back to 8192 fixed the problem.


Interesting. I found a page on a Ubuntu site which recommended 
LinearAlloc 16384. I changed it back to 6144 after reading the intel 
driver man page. However, since I am outputting to an LCD, I don't see 
any changes between bob or linear or kernel de-interlace in my playing 
around. But it also appears that I don't *really* know what the system 
is using. Must play with the settings and parse the output a little more 
closely. Must..resist..urge to ..make .... more ....time 
...to...play..with myth....!

As an aside, mythfrontend logging output is INCREDIBLY obtuse. I can see 
that it tests for various decoders etc. but it is only some time later 
that there is an almost hidden reference to which one is actually 
used...and no output about why a particular combo is NOT used. (Are you 
reading Daniel?).

The chipset on the Asus board is a 945G northbridge with a Graphics 
Media Accelerator 950 video chip..
Both man pages state: "The following driver Options are supported for 
the i810 and i815 chipsets:" and then include the XvMCSurfaces option, 
before going on to describe further options on more recent chipsets. 
Neither states "...are supported ONLY on the i810..." nor the other 
possibility: "The following driver Options are supported (ONLY) for the 
830M and later chipsets:". This leaves it unclear if the early option 
are also available on the later chips. My guess is that the option IS 
available on later chips..at least one would think that that is the 
logical progression in chipsets...

More coffee and play-time required. At least this mb and cpu combo can 
handle HD without xvmc.

Geoff


More information about the mythtv-users mailing list