[mythtv-users] How do you set up a MythTV frontend to output YCbCr via HDMI

Andre Newman mythtv-list at dinkum.org.uk
Thu Mar 7 11:16:23 UTC 2013


On 6 Mar 2013, at 19:19, Paul Gardiner wrote:

> On 06/03/2013 16:58, boehm100 at comcast.net wrote:
>> In addition to setting YCbCr in nvidia-settings I needed to add the following to the "Screen" section of my xorg.conf
>> 
>>     Option "ColorSpace" "YCbCr444"
>>     Option "ColorRange" "Limited"
>> 
>> After the addition, mythtv launches for me in and leaves the monitor in YCbCr.  Finally I needed to set the studio color space flag to make the black levels look correct.  The proper name of the flag is escaping me at the moment.
> 
> Magic, thanks. That did it. I too needed to enable studio levels, but I
> did it from the frontend menu. A bit suspicious that I needed to do

Hi, a little late to the thread addressed to me but here now. :-)

I'd be suspicious that studio levels were required too, the nvidia-settings gui quite correctly disables the Full/Limited selection box. Studio/Limited levels is an PC RGB thing, in the colour difference world there is nothing other than Studio levels. 

This suggests to me that there is something not quite right in the colour space conversion in MythTV, I did receive a response from a developer WRT this and some new code better matching up with VDPAU's colour space handling. I'm unable to find those emails just now and I don't think they went to the list, this was somewhere around 0.22-0.23 and I recall that after this revision the difference between the graphics card outputting RGB and YCbCr was much less.

I should find the time to look through that part of the MythTV code and understand what's really going on. Most of my work is around dealing with the problems that occur when PC or IT background people get involved in Broadcast Television or less frequently when the reverse happens.

The things that were the most noticeable differences between graphics card in RGB mode and in YCbCr were colour casts in shadow that varied in colour over time and the quantity of noise in shadows. Most domestic TVs may hide this effect as they may have bigger issues in shadows or other low light areas so not everyone will be able to see this even if they know what to look for. If you don't see it then I guess you don't need to worry about it...

It's also possible that my display (Optoma HD80 DLP Projector) is especially poor when driven as RGB and therefore gives better results with YCbCr. I don't have ready access to any broadcast monitors just at the moment to check more thoroughly.

I have recently upgraded my VDPAU card to a GT640 (fanless too :-) and noticed some small but welcome improvement in video quality. I need to recalibrate the projector soon so I'll re-run the tests I did initially to see if I can find any difference with current Nvidia & MythTV code.


> that. I was hoping that running this way would reduce the number of
> processing stages between codec and TV screen, but if that setting
> still has effect, I guess the decoded YCbCr is still processed before
> it gets to the screen buffer.
> 
> Do you know how this works? Is the screen buffer YCbCr, or is it
> still RGB with the graphics chip turning it back to YCbCr just
> to output to HDMI?

The internal buffers _should_ be YCbCr but in the TV industry Nvidia is generally considered to play fast and loose with little details like colour space, it's supposed to be one of the differences between gaming and professional cards & drivers but even in the pro drivers on pro cards it's still considered a little off by some. AMD always used to get this stuff right but that's no use to MythTV.

Andre


More information about the mythtv-users mailing list