[mythtv-users] MinnowBoard MAX

David Edwards david at more.fool.me.uk
Sat May 17 17:42:55 UTC 2014


On 17 May 2014 13:53, Jean-Yves Avenard <jyavenard at gmail.com> wrote:
> On 17 May 2014 20:01, David Edwards <david at more.fool.me.uk> wrote:
> Personally, I would prefer to implement the new VAAPI advanced
> deinterlacer and wait for intel to fix it (though it doesn't look
> promising on haswell).
> If no-one implements the VAAPI new deinterlacer because it's currently
> buggy, what intensive is there for Intel to actually fix it, plus
> no-one would know if it's still buggy

I sort-of agree... although I don't know enough about the algorithms
to know whether I would still be pining after something better even if
they did work...

There is also something about scaling algorithms in that huge thread
on the XBMC forums which I didn't fully digest but might be
interesting to investigate.

For me it boils down to this: watching TV/videos is the main point of
MythTV. In my opinion, making it do that as well as it can should be a
high priority objective. If XBMC does it better then people will use
that instead.

>> 2. Non-integer refresh rates are not supported.
> What did XBMC do to support those, if Intel drivers don't support
> it... I'm curious

I am not entirely sure, but they have a command called "xbmc-xrandr".
If I run it, it spits out a list of outputs and their modes which have
exact refresh rates correctly listed. I would guess that they use a
more recent version of xrandr if it is available. I would try and
figure out the source code (is this it?
https://github.com/xbmc/xbmc/blob/master/xbmc-xrandr.c), but it would
probably make more sense to you.

> At the time, I implemented the NV-Control extension

I remember that when I was using NVidia and was very excited.
Unfortunately, it was then that I realised that my old TV didn't
support the rates I wanted :-(

David


More information about the mythtv-users mailing list