[mythtv-users] VGA -> RGB-SCART (PAL, 720x576i50) on Intel HD video possible?

Nick Morrott knowledgejunkie at gmail.com
Wed Jun 1 20:04:59 UTC 2016


On 1 June 2016 at 20:55, Hika van den Hoven <hikavdh at gmail.com> wrote:
> Hoi Nick,
>
> Wednesday, June 1, 2016, 9:39:36 PM, you wrote:
>
>> On 1 June 2016 at 13:09, Hika van den Hoven <hikavdh at gmail.com> wrote:
>>> Hoi Nick,
>>>
>>> Wednesday, June 1, 2016, 10:04:27 AM, you wrote:
>>>
>>>> I'm curious to know if it's possible to use (and thus if anyone is
>>> using) a VGA->>RGB SCART output path on recent Intel Core-based systems
>>>> with Intel HD (i915) video?
>>>
>>>> Does the Intel HD video platform support outputting PAL-compatible
>>>> 720x576i50 signals at 15kHz?
>
>>> If the card supports interlacing, it's a simple scheme to convert vga
>>> to scart. All it has to do is combine hsync and vsync. I use it with
>>> an nvidea card. If you are interested I can lookup the scheme. I also
>>> have created my own modeline to optimize for my TV and to remove
>>> overscan.
>
>> It's not as simple as that. The VGA output needs to be able to display
>> a PAL-compatible modeline directly to be able to use one of these
> "simple" VGA->>RGB SCART cables.
>
>> As I mentioned, I already use a VGA->RGB SCART cable with an ATI
>> Radeon card (the sync circuit for Radeons is even simpler than for
>> NVidia cards, as the ATI Radeon outputs composite sync).
>
>> What I want to know is whether current Intel HD video natively
>> supports a low enough dot clock (14MHz) to output the necessary
>> 15.625kHz horizontal frequency PAL signal directly through the VGA
>> output, or whether a "VGA -> PAL" converter would be required.
>
>> Cheers,
>> Nick
>> _______________________________________________
>
> As far as I know Intel does not do interlaced, so the only option then
> is Nvidia (nouveau also doesn't do interlaced). And I don't know about
> those adapters you can buy, but for a few euro in electronics you can
> solder one and all it needs is a card that does interlaced. The
> modelines you can create/ supply yourself.
> At the time I was building testing my solution, there where cards that
> should do it, but the drivers'(a.o Nouveau, Intel) didn't do the
> interlacing correct, giving f.i. two half pictures above each other.
> The build-in Intel card I couldn't get working so I ended up added an
> Nvidia with their driver.

I have a fanless Nvidia Geforce 210 from some years ago, so I guess I
might need to use that (+binary, assuming the card will generate the
right modeline) if the onboard Intel HD can't be persuaded to play
nicely.

Bit of a shame though, as I'd like to minimise additional
heat-producing components for a totally silent box.

Thanks,
Nick


More information about the mythtv-users mailing list