[mythtv-users] compiling performance question
stephen_agent at jsw.gen.nz
Tue Feb 19 06:54:54 UTC 2019
On Mon, 18 Feb 2019 21:31:37 -0500, you wrote:
>On 2019-02-18 8:33 p.m., Stephen Worthington wrote:
>> On Mon, 18 Feb 2019 12:53:27 -0500, you wrote:
>>> Been running a 1030 in one frontend, had to limit to 1080, card just
>>> couldn't keep up with even medium bitrate 4k.
>>> Looking at some of the release notes for the nVidia driver, an update
>>> might fix it, but I just don't have enough 4k material for it to be
>>> worth my while.
>> Right, so what exact 1030 card and motherboard? Which slot were you
>> using? If it is an older motherboard, its PCIe version is probably
>> too old to give full throughput to the 1030 card.
>Gigabyte Z68MA-D2H-B3, Z68 chipset, PCIe Gen 3 16x PCI slot, (which is
>the slot I have the card in).
>The video card is a Gigabyte GT 1030, not sure how to pull the specific
>model number from remote ( not at home at the moment ). I'm reasonably
>sure its: GV-N1030D4-2GL.
>The processor in this machine is a lowly i3-2120T, which only supports
>PCIe Gen 2, but considering were using the card to handle the video
>decode, I wouldn't have expected the bus speed to be that much a bottle
>Again, I don't have a lot of 4K content, so while I'm a little
>disappointed, it isn't costing me any sleep :-)/
Gigabyte do not mention the PCIe specification for this card. I
expect it will be version 3.0, but I can not find anyone who is
actually saying that. It is DDR4 only, and only 2 Gibytes of RAM, so
relatively slow for a 1000 series card. Its HDMI port is 2.0b, so it
should be able to do 4K @ 60 Hz with 24 bits per pixel, plus HDR. But
that is just the HDMI port - the rest of the card needs to be able to
do that too. In the past, with non-HDR formats (eg 1080p), the video
card does not have to work hard to do what is needed to just display
the output from a multimedia program. MythTV needed 1 Gibyte of RAM
on the GPU, but half of that requirement was to allow the operating
system to use fancy 2D and 3D embellishments. And serious amounts of
GPU RAM was only needed for gaming and full 3D software. So if you
were running Ubuntu with the Unity desktop, you needed 1 Gibyte of
RAM, but if you used Xubuntu, MythTV would be fine with only 512
Mbytes of GPU RAM. In theory, 4K should be similar, but when you then
add HDR there may need to be more RAM and more GPU use, so maybe a low
end card like this does not cope. But there is hope that it is just a
driver problem preventing it from doing full 4K.
In any case, I will not be getting a 4K monitor or TV any time soon,
so my EVGA card should be fine for what I need:
More information about the mythtv-users