[mythtv-users] Mythtv v35 fails decode with Nvidia NVDEC
Scott Theisen
scott.the.elm at gmail.com
Sat Feb 15 19:23:59 UTC 2025
On 2/14/25 12:10, James Abernathy wrote:
>
>
> On Fri, Feb 14, 2025 at 8:32 AM Roland Ernst <rcrernst at gmail.com> wrote:
>
>
>
> On Fri, Feb 14, 2025 at 1:44 PM James Abernathy
> <jfabernathy at gmail.com> wrote:
>
>
>
> On Fri, Feb 14, 2025 at 7:08 AM Ryan Patterson
> <ryan.goat at gmail.com> wrote:
>
>
> On Thu, Feb 13, 2025 at 9:16 PM James Abernathy
> <jfabernathy at gmail.com> wrote:
>
> On Thu, Feb 13, 2025 at 9:04 PM James Abernathy
> <jfabernathy at gmail.com> wrote:
>
> On Thu, Feb 13, 2025 at 8:54 PM James Abernathy
> <jfabernathy at gmail.com> wrote:
>
> On Thu, Feb 13, 2025 at 8:27 PM Ryan Patterson
> <ryan.goat at gmail.com> wrote:
>
> I just upgraded to fixes/35 and it is not
> working correctly. Nothing will play when
> set to use the Nvidia NVDEC video decoder.
>
> The error displayed says, "Video Frame
> Buffering Failed Too Many Times"
>
> I switched to VDPAU decoding and playback
> works. But I need NVDEC for 4K content.
>
> _____________
> Ryan Patterson
> May the wings of liberty never lose a feather.
>
>
> What are you seeing? Any log errors? I have a
> laptop with hybrid GFX. Nvidia and Intel, so I
> can launch mythfrontend with pure Nvidia or
> pure Intel. Since I've been told that using
> OpenGL only works best with the current
> software and average CPU. That even applies
> to RPI 4. I'm on v35 on everything so I can
> test it.
>
> Jim A
>
>
> I did a quick test on my laptop and launched
> mythfrontend with dedicated GPU option. NVDEC
> would not even play for me. I switched to OpenGL
> High Quality and it played perfectly. Of course my
> laptop is pretty good. Intel© Core™ i7-10750H CPU
> @ 2.60GHz × 6.
>
> Jim A
>
>
> I switched my laptop to Nvidia GPU only and rebooted.
> I got the same error as you. However, using OpenGL
> High Quality was a perfect picture on the Nvidia..
>
> Jim A
>
>
> Thanks for confirming you see the same issue. Yes, other
> decoders work (VDPAU, FFMPEG, etc.). But my CPU is not
> powerful enough for h.265 HEVC 4K decode. That is why I
> need Nvidia NVDEC. Hopefully the issue is fixed soon.
>
> -Ryan
>
>
> You might want to open an issue on the github.com
> <http://github.com> site for mythtv.
>
>
> You can try to set up different playback profiles, see last thread in
> http://lists.mythtv.org/pipermail/mythtv-users/2025-February/414942.html
> mythfrontend: playback settings
>
> Formats MPEG2VIDEO->VDPAU acceleration & OpenGL Hardware
> Formats H264->VDPAU acceleration & OpenGL Hardware
> Formats HEVC->NVIDIA NVDEC acceleration & OpenGL Hardware
>
> Roland
>
>
> That setup worked for me but I can only test it on MPEG2 and H.264 m4v.
>
> Jim A
What Nvidia driver version do you have?
When I updated FFmpeg to 7.1, I had to update ffnvcodec (the headers
FFmpeg uses for dynamically loading CUDA/NVDEC/NVENC), which increased
the minimum driver version to 550. There are other versions of
ffnvcodec we could use with lower driver version requirements. See
https://github.com/MythTV/mythtv/pull/1040 if that is the issue.
The GitHub issue is at https://github.com/MythTV/mythtv/issues/1039
Regards,
Scott
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mythtv.org/pipermail/mythtv-users/attachments/20250215/25d57eba/attachment.htm>
More information about the mythtv-users
mailing list