Thank you for your quick response!

Yes, that was the way we were going to investigate next, we already had a 
project using Nvenc directly, but the original implementation was for OpenGL.
We also saw there was D3D12 Video Encoding (D3D12 Video Encoding | 
DirectX-Specs<https://microsoft.github.io/DirectX-Specs/d3d/D3D12VideoEncoding.html>)
 that might also do something similar. Do you have any experience with the 
latter? Otherwise we'll most likely do as you say with the nvidia SDK.

I was contacting for low latency as that is currently the problem I was having, 
however the reason we wanted libavcodec initially is that a few of our clients 
cannot agree on a codec. So we wanted the broad range of codecs supported by 
FFmpeg, even if that meant a higher latency. I thought we could do a basic 
implementation, test the end-to-end latency and figure out from there if we 
needed a lower level implementation. It is only while doing the implementation 
that I found that even though d3d12va is supported, the hevc_nvenc doesn't list 
d3d12 as a pixel format supported.
________________________________
From: Libav-user <[email protected]> on behalf of Ben Harper 
<[email protected]>
Sent: July 9, 2025 11:19 AM
To: This list is about using libavcodec, libavformat, libavutil, libavdevice 
and libavfilter. <[email protected]>
Subject: Re: [Libav-user] Libavcodec D3D12 hevc_nvenc

Have you tried just using the nvidia SDK directly? Since you need low latency, 
I assume this is for some kind of realtime application, which implies you're 
streaming video packets. The raw video codec data is simpler than many people 
imagine. It's just a stream of packets - also called NALs, or NALUs. The nvidia 
encoder will likely spit these out, and I'm guessing this is what your decoding 
side needs. You'll want avcodec for packaging those up into a .mp4 file or some 
other container format. But if you're just streaming live video data, then I'm 
guessing you don't actually need a container file format, so the raw packets 
might be good enough.

On Wed, 9 Jul 2025 at 17:12, Alex Cha 
<[email protected]<mailto:[email protected]>> 
wrote:
Hi,

I was under the impression that there was hardware acceleration for D3D12 
supported by ffmpeg.
Upon implementing the solution however, we had unsupported pixel format d3d12 
by the hevc_nvenc.

Digging further in the nvenc.c code it seems d3d12 format is not supported 
directly. Was there a reason for this? Looking into the nvidia documentation 
the encoder should support D3D12, even if it is unclear to me what needs to be 
done.

For reference, we are trying to integrate hardware streaming using FFmpeg into 
a unreal engine 5 application which uses D3D12. We need a very low latency.
We have tried using the D3D11on12 interop without success, it is returning us 
"openencodesessionex failed no encode device(1)". The adapter seems to be using 
my RTX 4090 correctly but no success.

Anyway, this email is a long shot in case someone as already encountered these 
issues and is able encode without copying to CPU.

Thanks,
Alex
_______________________________________________
Libav-user mailing list
[email protected]<mailto:[email protected]>
https://ffmpeg.org/mailman/listinfo/libav-user

To unsubscribe, visit link above, or email
[email protected]<mailto:[email protected]> with 
subject "unsubscribe".
_______________________________________________
Libav-user mailing list
[email protected]
https://ffmpeg.org/mailman/listinfo/libav-user

To unsubscribe, visit link above, or email
[email protected] with subject "unsubscribe".

Reply via email to