Click here to Skip to main content
15,867,834 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
platform: Windows
application: WinForm application
language: C#
decoder: ffmpeg

Hi
We're using ffmpeg to decode frames sent via network to us. The program is written in C# and uses FFmpegInvoke as well as the required DLL's like avcodec-56.dll and avutil-54.dll for decoding the frames. We need to use GPU instead of CPU for this purpose. Now CPU is used and decoding is done without any problem.
My question is that how can I tell ffmpeg use GPU instead of CPU for decoding?
Is there any sample code for this purpose?
Thanks

What I have tried:

I googlized this problem and found no clue.
Posted
Updated 20-Aug-18 22:49pm

1 solution

Start by reading HWAccelIntro – FFmpeg[^].

For usage with the API check if an appropriate decoder is available and select that for decoding (requires that the library has been built with support for that decoder):
// C/C++ example to use the DXVA2 decoder
AVCodec* decoder = avcodec_find_decoder_by_name ("h264_dxva2");
avcodec_open2 (decoder_ctx, decoder, NULL);
 
Share this answer
 
Comments
ilostmyid2 23-Aug-18 10:46am    
thanx, i'll check it. before checking, let me know whether it's required to use ffmpeg or i may access the card directly for this purpose using direcx or something.
Jochen Arndt 23-Aug-18 11:04am    
ffmpeg is not required. You can use any library or API for specific GPUs that is supported on the used system.
ilostmyid2 25-Aug-18 9:19am    
i tried it. the return value of avcodec_find_decoder_by_name is NULL.
Jochen Arndt 25-Aug-18 10:38am    
Then that decoder is not supported by your ffmpeg version.
ilostmyid2 25-Aug-18 12:02pm    
how to get the latest version? i use ffmpeginvoke.cs

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900