Jump to content

AMD hevc decoder


raudraido

Recommended Posts

raudraido

Hi,

 

I do not understand why My radeon rx560 won't decode, instead Emby switch to software..?

 

 

 

>>>>>>  FindVideoDecoder - MediaType: hevc, Mode: Custom
Info    FindVideoDecoder - Checking: 'DX11VA Radeon RX 560 Series - H.265 (HEVC)' (Priority: 100)
NoMatch Codec does not support profile 'Main 10 Profile'
 
 
 

ffmpeg-transcode-05b7eab5-db4c-4b87-b0f2-20b20cf83b71_1.txt

Link to comment
Share on other sites

Q-Droid

The card itself is capable of decoding 10-bit HEVC but the software and drivers have to be able to leverage the features.

Link to comment
Share on other sites

raudraido

The card itself is capable of decoding 10-bit HEVC but the software and drivers have to be able to leverage the features.

 

 

Can you explain more detail? Which card/soft/drivers combo can hardware decode 10bit HEVC? GTX1660?

Link to comment
Share on other sites

Q-Droid

Can you explain more detail? Which card/soft/drivers combo can hardware decode 10bit HEVC? GTX1660?

What I wrote is specific to the rx500 series. The decoder on that card, UVD 6.3, supports 10-bit hevc. For it to work with Emby the ffmpeg api has to be able to use the features. Also the drivers have to be right for all of them to work together.

 

Nvidia cards use different apis.

Link to comment
Share on other sites

@@raudraido - AMD has been very late to the game of hardware video acceleration.

 

Nvidia has been assisting to get support for their hw acceleration into ffmpeg in earlier days.

Intel is still actively contributing to ffmpeg, enabling all their latest encoding features for use with ffmpeg.

Now, guess what AMD is doing? (nothing)

 

AMD is offering their AMF SDK and this will allow you to do 10bit HEVC decoding. 

But decoding via AMF is currently not implemented in ffmpeg. Only encoding is implemented and Emby can make use of that.

 

For decoding, there's only DXVA, which is a Microsoft interface, supported by ffmpeg and Emby can make use of that.

But the supported codecs and for each codec which resolutions, profiles, levels and color formats are supported: That depends on the Windows graphic driver.

 

You can try the Microsoft driver and the specific latest AMD driver for your Radeon. If you don't get 10bit HEVC decoding in both cases, please contact AMD for further  support.

Edited by softworkz
  • Like 1
Link to comment
Share on other sites

raudraido

@@raudraido - AMD has been very late to the game of hardware video acceleration.

 

Nvidia has been assisting to get support for their hw acceleration into ffmpeg in earlier days.

Intel is still actively contributing to ffmpeg, enabling all their latest encoding features for use with ffmpeg.

Now, guess what AMD is doing? (nothing)

 

AMD is offering their AMF SDK and this will allow you to do 10bit HEVC decoding. 

But decoding via AMF is currently not implemented in ffmpeg. Only encoding is implemented and Emby can make use of that.

 

For decoding, there's only DXVA, which is a Microsoft interface, supported by ffmpeg and Emby can make use of that.

But the supported codecs and for each codec which resolutions, profiles, levels and color formats are supported: That depends on the Windows graphic driver.

 

You can try the Microsoft driver and the specific latest AMD driver for your Radeon. If you don't get 10bit HEVC decoding in both cases, please contact AMD for further  support.

 

Thank you for clarification.

As I understand I have DXVA and Radeon latest drivers enabled, still I do not get hardware decode on 10bit HEVC.

So i'll need to figure it out with AMD.

 

5d68e1647c423_Capture.jpg

Link to comment
Share on other sites

raudraido

Have you tried the Intel decoder? I'm not sure whether 530 supports 10bit, though.

It does not support 10bit. Next gen (starting from Kaby Lake) already supports HEVC 10bit decode/encode as well.

Link to comment
Share on other sites

raudraido

Which choice is better, purely from Emby perspective, change CPU to Kabylake which supports HEVC10bit or change gpu to nvidia 1660ti?

Edited by raudraido
Link to comment
Share on other sites

One thing to know is that in both cases, the video acceleration is done by dedicated units in the hardware, so you cannot look at the 3D performance to compare.

Both are good choices (and better than AMD).

Link to comment
Share on other sites

BAlGaInTl

Which choice is better, purely from Emby perspective, change CPU to Kabylake which supports HEVC10bit or change gpu to nvidia 1660ti?

 

 

One thing to know is that in both cases, the video acceleration is done by dedicated units in the hardware, so you cannot look at the 3D performance to compare.

Both are good choices (and better than AMD).

 

One thing to keep in mind is that there is a limit of 2 streams (I believe) when using the Nvidia consumer cards to decode.  The Intel chip won't have that limitation.

 

ETA: You may be able to get around the Nvidia limitation, but it will involve flashing a different firmware I believe.

Edited by BAlGaInTl
Link to comment
Share on other sites

Q-Droid

@@raudraido - AMD has been very late to the game of hardware video acceleration.

 

Nvidia has been assisting to get support for their hw acceleration into ffmpeg in earlier days.

Intel is still actively contributing to ffmpeg, enabling all their latest encoding features for use with ffmpeg.

Now, guess what AMD is doing? (nothing)

 

AMD is offering their AMF SDK and this will allow you to do 10bit HEVC decoding. 

But decoding via AMF is currently not implemented in ffmpeg. Only encoding is implemented and Emby can make use of that.

 

For decoding, there's only DXVA, which is a Microsoft interface, supported by ffmpeg and Emby can make use of that.

But the supported codecs and for each codec which resolutions, profiles, levels and color formats are supported: That depends on the Windows graphic driver.

 

You can try the Microsoft driver and the specific latest AMD driver for your Radeon. If you don't get 10bit HEVC decoding in both cases, please contact AMD for further  support.

 

Is D3D11VA available and would it make a difference with the AMD decoding?

Link to comment
Share on other sites

pwhodges

ETA: You may be able to get around the Nvidia limitation, but it will involve flashing a different firmware I believe.

 

It's a driver patch, which has to be selected to match the exact version of the driver.

 

https://github.com/keylase/nvidia-patch

Edited by pwhodges
Link to comment
Share on other sites

Is D3D11VA available and would it make a difference with the AMD decoding?

 

@@Luke - I think we should display the detected hw coder capabilities in the UI.

 

It's a bit awkward that we let users find out by trial and error what capabilities are supported by their hardware.

 

Or even worse: Let users post their hw logs and we need to look through them to tell the user about his hardware's capabilities...

Edited by softworkz
Link to comment
Share on other sites

Q-Droid

@@Luke - I think we should display the detected hw coder capabilities in the UI.

 

It's a bit awkward that we let users find out by trial and error what capabilities are supported by their hardware.

 

Or even worse: Let users post their hw logs and we need to look through them to tell the user about his hardware's capabilities...

 

Sorry, I wasn't paying attention. DX11VA and D3D11VA are the same, right? I noticed the details in the posted image above after my post and your response.

Link to comment
Share on other sites

Sorry, I wasn't paying attention. DX11VA DXVA2 and D3D11VA (= DX11VA) are the same, right? I noticed the details in the posted image above after my post and your response.

 

It's two different APIs for the side of the application that wants to use it but in both cases you're accessing the same set of decoders implemented by the graphics driver.

The most important difference is that DXVA2 is based on DirectX 9 and requires an active desktop session having video memory allocated (= requires a monitor to be connected).

D3D11VA does not have this restriction (= works on headless machines) but it can be a little bit slower.

 

That's the reason why we're having both in Emby.

Edited by softworkz
Link to comment
Share on other sites

Q-Droid

It's two different APIs for the side of the application that wants to use it but in both cases you're accessing the same set of decoders implemented by the graphics driver.

The most important difference is that DXVA2 is based on DirectX 9 and requires an active desktop session having video memory allocated (= requires a monitor to be connected).

D3D11VA does not have this restriction (= works on headless machines) but it can be a little bit slower.

 

That's the reason why we're having both in Emby.

 

Thanks for the info. I wrote DX11VA because it is listed in the HEVC decoder list image posted by raudraido. I wasn't sure if that was the Emby UI side equivalent of D3D11VA but this is over my head so I'll leave it be.

Link to comment
Share on other sites

DXVA2 is an official name, but there is none for the DirectX 11 variant. So DXVA2 is one side and everything with 11 is the other side. Sorry for the confusion.

Link to comment
Share on other sites

  • 1 month later...
raudraido

With the latest 4.3.0.12 beta my rx560 started to decode hevc 10 bit.

 

I think it's because of that fix

* DXVA: Fix 10bit profile detection

Edited by raudraido
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...