Jump to content

Hardware Transcoding help for Nvidia RTX A4000 card


milindmody

Recommended Posts

milindmody

Hi. I am a new user. Just started using Emby from last 2 weeks...and I must say..I am blown away by the software as well as the active community and great support by the emby team. This made me buy the Premiere license. Hopefully it will help contribute to the further development of this great software.

These are my emby server details:

Windows Server 2016 standard edition with 128 GB RAM , 11th gen Intel i7 -11700k @ 3.60 GHz

12 x 24 Bay hard disks (12 TB each) with 10 Gigabit Lan connection. This goes to multiple TVs and devices and a home theatre. At a single time 4 to 5 devices generally watch content. 

I purchased Nvidia RTX A4000 card and was checking the Advanced settings. I am a little confused because for each setting there are minimum 2 to 3 options which pop up. 

Preferred Hardware Decoders

MPEG-2

NVDEC NVIDIA RTX A4000 - MPEG-2
 
DX11VA NVIDIA RTX A4000 - MPEG-2
 

VC-1

NVDEC NVIDIA RTX A4000 - VC-1
 
DX11VA NVIDIA RTX A4000 - VC-1
 

H.264 (AVC)

NVDEC NVIDIA RTX A4000 - H.264 (AVC)
 
DX11VA NVIDIA RTX A4000 - H.264 (AVC)
 

H.265 (HEVC)

NVDEC NVIDIA RTX A4000 - H.265 (HEVC)
 
DX11VA NVIDIA RTX A4000 - H.265 (HEVC)
 
CUVID NVIDIA RTX A4000 - H.265 (HEVC)
 

VP8

NVDEC NVIDIA RTX A4000 - VP8
 

VP9

NVDEC NVIDIA RTX A4000 - VP9
 
DX11VA NVIDIA RTX A4000 - VP9
 

Preferred Hardware Encoders

H.264 (AVC)

NVENC NVIDIA RTX A4000 - H.264 (AVC)
 

H.265 (HEVC)

NVENC NVIDIA RTX A4000 - H.265 (HEVC)
H264 encoding preset:                         Auto                         veryslow                         slower                         slow                         medium                         fast                         faster                         veryfast                         superfast                     
Choose a faster value to improve performance, or a slower value to improve quality.
 
H264 encoding CRF:
The Constant Rate Factor (CRF) is the default quality setting for the x264 encoder. You can set the values between 0 and 51, where lower values would result in better quality (at the expense of higher file sizes). Sane values are between 18 and 28. The default for x264 is 23, so you can use this as a starting point.

 

Enable HDR tone mapping:

Which option should be best for me HDR tone mapping?

-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

My primary usage is to use emby to watch 4k Remux content on my 160 inch blackroom theatre screen. 

But I also have lot of old tv episodes and old dvds from my collection which my family watches. I have attached Media stats with this post.

Any help here would be appreciated. If not, then also its fine. Please continue doing the great work here. Cheers!!

 

 

 

 

 

Stats.jpg

Link to comment
Share on other sites

milindmody

I am primarily confused between multiple options which show up for hardware transcoding. Should I keep them all on? Or select just 1? Which one? If all then what should be priority? I have gone through some other threads about transcoding as well as seen checked some links on Nvidia website. Do not have a clear answer.

Edited by milindmody
Link to comment
Share on other sites

GrimReaper

If you have no issues - just leave it on Auto. Though, with your CPU, you realistically had no need for discreet GPU to begin with. If at some point you do exhibit issues, then we can take it from there, but typically NVENC/NVDEC on top will do the job if you're keen on having it on Advanced. 

Link to comment
Share on other sites

  • 2 weeks later...
pir8radio

I have the same question, but I'm looking for more of an explanation for me and future people doing this same search.

I recall a while back someone told me to ONLY enable CUVID, and I have started to see HDR transcode issues (skipping etc).  So I tested each decoder one at a time, and found DX11VA seems to give me the highest FPS conversion, NVDEC second fastest, CUVID least fast transcode speed.   So I set mine up like below...

 

My question for the video/transcode experts is, can you please explain pro's and con's with each method..    As I understand it decoding/encoding (for the most part) is done on the hardware,  the below options are the software handoff's,    DX11VA being something by microsoft,  NVDEC being nvidia's software, CUVID (no clue).      So which is better for normal 1080p stuff?    Which is better for HDR HEVC 10 bit stuff?   When would emby pick a different decoder from the one in the first position? (like when would ffmpeg or emby move to NVDEC and why?)  Should i have different decoder orders for different video types (h.265 vs MPEG-2)?        Any other tips for me or others searching?   

image.png.082a033e57e0313c14b354f02898e8f1.png

Edited by pir8radio
  • Agree 1
Link to comment
Share on other sites

23 minutes ago, pir8radio said:

I have the same question, but I'm looking for more of an explanation for me and future people doing this same search.

I recall a while back someone told me to ONLY enable CUVID, and I have started to see HDR transcode issues (skipping etc).  So I tested each decoder one at a time, and found DX11VA seems to give me the highest FPS conversion, NVDEC second fastest, CUVID least fast transcode speed.   So I set mine up like below...

 

My question for the video/transcode experts is, can you please explain pro's and con's with each method..    As I understand it decoding/encoding (for the most part) is done on the hardware,  the below options are the software handoff's,    DX11VA being something by microsoft,  NVDEC being nvidia's software, CUVID (no clue).      So which is better for normal 1080p stuff?    Which is better for HDR HEVC 10 bit stuff?   When would emby pick a different decoder from the one in the first position? (like when would ffmpeg or emby move to NVDEC and why?)  Should i have different decoder orders for different video types (h.265 vs MPEG-2)?        Any other tips for me or others searching?   

image.png.082a033e57e0313c14b354f02898e8f1.png

The choice of cuvid vs nvdec is something you'll just have to try and see what works best for your hardware. The choice exists because it is difficult to detect automatically which one will work better in all cases, but when in doubt, trust the defaults.

Link to comment
Share on other sites

GrimReaper

Edit: Stated typing, Luke already answered in the meantime, still think can expand that a bit.

Far from being an 'expert', but can throw my 2c here. 

I reckon best answer you'll ever get is: it depends.

There's no "one-scenario-fits-all" solution, on some systems one setup works better, on some the other does, and there's hardly an universal recommendation besides doing a bit of experimenting. For the most part, Auto does the good job and further fine-tuning is required only if artifacts/issues are occurring. 

58 minutes ago, pir8radio said:

I recall a while back someone told me to ONLY enable CUVID

 

58 minutes ago, pir8radio said:

CUVID (no clue)

Don't know why would you do that in present day, as CUVID (old NVIDIA's API, btw) is actually deprecated and superseded by NVENC/NVDEC:

https://docs.nvidia.com/cuda/video-decoder/index.html

My guess it is here for legacy purposes mostly, though again, on some systems it might also give better performance, as stated previously. 

58 minutes ago, pir8radio said:

As I understand it decoding/encoding (for the most part) is done on the hardware

It's actually done in software, but if one has Premiere subscription then it'll be hardware accelerated in the order you put there; if one of them fails (due to various reasons, ffmpeg implementation, for example), it'll move down the list with software decoding/encoding being the last fallback, if they all fail. 

Quote

and found DX11VA seems to give me the highest FPS conversion

Conversion speed should not be your only criteria, as it may (and likely will) affect quality; general consensus is that pure software conversion actually gives better quality (though HA had been gaining grounds lately, with Intel being doing really good job with QSV), so it's actually up to each individual to find that perfect balance between speed and quality that'll best fit one's needs/preferences. Again, achievable only through a bit of experimenting. 

 

Edited by GrimReaper
Afterthought
  • Thanks 1
Link to comment
Share on other sites

pir8radio
3 hours ago, GrimReaper said:

Edit: Stated typing, Luke already answered in the meantime, still think can expand that a bit.

Far from being an 'expert', but can throw my 2c here. 

I reckon best answer you'll ever get is: it depends.

There's no "one-scenario-fits-all" solution, on some systems one setup works better, on some the other does, and there's hardly an universal recommendation besides doing a bit of experimenting. For the most part, Auto does the good job and further fine-tuning is required only if artifacts/issues are occurring. 

 

Don't know why would you do that in present day, as CUVID (old NVIDIA's API, btw) is actually deprecated and superseded by NVENC/NVDEC:

https://docs.nvidia.com/cuda/video-decoder/index.html

My guess it is here for legacy purposes mostly, though again, on some systems it might also give better performance, as stated previously. 

It's actually done in software, but if one has Premiere subscription then it'll be hardware accelerated in the order you put there; if one of them fails (due to various reasons, ffmpeg implementation, for example), it'll move down the list with software decoding/encoding being the last fallback, if they all fail. 

Conversion speed should not be your only criteria, as it may (and likely will) affect quality; general consensus is that pure software conversion actually gives better quality (though HA had been gaining grounds lately, with Intel being doing really good job with QSV), so it's actually up to each individual to find that perfect balance between speed and quality that'll best fit one's needs/preferences. Again, achievable only through a bit of experimenting. 

 

understood..   I'm looking for speed...    I have multiple streams going out..   My home stuff is direct played/streamed   but if their internet or phone requires transcoding, then i assume HD 5mbps is good enough..  better than netflix in most cases..    I have never really seen impacting quality issues..  If i need 4k I'm not transcoding anyway.      Good info thanks for responding...

Edited by pir8radio
  • Like 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...