Jump to content

Emby not respecting H264 encoding preset setting on Hardware Encoding for Intel Quick Sync


snake98

Recommended Posts

snake98

I notice with the logs, that Emby is not respecting the H264 encoding preset setting.  It only uses h264_qsv -preset 7 when using hardware acceleration.  According to intel 7 is Fastest settings, biggest quality drop.

 

 

cpu is Intel® Core™ i7-8700K.

 

Log

ffmpeg.exe -analyzeduration 3000000 -fflags +igndts -c:v h264_qsv  -i "http://127.0.0.1:8096/LiveTv/LiveStreamFiles/7d781adb17ca4640879c92a5b50bef79/stream.ts" -map_metadata -1 -map_chapters -1 -threads 0 -sn -codec:v:0 h264_qsv -preset 7 -look_ahead 0 -b:v 2616000 -maxrate 2616000 -bufsize 5232000 -profile:v high -level 4.1 -force_key_frames "expr:gte(t,n_forced*3)" -vf "scale=trunc(min(max(iw\,ih*dar)\,1280)/2)*2:trunc(ow/dar/2)*2" -flags -global_header -vsync cfr -codec:a:0 aac -strict experimental -ac 2 -ab 384000  -f segment -max_delay 5000000 -avoid_negative_ts disabled -start_at_zero -segment_time 3  -individual_header_trailer 0 -segment_format mpegts -segment_list_entry_prefix "hls/7aadbd1a838da4d6a936911d25fd8688/" -segment_list_type m3u8 -segment_start_number 0 -segment_list "z:\transcoding-temp\7aadbd1a838da4d6a936911d25fd8688.m3u8" -y "z:\transcoding-temp\7aadbd1a838da4d6a936911d25fd8688%d.ts"

Setup

 

5a9d762444826_hardwareencoding.png

5a9d76d53a725_inteldocs.png

  • Like 1
Link to comment
Share on other sites

snake98

It's currently only used with software encoding.

Is there a way to change the setting used from 7 to 2 behind the scenes?

Link to comment
Share on other sites

snake98

It's something to consider for the future. Thanks.

I created a pull request to add this options.  I'm not a professional programmer, but I believe it follows your style.  Let me know if you need anything changed, or can put it in the next Beta.  If someone else wants to test it let me know.  You just need to replace3 dll's that works on the current branch.

 

https://github.com/MediaBrowser/Emby/pull/3193

Link to comment
Share on other sites

  • 2 weeks later...
snake98

Is getting that code put into the beta a possibility?  I"ve been testing it and it works fine.  If anyone else wants to test it let me know, I can upload the 3 dll that got changed.

  • Like 1
Link to comment
Share on other sites

cybergrimes

+1 this all day. I'm a big fan of QSV, so more options for quality control would be great.

Link to comment
Share on other sites

snake98

@@Luke

  If this is a problem let me know i'll remove it, or should be it's own topic?

 This is a test will set the quality of the QSV to the settings of H264 encoding preset,  Emby currently sets it to 7 (veryfast) and can't be changed.

To install

Replace the 3 Dll's in the \Emby-Server\system directory with the ones in the Link.  You have to reapply these after every update.

 

 

Allowed Settings are
veryfast
faster
fastermedium
slow
slower
veryslow

Any other choice will default to emby's veryfast setting.

These were built off server 3.1.1.0, and tested on windows.

https://drive.google.com/drive/folders/1n-I1PXXL87rCi4hDpdqTjym2M6Xt90mJ

 

 

Change code is here https://github.com/MediaBrowser/Emby/pull/3193

 

Please let me know if you have any problems, or if it works on linux.

Edited by snake98
Link to comment
Share on other sites

This is the same for amf and nvenc where it's not respecting the defaults set for the encoder.

 

Actually what would be best is to have the normal CPU settings PLUS a section specific to the HW encoder chosen.

  • Like 1
Link to comment
Share on other sites

cybergrimes

It looks like on the fly DVR transcoding doesn't use these settings either? I set it to 20 but see in the log crf=23.0

Link to comment
Share on other sites

It looks like on the fly DVR transcoding doesn't use these settings either? I set it to 20 but see in the log crf=23.0

 

This option will go away at some point due to it's instability so I don't think there is much need to do new development with it.

Link to comment
Share on other sites

The option for Live TV or you mean the option to set the defaults we each want to use?

I sure hope you don't mean the latter as I'm sure many of us don't use the defaults.

Link to comment
Share on other sites

snake98

This is the same for amf and nvenc where it's not respecting the defaults set for the encoder.

 

Actually what would be best is to have the normal CPU settings PLUS a section specific to the HW encoder chosen.

I can wright the code, but can't test it. 

and mapping doesn't seem to be as straight forward as intels.

-preset            <int>        E..V.... Set the encoding preset (from 0 to 11) (default medium)
     default                      E..V.... 
     slow                         E..V.... hq 2 passes
     medium                       E..V.... hq 1 pass
     fast                         E..V.... hp 1 pass
     hp                           E..V.... high performance
     hq                           E..V.... high quality
     bd                           E..V.... blue ray disk
     ll                           E..V.... low latency
     llhq                         E..V.... low latency hq
     llhp                         E..V.... low latency hp
     lossless                     E..V.... lossless  
     losslesshp                   E..V.... lossless high performance

I believe lossless overrides bit rate control on nvenc, so  do we just want to map the bd,slow, medium,fast options.  I can't seem to find good documentation.

Is bd higher quality than high quality?  Add an option to do high performance 2 pass?

 

I believe we can ignore low latency, as it more for game streaming and teleconferencing and lowers picture quality.

Edited by snake98
Link to comment
Share on other sites

cybergrimes

The option for Live TV or you mean the option to set the defaults we each want to use?

I sure hope you don't mean the latter as I'm sure many of us don't use the defaults.

 

He quoted me so should be talking about the DVR auto-transcode feature. I think he has mentioned some kind of post-record feature to replace it eventually (unless I misunderstood)

Edited by cybergrimes
Link to comment
Share on other sites

Got ya.  Yes that makes a lot of sense and is a solid way to go.  You would also get much better compression of H.264 post processing it as well.

Link to comment
Share on other sites

I can wright the code, but can't test it. 

and mapping doesn't seem to be as straight forward as intels.

-preset            <int>        E..V.... Set the encoding preset (from 0 to 11) (default medium)
     default                      E..V.... 
     slow                         E..V.... hq 2 passes
     medium                       E..V.... hq 1 pass
     fast                         E..V.... hp 1 pass
     hp                           E..V.... high performance
     hq                           E..V.... high quality
     bd                           E..V.... blue ray disk
     ll                           E..V.... low latency
     llhq                         E..V.... low latency hq
     llhp                         E..V.... low latency hp
     lossless                     E..V.... lossless  
     losslesshp                   E..V.... lossless high performance

I believe lossless overrides bit rate control on nvenc, so  do we just want to map the bd,slow, medium,fast options.  I can't seem to find good documentation.

Is bd higher quality than high quality?  Add an option to do high performance 2 pass?

 

I believe we can ignore low latency, as it more for game streaming and teleconferencing and lowers picture quality.

Why would you map it?

 

The idea would be to show what is supported by the hardware.  Not to try and translate it back to CPU nomenclature. 

So for NVENC just show the options exactly as they are here.  So basically these options will translate exactly to what we see in the ffmpeg options.

Link to comment
Share on other sites

Even if you have gpu encoding enabled, there are cases where cpu will be used. that's why the mapping makes sense. 

Link to comment
Share on other sites

cybergrimes

Hey, saw the change is in 3.3.1.6 and is working for me (can see medium setting passed in log instead of ultrafast)

Did notice the deinterlacing options for standard/bob&weave are missing from bottom of page-- expected change?

Link to comment
Share on other sites

Yes that's by design due to the amount of troubleshooting that it causes. You can still edit the setting in the config file.

Link to comment
Share on other sites

snake98

@@Luke

 

If this is acceptable, i'll write the code and ask for some beta tester in another thread.  I"ll map it as following

slow, slower, veryslow = slow
medium = medium
All other = fast
Link to comment
Share on other sites

Even if you have gpu encoding enabled, there are cases where cpu will be used. that's why the mapping makes sense. 

That would be fine.

 

What I'm saying is to have defaults for BOTH the CPU and for the HW transcoding.  I know I'd prefer to have different settings between the two so mapping them to appear the same isn't good.  For example I'd want to use a better setting for the preset for the hardware but may want fast or even very fast for CPU.  

 

With NVENC for example using one of the better profiles can cut file size nearly in half (less bandwidth as well) and comes much closer to CPU sizes without a huge impact on time (especially compared to CPU overhead).  But if it's mapped then we can't have a "slow" setting for hardware and a fast or very fast for CPU.

 

With NVENC we only get 2 streams on consumer GPUs so it's best to set this up to keep files/bitrates with better compression and PQ and it has little impact on the system.  You can tune this to get much better quality and still easily do better than real-time with two simultaneous encodes.  Not fine tuning the HW encode is just a waist since it can only due 2 streams and won't even use 50% of the GPU.

 

Hopefully that makes sense.  So basically in a nut shell, many of us would want to use something like "slow" or "very slow" equivalents via Hardware but want to use "fast" or "very fast" with CPU.  So we'll try to always use the HW first if possible but fall back to CPU when it has to, but each would be tuned the way we need it on our system.

 

Make sense?

  • Like 3
Link to comment
Share on other sites

  • 2 weeks later...
roberto188

That would be fine.

 

What I'm saying is to have defaults for BOTH the CPU and for the HW transcoding.  I know I'd prefer to have different settings between the two so mapping them to appear the same isn't good.  For example I'd want to use a better setting for the preset for the hardware but may want fast or even very fast for CPU.  

 

With NVENC for example using one of the better profiles can cut file size nearly in half (less bandwidth as well) and comes much closer to CPU sizes without a huge impact on time (especially compared to CPU overhead).  But if it's mapped then we can't have a "slow" setting for hardware and a fast or very fast for CPU.

 

With NVENC we only get 2 streams on consumer GPUs so it's best to set this up to keep files/bitrates with better compression and PQ and it has little impact on the system.  You can tune this to get much better quality and still easily do better than real-time with two simultaneous encodes.  Not fine tuning the HW encode is just a waist since it can only due 2 streams and won't even use 50% of the GPU.

 

Hopefully that makes sense.  So basically in a nut shell, many of us would want to use something like "slow" or "very slow" equivalents via Hardware but want to use "fast" or "very fast" with CPU.  So we'll try to always use the HW first if possible but fall back to CPU when it has to, but each would be tuned the way we need it on our system.

 

Make sense?

 

Having the flexibility to set both the CPU and Hardware profiles would be ideal.

Link to comment
Share on other sites

roberto188

 

@@Luke

 

If this is acceptable, i'll write the code and ask for some beta tester in another thread.  I"ll map it as following

slow, slower, veryslow = slow
medium = medium
All other = fast

I'll test it. 

Edited by roberto188
Link to comment
Share on other sites

roberto188

Luke, how is the bob & weave option inserted into the encoding config file?

Nevermind. 

<DeinterlaceMethod>bobandweave</DeinterlaceMethod>

Edited by roberto188
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...