Jump to content

Tone-mapping in transcoding HDR for playback on SDR screens??


griffindodd
Go to solution Solved by Luke,

Recommended Posts

17 minutes ago, raudraido said:

Made test with my hw. NVENC is quite near to original. 

Full size pictures - https://raud.cloud/s/SB9YzwXLSMrNkQ8

1.nvenc

1-nvidia-nvenc.jpg

Hable?

Any chance you could do the same for the other Tone Mapping Algorithms if not a super amount of trouble?  It would be great to see the difference these algorithms make with different GPUs done on the same system by the same person.

Link to comment
Share on other sites

raudraido
27 minutes ago, cayars said:

Hable?

Any chance you could do the same for the other Tone Mapping Algorithms if not a super amount of trouble?  It would be great to see the difference these algorithms make with different GPUs done on the same system by the same person.

Yes, tests was done with emby's default settings, it means hable.
I could do that later, but I should mention that I did use 2 different pc's.

Link to comment
Share on other sites

Two PCs isn't that big a deal.  Same media, same person doing the tests is more important IMHO.

Link to comment
Share on other sites

raudraido
1 hour ago, cayars said:

Two PCs isn't that big a deal.  Same media, same person doing the tests is more important IMHO.

I did some tests. Results here https://raud.cloud/s/6fJ3NDcmoLTQGX4

I noticed that software Hable was almost identical with original source, but blurry, Emby somehow lowered the bitrate even if all tests was done with setting 4K-120Mbps bitrate

00-native.png

  • Like 1
Link to comment
Share on other sites

23 hours ago, softworkz said:

But if there's demand, I can bring back those tweaking parameters that we had during the initial testing.

These will be back in the next beta!

  • Like 5
Link to comment
Share on other sites

vdatanet
On 5/1/2021 at 12:26 PM, softworkz said:

These will be back in the next beta!

Adjusting the Desaturation Strength value I get a result very close to what I want. But why is this value not persistent after server restart?

Link to comment
Share on other sites

33 minutes ago, vdatanet said:

Adjusting the Desaturation Strength value I get a result very close to what I want. But why is this value not persistent after server restart?

The Diagnostics plugin provides settings for Diagnosing, Troubleshooting and Testing!

With all the available settings, you could easily and quickly configure your system to death. That's a bad situation for both, users (no joy) and us (support time). Also, Emby is not intended to be like one of those applications where you (as a user) need to solve problems by finding some hint deep down in a long forum conversation, where you are told to create some undocumented registry or config file entry. Instead, issues are supposed be fixed in a way that fixes are applied automatically whenever possible. The Diagnostics plugin allows us to identify issues and verify fixes more quickly and more easily when it needs to be done with the user interactively.

The purpose of the Diagnostics Plugin is NOT to provide "additional options" for daily use, and as such it's also not the (finally) intended way for setting tone mapping options. The status quo is different. But you must not forget that this is still in the beta channel at this time. At the backend side, a certain set of tone mapping options has already been considered to be useful for regular use and those are already persisted, even though you still can access them via the Diagnostics plugin only.

For other tone mapping options, things are undecided. I had actually assumed that these won't really be required. Now we brought them back for gathering additional feedback.

Link to comment
Share on other sites

1 hour ago, vdatanet said:

Adjusting the Desaturation Strength value I get a result very close to what I want.

You need to test this with a wide variety of videos. Trying just one or a few videos to determine "your value" for this, usually won't hold up for long once you watch more and more different videos. That's one of the reasons why I had considered that value as not too much useful.
(also the behavior of the desat param is very different among the different implementations, so you won't be able to get predictable results in cases of multiple hwas or hw-to-sw fallback)

Honestly, I'm skeptical, but I'm interested in more feedback.

I would just like to ask all of you who are willing to play with those params: please don't post something after testing one or two videos. Test this with a wider range of HDR videos before coming to a conclusion.

Thanks.

Link to comment
Share on other sites

vdatanet
4 minutes ago, softworkz said:

With all the available settings, you could easily and quickly configure your system to death

I agree.

4 minutes ago, softworkz said:

Now we brought them back for gathering additional feedback

My final feedback is (using i5 8600K on Linux - VAAPI) :

  1. Software tone-mapping bt.2390 is the best setting for me, but my CPU works like a dog
  2. Native VAAPI, it's too dark
  3. OpenCL it's to light, I have to increase Desaturation Strength to get close to software tone-mapping
  4. Using Quicksync, I get green artifacts

 

Link to comment
Share on other sites

vdatanet
5 minutes ago, softworkz said:

You need to test this with a wide variety of videos

Yes, I noticed that. Depending on the video there are more suitable values, it is very difficult to choose a single value for all. In general, increasing this value improves the result, but some videos require a greater value, some a lower, in some video  the default is good.

Edited by vdatanet
Link to comment
Share on other sites

It would also be interesting to know whether your conclusions are the same when trying some of those demo videos: https://4kmedia.org/tag/hdr/

Especially the Swordsmith video will teach you that algorithmic tone mapping should rather be a last resort than an essential part of your media library and viewing setup...

 

Edited by softworkz
Link to comment
Share on other sites

niallobr
2 hours ago, vdatanet said:

I agree.

My final feedback is (using i5 8600K on Linux - VAAPI) :

  1. Software tone-mapping bt.2390 is the best setting for me, but my CPU works like a dog
  2. Native VAAPI, it's too dark
  3. OpenCL it's to light, I have to increase Desaturation Strength to get close to software tone-mapping
  4. Using Quicksync, I get green artifacts

 

Very similar findings here on 8700K Coffee Lake.

- Software bt.2390 looks great

- Native VAAPI is very dark and not really usable

- OpenCL has higher brightness which can be adjusted nicely using the Tonemap Paramaters but all colours still look desaturated and there seems to be a green or yellow hue in videos I've tested. Skin tones aren’t looking natural.

- QuickSync I just get a mix of colours on non HDR videos (already reported)

Currently using Reinhard with Tonemap Paramaters 0.7. Desaturation Strength 0 (default).

Thanks for the link @softworkz I’ll try those out during the week.

Edited by niallobr
Link to comment
Share on other sites

cryzis

Has anyone tested on the tiger lake / rocket lake chips? Trying to determine if I can go with just that and skip the gpu.

Link to comment
Share on other sites

3 minutes ago, cryzis said:

Has anyone tested on the tiger lake / rocket lake chips? Trying to determine if I can go with just that and skip the gpu.

There is (was) a bug in the Intel Media Driver for Tigerlake CPUs, but I have worked with them to get it fixed. We are just waiting for their next official driver release to include that in our packages. Affected is only VAAPI Native tone mapping everything else works (I'm sure as I have a TGL here).

In general I'd say yes: no need for an external GPU when you have TGL (as long as you're not heading for some extreme loads).

Link to comment
Share on other sites

cryzis
19 minutes ago, softworkz said:

There is (was) a bug in the Intel Media Driver for Tigerlake CPUs, but I have worked with them to get it fixed. We are just waiting for their next official driver release to include that in our packages. Affected is only VAAPI Native tone mapping everything else works (I'm sure as I have a TGL here).

In general I'd say yes: no need for an external GPU when you have TGL (as long as you're not heading for some extreme loads).

That’s awesome, yeah I am thinking 1-2 max, super curious how many one of these chips can actually pull off. 
Shame the GPU market is broken right now as I would pick up a touring + card just for the improved quality.

Thanks!

Link to comment
Share on other sites

3 hours ago, cryzis said:

That’s awesome, yeah I am thinking 1-2 max, super curious how many one of these chips can actually pull off. 
Shame the GPU market is broken right now as I would pick up a touring + card just for the improved quality.

We all know how Nvidia GPUs compare to Intel with regards to 3D and gaming performance. But when it comes to video hardware acceleration, the picture is totally different. With regards to encoding features and quality, Intel is clearly leading meanwhile on the technical side and pushing updates and innovation much stronger than the others.
In terms of performance - it depends and varies. You can only do a device-by-device comparison and then only with focus on a certain ability. Encoding and decoding is done by specialized "chips" (rather IC cores), so that's a whole different game than 3D performance. Though, the latter plays a role when it comes to the processing in-between (scaling, deinterlacing, tone mapping, etc.), this usually goes to Nvidia where Intel doesn't have specific hardware support for it (which changes with every cpu gen).

Note 1: On the Intel side, I'm not speaking of VAAPI, but MSDK/QuickSync. 

Note 2: This is about the current situation in the industry from a technical viewpoint. It's not about what you might see in Emby currently. Our default hardware encoding settings are chosen to fit for common use and compatibility, but there's a lot to explore and research. When somebody is interested, open a new topic and drop me a note, then I can provide some hints.

Note 3: I apologize for any confusion - this was not about tone mapping but about the quality of those hardware accelerations in general.

Edited by softworkz
Link to comment
Share on other sites

14 minutes ago, softworkz said:

With regards to encoding features and quality, Intel is clearly leading

In what way? Just curious because it seems everyone using Nvidia is having no issues with quality and can tone map multiple 4K files at once. I've not seen anyone who has both prefer the tone mapped version done in Intel HW over Nvidia HW.

I could be wrong on this but I think people running Intel are getting lower fps as well but maybe that's on older dies.

Seems like they may be a lot more "gung ho" but are playing catch.

Of course there are many ways to quantify what you said so I'm just curious. Is this based on only the latest chip set or where they are now going forward?
(I could see that).

  • Thanks 1
Link to comment
Share on other sites

Sorry - that's a misunderstanding. @cryzis was asking about whether he should buy an Intel or Nvidia GPU and my reply was just about general quality of those hardware accelerations and not about tone mapping at all.

I'll add a Note 3..

  • Thanks 1
Link to comment
Share on other sites

For tone mapping results, it's more a competition of algorithms and their implementations rather than vendors.
(The only vendor-provided tone mapping is Intel "VAAPI Native Tone Mapping". )

  • Like 1
Link to comment
Share on other sites

niallobr
On 03/05/2021 at 09:42, softworkz said:

It would also be interesting to know whether your conclusions are the same when trying some of those demo videos: https://4kmedia.org/tag/hdr/

Especially the Swordsmith video will teach you that algorithmic tone mapping should rather be a last resort than an essential part of your media library and viewing setup...

 

I've been comparing these videos. It's definitely interesting to see tone mapping applied to a different subjects in different lighting situations.

 

https://imgur.com/a/2Dr1spO

 

For my setup I think Reinhard with Tonemap Parameter 0.7 worked best. It would be interesting to see what others think and to see what other hardware will produce? I didn't bother to include software tone mapping since it is quite intensive for most systems. In my opinion software bt.2390 is better than all of these.

 

For reference the demos I used were:

Sony New York Fashion

Sony Swordsmith

LG Cymatic Jazz

Samsung and Red Bull See the Unexpected 

Note: Samsung demo was insanely bright and had some strange colours going on (almost like 3D?) so maybe not relevant... It was the only one that looked interesting with the Native VAAPI HW option since the scenes were so bright.

  • Like 1
Link to comment
Share on other sites

PontusN
7 minutes ago, softworkz said:

@PontusN The log doesn't show any fallback. It seems you didn't have the QuickSync encoder and decoder activated.

That's the weird thing, I do. As soon as I set desaturation to 0 again, it works, without changing anything else

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...