Jump to content

Tone-mapping in transcoding HDR for playback on SDR screens??


griffindodd
Go to solution Solved by Luke,

Recommended Posts

griffindodd

This is needed for transcoding though. Direct playing to Kodi or ET this isn't really an issue. Playing HDR material in any situation where transcoding it is required. This string would only be applied when transcoding HDR material.

 

Yep, or also be a trigger to start transcoding to an SDR display. But, if the end display is HDR, hopefully Emby can still pass HDR content even if it does have to transcode for other reasons (codecs, bandwidth etc)

Edited by griffindodd
  • Like 1
Link to comment
Share on other sites

griffindodd

But wouldn't it be better to not lose the HDR color? And negate the need for conversion? That's what I'd prefer.

HDR is strictly dynamic range, don't confuse it with 10 or 12bit color which can technically also exist in SDR material too

 

Sent from my SM-G955U1 using Tapatalk

Link to comment
Share on other sites

Guest asrequested

HDR is strictly dynamic range, don't confuse it with 10 or 12bit color which can technically also exist in SDR material too

 

Sent from my SM-G955U1 using Tapatalk

We use an app with mpv that uses PQ for HDR. It isn't strict. The color representation is perfect. I actually prefer it to sending the metadata to my display. The picture quality is exactly the same without waiting for the display to switch. Mpv uses ffmpeg, so PQ can definitely be used.

 

https://emby.media/community/index.php?/topic/53211-emby-theatre-support-for-evc/page-7&do=findComment&comment=521624

Edited by Doofus
Link to comment
Share on other sites

griffindodd

We use an app with mpv that uses PQ for HDR. It isn't strict. The color representation is perfect. I actually prefer it to sending the metadata to my display. The picture quality is exactly the same without waiting for the display to switch. Mpv uses ffmpeg, so PQ can definitely be used.

 

https://emby.media/community/index.php?/topic/53211-emby-theatre-support-for-evc/page-7&do=findComment&comment=521624

Sounds like the implementation of the conversion is well within the scope of FFMPEG it seems that the rules and flags to trigger are more involved though.

 

Sent from my SM-G955U1 using Tapatalk

  • Like 1
Link to comment
Share on other sites

Guest asrequested

Sounds like the implementation of the conversion is well within the scope of FFMPEG it seems that the rules and flags to trigger are more involved though.

 

Sent from my SM-G955U1 using Tapatalk

Yeah, that's why I mentioned @@Waldonnis. I'm hoping he's got a viable solution.

Link to comment
Share on other sites

Guest asrequested

I guess the question would be does the standard transcoding we already employ not satisfy?

 

It doesn't, if we want to allow transcoding of HDR content. 

Link to comment
Share on other sites

griffindodd

I guess the question would be does the standard transcoding we already employ not satisfy?

Not for HDR. Even now if an HDR file has to be transcoded (even just for bandwidth or an audio code change) it strips the Metadata so then an HDR display will not recognize the content as HDR and switch to HDR mode, this results in the washed out look again.

 

Add to that lack of tone mapping for SDR displays and it makes Emby all but useless for viewing HDR material unless it's simply direct playing only to an HDR display.

 

Sent from my SM-G955U1 using Tapatalk

Edited by griffindodd
  • Like 2
Link to comment
Share on other sites

griffindodd

These are technically two separate issues.

 

1. Transcoding needs to retain HDR Metadata so that HDR compatible displays know the content is HDR and switch to the correct display mode.

 

2. The Emby client checks to see if a display is HDR compatible, if not it tells the transcoder to engage tone mapping during the encode to output accurate colors in a traditional SDR format.

 

Sent from my SM-G955U1 using Tapatalk

Link to comment
Share on other sites

Guest asrequested

These are technically two separate issues.

 

1. Transcoding needs to retain HDR Metadata so that HDR compatible displays know the content is HDR and switch to the correct display mode.

 

2. The Emby client checks to see if a display is HDR compatible, if not it tells the transcoder to engage tone mapping during the encode to output accurate colors in a traditional SDR format.

 

Sent from my SM-G955U1 using Tapatalk

 

Again, I think it's better to use PQ, then those two things will be redundant. That would be my choice.

Link to comment
Share on other sites

griffindodd

I have no idea what a perceptual quantizer is, I assume you can time travel with it though like a Flux capacitor

 

Sent from my SM-G955U1 using Tapatalk

Link to comment
Share on other sites

Guest asrequested

I have no idea what a perceptual quantizer is, I assume you can time travel with it though like a Flux capacitor

 

Sent from my SM-G955U1 using Tapatalk

I guess you didn't read the link that I posted.

Link to comment
Share on other sites

Jdiesel

OP is having issues with HDR material on Roku and Apple TV clients outside his network on non HDR TVs. This doesn't apply to direct playing files on ET or Kodi.

Link to comment
Share on other sites

griffindodd

OP is having issues with HDR material on Roku and Apple TV clients outside his network on non HDR TVs. This doesn't apply to direct playing files on ET or Kodi.

All Emby clients attached to any SDR display

 

Sent from my SM-G955U1 using Tapatalk

Edited by griffindodd
Link to comment
Share on other sites

Guest asrequested

Ok, so all I'm suggesting is, rather than having to augment every app to monitor every device and display, then report to the server, and have the server transcode HDR to SDR if the device and/or display doesn't support HDR, or pass the HDR metadata to the display, if it supports HDR. Employing PQ at the server transcode level, will provide HDR coloring to everything, without complication. It's a lot less work and maintenance for the devs, and provides a better user experience.

 

What is Perceptual Quantization (EOTF?

 

More detail on PQ

Link to comment
Share on other sites

griffindodd

Ok, so all I'm suggesting is, rather than having to augment every app to monitor every device and display, then report to the server, and have the server transcode HDR to SDR if the device and/or display doesn't support HDR, or pass the HDR metadata to the display, if it supports HDR. Employing PQ at the server transcode level, will provide HDR coloring to everything, without complication. It's a lot less work and maintenance for the devs, and provides a better user experience.

 

What is Perceptual Quantization (EOTF?

 

More detail on PQ

Sounds like a reasonable approach as the content is being transcoded anyway.

 

Of course if the display DOES support HDR but a transcode is needed for bandwidth or codec reasons then the transcode should try and keep everything in tact yes?

 

Sent from my SM-G955U1 using Tapatalk

Edited by griffindodd
Link to comment
Share on other sites

Guest asrequested

Sounds like a reasonable approach as the content is being transcoded anyway.

 

Of course if the display DOES support HDR then the transcode should try and keep everything in tact yes?

 

Sent from my SM-G955U1 using Tapatalk

 

That would be good if it could be achieved, I'm just trying to limit how much work the devs have to do, but still offer a happy medium that most people can appreciate. How easy it is for them to implement is not  something I can really speculate, on.

Edited by Doofus
Link to comment
Share on other sites

griffindodd

That would be good if it could be achieved, I'm just trying to limit how much work the devs have to do, but still offer a happy medium that most people can appreciate. How easy it is for them to implement is not something I really speculate, on.

Yep I'm also in the same place, just trying to bring attention to the problem but unfortunately not able to create a fix/plug in myself due to lack of brains lol.

 

I'm seeing this issue come up a lot in the different media communities, getting out ahead of this could certainly grab a lot of market share for Emby away from the likes of Plex if the solution is built in.

 

HDR isn't going away so having a core solution is going to be needed by all streaming platforms

 

Sent from my SM-G955U1 using Tapatalk

Edited by griffindodd
  • Like 1
Link to comment
Share on other sites

  • 4 months later...
vick1982

maybe im confused ..... but is doofus saying that having a hdr display is just a bullshit gimmick and any regular display can look exactly the same using PQ ?   

  • Like 1
Link to comment
Share on other sites

Guest asrequested

maybe im confused ..... but is doofus saying that having a hdr display is just a bullshit gimmick and any regular display can look exactly the same using PQ ?   

 

No. There is a color gamut. 10 bit has a larger gamut than 8 bit. You can't use a 10 bit gamut on an 8 bit display. But as HDR is metadata, and can only be directly applied in displays that can use the metadata, perceptual quantization can render the color gamut to the picture, in place of the HDR metadata application. The quandary then becomes, what do you do with the extra colors that an 8 bit display doesn't have the color gamut to display. MPV has the ability to render the metadata and tone map to an 8 bit gamut. The thought I had was that maybe a similar process could be used in transcoding. Right now, when an HDR movie is transcoded, the metadata is simply lost, and the color is 'washed out'.

Edited by Doofus
Link to comment
Share on other sites

Waldonnis

The thought I had was that maybe a similar process could be used in transcoding.

 

It can be.  You can actually tonemap with ffmpeg, but it's not as easy as one would think (there's a tonemap filter, but it requires some handholding for this type of task).  It's something I've been looking into on and off, but haven't really had time to dig into it.  I did find a blog entry about it a while ago (clicky) that was interesting and outlines some of the steps required...and the basic reasoning is sound.  I haven't bothered with doing any analysis of the output from his command lines yet to see if additional filters may be useful, though.  I also really want to see how some of the players implemented tonemapping and run some histogram comparisons (takes time, so it's a bit low on the list at the moment).

Link to comment
Share on other sites

Guest asrequested

It can be.  You can actually tonemap with ffmpeg, but it's not as easy as one would think (there's a tonemap filter, but it requires some handholding for this type of task).  It's something I've been looking into on and off, but haven't really had time to dig into it.  I did find a blog entry about it a while ago (clicky) that was interesting and outlines some of the steps required...and the basic reasoning is sound.  I haven't bothered with doing any analysis of the output from his command lines yet to see if additional filters may be useful, though.  I also really want to see how some of the players implemented tonemapping and run some histogram comparisons (takes time, so it's a bit low on the list at the moment).

 

Interesting. I've bookmarked that for later reference. That maybe too labor intensive to add to the server...at least not without a warning. But maybe using h264, it might be ok.

Link to comment
Share on other sites

Waldonnis

Interesting. I've bookmarked that for later reference. That maybe too labor intensive to add to the server...at least not without a warning. But maybe using h264, it might be ok.

 

It may not be that much more intensive, actually, but I've never measured it.  It's getting it worked into the existing filterchain that requires some thought.  I also don't know what impact it would have on hardware encoding or what may need to be changed that way.  Ultimately, it's all math and is probably a lot less intensive than encoding is, so it shouldn't be all that bad.  I just haven't run benchmarks or any other analysis yet to see how much more of a burden it would be quite yet.  I'd also like to check for issues when downscaling, since I'd expect most non-HDR monitors are 1080p (and most HDR sources are 2160p).  I'd hate to introduce aberrations or unnecessary banding because the order of filterchain operations wasn't well thought out.

 

Of course, 10->8 bit tonemapping would be handy for h.264 Hi10p as well (mostly anime), so you'd see benefits with those types of 1080p sources.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...