Jump to content

Recommended Posts

jsc1205
Posted

I have bounced around between Plex and Emby for quite some time.  When it comes to tone mapping for playback of 4K HDR content on a non 4K TV, is the tone mapping process exactly the same between Plex and Emby?  Trying to evaluate best bang for the buck.

Posted

Hi.  I have no knowledge of their process but I'm sure it is not.  We have expended quite a bit of effort that, I believe, gives us pretty much the best implementation around.  Why don't you try it out and let us know?

Welcome!

  • Like 2
sross44
Posted (edited)
5 hours ago, ebr said:

Hi.  I have no knowledge of their process but I'm sure it is not.  We have expended quite a bit of effort that, I believe, gives us pretty much the best implementation around.  Why don't you try it out and let us know?

Welcome!

This goes live in 4.6 is that correct??

Edited by sross44
Posted
35 minutes ago, sross44 said:

This goes live in 4.6 is that correct??

Yes.

  • Like 1
sross44
Posted
Just now, Luke said:

Yes.

Thought so! Thanks.... hoping it's released in the next day or so. Been waiting patiently for this one.

Carlo
Posted
59 minutes ago, sooty234 said:

libplacebo is probably the face of the future...

https://github.com/haasn/libplacebo#readme

What does that have to do with Emby Server tone mapping for clients that can't do HDR?

sooty234
Posted
29 minutes ago, cayars said:

What does that have to do with Emby Server tone mapping for clients that can't do HDR?

1133809331_Screenshot2021-05-16164908.thumb.jpg.e16f70706b2b72a6766f3efe1d6e0e67.jpg

P*** off muppet! If you don't understand, I'm not going to explain it to you.

GWTPqZp6b
Posted

Are you suggesting emby and plex both use this common library? If that’s the case I would expect results to be identical, but I don’t think they are suggesting some custom development or tuning in at least one of them. 

 

Posted (edited)

In the future everybody would share and play nice. In the future we all share a common goal and the nasty of the world is eliminated. There is only joy. The future is a perfect place. But the reality is anything but. There must be a difference and a race to be better or you are stuck with mundane. The same. A race to the bottom.

Instead what you have presently is a race to the top which is a very good thing. It inspires development wanting to be better.

I think what is meant was the future is heading towards a unified experience where we all submit code to the same branch and play nice and we all evolve hardware tone mapping together. But we are certainly not there yet... It is a time consuming (time is money) process to learn everything and to just give it away (share code) so others can immediately catch up (close the gap) at this point is not going to happen. In a few years.. maybe. Maybe that is the future in a few years.

Edited by speechles
  • Like 2
sooty234
Posted
2 minutes ago, GWTPqZp6b said:

Are you suggesting emby and plex both use this common library? If that’s the case I would expect results to be identical, but I don’t think they are suggesting some custom development or tuning in at least one of them. 

 

It isn't that simple. Some hardware has native support for tone mapping, and that is also being utilized. I can't speak for Plux, but emby is using ffmpeg's algorithms, which in my opinion, are a little behind. libplacebo offers much greater potential, but it is a massive amount of code. I was just saying that libplacebo is probably what should be used in the not too distant future, if not now. 

8 minutes ago, speechles said:

In the future everybody would share and play nice. The future is a perfect place. But the reality is anything but. There must be a difference and a race to be better or you are stuck with mundane. The same. A race to the bottom. We race to the top.

I think what is meant was the future is heading towards a unified experience where we all submit code to the same branch and play nice and we all evolve hardware tone mapping together. But we are certainly not there yet... It is a time consuming process to learn everything and to just it away so others can immediately catch up at this point is not going to happen. In a few years.. maybe. Maybe that is the future in a few years.

Ha! Always the pacifist :) Cheers buddy 🍻

But some people just need a poke in the eye (metaphorically, of course) ;)  

Carlo
Posted
1 hour ago, sooty234 said:

1133809331_Screenshot2021-05-16164908.thumb.jpg.e16f70706b2b72a6766f3efe1d6e0e67.jpg

I understand what it's doing but asking why you commented about it? This would be fine on the client side but not on the server side.

I have not touched or tried that library so I don't know what it's cable of or if it could be incorporated into the ffmpeg pipeline and be able to support scaling and filter and burning in subs and all the other crazy things the transcoder needs to do.

Then even if it could it doesn't appear to support the same GPUs that Emby already supports nor do we know what the speed would be like.

It would surely be interesting to see what it's cable of.

8 minutes ago, sooty234 said:

It isn't that simple. Some hardware has native support for tone mapping, and that is also being utilized. I can't speak for Plux, but emby is using ffmpeg's algorithms, which in my opinion, are a little behind. libplacebo offers much greater potential, but it is a massive amount of code. I was just saying that libplacebo is probably what should be used in the not too distant future, if not now. 

Ha! Always the pacifist :) Cheers buddy 🍻

But some people just need a poke in the eye (metaphorically, of course) ;)  

Emby isn't using ffmpeg tonemapping in the way you might think.  Softworkz wrote all our own algorithms and they are much faster than anything stock or anything the competition is doing. Not only that but it works with multiple different GPUs and lots of different hardware.  I know how much work and customization he put into this because I helped him test much of it before it hit our private testers, then hit beta. Heck my Synology can tone map using Emby to 4K and doesn't even need to downgrade or scale to 1080 video. Running the "P" software it struggles to tone map to 1080 and stutters pretty bad.

Ideally more clients will build in HW tone mapping (aka Shield TV) or software libs like you mentioned that can be incorporated into the clients so some of the grunt work can happen on the client vs the server but that only works if the client can get the raw 4K HDR stream which often can't be done remotely due to bitrate limits.

Definitely interesting times for 4K HDR media with 4.6 Server and many of the Emby Clients.

sooty234
Posted (edited)
45 minutes ago, cayars said:

 This would be fine on the client side but not on the server side.

And this is what you don't understand. Let me guess... You probably think it's only for a mpv or similar media player? How can you be this daft and still have a job? I don't have the patience to explain it.... And from what I've read from some of the mpv devs, ffmpeg used some of mpv's development. One of them is quite pissed off about it, and wishes that had been blocked. 

But, on a more positive note, I'm happy to hear that softworkz wrote his own algorithms. He's quite brilliant. 

As for you Carlo, this is how I perceive you

Beaker.gif.dfb19d8ad5a12feacc8bfdcff20808a7.gif

Edited by sooty234
Carlo
Posted

Did you just stop reading at that line or did you actually read what it says which talks about the obstacles to get it integrated into ffmpeg and if it would even be useful?

This is what I said:

49 minutes ago, cayars said:

I have not touched or tried that library so I don't know what it's cable of or if it could be incorporated into the ffmpeg pipeline and be able to support scaling and filter and burning in subs and all the other crazy things the transcoder needs to do.

Then even if it could it doesn't appear to support the same GPUs that Emby already supports nor do we know what the speed would be like.

It would surely be interesting to see what it's cable of.

12 minutes ago, sooty234 said:

And this is what you don't understand. Let me guess... You probably think it's only for a mpv or similar media player? How can you be this daft and still have a job? I don't have the patience to explain it.... And from what I've read from some of the mpv devs, ffmpeg used some of mpv's development. One of them is quite pissed off about it, and wishes that had been blocked. 

But, on a more positive note, I'm happy to hear that softworkz wrote his own algorithms. He's quite brilliant. 

No I obviously understand it is it's own library, else I wouldn't have made the comments I made about integrating it into ffmpeg. But just guessing based on the fact it came from a client implementation it would likely not have a lot of functionality needed for a server implementation.

It should be clear by my answer that I know its a library made available by MPV and constitute the core rendering algorithms they use on the client when rendering video. Again as I said it would be interesting to see what use it could have and how fast it performs but in general there is quite a difference in client rendering vs server/stream rendering and transformation. The big difference is that you aren't displaying the data but instead are working with it in a pipeline to create a stream and all pieces of the pipeline need to be able to run on the GPU in order to keep speeds high.  As soon as any part of that pipeline needs to run in software it slows things down quite a bit.

That's the problem with most "display libs" is that they aren't developed for being part of a streaming engine working with multiple streams, but instead are taking a stream and modifying it for display output. That's a much, much simpler process.

sooty234
Posted
3 hours ago, cayars said:

Did you just stop reading at that line 

I think it's safe to say that a number of us don't read much of what you have to say.... because...well....you know....you're full of yourself....s**t I mean...

pwhodges
Posted

It seems to me that you are advocating to replace a known good block of software with another known good block of software which may or may not be (a) suitable or (b) as fast. 

Do you perceive any specific advantages to doing this which would justify (a) the work involved making the change and (b) the change to reliance on others in an area that Emby is competitive?

Paul

  • Like 1
Posted

If you cannot be civil to the other members and staff here, we don't need your input.  Please follow the rules.

  • Like 1
sross44
Posted
5 hours ago, pwhodges said:

It seems to me that you are advocating to replace a known good block of software with another known good block of software which may or may not be (a) suitable or (b) as fast. 

Do you perceive any specific advantages to doing this which would justify (a) the work involved making the change and (b) the change to reliance on others in an area that Emby is competitive?

Paul

I am curious to hear the response to this… what are the advantages you see in regards to the work that is needed etc

  • Agree 1
Posted (edited)
14 hours ago, cayars said:

This would be fine on the client side but not on the server side.

13 hours ago, sooty234 said:

And this is what you don't understand. Let me guess... You probably think it's only for a mpv or similar media player? How can you be this daft and still have a job? I don't have the patience to explain it....

 

13 hours ago, cayars said:

  As soon as any part of that pipeline needs to run in software it slows things down quite a bit.

That's the problem with most "display libs" is that they aren't developed for being part of a streaming engine working with multiple streams, but instead are taking a stream and modifying it for display output. That's a much, much simpler process.

 

9 hours ago, sooty234 said:

I think it's safe to say that a number of us don't read much of what you have to say.... because...well....you know....you're full of yourself....s**t I mean...

 

Let's get things sorted a bit. First of all, @cayars is right in almost everything he said. For client libraries, it might not be easier with regards to the effort of implementation, but it's definitely easier to achieve good results. And that's simply because at the client side, you have much more information about the presentation display. A software that is running at the client can read out color profiles and capabilities of a display from the driver or via HDMI communication. Having that information, you can much better adapt to the setup and provide better results.

Another difficulty is the wide variety of file formats and ways to include HDR side data in a video stream. ffmpeg is about to add support for more standards (more and less official ones), and MPV is surely in the same boat, because MPV == ffmpeg + some additions for playback.

No doubt - libplacebo has quite a good reputation, but it's using Vulkan and OpenGL hw contexts and we're doing neither of these.
(I don't know it in detail, that's why I can't say anything more about it)

Anyway, that discussion is pointless. We don't need a single TM implementation - we need 4 at minimum: software, Nvidia, VAAPI, QuickSync. And those should - at least roughly - behave in a similar way, so that you don't get a totally different picture when transcoding changes between software and hardware.

Still - what we (and all of the competition and all ffmpeg) are doing here is 'semi-professional' at best and more an approximation than a precise transform.

The primary difference is that our implementations are the fastest. 🙂 

Edited by softworkz
  • Like 1
  • Thanks 1

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...