Jump to content

Are there hardcoded limits for the auto quality setting?


Recommended Posts

Posted
33 minutes ago, rbjtech said:

A simple bandwidth test done just before playback simply works - AndroidTV/FireTV has done it for years - and 100% of my remote clients direct play on those devices.   Using any other client,   Samsung Client, LG Client etc they get transcoded when I know it should direct play with ease.

I'm really not sure why Emby can't implement what AndroidTV does .. OR allow the Emby Admin to set a value in 'devices' to assist with the so called 'auto' calculation .. reducing it to 'something low quality but is likely to work' is really not an acceptable technical solution in 2024 imo.

I agree with this.

Seeing LG OLED TVs and Samsung TVs stream at 4-6mpbs because the user just think auto knows best feels like getting hit by a sledge hammer.
I've resorted to make special movie folders with images explaining it all but users still believe an auto setting more than the one running the server.

At least i fixed my dads quality problem thanks to ebr and installing Emby for Android TV on my dads system. :)
Quote from afterwards "Told you auto works fine". 🤪

Posted

It's a no-win situation for any device that can more around as either the WIFI or cell service can drastically change from the time the test was made.  If you're walking around the house on a phone, tablet, chromebook, notebook and get too far from the AP your bandwidth might not be high enough to use even though it was just checked a minute prior.

Even dedicated hardware clients that don't move using WIFI can be affected by other WIFI traffic.  Then there is the server bandwidth cap, if required, as well as the internet upload bandwidth.  You could test and find out you're good for 8Mb of bandwidth, but if/when another remote use starts a stream there is contention for the server upload bandwidth that could easily cause issues as well. A 10Mb upload is probably fine for most remote users but if your ISP only gives you that much upload speed to start with the amount of usable bandwidth affects multiple remote users when there is more than 1 or 2 (low bandwidth) streams.

Don't get more wrong, I'm sure there is room for improvement to handle some of these situations better, but the fact is, there is no guarantee the "tested bandwidth" will be available once you've started the stream playing.

The only legit solution would be to use adaptive streaming, so it appears to the client there are multiple resolution/bitrate streams available.  The client could then make use of these multiple streams during the request for the next segments which would allow it to dynamically change the resolution/bitrate, based on how full the client-side buffer is. This would be pretty trivial to implement client side, compared to changes needed on the server, if even possible, considering the different codecs we need to support like TS burning in subtitles, tone mapping, etc.

 

Posted
15 minutes ago, Carlo said:

It's a no-win situation for any device that can more around as either the WIFI or cell service can drastically change from the time the test was made.  If you're walking around the house on a phone, tablet, chromebook, notebook and get too far from the AP your bandwidth might not be high enough to use even though it was just checked a minute prior.

Even dedicated hardware clients that don't move using WIFI can be affected by other WIFI traffic.  Then there is the server bandwidth cap, if required, as well as the internet upload bandwidth.  You could test and find out you're good for 8Mb of bandwidth, but if/when another remote use starts a stream there is contention for the server upload bandwidth that could easily cause issues as well. A 10Mb upload is probably fine for most remote users but if your ISP only gives you that much upload speed to start with the amount of usable bandwidth affects multiple remote users when there is more than 1 or 2 (low bandwidth) streams.

Don't get more wrong, I'm sure there is room for improvement to handle some of these situations better, but the fact is, there is no guarantee the "tested bandwidth" will be available once you've started the stream playing.

The only legit solution would be to use adaptive streaming, so it appears to the client there are multiple resolution/bitrate streams available.  The client could then make use of these multiple streams during the request for the next segments which would allow it to dynamically change the resolution/bitrate, based on how full the client-side buffer is. This would be pretty trivial to implement client side, compared to changes needed on the server, if even possible, considering the different codecs we need to support like TS burning in subtitles, tone mapping, etc.

 

 

I get dynamic switching is almost impossible and Emby is a personal media server after all.

But can we at least get the Emby for Android TV solution integrated into the Samsung and LG clients? I don't see how they could be used on mobile clients anyway.
Maybe a server side throttle switch to disable that feature in clients just in case something goes wrong.

It would be a nice fix for now.

rbjtech
Posted
13 hours ago, Carlo said:

It's a no-win situation for any device that can more around as either the WIFI or cell service can drastically change from the time the test was made.  If you're walking around the house on a phone, tablet, chromebook, notebook and get too far from the AP your bandwidth might not be high enough to use even though it was just checked a minute prior.

Even dedicated hardware clients that don't move using WIFI can be affected by other WIFI traffic.  Then there is the server bandwidth cap, if required, as well as the internet upload bandwidth.  You could test and find out you're good for 8Mb of bandwidth, but if/when another remote use starts a stream there is contention for the server upload bandwidth that could easily cause issues as well. A 10Mb upload is probably fine for most remote users but if your ISP only gives you that much upload speed to start with the amount of usable bandwidth affects multiple remote users when there is more than 1 or 2 (low bandwidth) streams.

Don't get more wrong, I'm sure there is room for improvement to handle some of these situations better, but the fact is, there is no guarantee the "tested bandwidth" will be available once you've started the stream playing.

The only legit solution would be to use adaptive streaming, so it appears to the client there are multiple resolution/bitrate streams available.  The client could then make use of these multiple streams during the request for the next segments which would allow it to dynamically change the resolution/bitrate, based on how full the client-side buffer is. This would be pretty trivial to implement client side, compared to changes needed on the server, if even possible, considering the different codecs we need to support like TS burning in subtitles, tone mapping, etc.

 

All valid points - but I think emby are missing the essential data of what is available in the first place.

If a quick test, reveals that 40Mbit download is available, then even if they halved that to 20Mbit to be 'really conservative' then even 4K would typically direct play.

If only 10Mb download was available and again, it halves it, then transcoding is going to typically happen - but at least that's valid based on a technical decision, not a guess.

The point is - if you have the upload bandwidth (and are willing to use it - the Admin has the choice via the Emby user config) and the client has ample download bandwidth - then it should be used on fixed devices.   Even on mobile - a 'reasonable' effort can be made to get in the right ballpark. 

Emby need to utilise newer transport technologies that are available to them - if FTTH is available, then I expect the Emby user wants to use the bandwidth they are paying for - and the remote user will not want to see some transcoded bit-starved stream on their new shiny 4K TV.   

Emby frankly need to up their game in this area - especially as they have a working model of it with experience on how best to implement it (on AndroidTV/FireTV) - it just sounds like excuses to me on why this is not implemented on other 'TV' based clients .. sorry.

  • Like 1
  • Agree 2
Riddler84
Posted
19 hours ago, ebr said:

Actually, technically, it is very, very different.  Kind of explained by Carlo but a streaming service is not transcoding content on the fly to different bitrates or formats and the server the content is coming from is owned and hosted by them, not you...

You could create 10-12 different copies of every item you have at different bitrates and then our system will pick the one that most closely matches the current environment on each playback.  It won't dynamically switch between them during playback because most people don't do this and so building that isn't of very high value for us right now.

We would like to eventually get to an adaptive streaming solution but, in reality, it isn't needed all that much with our typical usage.

I wasn't really talking about adaptive streaming. My point is, simply, that the auto setting isn't working as expected by the user. It is counter-intuitive, because it suggests that it automatically chooses the best setting, that is possible, considering the server side limits that have been set and the user's personal internet connection.

But then there are people who are using the web client, who could easily stream with 20 mbps or more, because they have 200+ mbps download, but they are stuck with a maximum of 7 mbps. I never see it go beyond, and a lot of stuff has to be transcoded because of that, which increases the workload and power consumption of my server. I have limited every single stream to 12 mbps, but web client users never reach it, while Fire TV users always reach it (if the media has a higher bitrate of course)

And sometimes I see users streaming with 2 mbps, or sometimes it goes even down to 768 kbps. And they just watch it that way, which is ridiculous, but they do. A simple restart of the media would probably fix this, but they won't do it, for whatever reason. Maybe because they think, it's all that they get. People don't like changing quality manually, because they are used to the fact, that you don't need to manually adjust such things nowadays, and that you're always getting the best quality, that is possible in your environment.

So, in my opinion, there has to be something to take these users by the hand. Like a bandwidth check every few minutes or so, that checks if the current quality setting is still the best one in the current situation. And then show a message like: "We have detected, that you could choose a higher quality. Would you like to change now to: 1080p (15 mbps)?" And if he chooses yes, just change the quality like you would do manually. And the same with reducing the quality. But maybe you should just do it then to prevent buffering.

Or for a start, just implement the same technique that you use for the Fire TV Android App, into the other clients and get rid of these hardcoded values. At least do an initial bandwidth scan and set the quality accordingly. This could be still leading to situations where a user is stuck with a low quality setting, because the test failed, or the connection was bad exactly when the test happened, but that would be still an improvement.

  • Like 1
  • Agree 1
Posted
7 hours ago, rbjtech said:

All valid points - but I think emby are missing the essential data of what is available in the first place.

If a quick test, reveals that 40Mbit download is available, then even if they halved that to 20Mbit to be 'really conservative' then even 4K would typically direct play.

If only 10Mb download was available and again, it halves it, then transcoding is going to typically happen - but at least that's valid based on a technical decision, not a guess.

The point is - if you have the upload bandwidth (and are willing to use it - the Admin has the choice via the Emby user config) and the client has ample download bandwidth - then it should be used on fixed devices.   Even on mobile - a 'reasonable' effort can be made to get in the right ballpark. 

Emby need to utilise newer transport technologies that are available to them - if FTTH is available, then I expect the Emby user wants to use the bandwidth they are paying for - and the remote user will not want to see some transcoded bit-starved stream on their new shiny 4K TV.   

Emby frankly need to up their game in this area - especially as they have a working model of it with experience on how best to implement it (on AndroidTV/FireTV) - it just sounds like excuses to me on why this is not implemented on other 'TV' based clients .. sorry.

I agree your points are all valid as well.

Certain device/platforms have more functionality that allows Emby to pick a resolution/bandwidth used to stream that works better than on other devices. There also seems to be two different schools of thought on bandwidth use:

1. Make the most use of bandwidth available which would favor direct play when available or using the highest transcode resolution/bandwidth it can.
2. Conserve bandwidth (especially for the server) to make the best use of limited bandwidth available.

Many of the "problem" clients seem to fall into #2 when the admin would prefer it defaulted to #1 when using the client auto setting.  What might help a lot would be a new config setting on the network settings page, used for remote users. "Remote Client Bandwidth Optimization. It could have a dropdown with 2 choices as above, or maybe a 3rd that's somewhere between the two choices using a reasonable upper bitrate limit in the 8Mb range. This would allow the admin to setup the server for remote users that would make the most use of resources available to the server.

The Emby client would then use this global setting if auto were active, to choose the res/bitrate fitting the criteria. Now instead of defaulting to use a "conservative" amount of bandwidth as it seems to do, causing extra transcoding that wasn't needed, it could default to using a higher amount of bandwidth giving the user a better-quality stream as well as often skipping the need to transcode in the first place saving server resources for better use.

At present, there are multiple settings and environmental condition that affect the client choice but typically the decision seems to be too conservative for many systems. Having this new global setting would allow the server to start the decision process with a low, medium or high bitrate default choose that wouldn't be lowered unless the bandwidth check determines there isn't enough bandwidth to support the global setting, at which point the client would pick the best quality resolution/bitrate for transcoding.

Just a thought, but I think this could be easily implemented and give the admin far more control and guidance on the remote client's default use of resources available to it.

Carlo

  • Like 1
  • Agree 1
Posted

What ever is done to fix it, even a bandaid fix while a greater fix is being worked on would be awesome. ;)
Like mentioned before having the other clients use the Emby for Android TV method.

Happy2Play
Posted

And in the end a good definition is needed for "Auto" as you will never, please everyone and will not work for everyone's connections as they think is should.  As it does not mean what anyone thinks it should mean or perform as anyone thinks it should so just document it and be done with it.  There are quite a few topics and in the end all circle the wagon.

Or simply Renaming to something other than AUTO.

  • Agree 1
pwhodges
Posted

"Default" should do, as that's what it seems to be in most cases...

Paul

  • Agree 3
Posted
3 hours ago, pwhodges said:

"Default" should do, as that's what it seems to be in most cases...

Paul

I know my dad and other family members would see that as auto. I also know this from when fx. looking in a bios, are there features i don't understand? Yes then put on default.

If a rename is what is going to happen then for my servers use it would just be better to scrap the naming all together and just set the start quality when a client is installed to be what ever quality devs think is best.

Might at least make people curious about the feature instead of thinking whats written is best.

Posted
6 hours ago, pwhodges said:

"Default" should do, as that's what it seems to be in most cases...

Paul

"Guestimate" may be more accurate.
Sorry, couldn't help myself. :) 

I think "Default" would be better than "Auto" as well. 
"Auto" to me conveys the notion it will choose the best resolution/bitrate for each stream, which a end user could easily take as being the best or optimal choice.
"Default" conveys a value that will try to work for most people but doesn't convey it's the "best" or "optimal" setting.

Carl

  • Haha 1
  • Agree 1
Posted

I like "Default", then change "Auto" to "Random" and keep both as options.

 

  • Haha 2
Jdiesel
Posted

Conservative, Safe, Low, 

Neminem
Posted

I would call them all "trial and error, till you hit the sweet spot".

Posted

Default:Lowest (Please change me!!)

;)

Jdiesel
Posted

Potato 🥔 

embylad892746
Posted (edited)
On 10/5/2024 at 8:20 PM, Carlo said:

I agree your points are all valid as well.

Certain device/platforms have more functionality that allows Emby to pick a resolution/bandwidth used to stream that works better than on other devices. There also seems to be two different schools of thought on bandwidth use:

1. Make the most use of bandwidth available which would favor direct play when available or using the highest transcode resolution/bandwidth it can.
2. Conserve bandwidth (especially for the server) to make the best use of limited bandwidth available.

Many of the "problem" clients seem to fall into #2 when the admin would prefer it defaulted to #1 when using the client auto setting.  What might help a lot would be a new config setting on the network settings page, used for remote users. "Remote Client Bandwidth Optimization. It could have a dropdown with 2 choices as above, or maybe a 3rd that's somewhere between the two choices using a reasonable upper bitrate limit in the 8Mb range. This would allow the admin to setup the server for remote users that would make the most use of resources available to the server.

The Emby client would then use this global setting if auto were active, to choose the res/bitrate fitting the criteria. Now instead of defaulting to use a "conservative" amount of bandwidth as it seems to do, causing extra transcoding that wasn't needed, it could default to using a higher amount of bandwidth giving the user a better-quality stream as well as often skipping the need to transcode in the first place saving server resources for better use.

At present, there are multiple settings and environmental condition that affect the client choice but typically the decision seems to be too conservative for many systems. Having this new global setting would allow the server to start the decision process with a low, medium or high bitrate default choose that wouldn't be lowered unless the bandwidth check determines there isn't enough bandwidth to support the global setting, at which point the client would pick the best quality resolution/bitrate for transcoding.

Just a thought, but I think this could be easily implemented and give the admin far more control and guidance on the remote client's default use of resources available to it.

Carlo

So is a bitrate fix for Admins added to the road map yet? Is something in Beta by early 2025 reasonable? @Luke?

I would love to get a definitive answer on this, I've been waiting for a bitrate detection fix / admin override for years and it never seems to come despite this topic having significant traction and discussed every year. For me, it's the biggest pain point about Emby in what is otherwise fantastic software.

 

 

 

 

Edited by embylad892746
zaHrecsohbiw
Posted

 

On 10/5/2024 at 12:20 PM, yocker said:

What ever is done to fix it, even a bandaid fix while a greater fix is being worked on would be awesome. ;)
Like mentioned before having the other clients use the Emby for Android TV method.

 

1 hour ago, embylad892746 said:

So is a bitrate fix for Admins added to the road map yet? Is something in Beta by early 2025 reasonable? @Luke?

I would love to get a definitive answer on this, I've been waiting for a bitrate detection fix / admin override for years and it never seems to come despite this topic having significant traction and discussed every year. For me, it's the biggest pain point about Emby in what is otherwise fantastic software.

 

 

 

 

 

I think im struggling to understand what people are actually wanting to achieve in this thread. It sounds like you all just want auto to behave for remote clients the way that it does for local lan clients? Couldn't you achieve that by adding "0.0.0.0/1, 128.0.0.0/1" to your "LAN networks" list in network settings?

Happy2Play
Posted (edited)
1 hour ago, zaHrecsohbiw said:

I think im struggling to understand what people are actually wanting to achieve in this thread. It sounds like you all just want auto to behave for remote clients the way that it does for local lan clients? Couldn't you achieve that by adding "0.0.0.0/1, 128.0.0.0/1" to your "LAN networks" list in network settings?

One could make everything considered local but then stuff will most likely stutter play as the REMOTE connect will not be able to handle the excessive bitrates the majority of the time on all that high bitrate content that will most likely never stream seamlessly.

Or one could say they just expect AUTO to do better at querying the connections speeds and play things as higher than the conservative coded rate now.

 

Easy fix all clients should be defaulted to Max and the server admin will be required to throttle their users to a value that works for them and their remote users.

Edited by Happy2Play
  • Like 1
zaHrecsohbiw
Posted
12 minutes ago, Happy2Play said:

One could make everything considered local but then stuff will most likely stutter play as the REMOTE connect will not be able to handle the excessive bitrates the majority of the time on all that high bitrate content that will most likely never stream seamlessly.

Or one could say they just expect AUTO to do better at querying the connections speeds and play things as higher than the conservative coded rate now.

 

Easy fix all clients should be defaulted to Max and the server admin will be required to throttle their users to a value that works for them and their remote users.

I only brought it up because it seems like one of the goals here was to eliminate transcoding, which I took to mean that whatever media they are providing, they've already determined that it isn't enough to saturate their own upload speeds or the download speeds of their users. Another user mentioned a band-aid, which is why I suggested it. 

FWIW, I agree that this isn't a good idea, and I would not enable it for my own configuration. I don't even think it's a good idea for the people in this thread to use it, considering that it seems like their users don't notice or mind the reduced bitrate in the first place. 

I know I would absolutely lose my mind if whatever show I was watching paused every 3 minutes to tell me that I _could_ switch to a higher resolution, only for some uncontrollable external factor to choke the connection afterwards. give me the conservative 4mbps default every single time over _that_ option.

  • Agree 1
Posted
23 minutes ago, Happy2Play said:

One could make everything considered local but then stuff will most likely stutter play as the REMOTE connect will not be able to handle the excessive bitrates the majority of the time on all that high bitrate content that will most likely never stream seamlessly.

Or one could say they just expect AUTO to do better at querying the connections speeds and play things as higher than the conservative coded rate now.

 

Easy fix all clients should be defaulted to Max and the server admin will be required to throttle their users to a value that works for them and their remote users.

While that fix would present a whole new bunch of problems, like some people just trying the server once and then giving up when it doesn't work for them, it would be better than what it is now as it would at least be easier to persuade people to use the quality settings.

Happy2Play
Posted (edited)
11 minutes ago, yocker said:

While that fix would present a whole new bunch of problems, like some people just trying the server once and then giving up when it doesn't work for them, it would be better than what it is now as it would at least be easier to persuade people to use the quality settings.

That is why it is like it is as it just works.  And then ones say hey I don't like you limiting my quality.  Well you can manually adjust it to a higher value of your choosing that your network can handle as Emby has no real mechanism for this at this time.  As AUTO does not work on every client Remotely in every condition known to the internet so it will fall back to a know hardcoded value.  To me it seems pretty simple to just adjust the client quality to something that works for you as there will never be a correct answer for everyone.

I don't really see controlling this from server side ever really happening either but I am just a user like everyone else.  This is why I tell all my users to set the quality to MAX and I will throttle them to share my low upload speed here on a normal home connection.  But Auto hardcorded value is technically more than enough here.

Edited by Happy2Play
  • Like 2
Posted
6 minutes ago, zaHrecsohbiw said:

I only brought it up because it seems like one of the goals here was to eliminate transcoding, which I took to mean that whatever media they are providing, they've already determined that it isn't enough to saturate their own upload speeds or the download speeds of their users. Another user mentioned a band-aid, which is why I suggested it. 

FWIW, I agree that this isn't a good idea, and I would not enable it for my own configuration. I don't even think it's a good idea for the people in this thread to use it, considering that it seems like their users don't notice or mind the reduced bitrate in the first place. 

I know I would absolutely lose my mind if whatever show I was watching paused every 3 minutes to tell me that I _could_ switch to a higher resolution, only for some uncontrollable external factor to choke the connection afterwards. give me the conservative 4mbps default every single time over _that_ option.

It's not about eliminating transcoding, if it was that simple i would just disable that entirely.
I just want people to have the best possible experience and it pains me to know people are watching extremely bad bitrates on their 4K OLEDs.
And with setting it high enough they could get crispy clear videos even with HDR a majority of the time.

In the end it's about user friendliness and ease of use i guess.

Having the settings "hidden" in where they are now is probably not the best idea either, only one of my friends have actually found them on their own and the others just thought everything worked automatically like fx. Netflix.

  • Agree 1
Posted
6 minutes ago, Happy2Play said:

That is why it is like it is as it just works.  And then ones say hey I don't like you limiting my quality.  Well you can manually adjust it to a higher value of your choosing that your network can handle as Emby has no real mechanism for this at this time.  As AUTO does not work on every client Remotely in every condition known to the internet so it will fall back to a know hardcoded value.  To me it seems pretty simple to just adjust the client quality to something that works for you as there will never be a correct answer for everyone.

I don't really see controlling this from server side ever really happening either but I am just a user like everyone else.  This is why I tell all my users to set the quality to MAX and I will throttle them to share my low upload speed here on a normal home connection.  But Auto hardcorded value is technically more than enough here.

I don't want it controlled by the server admin either, that would be a mess. Just saying setting it to max would at least force people to look for the quality setting, though i actually think all of the few people i have given access to would be able to handle max quality no problem.

I would love something like what Emby for Android TV does to be universal to all TV clients but even better i would love for the UI to be more user friendly so less tech savy people would know how to use the settings.

Most users sadly think it's just another streaming service but unlike those Emby can't skirt around that with auto to ensure the best possible quality.

Posted
3 hours ago, Happy2Play said:

Easy fix all clients should be defaulted to Max and the server admin will be required to throttle their users to a value that works for them and their remote users.

This would be perfect if we could also throttle "local" users who happen to be connecting remotely through a home VPN, like some of us have set up as an alternative to SSL.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...