Jump to content

"easy mode" quality options


MSI2017

Recommended Posts

MSI2017

Hi all,

So I've got a hard time with transcoding on Emby. Sometimes it transcoded when it really shouldn't (reporting exceeds selected quality). But now for Example a 4K HDR file on my iPad Mini 4 stutters like crazy. Therefore I want to turn it back down to 1080p. However,  selecting 1080p 60mpbs would still allow for 4K direct play since the file bitrate is 19mbps. This is on local network.

Therefore I would suggest an 'easy mode' quality panel where it just shows resolution. I know my parents find this stuff really confusing as it shows a million options in their eyes. For some users just being able to get a option list like YouTube where it shows certain resolutions and that's it would be much more helpfull. 

 

Related to this, how many API's do browsers have to tell Emby the capabilities of the device? Both internetspeed and codec/HDR support? I've noticed stuff like transcoding h265 to h264 despite Chrome supporting it, playing back HDR on a device which deffo isn't HDR capable instead of tonemapping etc. I do believe that quality is largely defined in the client settings page, which results in transcoding to a lower resolution. Any way to change this server side? They have the bandwith but still using transcoding is a bit wastefull. I'd rather have Emby try max quality and than dynamically shifting to a lower resolution if it fails (kind of the opposite of Netflix which is good since Netflix never ever kicks in 2160P for me)

Edited by MSI2017
Link to comment
Share on other sites

GrimReaper
14 minutes ago, MSI2017 said:

But now for Example a 4K HDR file on my iPad Mini 4 stutters like crazy. Therefore I want to turn it back down to 1080p. However,  selecting 1080p 60mpbs would still allow for 4K direct play since the file bitrate is 19mbps. This is on local network.

Why would you select 60mbps for a 19mbps file and expect it to be transcoded? Select lower quality, like 15mbs. On AndroidTV you can alternatively select Playback correction twice, which will trigger transcoding as well (first selection triggers Remux).

 

14 minutes ago, MSI2017 said:

Therefore I would suggest an 'easy mode' quality panel where it just shows resolution.

IMHO that would make no sense as it's not resolution that's defining factor in Quality selection menu but bitrate. If anything, all resolutions should be removed and only bitrate indicated. 

Edited by GrimReaper
  • Like 1
Link to comment
Share on other sites

MSI2017
Just now, GrimReaper said:

Why would you select 60mbps for a 19mbps file and expect it to be transcoded? Select lower quality, like 15mbs. You can alternatively select Playback correction twice, which will trigger transcoding as well (first selection triggers Remux).

Because it is a 4K video which seems like it is chocking the iPad. I just want it to be 1080p in the highest quality possible. Therefore I am choosing the top 1080P one. I don't understand how resolution is not leading. I need it to be 1080p but it just ignores my request since the server decided the bitrate should be leading. How is anyone expected to check bitrate first in order to know which option to select?

 

Imagine me explaining this stuff to my mom: Yeah when it stutters a lot and doens't decide to drop down itself make sure to check the file bitrate, go back to playback and choose the first 1080p option with the number behind it being lower than the file bitrate? Terrible UX it seems? For me its fine (although I'd prefer the server handling transcoding a bit more intelligently) since I'm more tech-savvy.

Link to comment
Share on other sites

MSI2017
12 minutes ago, GrimReaper said:

If anything, all resolutions should be removed and only bitrate indicated. 

I missed this edit, if this were to happen I'd almost go Plex. And I hate Plex. There is a reason absolutely no other streaming platform does this. Imagine if YouTube started doing this. Granted I would love this in Netflix since it doesn't alway go for the highest quality possible but even than I'd rather just have the 4K option also just defaulting to max bitrate and only dropping down if internet cannot handle of if its beyond the device capabilities.

Edited by MSI2017
Link to comment
Share on other sites

Cheesegeezer

I kinda agree with both sides here. 
 

a non tech person wouldn’t know the capabilities of their network or internet connection, therefore the bitrate only adds to confusion for them.

on the other hand displaying only the resolutions may also confuse a non tech person, where they buy a 4k tv and immediately select 4k because they have a 4k tv. 
 

what i think should be displayed is Network Quality. 

  1. Excellent
  2. Very Good
  3. Good
  4. average
  5. Poor
  6. Very Poor
  7. Dire

then you could relate these to number of bars on the wifi symbol or add some kind of network speed to let the user know the limits.

a simple info line at the top of the selection window on devices could add extra guidance to a user on what to select 

Edited by Cheesegeezer
Link to comment
Share on other sites

19 hours ago, MSI2017 said:

There is a reason absolutely no other streaming platform does this

Hi.  Unfortunately, that reason is just a marketing one.  TV manufacturers and content providers only advertise resolution so the general public thinks that is the main driver of quality when, in fact, bitrate really is.  A 4K video at 1Mb/s would be horrible quality and much worse than a 1080 one at 10Mb/s.  The bitrate is also the main factor in bandwidth that's needed to deliver the video.

So, the problem is that most users only understand resolution, but the system has to really key on bitrate and we have to come up with a good way of conveying that within our options.

Link to comment
Share on other sites

MSI2017
7 hours ago, crusher11 said:

But resolution doesn't mean anything. Bitrate does.

I know that, and so do most members here. But not people who are not familiar with this. I even know some people who are sortof tech savvy who don't understand bitrate. I am not asking for a complete switch, just the option (althoug when playing a lower bitrate 4K file selecting the highest bitrate 1080p should still go to the highest quality possible in 1080p, just for compatibility reasons)

Link to comment
Share on other sites

MSI2017
4 minutes ago, ebr said:

So, the problem is that most users only understand resolution, but the system has to really key on bitrate and we have to come up with a good way of conveying that within our options.

Yes my point exactly, or at least make it optional. In general I'd prefer the server making better choices so that users don't have to. That's really the main issue, otherwise the menu wouldn't be an option (Netflix doesn't even have one for example)

Link to comment
Share on other sites

Mnejing
1 hour ago, ebr said:

Hi.  Unfortunately, that reason is just a marketing one.  TV manufacturers and content providers only advertise resolution so the general public thinks that is the main driver of quality when, in fact, bitrate really is.  A 4K video at 1Mb/s would be horrible quality and much worse than a 1080 one at 10Mb/s.  The bitrate is also the main factor in bandwidth that's needed to deliver the video.

So, the problem is that most users only understand resolution, but the system has to really key on bitrate and we have to come up with a good way of conveying that within our options.

So that's why you kind of lie to them. It's not a bad thing, it's just presenting it in a way that makes sense to them. There's not much point in arguing the point that bitrate IS the proper measure, because absolutely no one outside of a relatively niche group actually care enough to understand that. It's hard enough explaining 720p vs 1080p vs 4k, throw bitrate in there and you might as well give them a book to read instead.

I think an option could be just to present it in a simplified manner to the end-user (720p, 1080p, 4k, etc), and allow for manual configuration of what that means in the server config instead. You know, basically a definition of what 1080p means, perhaps with a minimum and maximum rate, maybe give it bitrate tiers based on stated speed. I dunno, you can get all kinds of granular with it. But the important bit is that you're driving the configuration and presentation of options to a person who (presumably and ideally) understands how to tweak that (read: server admin). It's a pretty easy Premiere feature because it ties in with transcoding, and I'd wager people actually paying for a license are slightly more inclined to actually understand how to do that, or at least understand how to get the answer. Keep the whole thing as default off, and allow people who actually want to have that kind of control to have it.

It's work, I get it. But I personally feel like it'd go a ways towards solving some UX issues for the less technically inclined.

Link to comment
Share on other sites

crusher11
2 hours ago, MSI2017 said:

I know that

If you know that, why are you trying to downrez a video that can direct play just fine?

Link to comment
Share on other sites

MSI2017
57 minutes ago, crusher11 said:

If you know that, why are you trying to downrez a video that can direct play just fine?

How do you know it can direct play just fine? I mentioned in my OP that it cannot handle 4K playback

Link to comment
Share on other sites

Cheesegeezer
13 minutes ago, MSI2017 said:

How do you know it can direct play just fine? I mentioned in my OP that it cannot handle 4K playback

That should boil down to selecting auto and Emby having a mechanism to determine the network speed against bitrate and making a decision based on that.

@ebr  is this in the ecosystem currently 

Link to comment
Share on other sites

30 minutes ago, Cheesegeezer said:

is this in the ecosystem currently

Yes but it isn't foolproof.  For one thing, the test is at a single moment in time which may or may not give a good sustainable estimate.  Another situation is where people use a VPN or other methods to make it look like the connection isn't remote and we don't do the test on local connections.  That's why we provide the manual options.  But this all comes down to bitrate, not resolution and there isn't a perfect relationship between the two.

For instance, if we were to do what the OP actually suggested and boil this down, for 1080 I'd be inclined to use 20Mbs as the bitrate equivalent and that would have direct played his item in the example too.

Link to comment
Share on other sites

Cheesegeezer
4 minutes ago, ebr said:

Yes but it isn't foolproof.  For one thing, the test is at a single moment in time which may or may not give a good sustainable estimate.
 

I completely get that and it makes sense.

4 minutes ago, ebr said:

Another situation is where people use a VPN or other methods to make it look like the connection isn't remote and we don't do the test on local connections.  That's why we provide the manual options.  But this all comes down to bitrate, not resolution and there isn't a perfect relationship between the two.

For instance, if we were to do what the OP actually suggested and boil this down, for 1080 I'd be inclined to use 20Mbs as the bitrate equivalent and that would have direct played his item in the example too.

and yes like i said earlier Resolution is a confusing factor in selection. Bitrate against network speed is accurate but requires some user knowledge of what this means. 
 

you cant stream a 17Mb/s video if your network at the client can only see 8Mb/s.. youll have issues for sure

Link to comment
Share on other sites

MSI2017
2 hours ago, Cheesegeezer said:

That should boil down to selecting auto and Emby having a mechanism to determine the network speed against bitrate and making a decision based on that.

@ebr  is this in the ecosystem currently 

Gigabit connection and the device speedtests at around 60MB/s (megabytes, not bits). File streaming from 980 Pro nvme SSD. It's just iPad seemingly not being able to handle 4k. (higher bitrate 1080p files work fine)

Link to comment
Share on other sites

MSI2017
1 hour ago, ebr said:

Yes but it isn't foolproof.  For one thing, the test is at a single moment in time which may or may not give a good sustainable estimate.  Another situation is where people use a VPN or other methods to make it look like the connection isn't remote and we don't do the test on local connections.  That's why we provide the manual options.  But this all comes down to bitrate, not resolution and there isn't a perfect relationship between the two.

VPN makes it even less user-friendly. But don't browser have a lot of API's to call on? Even going as far as HDR capabilities. For the app it might make more sense to just use that single moment test (knowing it might not be sustainable) and drop the quality down a bit if it indeed buffers. Or the other way around like Netflix is doing. Start out a bit lower, do some tests and evaluate current playback whilst the content is playing at that lower res and kick in high quality. Problem here being that this would not work with Dolby Vision and HDR files. But I do think that Emby's playback and transocding options could use an update. But will admit it's really tough.

 

1 hour ago, ebr said:

For instance, if we were to do what the OP actually suggested and boil this down, for 1080 I'd be inclined to use 20Mbs as the bitrate equivalent and that would have direct played his item in the example too.

I just find it weird that when my only goal is to kick the res down to 1080P (since my device just cannot handle that resolution, bitrate is fine) it just keep on playing "because the content bitrate is infact lowe than the selected bitrate: despite it saying 1080p infront. I'm not an idiot, I pressed that button for a reason but I get ignored. But this all should be a non-issue if like said above can be looked at. The app didn't even seem to recognise that easily more than half of the frames we're being dropped. 

Link to comment
Share on other sites

35 minutes ago, MSI2017 said:

since my device just cannot handle that resolution, bitrate is fine

The device must be reporting that it is 4K capable so I don't think the resolution should really be the issue.  The app would not allow it to direct play if the device said it couldn't handle 4k resolution.  That is to say that it is more likely the bitrate or something else that is causing the problem.

We always have a bias towards direct playing whenever possible because that's what most people want and it allows the server to deliver the most media at once.  So, if the device reports it can handle a certain bitrate/resolution we're always going to direct play as long as the media fits within those parameters.  So, in that case, the option labeling is deceptive, I'll give you that.

Improvements can definitely be made in how this is presented.

Link to comment
Share on other sites

Cheesegeezer
58 minutes ago, MSI2017 said:

Gigabit connection and the device speedtests at around 60MB/s (megabytes, not bits). File streaming from 980 Pro nvme SSD. It's just iPad seemingly not being able to handle 4k. (higher bitrate 1080p files work fine)

Your gripe is not with emby but with apple. I would go and knock on their door. 
 

I don’t deal with Bits/sec either… that’s for marketing idiots to make their products sound better than they are lol 😂 

Link to comment
Share on other sites

MSI2017
30 minutes ago, ebr said:

The device must be reporting that it is 4K capable so I don't think the resolution should really be the issue.  The app would not allow it to direct play if the device said it couldn't handle 4k resolution.  That is to say that it is more likely the bitrate or something else that is causing the problem.

We always have a bias towards direct playing whenever possible because that's what most people want and it allows the server to deliver the most media at once.  So, if the device reports it can handle a certain bitrate/resolution we're always going to direct play as long as the media fits within those parameters.  So, in that case, the option labeling is deceptive, I'll give you that.

Improvements can definitely be made in how this is presented.

I think it is, and like @Cheesegeezersaid it probably an iPad issue. But the dropped frames should be noted by the app right? Or is Apple limiting the info? Would not be surprised lol.

 

I fully support the bias toward direct playing, in fact I think this does not happen often enough for remote users (an weirdly Chrome still going to h264 despite there beign support now).

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...