ebr 14925 Posted May 4, 2020 Share Posted May 4, 2020 If they have their Shield set at 1080p because they want their 4K TV to upscale 1080p to 4K then show them 1080p first because that's what they want to see anyway No, that isn't what they want in that scenario. They want 1080 content to be upscaled by their TV but they want 4K content sent unaltered and we handle all of that with automatic switching. I imagine there is a way we can improve the logic to some extent but it isn't as cut and dried always as it seems and no other app does this at this time. 1 Link to comment Share on other sites More sharing options...
CBers 6771 Posted May 4, 2020 Share Posted May 4, 2020 You only have your Shield set to 1080 if you are connected to a 4K TV and you set it manually, or the Shield detects the TV is only capable of 1080 and only offers those resolution. For example, on my Shield connected to my 1080 TV, I only have the 1080 resolutions available to choose from, whereas on the Shield connected to my 4K TV, I have all resolution options available. The Shield detects what type of TV it is connected to and only offers those resolutions as an option, so I think @@FrostByte may be right in suggesting that the app should use that setting. 2 Link to comment Share on other sites More sharing options...
FrostByte 5052 Posted May 4, 2020 Share Posted May 4, 2020 (edited) Can the app then see if 4K is even an option for the resolution setting? If 4K isn't listed as a setting option on the Shield then you know the TV isn't 4K and you can list 1080p first Or, you could use the detected resolution and not the resolution the user has their Shield set at to show what is shown first. If that's made available to you Edited May 4, 2020 by FrostByte 2 Link to comment Share on other sites More sharing options...
ebr 14925 Posted May 5, 2020 Share Posted May 5, 2020 I'm pretty sure it can be improved but there are some complex logic cases that need to be covered. 1 Link to comment Share on other sites More sharing options...
vdatanet 1549 Posted May 5, 2020 Share Posted May 5, 2020 (edited) That's because the device you are running the app on can support that 4K content. The determination of the default will be the highest quality that can direct play and that can. If you can find a situation where the default causes a transcode when a non-default wouldn't, let me know. Apple TV 4K, I used to have 2 versions (1080p and 4K HEVC), but 4K HEVC always was played, with transcoding, instead of direct streaming 1080p version. So we need to improve the default selection logic, I prefer 1080p instead of washed 4K. Edited May 5, 2020 by vdatanet 2 Link to comment Share on other sites More sharing options...
vdatanet 1549 Posted May 5, 2020 Share Posted May 5, 2020 Apple TV 4K, I used to have 2 versions (1080p and 4K HEVC), but 4K HEVC always was played, with transcoding, instead of direct streaming 1080p version. So we need to improve the default selection logic, I prefer 1080p instead of washed 4K. The same applies to Nvidia Shield, I have a Shield on my bedroom connected to a 1080p display. It's true it can direct play 4K content but it's washed. I think HDR should be taken into account when selecting default version. If display is not HDR capable, non HDR version should be set as default. I don't know if you can query if a display is HDR capable. Link to comment Share on other sites More sharing options...
ebr 14925 Posted May 5, 2020 Share Posted May 5, 2020 The same applies to Nvidia Shield, I have a Shield on my bedroom connected to a 1080p display. It's true it can direct play 4K content but it's washed. I think HDR should be taken into account when selecting default version. If display is not HDR capable, non HDR version should be set as default. I don't know if you can query if a display is HDR capable. I don't believe there is any way to know if the display supports HDR. Link to comment Share on other sites More sharing options...
FrostByte 5052 Posted May 5, 2020 Share Posted May 5, 2020 On my 2019 in the same place it shows that it detected the resolution of my TV being 4K also says that my display is HDR10 ready. So the Shield knows it is connected to an HDR monitor. Not sure if you have access to everything the Shield detected, or not, but one would hope so. 1 Link to comment Share on other sites More sharing options...
ebr 14925 Posted May 6, 2020 Share Posted May 6, 2020 On my 2019 in the same place it shows that it detected the resolution of my TV being 4K also says that my display is HDR10 ready. So the Shield knows it is connected to an HDR monitor. Not sure if you have access to everything the Shield detected, or not, but one would hope so. Yes, I think I can in 7.0+. Older Fire devices are always a problem with this though (they run on 5.1.1). 1 Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now