bozrdnag 75 Posted May 27, 2021 Posted May 27, 2021 (edited) I've just started playing around with transcoding with dynamic tone mapping and 4K videos. I usually have my friends and family set the bitrate at 5Mbps because my internet upload speed maxes out at 20 and I want to be able to support multiple users simultaneously. With regular BluRay that looks good with a pretty sharp image. But I've noticed that that same setting with HDR 4K looks noticeably blurrier. I have to bump to 8Mbps to get approximately the same detail. Is this just a fact of life or known issue. Or do others of you not experience this. I'm using CPU transcoding and it keeps up fine running around 25% while transcoding. I've tried the different tone mapping algorithms and it doesn't seem to change that. I've attached a few phots to demonstrate. UHD 5Mbps (blurry) UHD 8Mbps BD 5Mbps Edited May 27, 2021 by bozrdnag
Luke 42078 Posted May 29, 2021 Posted May 29, 2021 Hi there @bozrdnag, can we please look at a specific example? Please attach the information requested in how to report a media playback issue. thanks !
bozrdnag 75 Posted May 31, 2021 Author Posted May 31, 2021 (edited) I didn't do that because I don't know that it's a bug. I'm asking if others have experienced this or if it's expected. Anyway, here is a transcode log while playing back Aquaman 4K HDR with settings at 5Mbps (1080p). A little experimenting while looking at Stats for Nerds shows that transcoding the regular BluRay at 5Mbps, playback resolution is reported as 1920x1080. Transcoding the UHD BluRay at the same setting, playback resolution is reported as 720x405. I have to increase it to 10Mbps to get it to show as 1920x1080. This is the same on every movie I try. ffmpeg-transcode-e05deb28-c06d-4240-bbec-de4d97d6424d_1.txt Edited May 31, 2021 by bozrdnag
bozrdnag 75 Posted June 8, 2021 Author Posted June 8, 2021 Nobody has any input or feedback on this issue?
niallobr 9 Posted December 26, 2021 Posted December 26, 2021 (edited) I know this is a slightly old topic and sorry to see you didn't get another reply but it would probably worth dropping your feedback into this thread also. I don't think there's any control of the scaling of streams using hardware tone mapping. You could be playing a 4K HDR video but when it transcodes with tone mapping remotely I think the resolution will drop from 3840x2160 to 1920x1080 or lower? I noticed if a remote client is using the Auto quality setting the resolution on my server drops as low as 640x360 at 1.5 Mbps, even though we both have plenty of bandwidth and my bitrate limit is 8 Mbps You can temporarily improve things by installing the Diagnostics plugin and going Diagnostic Options > Disable Scaling, which seems to give you full 3840x2160 tone mapping, but it resets on reboot and if you have the same Auto bitrate issues for remote users this will probably stay at 1.5 Mbps unless the user manually selects a higher quality setting, which sucks. Edited December 26, 2021 by niallobr
Luke 42078 Posted December 27, 2021 Posted December 27, 2021 Quote You could be playing a 4K HDR video but when it transcodes with tone mapping remotely I think the resolution will drop from 3840x2160 to 1920x1080 or lower? It could be depending on the quality setting in the app, yes.
niallobr 9 Posted December 28, 2021 Posted December 28, 2021 (edited) Good to know for sure, thanks Luke. A bit off topic but the main issue I have with the remote connections is the Auto quality setting seems to always default to 720p 1.5 Mbps on clients like Xbox and Samsung TV, even when there seems to be plenty of bandwidth available on both sides. I can manually choose the highest quality setting and playback is perfect (follows my 8 Mbps remote bitrate limit) but users don’t seem to notice this option. It would be nice if the Auto setting was more effective. I noticed a few threads about it so I’ll try follow up there. Edited December 30, 2021 by niallobr
Luke 42078 Posted December 29, 2021 Posted December 29, 2021 19 hours ago, niallobr said: Good to know for sure, thanks Luke. A bit off topic but the main issue I have with the remote connections is the Auto quality setting seems to always default to 720p 1.5 Mbps on clients like Xbox and LG TV, even when there seems to be plenty of bandwidth available on both sides. I can manually choose the highest quality setting and playback is perfect (follows my 8 Mbps remote bitrate limit) but users don’t seem to notice this option. It would be nice if the Auto setting was more effective. I noticed a few threads about it so I’ll try follow up there. We're constantly working on improving it. Thanks for the feedback. 1
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now