Jump to content

Transcode bitrate limit


Recommended Posts

roberto188
Posted

Emby server limits the transcode bitrate for Live TV to the actual bitrate. In other words if the Show is 8Mbps and my Roku requests it at 20 the transcode is capped at 8, which limits quality. Can u remove this limit? Thanks.

Jdiesel
Posted

Why would you want to transcode to a higher bitrate than the original source? Increasing the bitrate beyond the source won't result in better quality just a larger filesize.  

Posted

This is by design. The 20 that you set in the Roku app is intended to be a maximum, not a set value.

roberto188
Posted

Because for live TV, the stream might be a quality compressed 8mega bit stream, but a fast re-encode at 8megabits looks bad thus, why you'd want to re-encode to 20. It really doesn't make sense to limit the transcode bitrate to the native bitrate.

Happy2Play
Posted (edited)

I don't see how transcoding a native 8mb bit rate file to higher bite rate will make it look better.

Edited by Happy2Play
roberto188
Posted

Because the original stream is encoded by specific hardware designed for this purpose and broadcast. The encoding done by emby isn't. A well compressed and processed stream at 8mbits will look far better than an x264 very fast 8 mbit stream. Pretty basic.

roberto188
Posted (edited)

It won't look BETTER THAN the original but it will look close to original. A 8mbit stream re-encoded poorly at 8mbots will significantly degrade the quality.

Edited by roberto188
Happy2Play
Posted

 I guess I don't see this on my Roku when something is transcoded vs Direct Play at 6MB but I don't do live tv.

Jdiesel
Posted (edited)

I think it would be best if you posted a transcode log as an example. What is area the details of the source video? Ideally it would have the video stream copied so there is no changes to the source video.

There will always be some quality loss when encoding an already lossy compressed video. Everytime the video is converted so information is lost.

Edited by Jdiesel
Waldonnis
Posted

Probably the only way to settle whether or not this would be beneficial is to run some PSNR/SSIM tests to see if the OP's assertion is true.  I'm guessing that his assumption is that you're reducing additional re-encoding quality loss by increasing the target bitrate, since it would have more "bitrate room" to preserve more of the original detail (basically, preserving the warts rather than compounding them).  Honestly, I've always encoded with a specific set of target requirements in mind and with a higher-bitrate source, so I never thought about this scenario before.  I can't imagine that increasing the target bitrate will help reduce further quality loss appreciably, but I could be wrong and there could be more significant differences at given bitrate thresholds.

 

That being said, I don't see much of a technical reason to limit the target bitrate to that of the source as long as it doesn't exceed the client's max bitrate setting.  At worst, it's only wasted bandwidth and maybe a few CPU/GPU cycles.

roberto188
Posted (edited)

X264 at very fast or super fast yeilds bad video quality unless you Jack up the bitrate. So if you are trying to re-encode mpeg2 to mpeg4 for playback on a Roku with a very fast or super fast preset, capping the bitrate at 8mbits will make your video look terrible. I'll post an example.

Edited by roberto188
roberto188
Posted

Also. Live TV is transcoded at 60fps. So it requires more than twice the bitrate to maintain the same quality as a 24fps movie.

Waldonnis
Posted

Also. Live TV is transcoded at 60fps. So it requires more than twice the bitrate to maintain the same quality as a 24fps movie.

 

Except live television is interlaced while a 24fps movie wouldn't be, so you can't really compare the bitrates of the two.  If anything the interlaced signal should only require slightly more bandwidth/bits than a 24fps progressive signal, and virtually the same bitrate as a 30fps progressive signal.  In fact, a 720p60 progressive video requires about the same bandwidth as a 1080i/30 video, which is why some non-US sports broadcasting is done at lower resolutions (so they can broadcast 50 or 60fps progressive for better motion representation).

 

I guess you could argue that the veryfast preset is the biggest factor here and that bitrate increases could offset that and I believe your perceptual results.  What I'm saying, though, is that I'd be curious to see via metrics what the gains really are, and if there are source/target bitrate thresholds where any gains become significant (or really, become insignificant).  If you have to double the bitrate to get a 10% SSIM/PSNR gain, that would be good to know.  Likewise, if going from a veryfast preset to a faster preset means only a 3% metric difference, then that's also good to know.  I don't object at all to allowing higher-than-source bitrate targets, but I'm wondering where you'd start encountering diminishing returns so that there can be an "upper cap" rather than just allowing someone to specify ridiculously large bitrates (e.g. 500Mbit for a 2Mbit source).

 

I could probably script such a test as it's not much different from a proof of concept quality testing scriptset I wrote a few months back, but don't have spare CPU cycles (or "me cycles") to dedicate to it at the moment.

Jdiesel
Posted (edited)

A few months ago I did some testing myself using a Bluray remux of the Disney short film Piper as my source material. From the original source I pulled snapshots of the same frames at 5 different time stamps at bitrates of 0.5, 1, 2, 4, 8, 16, 32Mbps, and the original unaltered source. I then went even further and encoded with h264 presets from veryslow to ultrafast for each of those bitrates. In the end I had 64 sets of 5 screen caps to compare to the original source.

 

My results are subjective of course but I found there was minimal improvement after 8Mbps and using a present greater than fast. The biggest surprise to me was how little changing the present affected the final output. Maybe this was due the the source material I used as an example (animated movie with consistent visuals). Up until that point I had been using a preset value of slow but ended up backing it off to fast because I saw little improvement in visual quality and a much greater hit on my server resources.

 

I just wanted to share my testing, not really trying to make a point other than what seems good for one person might not be adequate for the next.

 

Edit: I would really be interested in the results of a controlled double blind test for video encoding. I have seen studies done with MP3 compression and the results tended to show that people stop being able to tell the difference at a much lower bitrate than they thing they can. 

Edited by Jdiesel
roberto188
Posted

Thank you everyone for the input. Yes, all video encoding depending on source, resolution, framerate etc varies widely and the quality will vary widely, which is precisely why I have requested the removal of the transcode bitrate cap. Please just look at the above post with the two different files. This is the quality and format that Emby is transcoding live TV. This is a slow action commercial and it is VERY easily seen that the 8mbit file has significant macroblocking while the 20 mbit one does not. I have been watching sports on Roku for months now. The channels that stream at 20 mbits are near indistinguishable from the source, while the 8 mbit limited stations when there is significant motion or a crowd shot suffers from significant macroblocking. Higher bitrate = higher quality. Yes, the returns are diminishing but it should be up the user to determine that for themselves, not to be capped by programmers who think they know better than the person actually watching the content. Simple request, please remove the limitation, or provide an option to remove it. Thanks!

roberto188
Posted

People's bandwidth and CPU capacities vary widely as well. My Quatro K2000 can do 6 720p 60fps streams at the same time, but it requires significant bitrate to get any decent quality. I have gigabit local network and internet service as well. I have more bandwidth than I know what to do with, but limited processing power. I want to be able to use the fastest encoding setting with a high bitrate so I can serve up as many streams as possible at the highest quality, not caring about the bandwidth.

Jdiesel
Posted

@@roberto188

 

Can you provide an Emby transcoding log when live tv is being transcoded from 8Mbps to 8Mbps?

Jdiesel
Posted

So we can see the ffmpeg string being used

Waldonnis
Posted

A few months ago I did some testing myself using a Bluray remux of the Disney short film Piper as my source material. From the original source I pulled snapshots of the same frames at 5 different time stamps at bitrates of 0.5, 1, 2, 4, 8, 16, 32Mbps, and the original unaltered source. I then went even further and encoded with h264 presets from veryslow to ultrafast for each of those bitrates. In the end I had 64 sets of 5 screen caps to compare to the original source.

 

My results are subjective of course but I found there was minimal improvement after 8Mbps and using a present greater than fast. The biggest surprise to me was how little changing the present affected the final output. Maybe this was due the the source material I used as an example (animated movie with consistent visuals). Up until that point I had been using a preset value of slow but ended up backing it off to fast because I saw little improvement in visual quality and a much greater hit on my server resources.

 

I just wanted to share my testing, not really trying to make a point other than what seems good for one person might not be adequate for the next.

 

Edit: I would really be interested in the results of a controlled double blind test for video encoding. I have seen studies done with MP3 compression and the results tended to show that people stop being able to tell the difference at a much lower bitrate than they thing they can. 

 

Yeah, presets for some encoders aren't as impactful as others.  It really depends on the source, as always, but going from veryfast to faster with x264 isn't usually a large jump in quality at all despite the increased computation being done.  Even with x265, using a medium preset isn't much different than slow or slower despite medium taking significantly less time.  Of course, where you see the difference is in efficiency and in more difficult encoding scenarios.  Take any scene that's very low lit and you'll start seeing the faster presets swallowing details entirely (movies like Godzilla from 2014 are filled with such scenes; you can lose entire monster limbs in the output if you're not careful with encoder settings).  Basically, there isn't so much "I think I need more analysis" when it comes to preset and encoder option selection...if your output sucks compared to the original or parts of a scene get lost because of dithering/blocking, you know it needs more analysis or possibly more bitrate (or both)  :P

 

Personally, I think a metric test would be easier than a double-blind test simply because you can usually defeat any visual test by just upping the brightness for some reason (people seem to like torch mode *shrug*; HDR television "tests" that don't use instrumentation during evaluation are hilarious to read because of this).  The same thing happened when a few people tried a double-blind DTS vs. AC3 test in the audio world some years back: DTS would always win because it was inherently louder (again, shrug, welcome to humanity).  Also, still frame comparisons aren't that great for evaluating video quality on their own...they're useful for looking at specific problems, though.  There's really no substitute for watching the video and/or running metrics on segments (preferably both).

roberto188
Posted

It's completely irrelevant to the request, but fine I'll send it.

roberto188
Posted (edited)

See the attached log. 

log.txt

Edited by roberto188
  • 1 month later...
roberto188
Posted

Any movement on this?

Posted

There are no plans to change this right now. We don't increase bitrates when transcoding, except in the cases of hevc to h264, as well as with very low bitrate content.

 

This is the type of thing that is advanced enough, that if you must have it, you'd have to get it with speechles blue neon app....but actually, that isn't even possible as it would require api changes. What you're asking for here is a paradigm shift from the way we currently do things.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...