I have an issue with emby. When I play a movie remotely it has to be transcoded most of the times because I've set a remote bandwith limit to 4,5 Mbps. The problem is that the client isn't buffering correctly/enough when watching a transcoded video. For example I start a movie and pause it, the buffering stops after about five seconds. I monitor the network traffic with iftop and I can see it in the client too, as kodi and emby clients show a grey bar indicating the buffering progress. With directplay it's all fine. It buffers minutes or even the whole video depending on the client and it's buffering with a higher speed than the bitrate. In kodi for example, it completely ignores the settings of buffersize and speed(readfactor) when transcoding on the opposite to directly play where it does what it's supposed to do. The whole problem with this is that when a short interrupt or bandwith shortage occurs the playback stops. Instead the client could have used the extra bandwith to fill it's buffer. This issue happens with all clients I've tested so far. My upload should be plenty with 40 Mbps. Is this behaviour intended? In my opinion transcoding is mostly used on remote connections and there is always a buffer needed opposed to direct play which is mostly used at home. The transcoding is more than fast enough by the way. It usually transcodes with 3x speed or faster.
Version 184.108.40.206 Docker on Ubuntu 18.04
Kodi Emby and EmbyCon (increased buffer/set it to unlimited)
Emby Theatre UWP
Emby Theatre Desktop
Web (Edge, Firefox & Chrome)
Just say if you need any client or server logs.
Edited by Gerrit507, 07 September 2018 - 07:49 PM.