Jump to content

Playback streaming bitrate doesn't seem to be completely honored


Recommended Posts

skurvy_pirate
Posted

I have set my playback streaming bitrate to 7 Mbps as I have a 10 Megabit upload connection. It seems that this isn't being completely honored. The average might be around 7 Mbps when streaming externally, but what is actually happening is it will spike up to at or above my max upload bandwidth for a few seconds, then not push any data, then spike again. Most times the spikes are close to 7 Mbps, but 1/5 at least are 8-10 Mbps. Here is a network graph from my router of an external stream:

58b3945a87909_Spike.png

 

I could try lowering it even further (I had it set to 8) and see if that helps. But I would like to be able to stream externally at around 7 Mbps and lowering will probably force the clients to a lower bitrate. And multiple clients would be even lower. The main reason this is an issue is that it really impacts some multiplayer games. Sometimes ping will be bad, other times ping seems fine but it will be rubber-banding and freaking out. 

JeremyFr79
Posted

I have set my playback streaming bitrate to 7 Mbps as I have a 10 Megabit upload connection. It seems that this isn't being completely honored. The average might be around 7 Mbps when streaming externally, but what is actually happening is it will spike up to at or above my max upload bandwidth for a few seconds, then not push any data, then spike again. Most times the spikes are close to 7 Mbps, but 1/5 at least are 8-10 Mbps. Here is a network graph from my router of an external stream:

58b3945a87909_Spike.png

 

I could try lowering it even further (I had it set to 8) and see if that helps. But I would like to be able to stream externally at around 7 Mbps and lowering will probably force the clients to a lower bitrate. And multiple clients would be even lower. The main reason this is an issue is that it really impacts some multiplayer games. Sometimes ping will be bad, other times ping seems fine but it will be rubber-banding and freaking out. 

This is what happens when throttling is enabled and in fact how most streaming works even from sources like Netflix etc.  you can disable throttling and should see a more steady result.

skurvy_pirate
Posted

So if I completely disabled throttling the same bitrate stream (lets say 7 Mbps) would actually peak at a lower rate? Or I guess, there wouldn't be "peaks" but it would just be a solid stream at a lower rate? 

JeremyFr79
Posted

So if I completely disabled throttling the same bitrate stream (lets say 7 Mbps) would actually peak at a lower rate? Or I guess, there wouldn't be "peaks" but it would just be a solid stream at a lower rate? 

In theory yes, you would probably still see valleys and hills to some extent unless they're using CBR in the ffmpeg settings, if not you'd see a peak of your set rate but it could dip below that during certain scenes if VBR is being used.

Guest asrequested
Posted

Also, with throttling turned off, the server will transcode the whole file as fast as it can, in one burst. Not in real-time.

pir8radio
Posted (edited)

I think you will still see the peak at the start of the stream, emby tries to push a big chunk to get the stream started then idle down.   Here is someone direct streaming a movie on my system.. no ffmpeg running (start at 5.46PM) you can also see where someone started a stream at 4.23pm then stopped watching..  there is still a peak.  I don't know of a way to stop that in emby.   

 

 

58ba07aa4b35a_chart.png

Edited by pir8radio
skurvy_pirate
Posted

With both throttling on and off the encoding gets done quickly up front and is done within a couple minutes. After that is just sending data to the remote client. I tried turning off throttling and go basically the same results. It actually looked like the spikes were higher somehow. They were only streaming at 5Mbps and my upload connection is 10Mbps but it was ruining my internet when it was spiking at or above 10Mbps. I need to figure out something else to resolve this. Last I checked 10 up was as high as I could get from Comcast without paying like 3x more for business class.

pir8radio
Posted

Google Net Balancer they have a freeware program that can limit the bandwidth a program is allowed to use. Maybe limit (ffmpeg or emby not sure which) to 7Mbps?

skurvy_pirate
Posted (edited)

So I noticed some interesting behavior when digging into this further. When streaming from an Android client, the bandwidth used is much more consistent and lower overall. Streaming to a Chrome browser had higher spikes and seemed to use more bandwidth. The client I have been posting about is an Apple TV box actually and I don't have access to it but it looks very similar to the way Chrome behaves. Here is a graph of starting a stream of the same video at 1080p 5Mbps on Chrome, followed by an Android client:

58bdb95a09de2_Streaming_Graph.jpg

 

You can see it is much smoother and more consistent on the Android client and is actually using around ~5Mbps whereas the Chrome (and AppleTV) are using close to 7Mbps and spiking even higher.

 

Another thing to mention, the server is actually running on Linux. I just posted under Windows/General because I originally thought it was a general setting that wasn't being honored that was causing this (Playback Streaming Bitrate). If this topic should be moved or closed and created in the Linux sub forum just let me know. Not sure exactly where it belongs.

Edited by skurvy_pirate
pir8radio
Posted

You are seeing the different file chunks.. the movies are not streamed as one big file.  Android probably handles that differently, I have always seen the same thing and think it is normal.

skurvy_pirate
Posted

I guess there is nothing I can do to "fix" it other than use only Android clients. The problem isn't the spike at the beginning or that it is spiking, but that it is spiking over my bandwidth limit and causing lag issues when streaming to Apple TV/Browser. It seems odd they behave so differently with the file chunks. 

skurvy_pirate
Posted

Another question, would the format make a difference? Is it possible Android handles the video format easier/differently than the Apple TV? I figure the data amount would be similar but maybe there is less stuff (Audio tracks, etc) getting transcoded and streamed to the Android client.

skurvy_pirate
Posted

I spent some time logging network traffic a few months ago. It seems target bitrates are just that, targets, due to how the transcodes are being encoded. To obtain more precise targets throttling would need to be disabled and CBR would need to be used instead of CRF.

 

https://emby.media/community/index.php?/topic/42876-transcoding-questions/

 

That is an interesting thread and tests that you did. I have throttling enabled in the encoder, but honestly nothing else runs much on that box so it usually just maxes the cores and is done in 2-3 minutes and then will just be streaming from cache. I can fiddle with that to see if it makes any difference. I am also curious if you have found an existing setting/configuration that worked for you? It sounds like to truly get what we want would require some new features/options.

Jdiesel
Posted

Not really. I ended up disabling throttling as my server is only used to host my media and I am fine with it running full out when transcoding. I found that I needed to set the client's quality settings a fair amount below the internet connection bandwidth to get acceptable results. For example, two of my clients have a stable 5 Mbps download connection but I need to set the quality at 3.5 Mbps to avoid buffering. Some videos play fine at a quality setting of 5 Mbps where other choke at 4 Mbps which I believe has to do with those extended periods of high peak bitrate.

 

I would be interesting in seeing the difference between CBR over CRF when transcoding and how it affects streaming performance. 

skurvy_pirate
Posted

I only have usually 1 remote client streaming, but they are streaming HD content so they want the best quality they can get. I thought 5Mbps would be reasonable with my 10Mbps upload connection but it causes issues so that is why I am a little frustrated. I will experiment with them lowering to 4Mbps but I really don't want their quality to suffer noticeably. Also, since the Android client handles this much better I might talk with them about switching boxes, but might be hard to get them to not use Apple TV.

Jdiesel
Posted

It is possible that the AppleTV has a smaller video cache compared to the Android client. How far away is the client? Might be worth doing a traceroute to see if it is a latency issue. 

skurvy_pirate
Posted

It is a state away (6-700 miles). I haven't noticed slow response times in the logs for that client, but I will do some more poking around with traceroute when I get a chance. Also I will be visiting there soon so I can mess with the Apple TV options in person and see if there is anything I can tweak on that side.

  • 1 month later...
skurvy_pirate
Posted (edited)

Sorry to bring back this thread but I am still having problems setting transcode bitrate for live tv. When setting to 3 Mbps on a 720p stream, it seems to honor it. When upping it to 4 Mbps, it goes to direct streaming, and it shoots up to about 12 Mbps. Here are 2 screenshots of streaming Live TV at 3 Mbps and 4 Mbps (with cursors on 3 Mbps in first and 4 Mbps in the second to see actual network usage)

:

58f6e2dd3b15d_3Mbps.png

 

58f6e2f19be4c_4Mbps.png

Edited by skurvy_pirate
skurvy_pirate
Posted

I have attached the server logs. There is only a transcode log for the 3Mbps one and when I change to 4Mbps it does not generate one (obviously, because it isn't encoding). It looks like Emby thinks the media bitrate of the TV stream is just about 3.5 Mbps, so when set to 3Mbps, isEligibleForDirectPlay and isEligibleForDirectStream are false but when set to 4Mbps they are set to true. But looking in VLC the bitrate is closer to 13Mbps. I have attached a screenshot from VLC Media info as well.

 

58f9701fb7d6a_vlc_bitrate.png

Server_Log.txt

Posted

Occasionally the probe of the live stream doesn't produce a bitrate, so we end up estimating it. We'll increase the estimated value. 

skurvy_pirate
Posted

I can try other channels but that one consistently behaves that way. Does it depend on the channel or the program on the channel? Is there a reason the probe can't detect the bitrate but VLC can?

skurvy_pirate
Posted

It looks like with the latest Emby server build (3.2.13.0) it is detecting the live tv stream bitrate properly. It is detecting it around 8-9 Mbps now so it gets transcoded when requesting bitrates lower than that.

Posted

Thanks for the feedback.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...