Jump to content

Original media bitrate is wrong - can create higher bandwidth stream than necessary


spuleckip

Recommended Posts

spuleckip

Hi,

 

I’ve noticed this for the past month or so when watching live tv from my HDHomerun.

 

Emby is recognizing the original media bitrate as 20.4 Mbps when this is never the case. 20.4 is probably the full stream including sub-channels, but never for one individual channel. Today, it showed that on my iPad and when I dialed into the HDHomerun it shows me the stream was 10.5 Mbps

 

I’ve noticed, especially when traveling that a 20 Mbps stream is being used (or at least reported that way in stats for nerds) to transmit to devices both local and remote.

 

I’ve never captured logs before but can read up on how to do this if necessary.

Link to comment
Share on other sites

Hi, the bitrate might have fluctuated from when the program started, that would explain the discrepancy between what the stats say vs hd homerun.

 

 

 

I’ve noticed, especially when traveling that a 20 Mbps stream is being used (or at least reported that way in stats for nerds) to transmit to devices both local and remote. 

 

This isn't true. As you can see in the stats screenshot above, it is transcoding at 1.5 mbps, so it doesn't matter whether the source is 20 mbps or 10, it's still getting converted down to 1.5.

Link to comment
Share on other sites

spuleckip

The screenshots weren’t meant to show the 20 Mbps stream, just the discrepancy in the bitrate. When I’m not on a cellular connection, it happens almost every time. I’ll attach two to this post showing this.

 

All the stations in this area broadcast multiple sub-channels, so none of them have 20 Mbps to dedicate to the primary channel. In my earlier example, that channel packs two HD feeds (CBS and CW) as well as 2 SD feeds all in a single channel space. A single program wouldn’t be 20Mbps. The example I’m providing here, the station has an HD main program of NBC and 3 standard definition subs.

Link to comment
Share on other sites

spuleckip

I can get around the issue by setting a max bitrate in the client app, but that defeats the purpose of the auto setting.

 

Attaching another screenshot where it can’t identify the original media bitrate of a standard def subchannel and is reporting an 80 Mbps stream for 640x480 video.

post-385539-0-09236900-1555525301_thumb.jpeg

Link to comment
Share on other sites

Ok this isn't actually causing any problem. Those values are the values requested by the app. They are max values. You're not actually getting 640*480 @ 30 mbps. We will need to improve the stats display there. Do the server dashboard stats show the same value?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...