Jump to content

Recommended Posts

Posted

I have a mate who has moved to South Korea (Seoul) and I have given him access to our server to watch Australian TV, we are in Perth. Problem we are having is that only the very lowest streaming quality will work without continual buffering, and even sometimes this buffers.

 

From doing speedtests to Seoul servers from my end, and him likewise to Perth servers, bandwidth definitely is not the issue. However, the latency is consistently >300ms. I presume this is where the issue lies? Is there anyway to overcome streaming issues due to latency, or is this a lost cause?

 

Cheers for any suggestions!

JeremyFr79
Posted

I have a mate who has moved to South Korea (Seoul) and I have given him access to our server to watch Australian TV, we are in Perth. Problem we are having is that only the very lowest streaming quality will work without continual buffering, and even sometimes this buffers.

 

From doing speedtests to Seoul servers from my end, and him likewise to Perth servers, bandwidth definitely is not the issue. However, the latency is consistently >300ms. I presume this is where the issue lies? Is there anyway to overcome streaming issues due to latency, or is this a lost cause?

 

Cheers for any suggestions!

Lost cause.  Even at the speed of light (fiber optics) there is still going to be a huge amount of latency over that distance there is zero way of overcoming that.  That's why you'll always find that large content providers have data centers all over the world to maintain low latency to "local" clients of their sites.

  • Like 1
JeremyFr79
Posted

Take a look here and you'll see that there are a TON of hops between Perth and Seoul as well which will always add to latency.

 

http://www.submarinecablemap.com/

Guest asrequested
Posted

Could using a VPN, help? Trying different server locations?

Posted

Lost cause.  Even at the speed of light (fiber optics) there is still going to be a huge amount of latency over that distance there is zero way of overcoming that.  That's why you'll always find that large content providers have data centers all over the world to maintain low latency to "local" clients of their sites.

 

Ahh buggar, thought that might be the answer  :(

 

I just did a trace route to his internet provider and there is 22 hops from here. What is interesting is that it goes out of Australia via Sydney, So I followed up with a trace route to Singtel in Singapore and even this goes out via Sydney and has a ping of ~300ms. Only last year this was 40-60ms and went direct from Perth to Singapore. There are direct Perth to Singapore cables.

 

I might enquire on Whirlpool to see if something has recently changed.

  • Like 1
Posted

Think I found the answer on Whirlpool.

 

 

There's multiple issues causing the high latency. Shortest path from Perth to Europe is Perh -> Singapore -> Europe. The Perth -> Singapore cable is out of action right now and due to Cyclone Vardah in India there are multiple Singapore -> Europe cables near Chennai that have been extensively damaged resulting in capacity constraints on this route. As a result a lot of providers are routing traffic via the United States which has increased the latency.

Unfortunately it will take several weeks to resolve the issues.

 

Guess we will just have to wait it out and when the cable issue is resolved, will try the streaming again.

  • Like 1
JeremyFr79
Posted

Could using a VPN, help? Trying different server locations?

No, you can't make data travel faster than the speed of light, in his scenario as he stated there are 22 hops, that's a lot of latency alone that can be added and no matter if it's VPN etc, it still would route identical.  There is no straight path connection from Perth to Seoul (best he could ever hope with a direct patch fiber connect would be around 90ms RTT)  You just can't overcome physics and added latency due to routing/encryption/congestion etc

PrincessClevage
Posted

Confirmation of the outage: http://status.vocus.com.au/view-incident.aspx?IncidentID=217

 

Hopefully this will reduce the hops and latency enough to get it working! I will report back in the coming weeks.

 

Cheers for the help!

I don't see what latency has to do with streaming? Maybe the interface might be sluggish from latency but once a stream is started it all about how much the providers upload bandwidth is and the consumers download bandwidth, if you have enough of both of these for the desired video quality then streaming a movie should work fine ( although slow to star to build buffer). More than likely this is a case of not enough upload bandwidth at the provider end ( usually poor upload speed with public ISp in australia)
rbjtech
Posted

Is there a way to use some of Emby's cloud based storage offerings to store the data 'closer' to the client and thus avoid the distance/latency issues ?  

 

I guess we have no way of knowing where for example the 'google drive' storage actually is, but I'm assuming their storage infrastructure and backbone will likely have a significantly better international internet connection than yours ;) ?

 

 

Posted

This is for live TV. The stream starts, but quickly eats into the initial 6 or so second buffer. I think that's where the latency becomes the issue as ffmpeg cant transcode further ahead like it can with movies/tv shows?

Posted

This is for live TV. The stream starts, but quickly eats into the initial 6 or so second buffer. I think that's where the latency becomes the issue as ffmpeg cant transcode further ahead like it can with movies/tv shows?

 

Hi there @misssii, in order for us to best help you, please provide the information requested in how to report a media playback issue. thanks !

JeremyFr79
Posted

I don't see what latency has to do with streaming? Maybe the interface might be sluggish from latency but once a stream is started it all about how much the providers upload bandwidth is and the consumers download bandwidth, if you have enough of both of these for the desired video quality then streaming a movie should work fine ( although slow to star to build buffer). More than likely this is a case of not enough upload bandwidth at the provider end ( usually poor upload speed with public ISp in australia)

Latency is quite important for streaming as high latency is usually tied hand in hand with network congestion.  Transfer speed while important is not the only indicator of successful streaming. With high latency the packets use to trigger request for addtional chunks of the stream take longer to travel the link and may arrive long after the data is needed and sent back over that same high latency link causing disruptions in the playback.  As I explained before this is why content providers make sure they have distributed systems to make sure latency to customers is extremely low.

Posted

Hi @@millsii the logs look fine. You're getting good transcoding performance so i would suggest lowering the in-app bitrate, if you customized it.

Posted

Thanks for looking @@Luke! Hopefully when the Perth-Singapore cable is repaired in the next week or so it will improve the link sufficiently for streaming to work.

  • 1 month later...
Posted

Sorry for the long delay in reporting back on this. The SeaMeWe 3 cable repair in mid January did not resolve the issue. However, today I was playing around with GPU transcoding for the first time (NVENC) and noticed on the router traffic graph that it had a much different transfer profile to normal.

 

With CPU (i5 4690) transcoding it is almost a consistent transfer rate. Whereas with GPU (Nvidia GT 730) it has a large transfer spike, then nothing, then large spike, then nothing and so on.

 

My mate has now been streaming on 480p without issue for the past hour. Is this transfer profile in the router graph suggesting larger less frequent files are being sent, and hence less susceptible to the high latency?

Guest asrequested
Posted

Do you have throttling, enabled, in transcoding?

Posted

It's possible the different encoders have different characteristics in that regard, yes.

Posted

Do you have throttling, enabled, in transcoding?

Yes, but the issue occurs with Live TV.

 

 

It's possible the different encoders have different characteristics in that regard, yes.

Is there any way to get the CPU transcoding to behave in a similar fashion? Is there a manual variable I can change somewhere? The reason I haven't tried NVENC previously was due to the limit of only 2 streams, whereas the i5 4690 can do 4 HD TV streams pretty comfortably. 

 

The transfer profile on the router graph reminds me of when streaming MLB.TV.

gstuartj
Posted

I've found that streaming from the US to Tokyo works well for me, but I'm using Cloudflare in front of my Emby server. You might try something like that. CDNs like CF tend to have more peering agreements and better routes for intercontinental traffic, which can make a difference when hosting on a residential connection.

Posted

I've found that streaming from the US to Tokyo works well for me, but I'm using Cloudflare in front of my Emby server. You might try something like that. CDNs like CF tend to have more peering agreements and better routes for intercontinental traffic, which can make a difference when hosting on a residential connection.

Thanks for the suggestion. Cloudfare seemed to ring a bell, so I did some quick Googling in regards to my provider Telstra and came across this blog post on the Cloudflare website.

 

https://blog.cloudflare.com/bandwidth-costs-around-the-world/

 

 

 

For instance, if Telstra were to peer with CloudFlare then they would only have to move traffic over about 30 meters of fiber optic cable between our adjoining cages in the same data center. Now Telstra will need to backhaul traffic to Free customers to Los Angeles or Singapore over expensive undersea cables.

 

I will look more into it though as even if it can give a more direct route through Singapore, then that would reduce the latency a fair bit. At the moment to get to Korea, the connection goes out via the east coast of Australia. Not sure how this would affect my local streaming though?

gstuartj
Posted (edited)

Not sure how this would affect my local streaming though?

 

That's where it gets slightly trickier. There are a couple of ways you can do local streaming on your LAN without having to go through Cloudflare:

  1. You can connect directly to the local LAN IP for your server, or
  2. If you are using a domain to connect, depending on your router, you can set that domain to point to the local address when you're connected to your LAN.

I use option two on my network so I'm always connecting to the same URL, but if I'm on the LAN it connects directly. When I'm away from home it's routed through Cloudflare. If your router doesn't offer an option to modify DNS settings, you can see if it supports an alternative firmware like Tomato/DD-WRT/OpenWRT, or you can set up a DNS forwarding server on a Raspberry Pi or your Emby server and configure your router to use that for DNS. You'd want to use an A or CNAME record.

Edited by gstuartj
Posted

Thanks for the additional info. Within the home network is all good as we use the local IP address, ie 192.168.xxx.xxx. I will do some more looking into this as does give me a good excuse to get a .com from the home server instead of just using a DDNS service :P

pir8radio
Posted (edited)

I'm just chiming in because i have some data to show the difference, I couldn't find much of this info/graphs when I was first looking into these things, hope it helps.

I would agree with the others, use cloudflare, they have a good system that may take different, shorter, or quicker paths than your direct internet connection would normally take..  

 

Here is some data...  These pings originate from different countries to my server, the first graph is ping times from the remote locations to my server in central US, the second graph is from the remote locations to my site running through cloudflare.    Asia is the worst using a direct connection averaging 153-180ms, using cloudflare ping times average between 60-80ms.  Invest in a .com, but if your ip address changes you may still require a DDNS, actually if you use cloudflare you can download an updater that will auto update your cloudflare IP so Cloudflare acts as your ddns host as well, but you still need a .something  

 

 

NO CLOUDFLARE:

58c346328382a_direct.png

 

 

 

CLOUDFLARE:

58c3464fac54f_cloudflare.png

Edited by pir8radio
  • Like 1

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...