top of page
Search
  • speedroyal55

Latency Speed Test

Updated: Nov 2, 2020

As network engineers, when we say latency, we’re generally referring to the amount of time it takes to send a packet of data from one location to another. Users generally aren’t affected by the latency itself but by the effect of the latency on applications. A 100ms latency to download a webpage would be wonderful, but 100ms of latency causes the underlying protocols to limit the transmission speed, so what the end-user sees is seconds of lag, not 100ms of physical layer latency. Whether you’re playing a game online, sending an email, or browsing the web, the way applications respond to latency is critical in successfully accomplishing any task.



Latency is how fast data transfers between a source and its destination — basically a delay of information communication. For instance, if you’re in the U.S. playing a game with a server in China, you will have a higher latency than someone playing the same game in China.


This tool offers two test options:

  • Start Latency Test. This option executes 10 HTTP latency tests and then offers the average latency for the 10 tests. This type of test should be executed multiple times during the day to get multiple samples and capture multiple readings throughout the day for the network latency.

  • Bandwidth Test (Download Test). As it was mentioned before, bandwidth is a factor relevant to the type of applications, number of users, and type of activities, which make use of this shared connection. In general, a bandwidth of 1 Mbps could be enough for a small trial implementation as a minimum recommendation. In some cases, if not other activities are running, 0.5Mbps might be enough, though it will not provide the best performance). As an example, the screenshot below shows a download bandwidth of fewer than 0.5 Mbps which could start causing degradation of performance.


For live video streaming applications, users generally want latency to be as low as possible. Whether you’re streaming live sporting events, esports, or bi-directional interviews, nothing kills the viewing experience like high latency. We’ve all watched a live broadcast on location with long, awkward pauses, or people talking over each other in interviews because of latency issues. Or perhaps you’ve watched a hockey game online while your neighbor watches live over the air and you hear them celebrate the winning shot 10 seconds before you see it. Or worse still, imagine watching election results and they appear on your Twitter feed before you even get to see it on your TV screen. In these cases, low latency is critical to assure an optimal viewing experience with great viewer interactivity and engagement.


Low latency might not always be of great importance, but in several IoT use cases, latency is very important for the user experience and the ultimate potential of the IoT device. An example of this could be if you want to unlock a door via your smartphone; imagine you want to let a person in remotely (could be a technician or your child who forgot their key) and you want the lock to open as soon as you tap the screen and not several seconds or minutes after so that you can tell the person when to enter. Or, instead of a phone, this could be a combination including a doorbell with two-way audio. Conversations involving too high latency are very hard to manage for most people. Perform Latency Speed Test with testmyinternetspeed.org

98 views0 comments

Recent Posts

See All
Post: Blog2_Post
bottom of page