How do you measure latency

Latency: A concept strongly linked to connection speed and bandwidth of a network. Learn more below.

With regard to latency, let's start by defining it: latency is a network term to describe the total time it takes for a data (information) packet to travel from a source node to a different destination node.

Every physical system with separation (distance) between source and destination will experience some kind of latency. In the field of human interaction with computer systems, the perceptible connection latency has a strong effect on user satisfaction and usability.

In summary:

Latency = delay

Latency is generally measured in milliseconds (ms).

In other words, the lower the number of milliseconds, the latency, and the network will behave more efficiently; therefore, the user will have a better experience.


The importance of connection speed and bandwidth

Latency is strongly linked to connection speed and bandwidth of a network. These 2 terms are sometimes mistakenly confused and equated with 'speed,' the speed at which we can upload and download files; service providers advertise that their internet connections have speeds of 50 Mbps (Megabits per second).

The truth is that the 50 Mbps connection has little to do with speed, and more to do with the amount of data it can receive per second, that is, the bandwidth. In this case, 50 Mbps is the maximum throughput that we could get.

We know these are confusing terms, but we will try to explain them simply and clearly: latency is a way of measuring connection speed. Ironically, bandwidth is not a measure of speed, although everyone refers to it as if it were.

The best way to explain the difference is by using a highway as an example. Bandwidth has to do with how narrow the highway is. Latency has to do with the vehicles on the highway—how fast a car moves from one end to the other.

An example to measure latency would be five cars on a highway. How long would it take them to get from point A to B if they are on a 5-lane highway (low latency, high bandwidth) compared to the same number of cars making the same trip on only two lanes (low latency, small bandwidth)?


How do you measure the performance of a network?

As a rule, the lower the latency, the higher the speed. An acceptable average latency is 100 milliseconds. Some online video games require less than 50 ms.

There really is no exact equivalence between the contracted bandwidth of a connection and the actual connection speed, since it depends on many factors, including:

  • Status of the network
  • Connection method: whether the computer is wired directly or uses a Wi-Fi network, the number of nearby wireless networks (regardless of the provider), the “channels” they are using, and the technology of the computer's network card.

As an approximation, the download speed will be around 80% of the contracted bandwidth. The uploading speed is usually much lower, hovering around 20%. There are providers that offer asymmetric speeds. That is the same upload speed as download speed.


Some factors that can affect latency:

Type of connection: satellite internet, Wi-Fi, wired (fiber optic cable or coaxial cable), mobile phone internet, among others. All methods have different characteristics, protocols and technologies.

Distances: the further we are from the access points (satellites, ISP center, Wi-Fi modem, router, etc.), the longer it will take to send the information from there to the computer.

Congestion: the smaller the bandwidth, or if we are sharing the bandwidth with many computers, the more likely we are to experience congestion, that is, slower internet.



A simple test to measure latency is to run a ping. This is a network diagnostic tool primarily used to test connectivity between two servers or devices.

To ping a destination server, an Internet Control Message Protocol (ICMP) echo request packet is sent to that server. If a connection is available, the destination node responds with an echo reply. Ping calculates the round-trip time of the data packet's route from its origin to its destination, and vice versa, thus determining whether any packets were lost during the journey.

Remember that having a technology partner with the necessary experience and knowledge will help you achieve your business goals. We invite you to visit



SAS. What is latency?, accessed August 2019.


Topics: latency