Have you ever been frustrated because you had to wait too long for a webpage to load? Well, it happens because of Latency. Latency is a measure of time taken for a data packet to transverse through a network. In loose terms, latency can also be called as a delay. A network with lower latency means faster network. Network latency is usually measured in milliseconds.
This latency in the network is introduced because of the signal traversing the geographical distances and also because of it going through various communications equipment like network hops, proxy servers, routers, VPNs (virtual private network), etc.
Since bandwidth is generally more promoted by telecommunicating companies, latency is a concept not many know about. However, Latency and bandwidth together determine the efficiency of an internet network.
Network latency creates bottlenecks for an internet connection. An internet connection with 5 Mbps capacity can perform better than one with 15 Mbps capacity if the latency of former is lower than that of latter.
How are network latency and bandwidth different?
As mentioned above, network latency is the delay, which is caused while transferring data. Bandwidth, on the other hand, determines the capacity of the data transfer. Bandwidth deals with the amount of data transferred measured in seconds. When you go for a new internet connection from Internet service providers, you see several plans in terms of bandwidths. E.g., 40 Mbps, 1Gbps, etc.
Let’s take an analogy to make it clear. Think of a highway with five lanes and a speed limit of 70mph. The number of lanes determines the number of cars plying on the road at a time. If you want to increase the number of cars traveling at the same time on the highway, you need to add more lanes.
But whatever the case may be, because of the speed limit, they can not go faster than 70mph. Moreover, the speed of the cars is also determined by the number of signals, crossings, bumps, etc.
This speed limit is analogous to latency for the network whereas the capacity of cars which can run simultaneously is analogous to bandwidth.
In reality, several factors affect bandwidth. Excessive latency significantly decreases the capacity of the network pipe. Thus fewer data flows through the network reducing the effective bandwidth.
Why does network latency occur?
There is latency in an internet network regardless of its connection type. Even if the connection is through Dial-up, fiber optics or satellite, Latency is an inherent component of the network.
If the connection is Dial-up or through cables, Latency is generally 100 milliseconds (ms). In some cases, it is even less than 25 ms. However, for a satellite-based connection, it can be 500 ms or higher.
Due to propagation
A Satellite internet connection possesses both, high bandwidth as well as high latency. So, you will generally face a bit of delay when loading a webpage through the Satellite. This is because the request message travels to the satellite station before returning to your network.
Hence the delay as the request message needs to travel a considerable distance. This kind of delay is tagged as propagation delay.
Due to WAN
Here, the delay is caused mainly because of the time constraints for the network. In such instances, the network is so busy handling all requests, that it can just not handle them all at maximum speeds. Hence, there is a delay. This in return affects the wired network too.
Problems with the Hardware
There can be a problem with the device in the network. An error within the device can increase the time it needs to read data, thereby causing delay. One of the examples is of a Hard drive, which takes too much time to store or retrieve some data.
Due to Software
There is more or less an antivirus software present on every computer. This software generally examines every bit of data which goes in and out of your computer.
This causes a delay if it cannot handle too much data at once. This is one of the reasons why over-protected computers are slower than their peers.
How to measure network latency?
Since Latency is such a critical network component, it is essential to measure it. An accurate measure of latency determines the strength of your network and network speed. The following are the methods which are commonly used to measure latency:
The Ping Test can measure Latency by monitoring the time it takes a data packet to reach its destination and come back. In other words, it is known as the round-trip time.
A more effective way this can be done is to multiple ping devices on different networks. Then, comparing the results would give you a better idea about latency.
It is similar to a ping test in that it measures the time taken by the data packet to complete a round trip. However, here a traceroute test provides the results on a hop by hop basis.
In internet networks, most probably there will be many hops before the data packets reach its destination. The Traceroute test uses the hops to measure latency effectively.
Simple Network Monitoring Protocol tools can track down network bottlenecks which can be the source of latency issues. Suppose if a server backup is clogging a critical uplink which possesses voice traffic.
An effective SNMP tool can identify the interface along with the bottleneck. Additionally, SNMP tools can also identify CPU malfunctions which may lead to higher latency.
Latency is one of the critical network components to know about. It has a significant impact on the data transfer taking place over a network. You should know the network latency your internet connection has.
If it is on a higher side, then you should work with your Internet Service Provider (ISP) to resolve the issues. A low latency internet connection would help you to have a better internet experience and lessen your frustrations.