The word latency refers to the time it takes for a packet of data to pass from one point to the other. In other words, latency is exactly the time that information takes between the source and the destination. When a computer accesses sites whose server is 1,000 km away, for example, between the request of the information and the delivery of it, it is the latency that will determine the response time. In some cases, the latency can be measured by sending the packet that is returned to the sender-the round-trip time is considered to be latency, or ping as is known in the Web meters. Latency is determined by factors such as:
- The propagation (which is the time it takes for the information packet to travel between one place and another with the speed of light)
- The transmission (fiber optic, wireless, etc.) and the packet size the packet travel time will be larger or smaller depending on what is being transmitted.)
- The quality of this data latency the network congestion, the routers and the firewalls.
- When you access any website or send e-mail, this command comes out of the computer by WiFi or cable, enters the network through the infrastructure (usually poles and cables and, in the cases of other countries, an international cable) until you reach the carriers that are connected to the Data center, hosting the site in question. The higher the physical distance between you and the server, the greater the latency or the ping.
Celérix has the shortest route between Brasilia and Sao Paulo, besides having the most modern and new equipment causing the latency in its network to be the smallest among all the operators of this same route. Each equipment, be it router or switch that is in the path of information degrades the latency time, the route of Celérix using DWDM technology is arguably the fastest.
Comment