Understanding latency

Jump to:

Draft
This page is not complete.

This article explains what latency is, how it impacts performance, how to measure latency, and how to reduce it.

What is Latency?

Latency describes the amount of delay on a network or Internet connection.

Latency is measured by testing the amount of time it takes for resources to travel on a network or the Internet. Low latency implies that there are no or almost no delays. High latency implies that there are many delays.

On a connection with low latency, requested resources will appear almost immediately. On a connection with high latency, there will be a discernible delay between the time that a request is sent and the resources are returned. We can determine the amount of latency by measuring the speed with which the data moves from one network location to another.

Latency can be measured one way, for example, the amount of time it takes to send a request for resources, or the length of the entire round-trip from the browser’s request for a resource to the moment when the requested resource arrives at the browser.

Document Tags and Contributors

Contributors to this page: walonge, estelle, irenesmith
Last updated by: walonge,