Updated Oct 5, 2020 · 2 min read

What is latency?

Latency is the time between when a user makes a request to when the website or application responds to the request. Low latency is associated with a positive user experience, while high latency is associated with a negative user experience. Latency is typically measured in milliseconds or ms. The major effects on latency include network connection, network hardware, distance, and bandwidth congestion.

How you can measure latency

Many websites and applications online offer ways for you to measure latency. The tests let you send a ping to a web server and measure the time it takes to receive a response. The test measure for latency is called the ping rate.

Ways to reduce latency

One way web developers can reduce latency is by reducing the physical distance between the client and the server. Content Delivery Networks (CDN) provide a way to reduce latency, which distributes web content across many physically distant servers. Another way is by making programs more efficient, such as optimizing images and files so they can be loaded faster and reducing the number of resources that block content rendering. Web developers also reduce latency by rendering certain content first, such as loading only the areas that customers can directly see.

Related Terms

Content Delivery Networks

A content delivery network (CDN) (sometimes referred to as a content distribution network) is a network of servers that’s distributed...


A server is a computer that provides a service (such as providing data) to other computers and its users, known as the clients.

Web Browsers

Web browsers (or “browsers”) are software applications used for accessing and viewing information on the internet.