Latency means some type of delay in computing language. It typically refers to those delays that occur during transmitting or processing of the data , whereas it occurs for many reasons.

Two examples of latency are network latency and disk latency, which are explained below.

1. Network Latency

Network latency is a type of delay that occurs during communication in a network (which includes the Internet ).

For example, a slow router produces a delay of a few milliseconds when a system tries on a LAN to connect with another via a router.

A more noticeable delay occurs when two computers that are located in different continents and communicating with each other with the help of Internet. In such a situation, there may be some delay when the connection is being established because it depends on the distance and number of “hops” that are involved in making a connection.

In such a situation, ” ping ” response time is a good indicator in a situation of latency.

2. Disk Latency

Disk latency is called the delay that occurs when a request for time data is made from a storage device and when the data starts returning. Reasons that affect disk latency include rotational latency (a hard drive ) and seek time.

A hard drive that has a rotational speed of 5400 RPM, for example, has a nearly double rotational latency of a drive that rotates at around 10,000 RPM. At the same seek time, in which to read or write the data of physical movement drive head, this also increases latency.

Due to disk latency, large amount of reading or writing files are much slower than reading or writing of a single contiguous file . Since SSDs do not rotate like traditional HDDs, they have much lower latency.

Other types of latency

There are also many different types of latency, such as RAM latency (aka “CAS latency”), CPU latency, audio latency, and video latency. The common thread that is found in all of these types is that a delay is produced as a result of some kind of bottleneck.

Talking about the computing world, these types of delays are usually only for a few milliseconds, but they later work together to create noticeable slowdowns that are reflected in performance.

It is important that you do not confuse at all with latency with other measurements such as data transfer rate or bandwidth . Latency refers to the delay that occurs before data transfer, not the speed of data transfer.

« Back to Wiki Index