Jitter

In networking terms, it refers to small intermittent delays that occur during data transfers . These arise due to many reasons including network congestion, collisions, and signal interference.

Technically, a jitter is a type of variation in latency – a type of delay when a signal is transmitted and when it is received.

All networks experience some degree of latency, especially wide area networks that pervade the entire internet. These delays, which are typically measured in milliseconds, can often be problematic for real-time applications, such as online gaming, streaming, and digital voice communication.

At the same time, the jitter deteriorates even more by creating additional delays.

The network jitter creates considerable difficulty due to which packets are sent at irregular intervals.

For example, a delay can occur when some packets are sent and then some packets are sent together. This can cause packet loss if the receiving system is unable to process all incoming packets.

If this happens during a file download, then the lost packets are resent, which slows down the file transfer. In a real-time service, such as in audio streaming, data is simply lost, which may cause audio signal drop out or decrease their quality.

The standard way to compensate for a network jitter is to use a buffer that stores the data before you use them, such as a few seconds of audio or video clip.

This makes it smooth out the playback of media, as it provides a few seconds to the receiving computer so that it can receive any packets that have been lost due to the jitter.

Although buffers are an effective solution in such a situation, they should be very small when they are used in real-time applications such as online gaming and video conferencing. If the buffer becomes too large (which is greater than 10 ms), then it can cause a noticeable delay.

« Back to Wiki Index