TELECOMMUNICATION ~ Difference between Gigue and Latency


    What is the difference between jitter and latency? - Jitter and latency are used to measure network reliability. The main difference between jitter and latency lies in their definition where latency is just a delay on the network while jitter is a variation in the amount of latency.

   
     Increasing latency and jitter have a detrimental effect on network performance. It is therefore essential to monitor it regularly. This increase in latency and jitter occurs when the speed of the two devices do not match; congestion causes buffer overflow and bursts at the traffic level.

Gigue Definition:

    Packets continuously transmitted over the network will have different delays, even if they choose the same path. This is inherent in a packet-switched network for two main reasons. First, packets are routed individually. Second, network devices receive packets in a queue, so consistent pacing cannot be guaranteed.

    This inconsistency in delay between each packet is known as jitter. This can be a significant problem for real-time communications, including IP telephony, video conferencing and virtual desktop infrastructure. Jitter can be caused by many factors on the network, and every network has a delay variation.

What are the effects of jitter?


    Packet loss - When packets do not arrive consistently, the receiving endpoint must compensate and attempt to correct. In some cases, it cannot make the appropriate corrections and packets are lost. In terms of the end user experience, this can take many forms. For example, if a user is watching a video and the video becomes pixelated, this indicates potential jitter.

    Network Congestion - Network congestion occurs on the network. Network devices cannot send the equivalent amount of traffic they receive, so their packet buffer fills up and they start dropping packets. If there is no disruption on the network at an endpoint, every packet arrives. However, if the end node buffer is full, packets arrive later, resulting in jitter. By monitoring jitter, it is possible to observe the onset of congestion. Similarly, if initial network congestion occurs, jitter changes rapidly.

    Congestion occurs when network devices start dropping packets and the end node does not receive them. The endpoints may then request that the missing packets be retransmitted, resulting in congestion reduction.

    With congestion, it is important to note that the receiving endpoint does not cause it directly and does not drop packets. Consider a highway with a sending station A and receiving station B. Congestion is not caused by B because there is not enough parking. The congestion is caused by A, because it keeps sending cars on the highway to B. 

How do I compensate for jitter?


    To compensate for jitter, a jitter buffer is used at the receiving end of a connection. The jitter buffer collects and stores incoming packets, so it can determine when to send them on regular intervals.

Definition of Latency:


    Latency is the time required for a data packet to reach the destination from the source. Under networking conditions, it is the time between processing the user-generated network access request and getting a response from the user to the request. Roughly speaking, latency is the time between the execution of the two events.

    Latency is simply the time it takes to process the messages at the end of the source and destination and the delays generated on the network. The latency of the network can be measured in two ways: the first is called unidirectional latency in which the time elapsed in the source that sends the packet and the destination that receives it is only measured. Then the second, the one-way latency from node A to node B is summarized with the one-way latency from node B to node A and is called round trip. 

Key differences between jitter and latency:


  •     The delay produced during the departure and arrival of the IP packet is called latency. Conversely, the jitter produced by the transmission of packets
  •     Network congestion can cause jitter, while propagation delay, switching, routing and buffering can generate latency.
  •     Jitter can be avoided by using timestamps. On the other hand, latency can be reduced by using multiple connections to the Internet. 

In short:


    Latency and jitter are somewhat related, but they are not the same. Latency is the time required for a data packet to reach the destination from the source. It is a complex measure affected by multiple factors. Jitter, on the other hand, is the difference in delay between two packets. Similarly, it can also be caused by several factors on the network. Although jitter and latency share some commonalities, jitter is simply based on delay, but it is not equivalent. 


THANK YOU for reading

You Tech 56

Follow us on social medias: linktr.ee/youtech56

Comments

Popular Posts