Monday, October 14, 2019

What is Jitter

Jitter is the amount of variation in latency/response time, in milliseconds. Reliable connections consistently report back the same latency over and over again. Lots of variation (or 'jitter') is an indication of problems.

Jitter is a symptom of other problems. It's an indicator that there might be something else wrong. Often, this 'something else' is bandwidth saturation (sometimes called congestion) - or not enough bandwidth to handle the traffic load.


To measure Jitter, we take the difference between samples, then divide by the number of samples (minus 1).

Here's an example. We have collected 5 samples with the following latencies: 136, 184, 115, 148, 125 (in that order). The average latency is 142 - (add them, divide by 5). The 'Jitter' is calculated by taking the difference between samples.

136 to 184, diff = 48
184 to 115, diff = 69
115 to 148, diff = 33
148 to 125, diff = 23

(Notice how we have only 4 differences for 5 samples). The total difference is 173 - so the jitter is 173 / 4, or 43.25.

We use this same mechanism no matter how many samples you have - it works on 5, 50 or 5000.

Some apps consider the Jitter value to be not good when it exceeds 15% of the average latency. If average latency on that hop is 150, and jitter is > 22.5, then it will be treated as bad



References:
https://www.pingman.com/kb/article/what-is-jitter-57.html


No comments:

Post a Comment