Javk
  • Javk
Explain why jitter is undesirable for real-time communications such as voice and video communications?
Computer Science
  • Stacey Warren - Expert brainly.com
Hey! We 've verified this expert answer for you, click below to unlock the details :)
SOLVED
At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti quos dolores et quas molestias excepturi sint occaecati cupiditate non provident, similique sunt in culpa qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio cumque nihil impedit quo minus id quod maxime placeat facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat.
chestercat
  • chestercat
I got my questions answered at brainly.com in under 10 minutes. Go to brainly.com now for free help!
Javk
  • Javk
Is this correct? The question says explain, I don't think I've explained as much as stated Jitter causes the received signal to breakup due to the variability of delay. this can lead to voice and video breaking up and becoming unintelligible, which would in turn defeat the purpose of the communication.
Curry
  • Curry
Jitter can refer to a number of things. There are a multitude of different noises (jitter) that can occur. Jitter due to thermal noise caused by the hardware, other signals intercepting, inability of the receiver to decode at a certain speed, or even mismatch between the baud rate of the transmitter and the receiver. There are multitudes more. These are the basic common ones. Anywho, this is bad for any type of communication because essentially, jitter is error in the data bits. In real time communication, this can be bad cause your image/voice might be distorted and/or become unintelligible. But I'd assume that for real time communication, the code doesn't have enough time to run a full error correction algorithm on it, and has to accept a higher error threshold. In which case, like you stated it could cause the signal to break up or become fully corrupt.
Javk
  • Javk
thanks So I'm not really missing anything in the answer...the answers just rather simplistic, right?

Looking for something else?

Not the answer you are looking for? Search for more explanations.