Difference between revisions of "Jitter"

From LavryEngineering
Jump to navigation Jump to search
Line 8: Line 8:
  
 
[[Category:Terminology]]
 
[[Category:Terminology]]
[[Category:analog to digital conversion]]
+
[[Category:analog to digital converter]]
[[Category:digital to analog conversion]]
+
[[Category:digital to analog converter]]

Revision as of 10:22, 29 March 2012

Overview

The term "jitter" is used to describe variations in a periodic signal, which can be in the frequency, amplitude, or "phase" of the signal in relationship to the idea form or the form at the point of generation. In digital audio; one of the most problematic issues involving jitter is in "clock recovery" from signals transmitted between equipment. If jitter in the recovered clock signal affects the clocking of the conversion, even small amounts of jitter can effectively reduce the resolution of the conversion to far below the theoretical limits of 16-24 bit conversion.

Basics

Because even "digital" signals are actually very high frequency analog signals, the receiving device must reconstruct the signal by means of some form of amplitude "threshold."