Difference between revisions of "Analog to digital converter"
Brad Johnson (talk | contribs) |
Brad Johnson (talk | contribs) |
||
Line 11: | Line 11: | ||
Before storage of the huge amount information generated by [[CD quality]] AD converters became practical, the earliest application in music recording was in "[[outboard]]" equipment such as digital delay or effects processors. Largely because the output of these early units was mixed in with the original (unprocessed) source at a low level as an ambient effect; the less-than high fidelity quality of the converters was acceptable. Even with the noise and distortion present in analog recordings, the perceived quality of the analog tape recordings was far better than the signal processed through these early converters. One of the more popular early digital delay units employed a novel for of digital encoding "[[sigma-delta]]" where, in contrast to the "[[linear PCM]]" format where each "[[sample]]" of the analog input waveform is represented by a digital [[word]] made of of a number of [[bits]]; sigma-delta encoded only one bit at a relatively high [[sample frequency]]. Compared to the relatively inaccurate PCM-based units, most recording engineers felt that the sigma-delta digital delay unit sounded closer to the source. | Before storage of the huge amount information generated by [[CD quality]] AD converters became practical, the earliest application in music recording was in "[[outboard]]" equipment such as digital delay or effects processors. Largely because the output of these early units was mixed in with the original (unprocessed) source at a low level as an ambient effect; the less-than high fidelity quality of the converters was acceptable. Even with the noise and distortion present in analog recordings, the perceived quality of the analog tape recordings was far better than the signal processed through these early converters. One of the more popular early digital delay units employed a novel for of digital encoding "[[sigma-delta]]" where, in contrast to the "[[linear PCM]]" format where each "[[sample]]" of the analog input waveform is represented by a digital [[word]] made of of a number of [[bits]]; sigma-delta encoded only one bit at a relatively high [[sample frequency]]. Compared to the relatively inaccurate PCM-based units, most recording engineers felt that the sigma-delta digital delay unit sounded closer to the source. | ||
− | With the introduction of [[Compact Disc]] technology by Sony/Phillips in the early 1980's came the standard of recording audio in [[16 bit]] linear PCM format. AD converter technology was still evolving at the time and even though many AD converters were nominally "16 bit" they were not truly accurate to 16 bit resolution. Contemporary AD converters are typically "[[24 bit]]" and are accurate to approximately 22-23 bits. | + | With the introduction of [[Compact Disc]] technology by Sony/Phillips in the early 1980's came the standard of recording audio in [[16 bit]] linear PCM format. AD converter technology was still evolving at the time and even though many AD converters were nominally "16 bit" they were not truly accurate to 16 bit resolution. Contemporary AD converters are typically "[[24 bit]]" and are accurate to approximately 22-23 bits. The sample frequency capability |
Revision as of 11:43, 4 January 2012
Overview
The term "analog to digital converter" is used to describe a device that accepts analog audio inputs and outputs a digital code that represents the original analog input. This code is typically linear PCM format; but may also be other formats such as DSD or I2S (typically used internally in analog to digital converter units). Once encoded, the information can be stored, transmitted, or copied in a lossless manner. In most instances; further processing is used to generate other formats that employ data compression of both the lossless and lossy variety.
The term can be used to describe the actual analog to digital converter IC or circuit, or an entire unit that incorporates all of the necessary support circuitry to accept line level analog input signals and output the encoded digital audio signal in one or more formats.
For brevity, the term "AD converter" or "AD" will be used interchagably with "analog to digital converter" in the following discussion.
History
Prior to the development of practical digital audio recording systems; AD converters were used in applications such as medical testing and monitoring equipment, and instrumentation (industrial measurement and monitoring). These early converters were limited either by the converter technology at the time or by the amount of data that associated system could handle to much lower resolution than typically used to encode audio. The resolution both in the amplitude domain (typically voltage of the input waveform) and time domain was often quite limited by digital audio standards.
Before storage of the huge amount information generated by CD quality AD converters became practical, the earliest application in music recording was in "outboard" equipment such as digital delay or effects processors. Largely because the output of these early units was mixed in with the original (unprocessed) source at a low level as an ambient effect; the less-than high fidelity quality of the converters was acceptable. Even with the noise and distortion present in analog recordings, the perceived quality of the analog tape recordings was far better than the signal processed through these early converters. One of the more popular early digital delay units employed a novel for of digital encoding "sigma-delta" where, in contrast to the "linear PCM" format where each "sample" of the analog input waveform is represented by a digital word made of of a number of bits; sigma-delta encoded only one bit at a relatively high sample frequency. Compared to the relatively inaccurate PCM-based units, most recording engineers felt that the sigma-delta digital delay unit sounded closer to the source.
With the introduction of Compact Disc technology by Sony/Phillips in the early 1980's came the standard of recording audio in 16 bit linear PCM format. AD converter technology was still evolving at the time and even though many AD converters were nominally "16 bit" they were not truly accurate to 16 bit resolution. Contemporary AD converters are typically "24 bit" and are accurate to approximately 22-23 bits. The sample frequency capability