Difference between revisions of "Wordlength"

From LavryEngineering
Jump to navigation Jump to search
Line 7: Line 7:
 
Early binary computer used 8-bit words to represent up to 256 increments; which made them useful for basic calculations and industrial control; but not for audio. As technology advanced; 16-bit computers became the norm, and digital audio became a possibility. At this time converter technology limited the early 16 bit digital audio systems' quality to much less than the potential.
 
Early binary computer used 8-bit words to represent up to 256 increments; which made them useful for basic calculations and industrial control; but not for audio. As technology advanced; 16-bit computers became the norm, and digital audio became a possibility. At this time converter technology limited the early 16 bit digital audio systems' quality to much less than the potential.
  
Computer technology continued to advance; and 32 bit computers became available. Due in part to the evolution of computer technology; it was not unusual for 32 bit computers to move information between the CPU and hard drive in pairs of 16 bit words. Practically speaking; digital audio converters with an accuracy of 24 bits are capable of encoding analog audio to the level of accuracy obtainable with very high quality analog circuitry; so even though 24 bit wordlength is not typical of computer technology; it has become a standard in digital audio. In most cases; computer software simply uses two 16 bit words (or a "32 bit word") to represent a 24 bit digital audio word and thus simplify operations. The 16 bit word containing the 8 least-significant audio bits simply contains "zeros" in the other 8 bits.
+
Computer technology continued to advance; and 32 bit computers became available. Due in part to the evolution of computer technology; it was not unusual for 32 bit computers to move information between the CPU and hard drive in pairs of 16 bit words. Practically speaking; digital audio converters with an accuracy of 24 bits are capable of encoding analog audio to the level of accuracy obtainable with very high quality analog circuitry; so even though 24 bit wordlength is not typical of computer technology; it has become a standard in digital audio. In most cases; computer software simply uses two 16 bit words (or a "32 bit word") to represent a 24 bit digital audio word and thus simplify operations. The first 16 bit word contains the most-significant digital audio bits, and the 16 bit word containing the 8 least-significant audio bits simply contains "zeros" in the other 8 bits.
  
 
Although excellent quality audio can be encoded to 24 bit digital audio; virtually any time a process is applied to the original 24 bit signal, more than 24 bits will be needed to accurately represent the processed signal. For this reason; most processing is performed with 32 bit or 64 bit precision, and the output is only reduced to 24 bit wordlength after dither and noise shaping is applied to retain quality.     
 
Although excellent quality audio can be encoded to 24 bit digital audio; virtually any time a process is applied to the original 24 bit signal, more than 24 bits will be needed to accurately represent the processed signal. For this reason; most processing is performed with 32 bit or 64 bit precision, and the output is only reduced to 24 bit wordlength after dither and noise shaping is applied to retain quality.     

Revision as of 16:54, 24 February 2012

Overview

The term wordlength is used to describe the format of digital audio signal and relates directly to the resolution or quality of the encoding.

History

In binary computer technology, information is contained in a digital "word" which is simply a pre-defined collection of a number of individual "bits." A single bit can only define two "states" as represented by a "1" or a "0." Although useful in limited situations where all-or-nothing yes-or-no information is required; for broader applications more "states" or increments are necessary. By associating a number of bits in a pre-defined manner; a digital word can be used to represent a useful amount of information.

Early binary computer used 8-bit words to represent up to 256 increments; which made them useful for basic calculations and industrial control; but not for audio. As technology advanced; 16-bit computers became the norm, and digital audio became a possibility. At this time converter technology limited the early 16 bit digital audio systems' quality to much less than the potential.

Computer technology continued to advance; and 32 bit computers became available. Due in part to the evolution of computer technology; it was not unusual for 32 bit computers to move information between the CPU and hard drive in pairs of 16 bit words. Practically speaking; digital audio converters with an accuracy of 24 bits are capable of encoding analog audio to the level of accuracy obtainable with very high quality analog circuitry; so even though 24 bit wordlength is not typical of computer technology; it has become a standard in digital audio. In most cases; computer software simply uses two 16 bit words (or a "32 bit word") to represent a 24 bit digital audio word and thus simplify operations. The first 16 bit word contains the most-significant digital audio bits, and the 16 bit word containing the 8 least-significant audio bits simply contains "zeros" in the other 8 bits.

Although excellent quality audio can be encoded to 24 bit digital audio; virtually any time a process is applied to the original 24 bit signal, more than 24 bits will be needed to accurately represent the processed signal. For this reason; most processing is performed with 32 bit or 64 bit precision, and the output is only reduced to 24 bit wordlength after dither and noise shaping is applied to retain quality.