Sample frequency

From LavryEngineering
Revision as of 13:47, 10 January 2012 by Brad Johnson (talk | contribs)
Jump to navigation Jump to search

Overview

In order to digitize analog audio, most contemporary systems use a system referred to as "sampling" to repeatedly measure the voltage of the analog audio waveform at a regular time interval. Each voltage measurement results in a binary number of a given wordlength. The series of binary “words” are typically stored consecutively in a file for later reconstruction of the analog voltage waveform by a digital to analog converter. The sample frequency is the rate at which the samples are generated and is measured in Hertz (cycles per second). The term sample rate is used interchangeably with sample frequency.

Basics

Virtually all contemporary analog audio equipment operates on the principle of an analog voltage waveform being analogous to the original sound's air pressure "waveform." Typically; the original sound is translated from the pressure variation to electrical variations by a microphone (a type of transducer). The resulting voltage waveform can be transmitted on wires to an amplifier and a power amplifier; then translated back into sound pressure variations by a speaker.

One important consideration is how this analog waveform can be stored for later reproduction or transmission. All analog storage and transmission schemes are prone to loss of signal quality; with storage being particularly problematic. As the technology became available, digital audio systems were developed to address these issues. In order to generate digital information that can be used for these purposes; the analog voltage waveform is sampled repeatedly at a specific time interval; normally referred to as the sample frequency. Please refer to analog to digital conversion for more details.

By sampling the analog audio signal at a fixed time interval; the need to record when the sample was taken along with the digitized voltage information is eliminated. As long as the sample frequency (SF) of a recording is known and the playback system operates at virtually the same SF; it can be assumed that the timing will be accurate when the analog waveform is reconstructed during digital to analog conversion. It also makes it extremely critical that the "clock" signal that is used as the timing reference during both analog to digital conversion and digital to analog conversion be extremely accurate in order to reduce distortion to an acceptable level.