Digitizing
|
Digitizing, or digitization, is the process of turning an analog signal into a digital representation of that signal. Analog signals are continuously variable, both in the number of possible values of the signal at a given time as well as in the number of points in the signal in a given period of time. However, digital signals are discrete in both of those respects, and so a digitization can only ever be an approximation of the signal it represents. But the digital representation does not necessarily lose information since the analog signal usually contains both information and noise.
A digital signal consists of a sequence of values, each value often represented by a sequence of bits, and each bit often represented by a voltage level. Digitization is performed by reading an analog signal A, and, at regular time intervals (sampling frequency), representing the value of A at that point by a digital value. Each such reading is called a sample.
There are two main factors determining how close an approximation to an analog signal A a digitization D can be. The bit rate of D, formally the number of bits in D per time unit, represents how often A is sampled.
The images we see on the TV screen, the raster display of a computer, or in newspapers are in fact "digitized images".
Digitizing is the primary way of storing images in a form suitable for transmission and computer processing.