All digital imaging systems are affected by image noise, which forms for a number of reasons. Some of these relate to the random nature of light and electronic systems in general, and so can't be avoided. Others, such as those to do with temperature or physical issues with the sensor, may be measured and thus easier to control.
Manufacturers typically use in-camera processing to minimise the effects of image noise, but the effectiveness of such processing largely depends on how the image noise formed in the first place.
As an example, the type of noise that forms during a long exposure is different from the type of noise that forms in images taken at high sensitivities. Long exposures require a sensor to be active for a longer period than usual; this creates heat, which in turn exacerbates noise.
Many cameras feature long exposure noise reduction, where a duplicate image with no exposure (known as a dark frame) is taken, before looking at how much noise the sensor produces without exposure to any illumination, such as that created by defective pixels or those with a higher leakage of current than usual. As this kind of image noise repeats itself from frame to frame, the camera can use this reference to effectively remove noise from the initial exposure, and even develop a map of known defective pixels for future images.
More random types of noise, which differ between images, are tackled differently. The shielded pixels that lie around the peripheries of a sensor are typically used to monitor noise levels throughout a frame. By getting an average reading from these shielded pixels this value may also be subtracted from those exposed to light, though this isn't as effective as the dark-frame method used in long exposures as it's just an average value subtracted from the image as a whole.
- Tue, 12 Jun 2012