An image sensor or imaging sensor is a sensor that detects and conveys the information that constitutes an image. It does so by converting the variable attenuation of light waves (as they pass through or reflect off objects) into signals, small bursts of current that convey the information. The waves can be light or other electromagnetic radiation. Image sensors are used in electronic imaging devices of both analog and digital types, which include digital cameras, camera modules, medical imaging equipment, night vision equipment such as thermal imaging devices, radar, sonar, and others. As technology changes, digital imaging tends to replace analog imaging.
Early analog sensors for visible light were video camera tubes. Currently, used types are semiconductor charge-coupled devices (CCD) or active pixel sensors in complementary metalâ"oxideâ"semiconductor (CMOS) or N-type metal-oxide-semiconductor (NMOS, Live MOS) technologies. Analog sensors for invisible radiation tend to involve vacuum tubes of various kinds. Digital sensors include flat panel detectors.
CCD vs CMOS technology
Most digital cameras use a CMOS sensor, because CMOS sensors perform better than CCDs, offering faster speeds with lower power consumption. Most CMOS sensors incorporate an integrated circuit, helping reduce costs; CCD sensors are still in use for cheaper cameras, but can demonstrate comparatively diminished performance (e.g. weakness in burst mode). Both types of sensor accomplish the same task of capturing light and converting it into electrical signals.
Each cell of a CCD image sensor is an analog device. When light strikes the chip it is held as a small electrical charge in each photo sensor. The charges in the line of pixels nearest to the (one or more) output amplifiers are amplified and output, then each line of pixels shifts its charges one line closer to the amplifier(s), filling the empty line closest to the amplifiers(s). This process is then repeated until all the lines of pixels have had their charge amplified and output.
A CMOS image sensor has an amplifier for each pixel compared to the few amplifiers of a CCD. This results in less area for the capture of photons than a CCD, but this problem has been overcome by using microlenses in front of each photodiode, which focus light into the photodiode that would have otherwise hit the amplifier and not be detected. Some CMOS imaging sensors also use Back-side illumination to increase the number of photons that hit the photodiode. CMOS sensors can potentially be implemented with fewer components, use less power, and/or provide faster readout than CCD sensors. They are also less vulnerable to static electricity discharges.
Another design, a hybrid CCD/CMOS architecture (sold under the name "sCMOS") consists of CMOS readout integrated circuits (ROICs) that are bump bonded to a CCD imaging substrate â" a technology that was developed for infrared staring arrays and has been adapted to silicon-based detector technology. Another approach is to utilize the very fine dimensions available in modern CMOS technology to implement a CCD like structure entirely in CMOS technology: such structures can be achieved by separating individual poly-silicon gates by a very small gap; though still a product of research hybrid sensors can potentially harness the benefits of both CCD and CMOS imagers.
Performance
There are many parameters that can be used to evaluate the performance of an image sensor, including dynamic range, signal-to-noise ratio, and low-light sensitivity. For sensors of comparable types, the signal-to-noise ratio and dynamic range improve as the size increases.
Color separation
There are several main types of color image sensors, differing by the type of color-separation mechanism:
- Bayer filter sensor, low-cost and most common, using a color filter array that passes red, green, or blue light to selected pixel sensors, forming interlaced grids sensitive to red, green, and blue â" the missing color samples are interpolated using a demosaicing algorithm. In order to avoid interpolated color information, techniques like color co-site sampling use a piezo mechanism to shift the color sensor in pixel steps. The Bayer filter sensors also include back-illuminated sensors, where the light enters the sensitive silicon from the opposite side of where the transistors and metal wires are, such that the metal connections on the devices side are not an obstacle for the light, and the efficiency is higher.
- Foveon X3 sensor, using an array of layered pixel sensors, separating light via the inherent wavelength-dependent absorption property of silicon, such that every location senses all three color channels.
- 3CCD, using three discrete image sensors, with the color separation done by a dichroic prism.
Specialty sensors
Special sensors are used in various applications such as thermography, creation of multi-spectral images, video laryngoscopes, gamma cameras, sensor arrays for x-rays, and other highly sensitive arrays for astronomy.
While in general digital cameras use a flat sensor, Sony prototyped a curved sensor in 2014 to reduce/eliminate Petzval field curvature that occurs with a flat sensor. Use of a curved sensor allows a shorter and smaller diameter of the lens with reduced elements and components with greater aperture and reduced light fall-off at the edge of the photo.
Sensors used in digital cameras
A listing of digital camera sensors can be found here.
See also
- Contact image sensor (CIS)
- Video camera tube
- Semiconductor detector
- Full-frame digital SLR
- Image sensor format, the sizes and shapes of common image sensors
- Color filter array, mosaic of tiny color filters over color image sensors
- Sensitometry, the scientific study of light-sensitive materials
- History of television, the development of electronic imaging technology since the 1880s
- List of large sensor interchangeable-lens video cameras
- Oversampled binary image sensor
- computer vision
References
External links
- Digital Camera Sensor Performance Summary by Roger Clark