Optoelectronics

What is image quality? What are IQ parameters, and how is it validated?

14th April 2025
Sheryl Miles
0

Image quality (IQ) determines how accurately a camera captures the real world. In embedded vision applications, from industrial automation to medical diagnostics, the clarity and accuracy of an image directly impact the performance of the full system.

As a result, image quality becomes a key consideration when selecting a camera for any application. Different components of the camera, like as the lens, sensor, and firmware, contribute to image quality. Many image parameters influence image quality, making it a complex and challenging task to measure accurately. Properly assessing a camera’s image quality requires a deep understanding of such contributing factors.

Understanding these parameters is crucial for producing high-quality images that truly represent reality. In this blog, you’ll find out the key parameters that define image quality, including signal-to-noise ratio, dynamic range, and quantum efficiency. You’ll also learn how these parameters are objectively evaluated for superior imaging performance across applications.

Understanding image quality

Image quality is an assessment of an image’s fidelity to the original scene. Multiple factors, including colour reproduction, distortion, sharpness, and noise levels, influence it. Different lighting conditions affect how images appear, making validation under various conditions a critical need.

Let us see how various parameters contribute to the overall quality of an image.

What are its key image quality parameters?

Colour accuracy

Colour accuracy describes how the camera reproduces colours. When light falls on an object, for instance, a red apple, it reflects only the red wavelengths and absorbs the rest of the light spectrum. The reflected red light is then processed by the human brain, allowing us to recognise the object as an apple. Similarly, all colours are perceived through the reflection of specific light wavelengths. White reflects all visible light, while black absorbs it entirely, reflecting none; hence, black is often considered the absence of colour.

The colour accuracy is validated using the colour checker chart.

Figure 1: colour checker chart

The colour checker chart is a standard colour checker consisting of 24 colour patches, including six neutral colours (grayscale), as well as primary and secondary colours for a comprehensive evaluation.

The grey colour patches in the last row are particularly useful for gamma and white balance evaluations.

It can also be validated by the ColorChecker Digital SG Chart provides 140 colours, including:

  • 24 patches from the original ColorChecker
  • 17-step greyscale
  • 14 unique skin tone colours

Figure 2: ColourChecker Digital SG Chart

White balance

White balance (WB) determines how accurately white is retained across different lighting conditions. The light source, such as the sun, creates an orange tone in the morning and a blueish tone depending on the time of day. However, effective white balance eliminates these tints, ensuring that white remains consistently white.

Auto white balance (AWB) functionality enables cameras to automatically adjust to different lighting conditions through tuning.

              

Figure 3: comparison of image output – before (right) and after (left) auto white balance tuning

Lens distortion

Distortion is the bending of straight lines in an image, appearing as follows:

  • Barrel distortion (negative value) – lines bend outward
  • Pincushion distortion (positive value) – lines bend inward
  • Mustache distortion (Waveform distortion) – It is a combination of lines that bend outward and inward, simply putting a combination of barrel and pincushion distortion
  • Keystone distortion – this distortion occurs when the camera’s sensor plane is not parallel to the plane of the capturing object, causing a trapezoidal effect in the image

Figure 4: lens distortion

This parameter is evaluated using dot pattern charts and can be corrected through calibration.

Figure 5: distortion OFF (Left) and distortion ON (Right)

Chromatic aberration

Chromatic aberration happens due to different colours of light bending at slightly different angles when passing through a lens. This causes colours to focus at different points on the sensor, resulting in coloured fringes.

The two types of chromatic aberration are:

  • Lateral chromatic aberration: different wavelengths fall on different points on the image plane
  • Longitudinal chromatic aberration: different wavelengths fall on different image planes

Figure 6: chromatic aberration

Lateral aberration is more easily visible in images, while longitudinal aberration requires analyzing image sequences captured at varying distances. This parameter is evaluated using dot pattern charts.

Lens shading/vignetting

Lens shading refers to the decrease in the image brightness from the centre to the edge of the image. The brightness variation affects the overall image quality. This can be corrected through lens calibration and tuning.

Figure 7: sample image outputs with Lens shading (left) and corrected image (right)

Dynamic range (DR)

The dynamic range represents the camera’s ability to capture both highlights and shadows in a scene. It’s measured in decibels (dB).

DR is evaluated using specialized charts:

  • ITU-HDR transmissive test chart (with 36 density steps from 0.10 to 8.22)
  • Contrast resolution chart (with 20 density steps from 0.15 to 4.9)

Low light performance

This parameter assesses how well a camera performs in limited lighting conditions. It’s evaluated using the eSFR ISO Chart, which contains:

  • Wedges and slanted edges for MTF50 calculation
  • 20-patch OECF for measuring noise, SNR, and DR
  • Colour patches for checking colour reproduction

Signal-to-noise ratio (SNR)

SNR refers to the ratio between the maximum signal and overall noise. Higher SNR values indicate better image quality with less visible noise. It is evaluated using the eSFR ISO Chart.

Sharpness

Sharpness determines how clearly edges are defined in an image. It measures how many pixels are required to transition from a dark area to a bright area. Metrics like MTF (Modulation Transfer Function) at different levels (10, 20, 50) are used to validate sharpness.

Flare

Flare occurs due to light scattering inside the camera, reducing overall contrast and affecting dynamic range. It’s also evaluated using ISO 18844 & P2020 Flare charts.

Figure 8: lens flare reducing contrast and causing bright spots (highlighted in red)

Product Spotlight

Upcoming Events

View all events

Further reading

A selection of Optoelectronics articles for further reading

Read more
Newsletter
Latest global electronics news
© Copyright 2025 Electronic Specifier