Histogram of an image is a histogram of pixel intensity values. It sounds complicated, but in reality the concept is very simple.
For simplicity let’s assume our image is a grayscale image, so that each pixel has some brightness (or intensity) represented by an integer from 0 to 2^n-1.
Histogram of that image is just a bar graph showing us how many pixels of each intensity there are in that particular graph.
Histogram might be normalized, meaning that for each pixel intensity i the number of pixels with that intensity is divided by the total number of pixels in the image. This way height of each bar in the chart represents a probability of some pixel having that brightness.
But what about colours? One of the most common solutions is to use RGB model: represent red, green and blue components of the image separately. Those three basic colours are then added to reproduce a broad array of colours. Now we can create separate histogram for each channel (red, green, blue) or we can convert it to grayscale and compute the histogram normally.
But why should we care about histograms?
Histogram are be used for many image enhancement algorithms. For example, histogram equalisation is used to automatically improve image’s contrast by “stretching” the histogram.
Histogram can also be used to choose an appropriate threshold if we want to make convert our image to a binary image (consisting of black and white pixels only).
To summarise: image histogram is a graphical representation of a pixel brightness distribution. It’s very important in image processing as many image enhancement algorithms, such as histogram equalisation rely on it.
Next time: Histogram equalisation