1080p, 720i, Megapixel, 2 Megapixels, 4, 8, 24 Megapixels… 4K, 8K, or even Gigapixels, yes, that’s a thing! But what does this all mean in terms of a practical application? Is more always better? And how do you decide what is the best choice for your application?
When talking about resolution within the industrial imaging community, there are multiple types of resolution to be aware of. The first, and probably the most common, is camera resolution. However, there is more to consider than just the megapixels of the camera. For instance, frequency resolution and lens resolution also play a role in the overall image quality of industrial imaging solutions.
Why talk about these other types of resolution when the main question of resolution is almost always asked with respect to megapixels? Because industrial imaging, and all imaging for that matter, is a series of compromises based on the specific application and how the resulting images will be used. In this article, we will discuss these different types of resolutions and the compromises associated with each.
This is the most commonly referred to measure of a camera’s resolution, measured in pixels, or megapixels. This measurement is the spatial resolution of the camera sensor and is only one of the factors that determines the overall quality of the camera image.
So, let’s start this conversation with examples of different popular resolutions and the perception of each. Take for example, standard TV resolution in the US, 640 x 480, that is 640 pixels in the horizontal direction, or X, and 480 pixels in the vertical direction, or Y, for a total pixel count of 307,200 pixels, or 0.3 megapixels. Not even a megapixel, so let’s take a look at a resolution of approximately one megapixel, 1280 x 960, or 1.2 megapixels. Now that’s a big jump from 0.3 megapixels to 1.2 megapixels, which is four times the resolution. Or is it?
It is now time to take a look at the resolving power of a single direction. For this example, we will take a look at the horizontal direction, 640 pixels vs 1280 pixels. In the horizontal direction, the resolution has only improved by two times, not four times, and this is the same in the vertical direction as well. This is because the term megapixel resolution is describing an area, or 2D space with a single number. With this in mind let’s take a look at HD 1080p and 4K.
The HD 1080p standard has a resolution of 1920 pixels by 1080 pixels for a resolution of 2 megapixels and the 4k standard has a resolution of 3840 pixels by 2160 pixels for a resolution of 8.3 megapixels. Going from 2 megapixels to 8.3 megapixels looks like a four times increase in resolution, however, the resolution is only 2 times greater across any given contrast variation.
Given this understanding of how higher-resolution sensors compare to their lower-resolution counterparts, it is time to start talking about the possible compromises that can be made and how to select the appropriate resolution for your project. First, what are the compromises? Data size, sampling frequency, sensitivity, shot noise, and of course, cost are the main factors to consider.
For a comparison to start with, let’s look at a one-minute, uncompressed 24-bit color video. A 1080p video would require 10.4GB of storage, whereas a 4K (3840 x 2160) would require 41.7GB of storage. Not only is the storage requirement greater for the 4K video, but the transmission bandwidth is greater as well. Now, yes, compression can be used, and there are many compression types and strategies that come into play, to the point that a whole blog post can be dedicated to compression.
However, given the same compression used on the 1080p video and the 4K video, the size ratio would remain the same, with a 1:4 difference in data size. The compromise between data size and resolution comes down to, whether the increased resolution justifies the increased bandwidth and storage requirements for any particular project.
This refers to the frame rate of the video. As more resolution is required, more data is generated, which must be read off the sensor before it’s reset for the next frame. In this regard, the more resolution required the less frequency resolution, or frame rate, is possible.
The reverse is also true, the lower the resolution required the higher the frame rate potential, where actual limitations will depend on the specific hardware being used. When making a compromise regarding frequency, start with your end goals of the project, if the increased frequency is of more importance, then there may need to be a concession in resolution.
What does sensitivity have to do with resolution? Sensitivity is directly related to the size of the pixel used on the sensor. The larger the pixel, the more photons can be registered. As more and more resolution is required, the size of the sensor must increase or the pixel size must decrease to meet the higher resolution requirement.
Today’s new imaging sensors are getting better and better in this area, however, when push comes to shove, sensitivity may win out over higher resolution. If the object of interest cannot be seen clearly, it does not matter how much resolution you throw at it, the image will still be lacking the quality needed.
Shot noise, cost compromise, as well as frequency and lens resolution, are important factors in determining how much resolution is needed for your project, but we will leave that for Part 2 of this article.