Top 4 Elements to Decide Image Quality of Video Camera

Video conferencing system is not a buzzword, it becomes to the conventional communication way nowdays and future, also is an important token of informatization in our era.

Video collaboration make communication easier, as users, we require the video crystal clear, high defintion, realistic, and with full details. Same as digital camera in working theory, the optical signal pass through lens to sensor which tranforms it into electrical signal, and processed by ISP, after complex algorithsm, providing final image. Therefore, we ought to know more about at least four elements affecting the image quality, before making up our mind to purchase the video camera.

1, ISP is the “Brain”: 

Image signal processor acts the role as “brain” for video camera, while the external light through lens reachs to sensor, which is in charge of captureing and recording light and tranforms it into current signal, and then transfers to ISP. ISP chip is the role of  input signal processing, and ultimately get the result from  linear correction, noise removal, bad point repair, color interpolation, white balance correction, exposure correction and other processing.  ISP chip determines the final image quality in  a large extent, usually improve image quality up to 10% -15%.

2, The video camera ISP algorithm ability is very important:  

In addition to ISP chip processing, the software algorithm also plays a crucial role in imaging results. Once ISP chip processes the input current signal, it will generate an unprocessed original image firstly, and the software algorithm is the way to “PS”  the original image internally , it optimize sthe image color, tone, contrast, noise Etc., and finally generate the images we could see. Definitely, because everyone’s PS technology and style are not the same, so even with the same photo, each person will eventually PS a different style. Similarly, every company’s software algorithm ability is different, the final imaging results and style is not the same. Only with excellent software algorithm capability, the image optimization can be positive,  and fits to different application environment.

3, Image sensor is most critical: 

It is the key factors to determine the quality of the picture. Image sensor is a device that converts an optical image into an electronic signal, which is widely used in digital cameras and other electronic optical equpiments.  Image sensor concludes CCD, CMOS and CIS series. CMOS sensor with a lower power consumption, competitive price and excellent image quality, to be adopted in video cameras mostly. The main parameters of the CMOS are the following:

1.) Cell size
It  refers to the actual physical size of each pixel on the chip array, the common dimensions include 14um, 10um, 9um, 7um, 6.45um, 3.75um, and so on. The pixel size reflects the response capability to light in a certain extent. The larger the pixel size, the more the number of photons that can be received, and the more the amount of charge generated generated in the same lighting condition and exposure time. For low light imaging, pixel size is a representation of chip sensitivity.

2.) Sensitivity

Sensitivity is one of the important parameters of the chip, it has two physical meaning. One is photoelectric conversion capability of a pointing device, which is the same as the response rate. The other means that the device can sense (the minimum illumination) to ground radiation power (or illumination),sam as the detection rate .

3.) Bad points
Due to the limitations of the manufacturing process, it is almost impossible for all the pixels to be good for a sensor with a few million pixels, and the bad point means the dead pixels in the chip (the pixels that can not be effectively imaged or the corresponding inconsistency is more than the number of parameters allowed range of pixels), the number of dead pixels is an important measure of the quality of the chip parameters.

4.) Spectral response

Spectral response refers to light response capacity of the chip for different wavelengths, uually given by the spectral response curve.

Relatively speaking, (in the same resolution), the larger the sensor size is,   the better the photographic performance; the more graphics signal captured, the higher the signal to noise ratio, and the better the imaging results.

4, the lens: the more the better

The lens is the device that will shoot the image on the sensor, an equivalent of the camera’s “eyes”, usually composed of a few lenses. The light signal through the lens will be filtered layer upon layer (infrared, etc.), therefore,  the more the numbers of lens is, the more realistic the image is. From the material point of view, the lens can be divided into plastic lens (Plastic) and glass lens (Glass), glass lens light transmission is better, and cost  is also higher.

Except filtering out the light, there is another very important factor parameter: aperture. Aperture is composed by a few pieces of very thin metal sheet composition of lens, it control  the amount of light reaching to the sensor. The value of the aperture is usually expressed in F / 2.2, F / 2.4, the smaller the number, the greater the aperture. The greater the aperture, the more light the lens arrives at the sensor, the brighter the imaging. so, in the night or dark light environment, the imaging advantage of the large aperture is even more obvious. Meanwhile, the aperture also has the function of controlling the depth of field. Life, we often see the background of the virtual effect of a strong photo, not only highlights the focus of shooting, but also has a very aesthetic sense of art, and this is the so-called depth of field. The greater the aperture opens, the smaller the depth of field, the background blur effect is even more obvious.