PC & Mobile technology
Smartphones
Mobile technology
Tricks and tips
09.12.2023 07:15

Share with others:

Share

How do smartphone cameras work?

For many, the value of a phone depends on the capabilities of the cameras. While I don't fully agree with this, as smartphones can do a lot more than photography, I can't argue with the importance of cameras.
How do smartphone cameras work?

"Which processor does your phone have? How much space does it have? How much RAM?" That's how we used to compare phones, but today the more common questions are: "How good are your phone's cameras? How well does it take photos and videos at night?” When testing phones, everyone, including testers, including us at the editorial office, pay the most attention to the cameras and only then to the overall performance of the phone. Since there are quite a few unknowns around cameras and a lot of technical terminology (screen, sensors, lenses, focus...), it might be time to clear up some of the fog around how they work.

Everything revolves around light

It's surprising (or not) how many parallels we can make between the camera and our eye. In dark rooms, eyes and cameras are blind. To record an image, we need light. But the light is often scattered in all directions. Our eyes have lenses that direct light onto the retina. Even the cameras on our phones have lenses that capture and record light information.

Light can also harm photography, which is most obvious with analog cameras that use photographic film. (Too) long exposure to light can destroy the contents of the film. The invention of the aperture solved this conundrum. In analog cameras, the shutter is a physical mechanism that opens and closes rapidly to control the amount of light that reaches the film. The eyelids have a similar function.

There is no physical shutter on the phones, although you do hear that distinctive "click" sound when you take a photo. It's just a sound effect. Put the phone on silent mode and the sound will disappear. Instead of physical shutters, the cameras on phones use an electronic shutter that has the same function, but it does everything with algorithms rather than physical movement. Some smartphones, such as the Huawei Mate 50 Pro, have a camera with a physical aperture that can be moved between preset positions.

The film has not yet been forgotten. It is still in use among hobbyists, professional photographers and even in the film industry. Elsewhere, it has been replaced by sensors.

Why do mobile phones have different lenses?

You've probably watched a professional photographer change lenses on his camera depending on the scene in front of him. Phones are technically capable of this, as demonstrated by Xiaomi with the Xiaomi 12S Ultra Concept phone, but it is extremely impractical, and new obstacles arise, such as durability issues, water resistance, high price, and the like. Manufacturers have therefore proposed several different cameras as a solution - each with its own specific lens, which can be easily switched between within the camera application according to needs. Today, most phone cameras work this way.

If you look at the back of your phone, you'll notice two, three, or even four lenses and one on the display in front. Each offers a different perspective, depth and its own unique features. The main lens is understandably always present. The ultra-wide camera is also more or less a constant on mobile phones. In the lower class, we often find a macro camera, while premium phones have a telephoto lens and a periscope telephoto lens, such as the Samsung Galaxy S23 Ultra.

What is the function of the lens?

Aperture, lens and image sensor are closely related. The aperture is the opening you can physically see on the camera lens. As mentioned, aperture controls how much light will reach the lens and sensor. As a general rule, a larger aperture is better because it means the camera can use more light information. However, this is not necessarily the best indicator of photo quality.

If you look at your phone's specs, you'll notice an "f" rating for cameras. These degrees are the ratio of the focal length to the physical diameter of the aperture. The smaller this number, the wider the aperture. For example, vivo X90 Pro has f/1.8 main lens with 23mm focal length, sf/1.6 (50mm) telephoto lens and so on.

You will have a hard time comparing the performance of phone cameras with the focal length. Focal length is extremely important, but for creating different aesthetics and visual effects. A shorter focal length is intended for a wide-angle perspective - nearby objects appear larger. A longer focal length, for example, creates a more proportional and neutral photo.

As light enters the camera module, the lens collects the incoming light from the shot and directs it onto the sensor. Smartphone cameras are made up of many plastic lenses called elements. Due to the nature of light, different wavelengths of light (colors) are refracted (bent) at different angles when passing through a lens. This means that the colors from your scene are projected onto the camera sensor out of alignment. Cameras need multiple lenses to transmit a clear image to the sensor without possible irregularities such as misalignment and other effects.

Photo: OnePlus

How does focus work on smartphone cameras?

Focusing is ironically not the user's focus because that is usually controlled by the cameras themselves. To a certain extent, focus can be adjusted manually (depending on the phone), but in most cases the software does the job so well that manual intervention is unnecessary. Cell phone cameras use a dedicated sensor and/or additional hardware such as a laser rangefinder to focus.

Software autofocus uses data from the image sensor to determine if the image is in focus and adjusts the lenses to compensate. The usual technique of passive autofocus is based on detecting the contrast of the image and adjusting the focus until it is at its maximum. This method is entirely software-based, making it the cheapest option. However, the process is slower and does not work as well in low light conditions.

Newer phones use phase detection autofocus (PDAF), which is faster and more accurate. Go to the specs of the latest iPhone 15 Pro Max and you'll notice the PDAF tag on the cameras. The latter ensures that the same amount of light reaches two closely placed sensors on the image sensor. Classic PDAF systems rely on these dedicated photosites on the image sensor to measure light coming from the left or right side of the lens. If the image spots on the right register the same light intensity as the spots on the left, the image is in focus. If the intensity is not the same, the system can calculate how much it needs to compensate for a sharp image, which is much faster than systems that rely on contrast detection.

Older PDAF systems use only a few percent of all image sites, while newer ones, such as the Galaxy S23 Ultra, use all 100 percent. In addition to the left and right image points, the top and bottom image points are also used for focusing.

iPhones have long had a dedicated LiDAR sensor that improves focus, depth perception, night imaging, and is handy for augmented reality (AR) applications.

Photo: Huawei

What is an image sensor?

The sensor is basically just a silicon wafer, on which a lot depends. The sensor receives light and converts it into electrical signals. A sensor can have several million pixels. How can you find that out? If you see a 100 or 200 MP camera, that means the sensor in question has 100 or even 200 megapixels. If the light from the lens does not reach the image point, the sensor registers this image point as black. If the image spot receives a lot of light, the sensor records it as white. The shades of gray that the sensor can register is called bit depth.

Most phones have 8-bit depth, but some have 10-bit depth. By comparison, 8-bit depth means that the camera can capture 256 hues for each primary color channel used to mix the color spectrum (red, green, and blue). That's 256 shades of red, green and blue. In total, this is 16.7 million possible color shades. 10-bit cameras can capture more than a billion shades.

How does a camera capture a color photo? Each image spot has a color filter that lets only certain colors through. With rare exceptions, such as Huawei phones that use RYYB (yellow instead of green filters), the most commonly used is a Bayer array of color filters that divides each square (2×2) of the image spot into red, blue, and two green filters (RGGB).

Photo: Sony

Normally, a camera running a set of Bayer filters will sum all of this color data into one value, but pixel binning doesn't do that. Manufacturers needed a way to collect each color separately.

For this purpose, they designed a so-called quad-Bayer array, where each group of pixels (2×2) is assigned one color. Four of these are then combined together - much like the original set of Bayer filters: 2x green, 1x blue, 1x red.

The new set not only allows smartphone manufacturers to preserve color data in the process of combining pixels, but also allowed them to introduce other innovative features such as HDR mode.

Let's go back to the sensors. In the case of sensors, it is necessary to pay attention to their size and the size of the pixels themselves. Larger sensors can capture better photos because they have more image areas that are also larger. Recently, smartphones and their cameras have entered the 1-inch world. For example, the Xiaomi 13 Pro and the vivo X90 Pro are among the first to incorporate 1-inch sensors.

Pixels are measured in micrometers (µm). Larger pixels can absorb more light, which is good for night photography. Don't worry if your phone has smaller pixels than other phones. Outside of night photography, it will still be able to deliver good results. Even the best Samsung phones struggle with smaller pixels. The Galaxy S23 Ultra has a 200 MP sensor, resulting in 0.6 µm pixels, while the iPhone 15 Pro Max has a 48 MP sensor with 1.22 µm pixels. Manufacturers have therefore started using pixel binning technology. The Galaxy S23 Ultra combines 16 pixels into one to capture photos with a final resolution of 12 MP.

Optical and electronic stabilization

Stabilization is also important for capturing good photos and videos: optical or electronic.

OIS is a hardware solution that uses a microelectromechanical system (MEMS) gyroscope to detect motion and adjust the camera system accordingly. Example: if you are holding a smartphone and your hand moves slightly to the left, the OIS system will detect this and move the camera slightly to the right. This is especially important in night photography, when the camera takes a long time to capture the light, during which time vibrations can affect the quality of the photo.

Electronic Image Stabilization (EIS) relies on the phone's accelerometer to detect motion. Instead of moving parts of the camera, it moves frames of the image or lighting. Because the exposures are aligned based on the image content and not the image sensor frame, the final image or video has a reduced resolution.

What does the software do?

After the image sensor converts the light into electrical signals, it is the job of the image signal processor (ISP) to convert those numbers into an image. The data in electrical signals is essentially a black and white image. The ISP must first return the color data based on a set of color filters (Bayer or something else). This creates an image, but the pixels are different intensities of red, green or blue. This is followed by color reconstruction, where the ISP changes the colors of pixels based on the colors of neighboring pixels. For example, if there are many green and red pixels and very few blue pixels in a certain area, the color reconstruction algorithms convert them to yellow.

ISP also has denoising and sharpening algorithms after color reconstruction is done. Each phone then has its own specific algorithms for producing the final photo.

The next time you pick up your phone, turn on the camera and take a photo, you'll know what was going on in the background during that time. Are you interested in how smart watches or their sensors work?




What are others reading?