Here Are All the New Camera Features of the Google Pixel 4
Despite the numerous issues that have plagued the Pixel series of smartphones, they still reign supreme when it comes to camera prowess. It is, in part, due to Google's impeccable software that enables the Google Camera app on Pixels to capture high-quality images. The Pixel 4 is the first in the series to come with two rear cameras. Its main camera has a 12-megapixel sensor with an f/1.7 aperture lens, while the telephoto camera has a 16-megapixel sensor with an f/2.4 aperture lens. It also has a host of new features, which we'll take a look at in this post.
Continuous zoom is Google's way of making you look at the two rear cameras as a single unit. Despite the Telephoto lens' focal length is only 1.85X longer than the main camera, the Pixel 4 can digitally zoom up to 3X with without any loss in quality. It is possible due to Google's Super Res Zoom technology. It uses the user's shaky hand as a method to let the camera collect more detailed scene data so the phone can magnify the photo better. Google claims that it can be used for 4X, 5X, and even 6X zoom without any measurable loss in quality.
Strictly speaking, HDR+ isn't 'new' or exclusive to Pixel 4. It has been around since the original Pixel dropped and has only gotten better since. It works by blending up to nine heavily underexposed shots taken in rapid succession into a single photo. Google claims that the camera begins capturing the images even before you hit the button. To make things livelier, the Pixel 4 offers separate sliders for bright and dark regions of an image. It lets you show a shadowed object in the foreground without worrying about washing a bright object in the background.
AI color correction and Bokeh
Objects look different under different lighting conditions. For example, a person's skin tone will look different under candlelight compared to blue light. A camera needs to be aware of the presence of different lighting conditions and process images accordingly. The Pixel 4 uses Google's AI software to adjust the white balance according to the ambient lighting conditions. It also uses the same software to process Bokeh images better. Google says that the quality of images is on par with an SLR lens, due to the portrait mode calculations happening with raw image data.
Stereoscopic vision 3D vision
Much like the human eye, the Pixel 4's Portrait Mode distinguishes between the object and the background by comparing the different views from our two angles. The first point of reference is 1mm from one side of its lens to the other, and the other is a longer gap about ten times between the two cameras. These dual gaps of different lengths let the camera judge depth for both close and distant subjects, allowing the camera to better differentiate between the object and the surroundings.
Night Sight made its debut alongside the Pixel 3 and is now available on most Android devices thanks to Google Camera ports. Night Sight uses Google Camera's software to 'insert' light in a low-light setting, making it seem as though the shot was captured under proper illumination. It is by no means perfect, but it is miles ahead of anything that the competition to offer. For all the accolades Samsung cameras get, they still lack when it comes to low-light photography. With the Pixel 4, Google took Night Sight to another level. It now has a dedicated astrophotography mode, which can help users capture celestial bodies. It is done by capturing 16 quarter-minute shots for 4-minute total exposure time, reduces sensor noise, then merges the images into one shot.