How iPhone X’s Portrait Lighting Feature Came to Be – Behind the Scenes Look [Video]
Ever wondered how that wonderful Portrait Lighting feature works on the iPhone X? Apple’s behind the scenes video gives us a glimpse.
Apple’s Portrait Lighting Took a Lot of Research and Time Before it Came to Be
Portrait mode was introduced with the iPhone 7 Plus, made possible with its dual-lens camera system. It works by shooting a portrait image using the telephoto lens while gathering depth information from the other one. The result is an image that has a beautiful blurred background, much like how you would achieve on a high-end camera.
The iPhone X and iPhone 8 Plus takes Portrait mode to a whole new level with the introduction of Portrait Lighting. It is a feature that allows you to ‘control’ the lighting situation of the Portrait shot you are taking. All of this happens thanks to machine learning and countless hours of research which Apple has thrown in.
Check out Apple’s brief video into how it all came to be:
Take a look behind the iPhone X and discover the process we went through to create Portrait Lighting. Combining timeless lighting principles with advanced machine learning, we created an iPhone that takes studio-quality portraits without the studio.
While the feature does have its imperfections, but it would be interesting to see how it moves forward on the iPhone lineup, and potentially the iPad too. After all, Apple did manage to bring the feature to the front facing camera of the iPhone X, using its TrueDepth camera to map out depth related data and then applying the necessary effects, therefore it gives us hope that the same technique will reach the rear camera too in the future.
It’s remarkable how the camera system on a smartphone is an ever-evolving affair, with brand new innovations coming out on an almost-regular basis. Take a look at the Pixel 2 and Pixel 2 XL as an example – portrait mode using nothing more than machine learning. Who knew!