Unlike their predecessors, the 2019 iPhone 11 lineup has made several big appearances on the rumor mill. It's looking as if Apple's attempts to cut information outflow from its facilities to the general public have failed. A proportionate amount of this information suggests that the upcoming smartphones will add on a camera sensor this year. Yesterday we got a handful of new details for the software-based photography capabilities of the rumored three lenses on the iPhones 11 and two lenses on the iPhone 11R.
Leaks for the Apple A13 are scarce these days, but yesterday's information can allow us to openly speculate about the upcoming chip, that will be the jewel in Apple's crown.
Apple Should Increase ISP Area On The A13 To Account For Rumored New iPhone 11 Features
We've had a couple of big leaks for the iPhone 11 since the start of this year. These details, some of which are mentioned above, have solidified key facts for the upcoming smartphones. Additionally, they also suggest that Apple A13's internal codename is T8030, and the iPhone 11 and 11 Max are identified as D42 and iPhone 12,3. This is strange as the iPhones XS and XS Max are identified by 11,2 and 11,6 respectively.
This information is supplemented by alleged camera features that Apple will introduce on the iPhone 11 and iPhone 11 Max. A new feature will allow users to manipulate an image's area that is uncovered by the frame. Dubbed as Smart Frame, it allows the software to, "capture the area around the framed area in pictures and videos so that the user can adjust the framing or perform automatic perspective and crop corrections in post," believes the source of these details.
Google's camera app also performs similar operations on data gathered through the same lens, but it's looking as if Apple's solution will give users more control. Should Apple include Smart Frame through software, it will need to dedicate more area on the A13 SoC to perform photographic computations.
Apple's A12 has a relatively smaller area dedicated to the ISP that is codenamed as Petra. On the SoC, Petra is located right next to the GPU, and the area occupied by it should be roughly equal to one core of Apple's quad-core custom graphics processing engine. Taking this into account, and the fact that the upcoming iPhones will each add a camera sensor, we're forced to conclude that either Apple will integrate the NPU (Neural Processing Unit) and the ISP, or that it will increase die area allocated to the image signal processor.
Out of these, it's likely for the company to follow the first approach since density gains achieved through TSMC's EUV are unlikely to allow Apple much maneuvering room. The N7+ (7FF+) process will provide a 20% density gain, according to TSMC's data, and if we're lucky, this should translate into 7.5 Billion transistors on the A13 with a die area of 83.27 mm² (same as the A12's). TSMC's 5nm, on the other hand, provides a 1.8x density gain over the 7FF process. Therefore, it's on the A14 that we should see Apple stretch its performance muscles.
A move to on-screen-fingerprint-recognition on the 2020 iPhone should also Apple to divert neural processing resources to image processing. However, we're uncertain whether the Cupertino tech giant's security standards will allow for smooth implementation of the feature on an iPhone, especially as the company also wants to move in the financial space.
Thoughts? Let us know what you think in the comments section below and stay tuned. We'll keep you updated on the latest.