Tech

Phone 14 Pro: finally a real change on the photo sensor from the iPhone 6S!

New rumors reaffirm that Apple could increase the definition of the main sensor of the iPhone 14 Pro and Pro Max. This would go to 48 megapixels. The definition of iPhone sensors has stagnated at 12 megapixels since the iPhone 6s, while the competition opts for sensors of 50, 64 or 108 megapixels. However, will this new sensor really improve the quality of the photos on these iPhones? Not sure.

apple iphone 13 pro test
Apple iPhone 13 Pro

In recent years, the definition of photo sensors in smartphones has progressed strongly. It continues to increase, even exceeding 100 megapixels. We had the opportunity to test some of them, such as the Samsung Galaxy S21 Ultra 5G, the Xiaomi Mi 11i or the Honor 50. As of this year, the first smartphones with sensors 200 megapixels could happen. The Xiaomi Note 11 could even be the first smartphone to take advantage of such a component.

Read also – iPhone 14 Pro: first price overview, Apple would increase prices

If the definition of the sensors is quite high in most brands of telephony, it nevertheless remains quite low in an irreducible… American. This is Apple. Since the iPhone 6s, the Cupertino company has remained faithful to the 12 megapixel definition. Admittedly, the nature of the sensors has changed since then, as well as the quality of the lenses, stabilizers and autofocus. But Apple refuses to increase the definition of the main sensor so as not to create an imbalance vis-à-vis the secondary sensors.

Apple would choose a 48 megapixel sensor to equip the iPhone 14 Pro and Pro Max

But that could change. Some rumors say that some models expected in September 2022 could have sensors 48 megapixels. The information is confirmed today by TrendForce which specifies that this change would only concern the pro rangei.e. iPhone 14 Pro and iPhone 14 Pro Max. Once again, Apple would give its much more expensive Pro line a photographic advantage. It was the first to benefit from a secondary sensor, an optical zoom, an optical stabilizer or even a LiDAR autofocus.

If this leak is confirmed, will it improve the photos? Indeed, the leak confirms that the sensor would be “Quad Pixel” compatible. This means that four adjacent pixels will be combined to form a larger pixel. The default definition of the photos would therefore remain at 12 megapixels. Either the current definition, to keep the balance mentioned above.

So why change? Two hypotheses. Either Apple chooses a larger component. This would increase the pixel size: a “quad-pixel” of the new sensor would be larger than a native pixel of the current component. And as always, more light always means better quality. Either Apple does not change the size of the sensor. This decision would then be motivated only by the marketing argument of the figure. Which would be rather sad.

Source: TrendForce

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *