The entire series of Apple iPhone 15 uses a 48-megapixel main wide-angle lens, and the photo output is preset to 24 million pixels. This specification has attracted the attention of many foreign media and celebrities. Well-known tipster ICE UNIVERSE explained that the default of 24 million pixels The photo is a big step ahead of Android and shows Apple’s excellent chip computing capabilities.
ICE UNIVERSE analyze, most users don’t understand the significance of the iPhone 15’s 24 million pixels, and bluntly say that this is a big step beyond Android phones. He explained that although the Android camera specifications have reached 100 million or 200 million pixels, the default photo output of general flagship mobile phones still remains at 12 million or 12.5 million pixels.
Please read on…
If high-resolution Android phones are enabled, they will not be able to shoot continuously quickly, and the HDR effect will also decline. The reason why Apple can preset output of 24 million pixels and bring better photo quality than 12 million pixels is the powerful chip processing power.
ICE UNIVERSE conducted further actual measurements and used S23 Ultra to take photos of 12 million, 50 million and 200 million pixels respectively. Then compared with the 24 million pixels of iPhone 15 Pro Max, it can be found that the performance of 12 million pixels and 24 million pixels is the most similar, with normal performance. Regarding the HDR effect, even though 50 million and 200 million pixels have higher resolutions, they all suffer from overexposure and loss of detail.
According to Apple’s official explanation, iPhone 15 uses a new light imaging engine, which will first combine a high-resolution photo with the best pixels with another image optimized for light capture, and finally synthesize a 24-megapixel photo. , which can bring more details than the previous 12 million pixels.
“You May Also Want to See”
Double the number of lenses? Rumor has it that Apple’s next generation iPhone 16 will transform into a “six-eyed monster”