Will this be enough?
Cameras are one of the biggest differences in smartphones, and Apple’s iPhone lineup is no different. Apple says the iPhone 13 and 13 Mini feature the company’s “most advanced dual-camera system to date,” while the 13 Pro and Pro Max have “three of the most powerful cameras we’ve ever seen.”
Which you would definitely expect. But this year, Apple is really making a big push with its cameras, especially with the Pro models. As always, the question will be what Apple can do with the image processing and software from its hardware.
The iPhone 13 lineup represents the first time that Apple has increased the sensor size of the primary camera across the board since the iPhone XS and XR in 2018, although last year’s 12 Pro Max featured 47 percent larger than the 12 and 12 Pro There was a sensor. Sensor size is an important factor in image quality because, along with lens aperture, it determines how much light the camera is capable of capturing. More light, less noise and blur.
The main cameras on the iPhone 13 and 13 mini have larger sensors, which is part of the reason why it and the ultrawide are now arranged diagonally in the camera bump. Apple also added sensor-shift optical image stabilization, first seen in the 12 Pro Max. It’s unclear how big the 13’s sensor is, but Apple says it will receive 47 percent more light than the 12’s.
According to Apple, the 13 Pro and Pro Max have an even larger primary sensor and a slightly faster f/1.5 lens that captures 2.2 times more light than before. Again, the exact sensor size isn’t advertised, but Apple has given the pixel size: it’s 1.9μm, which is bigger than any modern smartphone I’ve known. Apple may be doing this because the sensor is a relatively low-resolution 12-megapixel, but it’s still an impressive stat that should translate into better low-light performance. For comparison, the 12 Pro Max had 1.7μm pixels, while every other iPhone since the XS had 1.4μm pixels.
It’s not clear what hardware changes Apple made to the iPhone 13’s ultrawide camera; The company simply says that it has a “faster sensor” that “reveals more detail in the dark areas of your photos.” The Pro has significant hardware tweaks, however, with Apple increasing the aperture to f/1.8 for a 92-percent improvement in light-gathering capability. The sensor now also has focus pixels on board – things are rarely out of focus in ultrawide shots because the depth of field is so great, but adding autofocus means the camera can be used for macro photography, in which the focus distance is 2 cm.
The telephoto camera remains exclusive to the 13 Pro phone, and Apple has increased its equivalent focal length to 77mm, or three times that of the primary camera. Earlier the telephoto of the iPhone 12 Pro offered 2x zoom while the 12 Pro Max had an increase of 2.5x. There’s a tradeoff here – if you want to frame something with a 2x zoom, the 13 Pro will need to reduce the image quality from the main camera. But your photos that go beyond the 3x zoom will be much sharper than before, and should make for a better portrait lens. Apple has also added Night Mode to Telephoto for the first time.
When compared to the Android competition, Apple isn’t doing much to outdo them on the hardware front. The large 1.9μm pixels are noteworthy, but most Android phone makers are preferring larger, higher-resolution sensors rather than pure pixel size. For example, Xiaomi’s Mi 11 Ultra has a massive 50-megapixel sensor with 1.4μm pixels, which means it has the ability to gather good light even when shot at native resolution without binning the pixels together. Is. And while the 3x telephoto lens is going to be useful, it’s now more common to see a 5x periscope telephoto (or sometimes even 10x) in the Android world.
So even though Apple has made meaningful hardware improvements to the iPhone 13 lineup, its performance relative to competitors will fall short of how well its software and image processing pipeline are optimized. The iPhone 11 was a vastly better camera than the XS a year ago, after all, even if the hardware has barely changed. This year Apple is using Smart HDR 4, which is capable of individually adjusting exposure for multiple people in the frame, but we’ll have to see the phone to know what kind of difference it makes. The same goes for Photographic Styles, a new filter-like feature that Apple says is smart about adjusting elements like skin tones and skies in each photo.
For video, Apple is making a great deal out of its Cinematic Mode that lets you selectively adjust focus and depth of field during post-processing, like Portrait Mode for photos. That’s something we’ll definitely have to test extensively. Meanwhile, the 13 Pro lets you record and edit video in Apple’s ProRes codec on the phone itself, or you can export the ProRes file to Final Cut Pro on a Mac.
All the usual warnings about waiting for full reviews certainly still apply, but it’s a pretty good year for the iPhone camera. Apple won’t have the most flashy hardware ever, but it has made some welcome improvements in areas that make sense, and thankfully, it hasn’t locked any features into the Max-sized iPhone. We look forward to seeing the results – as well as those of emerging competitors like the Pixel 6.