Apple surprised everyone with the release of the iPhone 14 Pro. There had been numerous credible rumors about the company switching to a pill-shaped cutout for the front-facing camera and Face ID system in place of the recognizable notch, but the new “Dynamic Island” alert system was a complete surprise. And even though it was becoming more obvious that Apple would eventually have to use larger camera sensors in line with the rest of the market, the company went one step further and completely overhauled its computational photography engine under the name Photonic Engine.
The iPhone 14 Pro, whose prices in the United States still start at $999 and go up, contains a lot of that kind of stuff. Apple introduced an always-on display later than other companies, but it is considerably more colorful. Apple is investing heavily in eSIM in the US, something that no one else is actually doing. Apple will distribute millions of these phones together with a rudimentary satellite networking system that is unlike anything else we have heard of when it launches later this year. Overall, the new iPhone 14 Pro has more beginnings of huge concepts than we have recently seen in an iPhone.
The simplest way to think about the iPhone 14 Pro is in that sense; it feels like the first step toward a lot of new things for Apple and the iPhone, and it may even be the first indication of a completely new category of iPhone. However, that does not imply that everything is yet flawless.
How do we evaluate and rate products?
I’ll be the first to confess that I’ve grown to like the moniker “Dynamic Island”; after all, it’s had everyone talking, which is unusual for a smartphone status indicator system. I’m on board if Apple wants to force everyone to seriously investigate alternative smartphone UI concepts.
The island takes the place of Apple’s well-known and frequently derided notch; it houses the front camera and Face ID system since they must occupy some area on the display’s front. The notch has one drawback, though: after using it for a while, it practically vanishes.
You should observe that the island is different. If you use your phone in light mode, the interface element is a black pill shape in the centre of a white screen and is situated lower on the screen than the notch. Given that it is always animating and moving, you will see it. In fact, I’d go so far as to claim that this is the first iPhone that feels better in dark mode since it blends in better.
Why then did Apple make the subtle notch into a slightly more noticeable island? Numerous alternative status indicator systems have been implemented to iOS over time. An overlay appears when a charger is plugged in or the mute switch is turned on. A green pill appears in the corner when a call is taking place in the background, and a blue pill appears when an app is using location. On the other side, there are pill indicators for screen recording and personal hotspot. AirPods can be connected via another overlay. Additionally, some elements—such as timers and background music—have never actually had any helpful status indicators.
By creating a new home for system alerts and integrating it with things like music and the new live activities API that will be included in iOS 16 later this year, which will enable apps to share even more background information for things like your flight status or a sports score, Apple is replacing and unifying all those older status systems. It does not serve as a substitute for notifications, which continue to all show in the same spot and have essentially the same appearance.
The island can be understood most simply as a new widget system based on the live activities API. The widgets have three possible views: the primary view within the island, an extended view, and an ultra-minimal icon when there are two things happening at once. Apple utilizes an internal priority list to place the two most crucial things in the island if you have more than two things running.
It’s a cool idea, but like with any first iteration, Apple made certain decisions that truly work and some that, well, it’s the first iteration after all.
One of those things, the Dynamic Island, requires a year of development and developer focus before we can truly assess its significance.
But because the animations make the island stand out so much more, you find yourself staring at it constantly. It can hide some content in apps that haven’t been updated because it’s lower on the screen. The island doesn’t exactly do enough to always be in the way, thus the tradeoff between how apparent it is and how beneficial it is right now is a little unbalanced.
All things considered, when the live activities API launches later this year, that trade-off might completely shift. The other significant thing Apple did correctly was to create the hooks that allowed third-party developers to access the entire system. Some of the concepts we’ve heard from Lyft, Flighty, and other companies are very intriguing. The Dynamic Island, however, currently has the impression of being one of those things that needs a year of development and developer focus before we truly understand its significance.
Typically, the 48-megapixel sensor on the Pro produces 12-megapixel photos
The other significant change to the camera system is that, depending on the camera you’re using, Apple now performs Deep Fusion processing for mid- and low-light photographs early on uncompressed image data. This is meant to increase low-light performance by two to three times. The “Photonic Engine” is the name given to the entire image processing pipeline as a result of this modification; Apple is still performing Smart HDR and all of its other well-known processing, but it now goes by a fancier moniker.
Although Deep Fusion’s effects have always been incredibly subdued, we’ve always referred to it as “sweater mode” because Apple prefers to showcase it with moody photographs of individuals in sweaters in low light. The same applies to the iPhone 14 Pro as well. On uncompressed data, sweater mode appears to still be in effect.
The 14 Pro and 13 Pro both produce remarkably similar images in general. You really have to look for it, but the 14 Pro is a little cooler and catches a small bit more detail at 100% in low light. This is valid for both the primary camera and the ultrawide, which this year features a larger sensor and gains from Photonic Engine. Details from the ultrawide appear slightly better than those from the 13 Pro at 100% in extremely low light, but only if you look very attentively.
The images of Verge senior video producer Mariya Abdulkaf taken outside in bright light appear to be identical, but upon enlarging them to 100%, you can see that the iPhone 14 Pro’s significantly larger sensor allows for somewhat more detail and a smoother background blur. Although it’s incredibly wonderful, it’s barely perceptible at Instagram sizes. With a 50-megapixel sensor that has been pixel-binned, the Pixel 6 Pro can capture even more information and a wider spectrum of colors.
The 14 Pro is unquestionably the start of many fresh concepts.
I’m not sure yet if all of these novel concepts are worthwhile. With the iPhone 14 Pro, you’ll be figuring things out with Apple in many ways if you’re the kind of person who enjoys the rough edges that come with being on the cutting edge. However, if you’re content with your existing phone, waiting another year to see how some of these things pan out can be worthwhile.