One of the more intriguing additions to the iPhone 13 is Cinematic mode. The feature, in effect, allows users to take video in Portrait mode. In other words, iPhone 13 users can record video and focus in on a particular subject and have the rest of the scene be blurred. Rapidly adjusting the focus between foreground and background subjects is a classic filmmaking technique and now it’s simply part of the iPhone 13 feature set. Even cooler is that users can adjust the depth of field after the fact in iMovie or Adobe Premiere.
Even more impressive is that Apple’s implementation can even “anticipate when a prominent new subject is about to enter the frame and bring them into focus when they do.” Apple is clearly championing Cinematic mode as a huge iPhone 13 selling point. Apple’s iPhone 13 page boasts that it’s the “only smartphone that lets you edit the depth effect after you shoot.”
How Apple developed Cinematic mode
Apple explains that it studied master filmmakers during the creation process of Cinematic mode.
On Hollywood shoots, pulling focus requires a talented team of experts. Like a cinematographer, who makes the overall call about what’s in focus and when that changes. And a focus puller, who makes sure the transition is smooth, the timing is spot on, and the subjects are perfectly crisp.
The entire process, Apple notes, required a tremendous amount of engineering. One of the key ingredients in bringing Cinematic mode to life is the Neural Engine. Apple writes that it trained the software to make decisions in real-time about what should be in focus. The software also applies smooth transitions when alternating between subjects. And again, if you’re not happy with the result, you can manually adjust the focus in post-production.
The software that powers Cinematic mode is “computationally intense.” In fact, Apple says it wouldn’t even be possible without the A15 Bionic.
Apple VP sheds light on the creation process
TechCrunchrecently interviewed
Apple executives Kaiann Drance and Johnn Manzari about Cinematic mode. Drance is Apple’s top marketing exec while Manzari is an Apple designer.
“We knew that bringing a high quality depth of field to video would be magnitudes more challenging [than Portrait Mode],” Drance told TechCrunch. “Unlike photos, video is designed to move as the person filming, including hand shake. And that meant we would need even higher quality depth data so Cinematic Mode could work across subjects, people, pets, and objects, and we needed that depth data continuously to keep up with every frame. Rendering these autofocus changes in real time is a heavy computational workload.”
What’s particularly interesting is that Apple didn’t sit down and decide to create Cinematic mode out of the blue. On the contrary, Apple designers sat down and took a look at timeless filmmaking techniques. During their research, the idea for Cinematic mode was born.
Is Cinematic mode as amazing as it sounds?
So far, there’s been some debate about the utility of Cinematic mode. Some reviewers were quick to call it a gimmick while others found it compelling. Of course, we’ll have to wait and see what the masses do with it before reaching a conclusion.
In the interim, Matthew Panzarino took his review unit to