The iPhone 13 series offers new camera systems with innovative features like the new Cinematic mode to create shallow depth in videos. Heavily relying on the power of the A15Bionic chip and the new Neural Engine, Cinematic mode not only beings a new filmmaking technique used by professionals to the iPhone, but it also introduces new creative tool users did not know they needed.
To discuss the design and development of the new Cinematic mode, Apple’s VP Worldwide iPhone Product Marketing Kaiann Drance, and designer at Apple’s Human Interface Team, Johnnie Manzari, sat down with Matthew Panzarino from TechCrunch. Here are some highlights from the interview.
Cinematic mode represents Apple’s philosophy, “take something difficult and conventionally hard to learn, and then turn it into something, automatic and simple”
Apple executives said that the concept of Cinematic mode originated from the will to bring timeless filmmaking techniques to the iPhone and then Apple’s design team extensively researched cinematography techniques for realistic focus transitions and optical characteristics before heading with the development of the feature.
“When you look at the design process,” says Manzari, “we begin with a deep reverence and respect for image and filmmaking through history. We’re fascinated with questions like what principles of image and filmmaking are timeless? What craft has endured culturally and why?”
Talking of challenges faced during the development, Manzari expressed that cinematic mode is for professional filmmakers who are trained to handle focus, not for normal people. Therefore, Apple saw those hurdles as opportunities to simplify a complex process for users to enjoy.
“We feel like this is the kind of thing that Apple tackles the best. To take something difficult and conventionally hard to learn, and then turn it into something, automatic and simple.
So the team started working through the technical problems in finding focus, locking focus and racking focus. And these explorations led them to gaze. In cinema, the role of gaze and body movement to direct that story is so fundamental. And as humans we naturally do this, if you look at something, I look at it too.
So they knew they would need to build in gaze detection to help lead their focusing target around the frame, which in turn leads the viewer through the story. Being on set, Manzari says, allowed Apple to observe these highly skilled technicians and then build in that feel. “
The executives explained that Apple’s advanced chip and Neural Engine resolved several unique issues of the feature via machine learning to offer iPhone 13 users a new technique to tell more expressive stories.
Read More:
- iPhone 13 is the most important iPhone ever, setting “new dimensions of performance“
- Apple promotes iPhone 13 cinematic mode with #HollywoodInYourPocket campaign on Twitter
- Apple updates iMovie and Clips apps to support new iPhone 13 cinematic mode and other features
- Every new camera feature in iPhone 13: ProRes support, Cinematic Mode, Macro, more