It’s an old story, but a good one. In 1975, Kodak created one of the very first digital cameras and then promptly sat on the idea for fear it would cannibalise its long-standing and immensely profitable film camera business. By 1996, it was the fifth most valuable brand in the world with over two-thirds of global market share. And in 2012? The more-than-century-old institution filled for bankruptcy following its slow embrace of the same new technologies it had once invented.
Apple might make the most widely used cameras on the planet, but it is not Kodak and likely never will be. For starters, it also sells more watches than the entire Swiss industry combined, has a $2.4 trillion market cap, which is larger than Italy’s GDP, and just happens to run a TV streaming service as a side project. Still, there is one major distinguishing factor between Apple and Kodak that towers above the rest: photographic trends now come to life in the course of months, not decades. Case in point: TikTok as we know it launched in August 2018 and was the most downloaded app in the US by October that same year. Arguably the defining factor for the iPhone’s ongoing success is its ability to keep pace with innovations its engineers might not have imagined while designing the thing.
Such is the importance and secrecy attached to these smartphone cameras, it’s not often that anyone is granted a glimpse behind the curtain at their making. To coincide with the release of Apple’s new iPhone 13 and iPhone 13 Pro models, we were granted an audience with Jon McCormack (VP of camera software engineering) and Graham Townsend (VP of camera hardware engineering): the two people leading a reported team of more than 800 engineers and other specialists in an ongoing search for pictorial perfection. From macro photography to a Hollywood-esque Cinematic video, there is a lot that’s new and important with these devices and it’s all been years in the making.
“The planning has to start about three years ahead, because that’s when we actually fix the specification of the silicon,” says Townsend. “So, for instance, the sensor gets defined at that point and the A15 Bionic processor is also frozen. That’s when we have to begin to talk with Jon and predict the experiences that we want. Obviously when we designed the new ultra-wide lens, we were going to deliver macro photos. But how is that going to work both in stills and video?”
A photo shot via the iPhone 13 Pro’s macro mode
If Townsend is the person in charge of sourcing custom lens components, assessing the first hardware samples to arrive back from manufacturers and ultimately figuring out just how big the bump on the back of the iPhone is going to be, then McCormack does the stuff you probably don’t notice as much. A state of affairs that’s entirely by design. With a career that spans from Amazon to Google and HP, he’s as close to a Silicon Valley veteran as you could imagine, right down to his tendency to explain technology in near-philosophical terms. “Our desire for recording the world around us goes back to the cave paintings in France from prehistory,” he notes at one point.
In practice, little about the iPhone’s camera is supposed to be esoteric, with almost all the behind-the-scenes trickery being done for you with a press of its shutter button. “Collect your iPhone, take the photo and don’t get lost in questions like ‘Is there enough light?’ or ‘Do I need to go into settings?’” says McCormack.
This has been the defining ethos for iPhone photography since day one and it has stood the test of time. From 2010 when the New York Times’ front page featured a photo from the Afghanistan War that was shot with vintage filter app Hipstamatic through to a pandemic when bored teens mobilised on TikTok to tank a Donald Trump rally, these cameras have documented modern life in all its weirdness and complexity. So although the iPhone 13’s cameras do benefit from larger pixels and a wider aperture to capture more light, improved optical image stabilisation to counteract picture blur and all manner of highly technical improvements, that’s only half the story here. The part that actually matters? What people do with it.
“There’s always this obsession with ‘How do we make this really easy to use if you’re not a trained photographer, but you just love taking photos?’” says Kaiann Drance, Apple’s VP of worldwide iPhone product marketing.
Of course, a major part of the iPhone camera’s evolution has been that it’s not just used for baby photos, aerial shots of your brunch and workout tutorials any more. After Sean Baker’s critically acclaimed Tangerine was shot on iPhone 5s all the way back in 2013, these smartphones have increasingly become just another tool in a filmmaker’s toolkit. Given the starting cost of an industry standard Arri Alexa camera is just under £100,000, having a piece of kit you can chuck about and not worry about roughing up is increasingly valuable. “What I want is to be able to use a camera like a pen or a brush, and it’s getting closer to be able to do that without it looking really crappy,” says director Ben Wheatley, who used multiple iPhone 12s in the production of his most recent movie, In The Earth. “The fact that these phones can spit out 4K – everything is so much better.”
As a company that’s never knowingly turned down the opportunity to align itself with the people who make your favourite stuff, a large part of the 13 and 13 Pro’s grand unveiling was draped in the clothes of a Hollywood epic. From the Knives Out-inspired trailer for their Cinematic camera mode to actually giving the handset itself to Kathryn Bigelow for a promo video, you could be forgiven for thinking that a major demographic for these handsets is anyone with a direct line to Netflix CEO Reed Hastings. That’s not really how the people making them actually see their roles, though. To the extent that McCormack quickly skips past a question about the pressures of his creation being put in the hands of a two-time Oscar winner.
“It’s a fantastic opportunity, but I think the even more important thing is getting to see what a 15-year-old girl on a high street does with the exact same technology,” he says. “For me, the Holy Grail is packaging that thing that Kathryn gets really excited about in a way that’s simple enough for just anybody to use to tell their own unique stories.”
If there’s one prime example of this ethos at play, it is the Cinematic mode that’s available across every iPhone 13. You know that bit in a movie or TV show when one person drops out of focus and another subject comes into view all within the same frame? Usually the dramatic tension is about to hit the fan. That technique is what’s known as “rack focus” and you can achieve it on all these new iPhones with a few taps of their touchscreens, and, more impressively, you can even go back to this footage after you’ve shot it to change its focus and depth of field. Whereas the iPhone’s Portrait camera mode does some similar stuff with one photo at a time, the A15 Bionic chip allows Apple to pull off the technique in real time.
“It was a long process with lots of windy roads, but like most profound things it takes a while,” says McCormack. “We’re not just looking at the depth of every single frame, but there’s also this thing called temporal stability: as we move between frames with people moving, how do we make sure you don’t end up with weird edges and stuff like that?”
While this level of computational photography is no child’s play, an arguably greater challenge for the future of the iPhone’s camera is making it in a more sustainable manner. Having pledged as a company to go carbon neutral by 2030, a major factor in Apple meeting that target will be conserving energy and lessening CO2 emissions by changing the materials it uses to make the iPhone. This process has been going on for some years now, with previous models employing a mix of recycled tin, tungsten and rare earth elements, and it continues with the iPhone 13. “We’re using upcycled water bottles for its antenna lines and we had to develop a new process to make them stronger,” says Drance.
The iPhone 13 Pro’s rear camera system
If creating a recycled aluminium enclosure for the recent iPad mini is reasonably feasible in the grand scheme of things, then the intricacies involved in camera design make for a significantly more complicated process. Just think about it: you’ve got lens glass, sensors, wiring and even moving parts such as sensor-shift stabilisation all crammed together into an island that’s about an inch and a half in size. Replacing all those components with those that are better for the planet requires doing so in a way that is imperceptible to the Average Joe in the short term and doesn’t sacrifice the resilience of those parts in the long run. It’s a big ask, but a major priority for Townsend.
“Most of the leaders for my team actually went to visit Apple’s recycling site in Austin and that really helped them understand how parts were recovered from other parts of the iPhone,” he says. “There are some rare earth magnets in the camera, so that was one area that we were looking at. When you have a large magnet, you can easily make it out of recycled material, but for the small ones it’s very difficult to get the magnetic flux density.”
Even if these changes take a while to figure out, Apple has both the time and the inclination to get them done. Most of the product shots used to sell any new iPhone inevitably spotlight its updated camera module, while its “Shot on iPhone” ad campaign is so ubiquitous it’s been a meme on YouTube in 2015 and then TikTok again four years later. At a time when people are holding on to their old phones for longer than ever, it’s clear what Apple sees as the key to convincing its users to make an upgrade. Especially when it comes to a device like the iPhone 13, whose major changes are internal and hardware-related as opposed to aesthetic.
So how do you handle that kind of pressure and expectation? “The way I explain it to the team to keep them motivated every year is that we have a remarkable responsibility. There are many other reasons that people have iPhones in their pocket, but we have a privilege to help capture precious moments that people aren’t even expecting to happen,” says Townsend.
A portrait shot on the iPhone 13 Pro’s wide camera
Go back and watch the very first iPhone presentation and you suspect that even Steve Jobs didn’t quite comprehend the scale to which this responsibility would evolve. Admittedly, there was a lot to get through during that 80-minute unveiling, but the handset’s two-megapixel camera essentially ranks as a footnote amidst a constant flurry of innovation and superlatives – perhaps because its actual feature set wasn’t much to shout about. Imagine a phone camera launching these days without flash, video recording or any kind of zoom: it’s unthinkable. At least Apple’s former cofounder was impressed. “I can just scroll through photos here with my finger. Pretty cool,” he said.
By early 2020, what was once a fun little curio had sparked an 87 per cent decline in digital camera sales over the course of a decade, according to the Camera & Imaging Products Association. Of course, the rise of Instagram, Snapchat and a whole raft of apps that changed the way we share imagery had a big part to play in this decline, but the iPhone has evolved hand-in-hand with them. From panoramic photos to a Night mode for low-light photos and a proper optical zoom, this camera now ranks a super-capable bit of kit for amateurs and pros alike. As Apple continues to dabble with new technologies such as augmented reality and high-fidelity ProRes video recording, its work is by no means done. For the iPhone to succeed, its camera will continue to change for the better.
“We’re not asking for the impossible, but we are asking that the camera achieves the best it can every year,” says Townsend. “Over the past ten years we’ve seen a dramatic improvement, but there is no rest.”