Innovation

iPhone 12: What Apple’s 3D-sensing camera means for the future of photography

Apple is expected to add a new dimension to photos with its next smartphone.

gece33/iStock Unreleased/Getty Images

Apple is set to add a 3D depth-sensing system to its next iPhone, a Wednesday report claimed. The rumored time-of-flight setup would send out light waves and measure the time it takes to bounce off objects, building up a deeper understanding of the world than before. The process is similar to how a lidar sensor in an autonomous car can sense object distance.

The Fast Company report claims that at least one phone this year will feature the sensor, and it will be supplied by Lumentum. The San Jose-based firm already helps power the TrueDepth sensor on the front of the iPhone, used for face recognition to unlock the phone. Beyond security, TrueDepth is also used for features like "Animoji," where users can map their face movements onto animated emojis.

The rumored rear-facing sensor could dramatically expand on these uses. The publication explains the Lumentum-supplied sensor could boost augmented reality apps, like the rumored all-in-one "Gobi" project that would tie in-store navigation into a single app. It could expand on existing uses, like the iOS "Measure" app that takes measurements of objects. It may also pave the way for more impressive photography, producing images closer to those found on professional cameras.

This could be a welcome expansion on the Portrait Mode, found in the iPhone 7 Plus introduced in 2016. That phone uses two cameras to capture a subject, calculates the depth based on the two different positions, and uses that information to blur the background. The end result is something that looks closer to a photo from a professional DSLR camera.

A time-of-flight system could improve this further. It could capture more precise depth information, using it to more effectively adjust bokeh depending on the object's distance. It could also be used to re-adjust focus after taking the photo, similar to the Focos app.

The camera in action.

gece33/iStock Unreleased/Getty Images

It's perhaps in photography where the laser sensor could really shine. The iPhone is the number one camera brand used on Flickr, the professional image sharing website, a position it has held since 2015. The camera on the iPhone has improved over its 13 years from the original's tiny two-megapixel sensor to the triple-lens 12-megapixel extravaganza found on the iPhone 11 Pro.

Over that time, the camera has become an increasing area of focus. The company revealed in 2015 that it had 800 people working solely on the camera. As smartphone cameras have improved, the old adage about "the best camera is the one that's with you" has meant more people using the iPhone as it's good enough.

This taps into a wider Apple philosophy about iPhone design. When introducing the iPhone 5S in 2013, Apple senior vice president of worldwide marketing Phil Schiller explained that the goal was to give people better photos without much effort, adding that “for most of us, we just want to take a picture and have the iPhone take a better picture for us.” He highlighted this point by comparing a full camera bag with a simple smartphone. In a 2018 interview he reiterated this point by saying that most people "want their picture to be a beautiful picture without thinking very much about it."

It's unclear how much the time-of-flight sensor would improve on the existing features, but the evidence suggests it'll offer welcome benefits. The phone is expected to use a vertical-cavity surface-emitting laser, or VCSEL, to measure the distance. This will be paired with a sensor and supporting software. While the TrueDepth system can only measure objects a few feet away, the rear sensor is expected to stretch much further.

Apple wouldn't be the first to include a time-of-flight sensor on its phones. The Huawei P30 Pro, Samsung Galaxy S10 5G and LG G8 Thin Q all feature the sensors. These previous implementations, which have enabled features like focusing video, suggest Apple will be able to add big value to its phones with the sensor.

The Inverse analysis

Apple's strength has been in making a simplified camera that works for as many people as possible, and time-of-flight is likely going to continue improving on that.

If previous versions of iOS are anything to go by, it's likely the sensor would be relatively straightforward to operate. There may be very few options to tweak, and it might work fairly simply out of the box. In fact, it might not be clear to the user at all that something special's going on, similar to how Apple started quietly adjusting white balance after it was able to better detect a scene.

The company's most notable improvement to the camera in years may end up an obscure footnote, but the results may be dramatic.

Related Tags