Click to Skip Ad
Closing in...

This app hacked the iPhone’s dual camera system, and you’ve never seen anything like it

Published May 21st, 2018 8:47AM EDT
iPhone X Camera Specs
Image: Apple Inc.

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Apple wasn’t the first smartphone maker to switch to a dual-lens camera setup on the back of its handsets. Of course once Apple did it, plenty of other smartphone makers followed the company’s lead and added dual-lens cameras to their own phones. In fact, one smartphone even has a triple-lens rear camera now, and it’s quite possibly the best camera phone on the planet right now.

The dual-lens camera on Apple’s iPhone X, iPhone 8 Plus, and iPhone 7 Plus is still among the best, but it still only provides two functions: 2x optical zoom and Portrait Mode blur and lighting effects. Other companies have found some other nifty ways to use dual cameras, looking to add additional value for users. Don’t worry though, Apple fans, because we’ve come across a new app that brings incredible new functionality to the iPhone’s dual-lens camera, and you’ve never seen anything else like it.

The app is called Apollo, and it’s available right now from the iOS App Store for $1.99. Trust us, it’ll be the best $1.99 you spend all week. Before we tell you about it, we’re going to show it to you in two preview videos made by the app’s developer, Indice Limited.

Seriously… how awesome is that?

The Apollo app “hacks” Apple’s dual camera system and lets you do amazing things to a Portrait Mode photo after you’ve captured it. You can change the direction of light sources, add and remove light sources, adjust brightness and even color, and plenty more. We’ve been using it since it was first released last week, and we’re blown away.

Here’s the story behind the development of the app, as told by Apollo creator Indice Limited on Reddit:

Apollo is the first application to use the depth data of portrait mode photographs, to realistically add light sources to a scene. Development of the app began as an experiment back in November 2017, when we first got our hands on a brand new iPhone 8+. We wanted to see what could be achieved by taking advantage of the depth information of portrait photos. Our hypothesis looked simple: if depth information can be superimposed on a 2D photo, it should be possible to re-illuminate objects with custom light sources.

Of course, the first results were horrible. Our team stuck to the cause and tried to squeeze every last bit of information from the depth buffer. First we needed a method for deriving more depth points from the depth(disparity) map provided by the dual camera API. We algorithmically produced a new, denser map of depth points on the existing photo. Things were looking brighter, but still the visual effect of the computed lighting using the enriched depth map looked disappointing.

It was time for smoothing. We implemented different filters with various results. We needed a map of smooth contour lines that realistically follow the curves of the foreground objects. A special sauce of interpolation for enriching our map, along with some bilateral filtering for avoiding edge artefacts [sic] was the recipe that saved the day.

Armed with a high quality depth map, we were able to deduce the normal map which is fundamental for applying the lighting model of a 3D scene. Using a Phong-style lightning model, we had our first success!

At this stage, computation of depth information for a portrait photo took roughly 45 seconds, leading to very bad UX. It was time to move closer to the GPU! Our algorithm was first broken down to take advantage of multiple threads. Then all computations were rewritten for the Metal 2 SDK. Loading time dropped to around 3 seconds, a staggering improvement!

The next step was to expose all configurable parameters to the user. When our UX team started work on the project, there were dozens of parameters to tweak. That was no good, we needed a minimal set of parameters that give the user full control without being overwhelming. After lots of iterations, we narrowed down our list to six parameters: 2 global settings and 4 light source specific parameters.

Apollo truly is a unique and fantastic app, and it’s available right now on the App Store.

Zach Epstein
Zach Epstein Executive Editor

Zach Epstein has been the Executive Editor at BGR for more than 15 years. He manages BGR’s editorial team and ensures that best practices are adhered to. He also oversees the Ecommerce team and directs the daily flow of all content. Zach first joined BGR in 2007 as a Staff Writer covering business, technology, and entertainment.

His work has been quoted by countless top news organizations, and he was recently named one of the world's top 10 “power mobile influencers” by Forbes. Prior to BGR, Zach worked as an executive in marketing and business development with two private telcos.