Click to Skip Ad
Closing in...
  1. amazon nest thermostat 3rd generation
    14:02 Deals

    Newest Nest Thermostat gets a rare Amazon discount ahead of Prime Day

  2. Prime Day Deals
    09:43 Deals

    These early Prime Day deals have prices so low, it’s like Amazon made a mistake

  3. Best Amazon Deals Today
    07:58 Deals

    15 hidden Amazon deals that are so exclusive, they’re only for Prime members

  4. Amazon Deals
    10:22 Deals

    Today’s best deals: Huge Prime members-only sale, $15 Echo Auto, $106 off Apple Watc…

  5. Best Kitchen Gadgets
    08:33 Deals

    Amazon shoppers are obsessed with this $23 gadget that should be in every kitchen




This app hacked the iPhone’s dual camera system, and you’ve never seen anything like it

Zach Epstein
May 21st, 2018 at 8:47 AM
iPhone X Camera Specs

Apple wasn’t the first smartphone maker to switch to a dual-lens camera setup on the back of its handsets. Of course once Apple did it, plenty of other smartphone makers followed the company’s lead and added dual-lens cameras to their own phones. In fact, one smartphone even has a triple-lens rear camera now, and it’s quite possibly the best camera phone on the planet right now.

The dual-lens camera on Apple’s iPhone X, iPhone 8 Plus, and iPhone 7 Plus is still among the best, but it still only provides two functions: 2x optical zoom and Portrait Mode blur and lighting effects. Other companies have found some other nifty ways to use dual cameras, looking to add additional value for users. Don’t worry though, Apple fans, because we’ve come across a new app that brings incredible new functionality to the iPhone’s dual-lens camera, and you’ve never seen anything else like it.

The app is called Apollo, and it’s available right now from the iOS App Store for $1.99. Trust us, it’ll be the best $1.99 you spend all week. Before we tell you about it, we’re going to show it to you in two preview videos made by the app’s developer, Indice Limited.

Seriously… how awesome is that?

The Apollo app “hacks” Apple’s dual camera system and lets you do amazing things to a Portrait Mode photo after you’ve captured it. You can change the direction of light sources, add and remove light sources, adjust brightness and even color, and plenty more. We’ve been using it since it was first released last week, and we’re blown away.

Here’s the story behind the development of the app, as told by Apollo creator Indice Limited on Reddit:

Apollo is the first application to use the depth data of portrait mode photographs, to realistically add light sources to a scene. Development of the app began as an experiment back in November 2017, when we first got our hands on a brand new iPhone 8+. We wanted to see what could be achieved by taking advantage of the depth information of portrait photos. Our hypothesis looked simple: if depth information can be superimposed on a 2D photo, it should be possible to re-illuminate objects with custom light sources.

Of course, the first results were horrible. Our team stuck to the cause and tried to squeeze every last bit of information from the depth buffer. First we needed a method for deriving more depth points from the depth(disparity) map provided by the dual camera API. We algorithmically produced a new, denser map of depth points on the existing photo. Things were looking brighter, but still the visual effect of the computed lighting using the enriched depth map looked disappointing.

It was time for smoothing. We implemented different filters with various results. We needed a map of smooth contour lines that realistically follow the curves of the foreground objects. A special sauce of interpolation for enriching our map, along with some bilateral filtering for avoiding edge artefacts [sic] was the recipe that saved the day.

Armed with a high quality depth map, we were able to deduce the normal map which is fundamental for applying the lighting model of a 3D scene. Using a Phong-style lightning model, we had our first success!

At this stage, computation of depth information for a portrait photo took roughly 45 seconds, leading to very bad UX. It was time to move closer to the GPU! Our algorithm was first broken down to take advantage of multiple threads. Then all computations were rewritten for the Metal 2 SDK. Loading time dropped to around 3 seconds, a staggering improvement!

The next step was to expose all configurable parameters to the user. When our UX team started work on the project, there were dozens of parameters to tweak. That was no good, we needed a minimal set of parameters that give the user full control without being overwhelming. After lots of iterations, we narrowed down our list to six parameters: 2 global settings and 4 light source specific parameters.

Apollo truly is a unique and fantastic app, and it’s available right now on the App Store.

Zach Epstein

Zach Epstein has worked in and around ICT for more than 15 years, first in marketing and business development with two private telcos, then as a writer and editor covering business news, consumer electronics and telecommunications. Zach’s work has been quoted by countless top news publications in the US and around the world. He was also recently named one of the world's top-10 “power mobile influencers” by Forbes, as well as one of Inc. Magazine's top-30 Internet of Things experts.




Popular News