When the Google Pixel 2 and Pixel 2 XL shipped last year, they weren’t actually operating to their full potential. Both phones contain a chip called the Pixel Visual Core, which is custom-designed to handle photography tasks like stitching together HDR images and applying machine learning algorithms to improve your photos.

Google enabled the Pixel Visual Core within the default camera app with the Android 8.1 beta back in November of last year, but third-party photo apps have been kept out of the fun until now.

Thanks to an update rolling out to the Pixel 2 family today, third-party apps like Instagram, Snapchat and Whatsapp can now use the Pixel Visual Core to enhance photography. The update should be rolling out to users starting today, and everyone should have received the update within the next couple days. Google is also planning on pushing out augmented reality camera stickers to celebrate the Winter Olympics later this year.

Google says that its custom chip is not only more powerful for processing photography tasks, but more efficient in terms of battery usage:

With eight Google-designed custom cores, each with 512 arithmetic logic units (ALUs), the IPU delivers raw performance of more than 3 trillion operations per second on a mobile power budget. Using Pixel Visual Core, HDR+ can run 5x faster and at less than one-tenth the energy than running on the application processor (AP).

Pixel Visual Core is built to do heavy-lifting image processing while using less power, which saves battery. That means we’re able to use that additional computing power to improve the quality of your pictures by running the HDR+ algorithm.

The Pixel 2 XL has been lauded as the only phone camera that challenges the iPhone X’s camera setup, and Google’s photography algorithms are a big part of that. HDR+ is a machine-learning-enhanced version of a regular HDR photograph, which stitches together several photos of the same subject, taken at different exposures, to show more detail in shadows and prevent blowout.

Comments