Google will launch the Pixel 4 phones in less than two weeks, but the cat is out of the bag. With each passing day, we learn more details about the handsets, and the leaks are more thorough than last year. That’s because several people had access to the phone and were able to extract and inspect some of the Pixel 4’s new apps, including the Camera, Recorder, Motion Sense, and Assistant. Of those, there’s really just one feature that I’d like Apple to copy for the iPhone, especially now that the app’s functionality has leaked in full.
I’ve often told you how much the Pixel phones have copied the iPhone, and the Pixel 4 is no exception. It took Google two years to do it, but they finally figured out 3D face recognition for Android, which will be part of the Pixel 4. It doesn’t really matter that Google had 2D face recognition before Apple did. Without the iPhone X, there’s no telling if Google would have ever attempted 3D face recognition, or if it would have completely changed navigation on Android.
Google has hardly taken any significant risks with the Pixel line so far, aside from making it expensive, but the Pixel 4 is different. The phone will introduce a feature no other phone has — a built-in radar chip that detects gestures. However, that’s not something we need on smartphones, at least according to what we’ve seen in leaks so far, and that’s definitely not something I’d want on the iPhone.
The feature I was referring to is the “new Google Assistant,” as Google is marketing it, and specifically the new continued conversation feature. We saw it in action a few months ago at Google I/O, where we were told the upgraded app is coming with new Pixel hardware this fall. At the time, we speculated that Google used a Pixel 4 prototype to demo the feature, and, fast-forward to a few months ago, we got to see the new Assistant on a leaked Pixel 4.
This brings us to 9to5Google’s in-depth look at the new app.
What’s remarkable about the Assistant is that Google shrunk the software down from 100GB to 0.5GB, which means it’s able to run directly on the device. That means you won’t have to wait for answers, and you can control the device without an internet connection. In addition to answering your commands that require searching online for answers, the Assistant is also able to perform tasks inside apps, like moving photos around, or on the phone, like disabling or enabling flight mode or the flashlight. You can ask Siri to do the same things on iPhone 11 or any other iOS 13 device, but Siri will warn you that it can’t operate if you turn on airplane mode.
The standout feature of the new Assistant is the “Continued Conversation” mode that allows you to interact with the voice assistant multiple times after you’ve invoked it. The Assistant is aware of the context and adapts accordingly, allowing you to multitask efficiently without ditching the voice assistant.
In one example, a person gets a text from a contact asking when their flight lands. The person asks the Assistant, without leaving the chat app, receives the answer, and then instructs the Assistant to send a text message to the contact with the answer. That’s just one example, although the conversations with the Assistant are far more advanced what you can do with Siri on the iPhone. Google’s mesmerizing demo at I/O proved how much the app has evolved.
Of course, with Google being Google, a company that started caring about user privacy only recently, you also have to entrust the Assistant with all of your data to allow for such deep interaction with it. You can’t have continued conversation with it if it can’t read your email for details about your next flight.
It’s unclear when the new Assistant will be available to other Android handsets, but it wouldn’t be surprising for Google to make it a Pixel 4 exclusive for a few months. Some of the clips that Google will use to demo the Pixel 4’s new Assistant, as well as the new interface that Google has conceived, are available over on 9to5Google.