The Pixel 4 will have a few distinctive signature features that make it sound like a better proposition, at least on paper, than other Android devices, and Google already confirmed them a few months ago. First, Google teased the dual-lens camera on the back, which makes the Pixel 4 the first Google phone with two rear-facing cameras. Then the company announced its own 3D face recognition system as well as the radar-based Motion Sense functionality. The later uses a Soli chip and sensors to pick up hand gestures and allow users to interact with the phone without touching it. That sounds great in theory, and might one day make possible gestures-only user interfaces like the ones seen in Minority Report. But a series of leaks suggest that, in practice, the Pixel 4’s gestures recognition system is pretty boring, and you might end up not using it in the first place.
Motion Sense will let you silence alarms and calls, and skip songs, Google said when it unveiled the functionality. Meanwhile, 9to5Google obtained videos that show how the functionality works on the Pixel 4.
All you have to do to silence alarm is to wave your hand over the phone. To silence a phone call you… wave you hand over the phone. You want to know how you skip songs? You. Wave. Your. Hand.
Motion Sense doesn’t seem to offer any sort of magical gesture that would make you want to use — or remember that it’s enabled. Sure, waving your hand to get all of that done might be easy and convenient to do, but we’re already so well trained to silence alarms and calls, and skip songs, that we might just stick to briefly touching the phone to do it. There’s no new gesture for turning the volume up or down, or for any other activity for that matter.
Motion Sense might be an excellent innovation for computers of all sizes, but it sure seems like an unnecessary gimmick added to the Pixel 4, one that forced Google to go for that huge top bezel on the handset.
The clips are all available at this link. If you want more proof about how boring these hand waves will be, also check out 9to5Google’s second Pixel 4 discovery. Google has partnered with Pokemon for a brief demo that teaches users what Motion Sense is all about. We’re still looking at lots of hand waving — translated in the clip by touches because the blog doesn’t have a Pixel 4 yet.
Hopefully, Project Soli will deliver novel ways to interact with regular computers in the near future, because it sure looks like it’s a feature we don’t necessarily need on phones.
Finally, I’ll add that Google might be first to use radar on a phone, but it’s not the first to add gestures to a flagship device. LG did the same thing with the G8 ThinQ earlier this year. The Z camera, or the time-of-flight (ToF) front-facing sensor, is supposed to pick up the gestures, including hand waves, pinch-to-capture, and palm motions. The phone would let you switch between apps and raise the volume, provided you get the hang of it, but the whole thing is far from intuitive. LG’s Air Motion also seems to be more sophisticated than Google’s Motion Sense when it comes to the types of gestures it can recognize — here’s how it all works: