We’re all expecting the computing future envisioned by Minority Report to actually happen, and I’m mostly talking about the fantastic user interface from the movie, although a system that can predict crime might also come in handy. Tom Cruise’s character uses a sophisticated computer that can interpret spatial gestures and provide instant responses. The entire room becomes a huge 3D screen where any type of gesture can be recognized.
Well, we’re not quite there yet. But a company called Qeexo is reinventing multitouch so that it feels more natural to the user, and I got to see it in action at Mobile World Congress.
Touching a display or touchpad is now the norm when it comes to interacting with devices, whether they’re smartphones, tablets, or PCs. And these devices can recognize a variety of touch patterns as long as you remember what those patterns are. But what if you could touch the screen in a more intuitive way, one that would come naturally without having to remember what a three-finger swipe to the left does?
That’s how Qeexo CEO Sang Won Lee explained the company’s TouchTools software to me.
Pretend you’re picking up an eraser while touching the screen, and an eraser will magically appear. Move your hand around while maintaining contact with the screen and you’ll erase everything the eraser touches. Want to write? Use your finger and a pen will appear on the screen. Want to change colors or settings for your pen? Just imagine you’re rotating a dial that brings up extra menus, and one will appear.
Swipe up with a hand just as you would with a real ruler on a piece of paper, and the ruler will appear and stay on the screen so that you can draw a straight line with your other hand.
The software can simulate a camera as well, or a tape measure (that actually lets you measure on-screen elements), and they’re all enabled by touch. These tools disappear from the screen the moment you lift your finger, freeing up as much space as possible for content. Here’s a video demo of Qeexo’s tech:
The best part about it is that the technology can be used on any device that accepts touch. It can be a big screen TV — that’s the kind of device I tried it on — but also something smaller like a car’s infotainment system. It can run Android, like the huge screen above, but it can also work on an iPad or anything else.
Do we really need it? Imagine being able to control the volume in your car by reaching to the touchscreen and performing a motion that’s similar to what you’d do with an actual physical volume knob. All that would happen without having to take your eyes off the road — and without having to remember a special gesture.
Imagine using the tech in a classroom instead of the regular chalk and blackboard. It would eliminate wasted time, and you would not have to physically clean the blackboard or dive through menus looking for a certain feature on a conventional digital device.
I was even shown an iOS version of the app that simulates mouse interaction, complete with buttons and scroll wheel that actually work.
All that might be available in the future on a wide range of devices. The only thing Qeexo could tell me at the show is that it’s aiming for a 2017 launch of a version of what I had seen on the show floor. Did I also mention that there’s on-device machine learning involved in all of this? Yes, the future does look interesting, even if we’re not quite where Minority Report was.
Until we get to see TouchTools in commercial products, you can try a version of it on Huawei phones like the P10, or some of last year’s handsets. The tech is called FingerSense, and uses many of the same solutions. The phone knows how you touch the screen, whether you use a finger or a knuckle, whether you draw a line or a letter, and it responds accordingly. It also works as soon as the display is turned on. Here’s a demo of FingerSense in action: