Over the past few years, virtual assistants have become a standard feature on mobile and desktop devices alike. Siri is still a fixture in Apple’s advertising campaigns for the iPhone and iPad, and Cortana has gone from a neat addition to Windows Phone to a key feature of Windows 10 (and coming soon to iOS devices as well).
But although Microsoft’s and Apple’s assistants make most of the headlines, Google’s own search functionality has quietly been making huge improvements as well.
On Monday, Google took to its Inside Search blog to explain how the Google app is now smarter than ever, as it can understand and respond to complex questions that it wouldn’t have been able to before.
As Google puts it, the app is now beginning to understand the meaning behind the question, as well as the intent of the person asking the question. Here are a few things the app can now comprehend:
- Superlatives (tallest, largest, etc.)
- Ordered items (“Who are the tallest Mavericks players?”)
- Points in time (“What songs did Taylor Swift record in 2014?”)
- Complex combinations (“What was the U.S. population when Bernie Sanders was born?”)
If you have the app on your phone or tablet, try to test the limits of what it can accomplish. Searching from an app isn’t typically my first choice, but if I can start talking to the Google app like it’s a human being, that might change.