Click to Skip Ad
Closing in...

A tiny detail in Google’s new Assistant will make apps a lot smarter

Published May 19th, 2016 5:05PM EDT

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Google is slowly building Her. In case you haven’t seen the 2013 sci-fi movie, it’s a disturbing love story about a regular guy falling for a voice-based operating system. And while Google is far from announcing a computer OS that’s able to respond via voice to all your queries and help out with day to day chores in intuitive ways in human-like fashion, it’s certainly laying the foundation for such a future.

One of the most important announcements coming out of Google I/O 2016 is the Google Assistant, an evolution of Google Now that’s going to offer even more advanced voice- and chat-based features. But Google is going a step beyond simply making Google Now an even more useful smartphone tool, by adding some of the Assistant’s smart features into other apps.

The company introduced new tools in developer sessions that will not only change the way we interact with our apps – by giving them Assistant-like powers – but also help save battery life, and streamline the performance of a phone.

DON’T MISS: 5 Android N features you won’t find on any iPhone

The Assistant APIs will let Android app developers take advantage of the various other APIs that tell a phone where you are, what you’re doing, and what’s around you to build automatic actions on your smartphone that happen without requiring input from the user. This isn’t Google Now gleaning information from the various apps you use. This is any app becoming a smaller assistant that would be able to operate independently of Google Now, using information about you, your likes, and habits.

Developers will be able to read data from the various sensors in modern phones, assemble it into useful information, and generate responses. For example, the phone might know you went to bed later than usual, and that your first meeting is later than usual today, and it could automatically alter the default alarm for the day.

Similarly, the phone would be able to turn on the Chromecast TV in the morning and project weather information on it so that you don’t have to search for it as you’re choosing the outfit. The phone would then tell you that you need to hurry to leave home, after analyzing traffic conditions, so that you’re not late for your meeting.

This would all happen in the background, without the user’s direct input. The apps would only wake up and link to each other when needed, and Google’s APIs would make sure that the phone is running smoothly, and that computational power and battery is saved. Not to mention that you won’t be turning the screen on as frequently, which means you’d be conserving additional battery life. Developers, meanwhile. won’t have to manually link up APIs to offer smart assistant-like features, and, thus, they won’t have to deal with resulting performance and/or battery draining issues.

Any third-party will be able to take advantage of the new API, not just system or Google apps. For example, real estate app Trulia has been working with the Awareness API to tailor its notifications so that they only show up on the screen in certain conditions. A user might receive a reminder about an open house only when he or she is in the neighborhood, if he or she is walking rather than driving or running, and if it’s sunny outside. The app would read all this data from the phone’s sensors.

Yes, Google and other apps will know more about your you if you want these features to work, but they’ll be opt-in, meaning that developers will have to ask your permission to offer you Assistant-like features in their apps. Or, at least, Google did say during this presentation that it values user privacy, so that’s something.

Google is yet to come up with a human-like OS as in Her, but the smartphone of the immediate future looks better and better. Not only will the phone be able to conduct better human-like conversations with us, but it’ll also automate many of the things we do on a phone, possibly helping us stare less at the screen.

A video explaining Google’s vision follows below, and it’s worth watching.

Chris Smith Senior Writer

Chris Smith has been covering consumer electronics ever since the iPhone revolutionized the industry in 2008. When he’s not writing about the most recent tech news for BGR, he brings his entertainment expertise to Marvel’s Cinematic Universe and other blockbuster franchises.

Outside of work, you’ll catch him streaming almost every new movie and TV show release as soon as it's available.