Google Can Teach Your Robot to Recognize You (and Your Mood)

Google

Imagine a Roomba robot that not only recognizes you as you but can tell from the scowl on your face that now may not be the best time to vacuum around your feet.

That’s the scenario enabled by a new Google tool that will let developers train robots, toys, and appliances to know who you are and even figure out from your expression what mood you’re in—and react accordingly.

Google (GOOG) detailed the technology in a blog post on Wednesday. A test version of the Vision Application Programming Interface is now available to developers.

At its most basic, this API will let developers build software that can interact with Google cloud-based smarts in near-real-time. In theory, that Roomba-of-the-future may see you in the room, send your image to Google computers to get information about it. Or it could be trained over time by reviewing many images to know that you are you and that a smile means you’re in a good mood and that a grimace means trouble. It would also know that the big fur ball over there is a cat and not a fuzzy slipper.

With the API, developers can apply tons of machine learning smarts to home or work applications. Machine learning, once known as artificial intelligence, is when computers learn over time from data they view to recognize people, things, words—just about anything.

All of that grunt work is enabled by the massive compute power of Google or Microsoft (MSFT) or Amazon (AMZN) public clouds, which harness a huge number of shared computers around the world. With all that firepower, the data can be crunched, parsed and served up as needed. To tell your vacuum cleaner who you are and that you may be in a crappy mood, for example.

Last month Google released TensorFlow, an artificial intelligence engine that can process images and sounds so it can recognize them later. Days later, Microsoft talked up new Project Oxford computer vision capabilities that will let computers gauge a person’s emotional state—anger, contempt, fear, disgust, happiness, sadness, surprise —from her facial expressions. (An earlier iteration could estimate a person’s age from her image.)

googleapi2

This technology could also be used by parents or businesses to quickly flag objectionable or inappropriate images before kids see them or to identify corporate logos to better hone searches for shoppers.

The good news is this could mean a new generation of smarter, more capable products and services. The bad news is that some of us may be creeped out by all of that stuff.

For more on what the tech can do, check out the Google video below:

[youtube https://www.youtube.com/watch?v=eve8DkkVdhI&w=560&h=315]

And for more on machine learning/artificial intelligence, check out Fortune’s video

For more from Barb, follow her on Twitter at @gigabarb, read her coverage at fortune.com/barb-darrow or subscribe via this RSS feed.

And please subscribe to Data Sheet, Fortune’s daily newsletter on the business of technology.

Subscribe to the Eye on AI newsletter to stay abreast of how AI is shaping the future of business. Sign up for free.