During Arc Touch’s recent third-annual Hackathon, we learned a lot about some emerging Internet of Things platforms. From chat bots and wearable’s to home assistants and augmented reality, we dove into the Internet of Things world around us in an effort to understand how businesses, developers and users can benefit from these new platforms.
Google Home was the focus of a few of our projects, for good reason. It’s among the newest of Internet of Things platforms, having been released late last year. And when you start to think about what it will take to make Google Home, or any home assistant, a mass commercial success advanced A.I. machine learning and contextual awareness Google is well positioned to make it happen.
If you’re heading up a business or product line thinking about how to engage with consumers in the home, the time is now to start learning about Google Home (along with its rival home assistant, Amazon And that’s exactly why we got our hands dirty and hacked up some proof-of-concept apps (or actions as Google refers to them). Here are nine things we learned in the process:
1. The future is smart and you may not need to set your alarm
As we mentioned before, Google Home is now an interface that responds to the user and to other devices, but it doesn’t suggest or proactively engage a user. In the near future, we envision Google Home being able to start and hold complex conversations. Google Home will achieve its purpose once it becomes a “home concierge,” knowing your likes and dislikes, customizing itself based on your behaviour and context.
How cool would it be if Google Home, powered by the Google Assistant, could make smart decisions on your behalf? Instead of you setting the alarm for a specific time, for example, Google Home could understand your morning routine, and calculate how much time you can sleep based on the traffic conditions on that day. It’ll wake you just in time so you won’t be late for work. That type of smart assistant might be coming sooner than we think.
2. But you might need to know more than just coding
A big part of building any voice app is writing the algorithms used to train an agent to understand natural language as it applies to specific subjects. For example, how do you get Google Home to understand key terms about topics ranging from food to football? Google has some domains mapped and available for anyone to use. But it’s fairly limited right now, as you might expect. Eventually, AI-driven bots might be able to go off and educate themselves on different topics. But until then, it won’t be enough to know how to develop the bot; you’ll need to be an expert on the subject your action is addressing, too.
3. Is Google Home better than Amazon’s Alexa?
Maybe. Google Home has the advantage when it comes to user context. The integration with API.AI makes some things easier, for example leveraging synonyms for key words, which ends up making a huge difference in the final user experience. Alex and Google adopt W3C’s SSML (Speech Synthesis Mark-up Language) — it is to voice apps as HTML is to the web — in which the tags make conversations more natural. Right now, Alex seems to work better with different accents, but Google Home’s voice seems more human and less robotic. Google Home also seems to be more sensitive to picking up voices, hearing from further distances in comparison to Amazon Echo. And as a developer, what if you want to support both platforms? Unfortunately, it’s not that simple. You have to create a thin layer for both and have all the business logic stored in a server
4. Google Home has issues with accents
Arc Touch has a large team in Florianopolis, Brazil. During the hackathon, Google Home had a hard time understanding the accent of our Brazilian team members — despite the fact that our employees are fluent English speakers. This isn’t a new issue — voice apps on phones and tablets have a history of poor performance when it comes to understanding different accents. It’s just that this known issue is going to become even more challenging with Google Home, Amazon Alex and other voice-only interfaces, where there is no touchscreen available as a backup. One of our hackathon projects designed for Google Home could help staff organize and manage dish orders in a professional restaurant. Could you think of a place with more ethnic diversity than a restaurant kitchen? We think Google is poised to solve this problem, given its history and expertise with data and machine learning.
5. Google Home is always listening, sort of
Google Home is always listening for the “OK, Google” command, ready to be activated. During the recent Super Bowl, an ad for Google Home triggered the devices for owners who had it in close proximity to the TV. This always-on state has generated buzz in industry circles about privacy and whether Google is storing every nearby conversation that takes place. And if it is, how might Google use that information? According to the company’s privacy policy, Google Home “listens in short (a few seconds) snippets for the hot word. Those snippets are deleted if the hot word is not detected.” After you engage Google Home with “OK, Google” your privacy settings will dictate what information Google collects. That information may be used to build your online profile, and of course, that online profile becomes highly valuable for services trying to reach you via advertising with targeted offers.