DEV Community

Subhendu Ghosh
Subhendu Ghosh

Posted on • Originally published at codeflu.blog on

Glow LEDs with Google Home

Recently I tried experimenting with Google Home, trying to voice control LEDs. Majorly the whole thing can be split into two parts,

  1. A custom command that makes a web POST request to fetch the result.
  2. A simple Flask app that can receive post request with parameters and glow some LEDs based on the POST request data.

For the part one, the custom commands were possible thanks to Google Actions Apis.  I used API.AI for my purpose since they had good documentation. I wont go into detail explaining the form fields in Api.ai, they have done a good job with documentation and explaining part, I will just share my configurations screenshot for your quick reference and understanding. In Api.ai the conversations are broken into intents.   I used one intent (Default Welcome Intent) and a followup intent (Default Welcome Intent – custom) for my application.

top-intents

Heres my first intent which basically greets the user and asks for a LED colour when the custom command “glow LEDs” is activated.

intent1

As you can see the User says is what defines my command , you can add multiple statements in which you want to activate the command. The Action and Contexts is set when you create a followup Intent. Text response is the part which your Google Home will use as response.

Next is the Followup Intent which basically takes the User response as input context (which is handled automatically when you create the followup intent) and looks for required parameters and tries to process the request.

user_interaction

Here the expected User says  would be a colour (red, blue, green) is what I allowed. In Api.ai you can use their ML to process the speech and find your needed parameters and values. I needed colours hence used @sys.color.  Their are other entities like @sys.address or @sys.flight etc. If these entities don’t serve your purpose then you might want to go vanilla and process the speech on your web-api end. The later part of the Followup Intent is a bit different, we are fulfilling the user request via web-hook here. Here the Response  is the fallback response incase the web request fails, the success response is received from web-hook response body.

home-response

The fulfilment option won’t be activated until you add your webhook in the Fulfillment section. Thats all for the part one. Also you can use Google Web Simulator to test your application On the Go.

webhook.png

In part two , I used a Raspberry Pi, 3 LEDs (red, blue, green) , a 1K ohm resistor some wires, a breadboard(optional)  and a T-cobbler Board(optional). Now, we will write a flask application that will accept a post request and turn on the required GPIO pin output high/low.

You can check with the request and response structure you need from the Api.ai docs. Next, this application receives the calls from api.ai webhook and it triggers the targeted LED depending on the resolvedQuery. The above code was written so that I can test locally with get requests too. I used pagekite.net to tunnel and expose my flask application to the external world. Following is the circuit diagram for the connections.

circuit

Following is the Result,

Some more Reads:

  1. https://arstechnica.com/gadgets/2016/12/google-assistant-api-launches-today-so-we-tested-some-custom-voice-commands/
  2. https://docs.api.ai/docs/actions-on-google-integration
  3. https://developers.google.com/actions/develop/conversation
  4. https://developers.google.com/actions/develop/apiai/tutorials/getting-started
  5. https://developers.google.com/actions/samples/
  6. https://docs.api.ai/docs/webhook
  7. https://docs.api.ai/docs/concept-intents#user-says

 

Top comments (0)