Friday, May 25, 2018

Fun with Home Automation Part 1: The most convoluted doorbell

This is the first of a two part series on using a music keyboard for home automation. Here we describe using a motion detection system to trigger two audio events:

1.) MIDI notes on a keyboard (because why not)
2.) Your Google Home device (this might actually be useful to some people).

The result is that when someone approaches our front door, two notes are sounded on a music keyboard, and the Google Home speaker talks to us as well. The future is now!

In this example, I receive motion events from my security camera software (iSpy) via a http GET request. The actual camera feed is from a streaming Raspberry Pi server (running RPi-Web-Cam Interface). iSpy allows me to view multiple streams, and manage motion detection for each stream. I also (as a lazy option) store stills during motion events into a dropbox folder, so I can see it easily on other devices without any custom application. This particular setup probably warrants a separate post at some future date...

Back to the main feature in question: the doorbell itself!

Here I'm using node-red again, running on a Raspberry Pi. Here's the overall system block diagram:


And here's what the flow looks like: [node-red flow code here]

On top of a stock node-red install, you will need the following two addons:

node-red-contrib-midi: for talking to the MIDI port
node-red-contrib-google-home-notify: for sending text to speech snippets to your Google Home device.

Relatively simple: on the top left you see the incoming motion /get request. I actually assemble a simple html reply and shoot it back for testing purposes, so you can test emitting this event with your browser and it will return an html page with "OK" on it.

1. Triggering MIDI notes

The Raspberry Pi running the node-red server also has a MIDI keyboard (in this case, a cheap Casio CTK-2300) connected via USB. The midi out object will automatically find any class compliant ports, and list them in a drop menu.

To emit the MIDI events, we have two triggers in the middle of the flow that basically turn on, and then off a particular MIDI note (you need to emit both ON and OFF messages otherwise the key will be stuck forever, even after it fades out and becomes inaudible). I put a delay on the second note so they are played one after the other. The message format is simply an array containing the raw MIDI bytes. You can take a look at the trigger objects (or midi out object info) to see the exact messages (notes) I'm sending.

2. Triggering Google Home

When we first got our Google Home device, one thing I really wanted to do is to be able to emit custom events in the form of audio notifications on the speaker. Turns out one of the easiest ways is to use google-home-notifier. The gist of how it works is that on a local network, you simply need to know the IP address of your Google Home speaker, and then audio can be directed straight to it! So the notifier application does a bit of text to audio first with your input, and simply transmits it to the Google Home, and thats it! Much simpler than I imagined. Obviously if you want more complex two way interactions you'll probably have to dig into the actual Google Assistant API...