Is there any success story integration with Alexa? I would like to add this feature for my system but I got some issues.
The trick is writing a "bridge" program for communication between AWS Lambda and Netlinx. The way I did this was to create a NodeJS service that can run on just about anything (Raspberry Pi, etc). It is simply a RESTful API service (using Express) and a standard TCP connection to the Netlinx controller.
Lambda makes RESTful calls in, which get turned into SNAPI-style commands to the controller via TCP.
As with anything else it just depends on how far you want to take it. I've written a couple different versions of the bridge program and Netlinx module.
It's all very doable with a little bit of reading the documentation and playing with a project or two.
The one issue I've had with this (and other similar technologies) is that that Alexa in and of itself , is able to do a fairly good job of communicating back to the user which in a way trains the user on how to issue commands properly. But with porting out commands to 3rd Party controllers, it's essentially an IR Remote. While the API is indeed two-way, it does involve you managing human speech responses at the AMX side. In other words: Your user must learn to speak the specific commands ver batim. Porting out the various ways you can tell Alexa to turn on the lights can get a bit cumbersome in AMX land.
But as long as your users know there may be only one or a few acceptible commands to do something ann... that the only good feedback that those commands did indeed work is that whatever was supposed to happen did, it works quite well.
I will also say that I have not yet found a client who actually likes the feature after a time. I mainly mess with it myself. I'm in the camp that good voice control will probably happen around the same date as the first viable fusion reactor powered electric plant.
You might find the easiest way to implement it is to leverage an existing 3rd party connection. I use Lutron. I know it's bit of a cheat, but I've been using it since it rolled out and it works without fuss. I use the Lutron RadioRA2-Connect bridge combo and have several software-only keypads on the Lutron side that are simply there to receive button presses from the Lutron-Alexa integration. The netlinx end can then be set up to do whatever you want when those keys are pressed.
Like Eric said, the commands must be issued verbatim, so anyone besides the person who came up with the scene name will invariably find themselves shouting commands at Alexa hoping they'll finally get it right. We originally started with two dozen commands and now find there are about eight that we use reliably. The wife still demands a printed list of commands sit near the kitchen device so she can remember what to say...
We implemented voice control years ago with TPControl. To get around the "verbatim" issue to a degree, we built a large database in the NetLinx that has 1500 entries... we clustered 10 entries per command, each with different wording. Since the data is file based, more and different commands can be entered by downloading the EXCEL file from the netlinx, and the entry is plain text. We're still working on a reliable Alexa/Home link.