For those that have been following either my blog or tweets you already know that I like to measure stuff and analyze the results.  So naturally I have been very much interested in and have been following this new trend called the “Internet of Things”.   Simply put the Internet of Things is internet enabling all sorts of everyday items from thermostats to toasters.

There are lots of reasons of why internet enabling everyday items is appealing but the most appealing use case to me is the ability to monitor my environment with sensors and capture the data for analysis.  I have been watching this space seeing many technologies enter the market, the problem I have with most of them are closed to their ecosystem meaning that you have to use their proprietary hub, sensors, portal, etc.  And that would be ok if a single vendor did everything I want, but they don’t.

So, I was really excited when I ran across the Spark Core and Spark Cloud.  In essence, the Spark Core is a fully integrated low-voltage micro controller, wifi-module, flash storage, and programmable digital / analog input and output pins.  If that wasn’t impressive enough, it is married with the Spark Cloud for REST-API access and development environment providing full access to the Core over the internet.

Screen Shot 2014-02-06 at 10.11.38 PM


This environment is exciting because it allows me to connect whatever sensor I would like to the core and have it be internet enabled without having to mess with a bunch of underlying transport issues – this allows me to focus on sensors and data rather than spending time figuring out how to internet enable the sensor and data.

I just missed the Kickstarter campaign, but got on the wait list for the next shipment after they fulfilled the Kickstarter backers.  Seems like I had to wait a few months but when my cores arrived I wasn’t disappointed.


I cut my teeth with the obligatory LED project to control an LED turning on/off over the internet, it has no real practical use but good to learn the environment.  Which admittedly so, took a little bit to wrap my head around.


The Core dynamically connects via WiFi to the Spark Cloud, and all development occurs using the Spark Cloud IDE and updating the Core over WiFi.  This takes a little getting used to because you aren’t directly connecting to the device for development…all development is occurring over the internet and the only way you know what is going on with the Core during development is pushing output to a serial connection on the Core and watching a terminal session.  But once I got used to this paradigm shift I could see the real power and value of this “internet of things” trend.  The Spark Core and Spark Cloud is setting the stage for how dev and things will be truly internet enabled.

Screen Shot 2014-02-06 at 10.20.37 PM

So, I thought I would tackle a more real world problem with the Core:  Monitoring temperature and humidity of an outdoor equipment room.  A little a background might be useful; I’ve got a pool with an enclosed outdoor pool equipment room.  This equipment room is where the pool pump, filtering, and chemical monitoring occurs.  In the winter I get concerned about the room getting too cold and having pipes freeze, and year round I get concerned about pipe leaks that if go unnoticed could cause serious problems.  Thus, I would like to monitor temperature and humidity in real time, and alert me when particular thresholds get breached.

This won’t be much of a tutorial, but rather an overview of what was done to illustrate the capabilities and possibilities of Spark Core.


  • Spark Core
  • SHT-15 Temperature / Humidity Digital Sensor
  • Breadboard and Wiring


  • Spark Cloud code Adruino compatable
  • Amazon EC2 Instance as operating system platform
  • Python for calling Spark Cloud REST
  • RabbitMQ for real time messaging
  • Twilio for sending SMS Text via internet API

I did a bit of searching for a quality temperature and humidity sensor  that wasn’t too pricey, the SHT-15 seems to fit the bill for about $10.  Good specs and documentation, including some sample code to gain and understand of how to get data off the sensor.

Screen Shot 2014-02-06 at 10.38.26 PM

Took a little messing around understanding how the pinout worked, and communication with the Core.  But got it working without too much effort, it probably took me more time to figure out how to output debug information to the serial connection than understanding the pin outs.  You can see in the picture below that the Spark Core is connected to my computer for power and access to serial output.  The serial output in the background is displaying the temperature in C and humidity in about 1 sec intervals.


Here is the sensor assembly located out in the equipment room.  Unfortunately, I am just barely on the edge of WiFi connectivity so it doesn’t have the best signal – I’ve placed another order for a few Spark Cores with a u.FL antenna connection that will allow me to boost the WiFi signal.


Next up was exposing temperature and humidity via a REST API interface through the Spark Cloud.  The picture below represents a bunch of technical detail and code for simplicity.  Exposing the data via REST -> API is called from Amazon instance -> if data meets certain criteria the data is place on a queue -> RabbitMQ message server to process the message -> consumer script picks up message from bus and sends a SMS text message out.


Here are a few shots of it all in action:

pythonmessage photo

Right now most of the code is hacked up prototype code to prove out the use case, next steps are to clean it up make it a bit more operational and add some logic to define rules for alerting (and I still need the Core with u.FL anyway).  In the meantime, let me know if it would be of value to post the code even in its ugly state 🙂

All in all I’m pretty excited about the possibilities around the Internet of Things and the Spark ecosystem is just the beginning to let people tinker with this space and develop some pretty powerful prototypes, and who knows maybe even the next billion dollar Nest.