Tags

I’m thinking through the client side of my neighborhood energy monitoring project.  I’m starting to call this project HappyDay Neighbors.  Don’t know if that will be the final name…Naming is hard.  

The energy monitoring experience includes collecting energy readings from our breaker boxes, sending the readings using mqtt to a data store in DAH CLOUD.  The home owner then views the readings.  Through viewing the readings, the home owner will understand how much energy is being consumed and think of ways to decrease energy use.  I also want to explore social engagement using gaming/facebook/twitter…The social part is TBD.  My current focus is on reading and visualizing the data.

The Goal

The goal of this post is to explore a path to visualizing the energy readings.  I’ll do this by building a prototype that:

  • grabs 8,000 energy readings from Tisham’s data on Thingspeak and sends it to a cloud data store using mqtt.
  • shows a histogram of the energy readings in a web browser.

Thanks To Those That Went Before

  • The folks behind the abundant resources and excellent community of the OpenEnergyMonitor project.  The folks in the community as well as the resources are very accessible.  There are many of us who benefit greatly from their ongoing efforts.  I’m excited!  Just today I got an emonPi in the mail.  Hopefully the CTs I have on order will arrive soon.  Then I’ll be able to give the OEM experience a twirl.
  • A HUGE shout out to Tisham Dhar.  Tisham is the creator of the ATM90E26 Single-Phase Energy Monitor Dev Kits.   I have been impressed with the intelligence and care Tisham has taken to Open Source his work.  Tisham also kindly answered all the questions I peppered him with.  His work looks super sharp to me.  

Background Info

My exploration took me on a path that relied on AWS services.  I went down this path because:

  • eventually I want to scale collecting and analyzing energy readings throughout our neighborhood and beyond.
  • I wanted to build as quick and dirty as possible.  This meant relying on a system of services – mqtt broker, data base, graphing – that were loosely coupled together, had a lot of folks using it so that there were a lot of videos and docs on use cases.
  • I am using the prototype to better define the final experience.
When I started this post, I had no idea what AWS services to use.  My learning process was to jump right on in.  I’ll write a bit about what I’ve learned at the end of this post.

There was a lot of goo…um…technology…that I was unfamiliar with. I bumbled my way through relying on mostly these resources:

Open Source

These files are used in the post:

Part 1: Sending mqtt Messages

I wanted to divide and conquer taking energy readings using sensors/hardware from visualization.  This way, I can start  collecting energy readings prior to having the hardware reading.  It will also allow me to focus on the why’s and how’s of the data visualization step.  I’ll read power readings from Tisham’s Thingspeak feed. I started discussing Tisham’s excellent work in this post.

Step 1: Build a File of Power Readings

I did this in my first Jupyter notebook.  As I started putting together the workflow of this prototype, it became apparent that date conversions are hard to get right (…yah, no kidding 🙂 ).  The updated notebook – ReadTishamDataIntoCSV.ipynb.

Sending date/time strings are tricky. Computing loves numbers, humans love strings.  I convert the date/time string to it’s epoch value (a long datatype).

The format TIsham uses to store the date/time format:

%Y-%m-%dT%H:%M:%SZ

Sending date/time strings are tricky. Computing loves numbers, humans love strings.  I convert the date/time string to it’s epoch value (a long datatype).  Here’s the Python script I ended up using to take in the dates in the ThingSpeak feed and convert to an epoch format:

dt = datetime.datetime.strptime(entry[u’created_at’],’%Y-%m-%dT%H:%M:%SZ’)
epoch = datetime.datetime(1970, 1, 1)
epochTime = int((dt – epoch).total_seconds())

NewImage

Entries in the CSV file look like this:

176,1506532776
177,1506532795

i.e.: Energy reading in watts, epoch time when measurement was taken.

E.g.: converting an epoch time format to a human readable format using Python:

>>> import time

>>> time.strftime(‘%Y-%m-%d %H:%M:%S’, time.localtime(1506532776))

‘2017-09-27 10:19:36’

Step 2: Send the Power Readings as mqtt messages 

In order to send mqtt messages, I needed to figure out what mqtt broker to use.  My requirements for a broker included robust availability, storage and visualization of the energy messages.  I ended up choosing AWS services.

I put the code I used into the Jupyter notebook SendMqttToTishamDataTopic.ipynb (located at this GitHub location).  It sends the mqtt energymqtt messages to the topic /clientPrototype/data/Tisham.

To do this I needed boto3: and AWS_CLI.

  • Install the AWS CLI:  From the web page – The AWS CLI is an open source tool built on top of the AWS SDK for Python (Boto) that provides commands for interacting with AWS services. hmmm…I have the Anaconda variant of Python 3.6 installed.  So this seems to mean I’ll be need to use the conda package manager (instead of following Amazon’s instructions, I followed instructions from stack overflow):

$ conda install -c conda-forge awscli

$ conda install -c anaconda boto3

 
NewImage
I needed an access key and secret access key.  I followed the direction to get the access key and secret access key from this Quick Configuration Link.  Next I ran aws configure and copy / pasted the keys.  Here’s what the mqtt messages look like:
{'power': '176', 'time': '1506532776'}
{'power': '177', 'time': '1506532795'}

Part 2: Data Store and Visualization

For data store and visualization I used:

  • the AWS Elastic Search Service as the message store.
  • an AWS IoT rule that triggered when an mqtt message with the /clientPrototype/data/# topic come into the broker.
  • AWS Kibana for visualization.
It is not my goal to give tutorials on the AWS services I used.  While overwhelming at times, once I focused on each one I was able to bumble through using the available videos and tutorials.  I’ll just go through the steps I used after many starts and stops.

AWS Elastic Search Service

  • I created a test domain.
  • If the clientPrototype index (database) exists (from prior testing) delete:
curl -XDELETE search-test-XXXXXXXXXXXXXXXXXXXXXXX.us-west-2.es.amazonaws.com/clientprototype
  • Create the clientprototype index and powerReading type:
 
curl -i -X PUT
   -d
'{
  "mappings": {
    “powerReading": {
      "properties": {
        "time": {
          "type": “date",
          “format": “epoch_second"
        },
        “power": {
          "type": “long"
        }
      }
    }
  }
}
' 'https://search-test-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.us-east-1.es.amazonaws.com/clientPrototype'

NewImage

I followed the example in the tutorial:

  • As I mentioned above, I’m using the epoch date format.   Stuff that took me time to figure out:
    • Removing line breaks from the curl command:
      • Copy/pasted the curl command to create the powerReadingtype within the clientPrototype index of the test domain into TextWrangler to edit.  
      • Remove line breaks using the Remove Line Break option on TextWrangler under the Text menu.  
    • Figuring out the epoch date formatting – both on the Python side as well as the format to use in Kibana. 

 AWS IoT Rule

Here’s the rule I created in the AWS IoT Rule UI:

Filter

NewImage

when a message comes into the /clientPrototype/data/<anything> topic, send it on to Elastic Search.  The topic I set up in the SendMqttToTishamDataTopic.ipynb notebook, /clientPrototype/data/Tisham , will trigger this rule when the mqtt message comes into the AWS broker.  

Action

The action taken:

Pasted Image 9 29 17 10 39 AM

Role

Rules run in the system using a role set up within the IAM UI. 

NewImage

I gave the following permissions:

NewImage

Access to the CloudWatch logs is important for debugging.  I’ll get to that.  The other policy allows the rule to publish into ElectricSearch:

NewImage

 

Sending mqtt messages and publishing into ElasticSearch can now be testing.

Test

The test will:

  • use the AWS IoT test UI to see if the AWS broker is receiving the mqtt messages.
  • evaluate logs in the log file to verify power readings are published into Elastic Search.

AWS IoT Test

NewImage

Evaluate Logs

The AWS services – IoT (message broker) and ElasticSearch can be configured to log debug information.  Monitoring is done through the AWS CloudWatch service.

I couldn’t get it to work until I went through the AWS_CLI command:

aws iot set-logging-options –logging-options-payload roleArn=”arn:aws:iam::XXXXXXXXXXXX:role/service-role/pub_to_es”,logLevel=”DEBUG”

Here are the log entries that are created when the mqtt message is sent:

2017-09-30 17:17:13.659 TRACEID:f872acf1-ccaf-b6f0-6df4-f94a7e91c3c5 PRINCIPALID:XXXXXXXXXX/XXXXXXXXX/AIDAIHBTGHRY27IMGPW2M [INFO] EVENT:PublishEvent TOPICNAME:/clientPrototype/data/Tisham MESSAGE:PublishIn Status: SUCCESS


2017-09-30 17:17:13.659 TRACEID:f872acf1-ccaf-b6f0-6df4-f94a7e91c3c5 PRINCIPALID:XXXXXXXXXX/XXXXXXXXX/AIDAIHBTGHRY27IMGPW2M [INFO] EVENT:PublishEvent MESSAGE: IpAddress: 50.46.122.171 SourcePort: 51507


2017-09-30 17:17:13.712 TRACEID:f872acf1-ccaf-b6f0-6df4-f94a7e91c3c5 PRINCIPALID:393111621300/AIDAIHBTGHRY27IMGPW2M/AIDAIHBTGHRY27IMGPW2M [INFO] EVENT:MatchingRuleFound TOPICNAME:/clientPrototype/data/Tisham CLIENTID:N/A MESSAGE:Matching rule found: ClientPrototypeRule

2017-09-30 17:17:13.712 TRACEID:f872acf1-ccaf-b6f0-6df4-f94a7e91c3c5 PRINCIPALID:393111621300/AIDAIHBTGHRY27IMGPW2M/AIDAIHBTGHRY27IMGPW2M [DEBUG] EVENT:ElasticsearchActionStart TOPICNAME:/clientPrototype/data/Tisham CLIENTID:N/A MESSAGE:Starting execution of ElasticsearchAction on topic /clientPrototype/data/Tisham

2017-09-30 17:17:13.843 TRACEID:f872acf1-ccaf-b6f0-6df4-f94a7e91c3c5 PRINCIPALID:393111621300/AIDAIHBTGHRY27IMGPW2M/AIDAIHBTGHRY27IMGPW2M [INFO] EVENT:ElasticsearchActionSuccess TOPICNAME:/clientPrototype/data/Tisham CLIENTID:N/A MESSAGE:Successfully indexed document in ES. Message arrived on: /clientPrototype/data/Tisham, Action: elasticsearch, Endpoint: https://search-test-tkpccxbid4y7vlelgo6jq55kmi.us-west-2.es.amazonaws.com, Index: clientprototype, type: powerReading, id: 008edff0-11f1-46c3-9e84-76500a5a4771

The log entries are a HUGE help during debugging.  It took me many (many) attempts to get through the workflow.  The log files made it obvious what the problem was.  The problems typically were 1) ill-formed JSON 2) lack of security clearance.

Kibana Visualization

A link to Kibana is available in ElasticSearch once I created the test domain:

Pasted Image 9 29 17 1 02 PM

Once I wrapped my head around Kibana’s main use case – visualizing a gazillion data points..like millions upon millions (e.g.: all the flight data from the past few years…).  I was able to wrap my head around making this bar chart:

NewImage

For the Y-Axis, I ended up using the MAX Aggregation figuring when the data is zoomed in or out this becomes the most interesting for a bar chart given the constraints on setting the Y-Axis.  I.e.: unlike Excel – data stores and visualizers that …well um…excel with data sets in the 10,000’s but are not focused on BIG BIG DATA – that plot individual points, Kibana uses aggregates (hence one of the suggested videos is about Kibana aggregation).

Lessons Learned 

Whew.  Well, it works.  I learned:

  • AWS services are both powerful and overwhelming.  It gets confusing what is free and what ends up being charged. For example, one path I took ended up costing about $14.  This alarmed me because I had no idea I was getting charged…things “just worked.”  I need to get a better feel of free versus costs.
  • Using AWS services is an easy way to scale to many, many energy monitors.
  • ElasticSearch and Kibana are optimized for finding a needle in a haystack. In this case, the haystack is gazillions of data points.  The needle being a behavior that is in the data but needs to be searched out.  Their scenario is one of BIG DATA.  While this is a scenario I am interested in eventually, the primary interest is for evaluating energy readings within a fairly small time frame – a day, a week, a month and/or real time.  Other AWS services for the data store (e.g.: S3) might be a better solution.  This is dependent on the eventual scenarios.  Kibana is not the best visualization because it assumes the scenario is one of a gazillion data points.  Kibana’s context is one of putting data points into buckets to display.  
  • AWS IoT rules are awesome.  It is so easy to trigger an AWS service (like storage of the payload) and set up roles.
  • mqtt works well for sending energy readings from the energy monitor that will be located in the breaker box.
 
Thanks for reading this far…..    Please find many things to smile about.

 

 

For the prototype – where my goal is to tweak out all aspects – hardware, mqtt message passing, storage, visualization, scaling to our neighborhood (and beyond) – I feel the path I took works out.  There are many, many choices from different vendors…and many choices within the AWS family to get to the finish line.  For now, I’m happy with this workflow.