Azure Services: Presenting IoT Data

By Joe Bell on

This blog post is designed to expand on our previous IoT and Azure Services blogs. Extending the Silversands IoT System and A bespoke IoT system for Silversands.
In this article, we will walk through our approach to setting up the telemetry charts found here. These charts are basically displaying the light and temperature levels in the Silversands office to show you, “If it is Sunny at Silversands”.

Earlier this year we developed a simple Azure IoT proof of concept (PoC) with data being displayed via Power BI. We wanted to go a step further. The brief was to simply extend the PoC so we could present the telemetry data being recorded by our Azure IoT PoC within our WordPress website.

We started looking at a number of approaches including some, rather unorthodox ways of using embedded Power BI. In the end we decided that we would look at exposing the data through an API which could be consumed by a WordPress plugin and rendered using a JavaScript library called Chart.js.

Silversands IoT Architecture

Solution Approach

Within this blog, we will take you through the key components of the PoC.

  1. Getting the data – Azure Event Hub
  2. Deciding where to storing the data – Azure SQL Database
  3. Processing the data – Azure Stream Analytics
  4. Exposing the data -Azure Web App
  5. Presenting the data – WordPress Plugin

Step 1 – Getting the Data

We had already developed a mechanism to capture the desired light and temperature data.  We simply used a Mico:bit as our sensor and a .Net console application to relay the data to Azure IoT Event Hub.

The Azure IoT Event Hub is basically a hyper-scale telemetry ingestion service that collects, transforms, and stores millions of events.

This element is covered more in our initial IoT Blog – A bespoke IoT system for Silversands.

Step 2 – Where to store the data

In order to surface the data, we first needed to store it in a structured way. To do this, we created an Azure SQL database. Azure SQL was chosen for its high transaction performance, scalability and ease of set-up. In addition, Azure SQL can automatically monitor your common usage scenarios and suggest performance optimisations.

In this instance, you can see the performance recommendation was to add an Index to the database.

Step 3 – Processing the Data

We then needed to populate the database with the telemetry data from the Azure Event Hub. As we already had a Stream Analytics job in place, this was a simple case of creating an additional output and modifying the query to send data to the new output. Stream Analytics makes it very easy to connect to an existing SQL database within your Azure Subscription as it can automatically populate the required parameters.

Stream Analytics – Multiple Outputs
Stream Analytics – Updated Query

Step 4 – Exposing the data

To expose the data from the database, we decided to implement an API using NodeJS and a framework called Restify. Restify allows you to very quickly set up a RESTful web service and is great for small and large projects alike.

We also used 2 additional libraries called Tedious and Tedious Connection Pool, to manage connections and queries to instances of Microsoft SQL Server.

With these libraries, it proved very simple to set up a pool of connections to SQL Azure.

We have listed some connection information below which may be of interest.  This shows how to connect to an Azure SQL Database using the the Tedious Connection Pool library within NodeJS.

const ConnectionPool = require("tedious-connection-pool");

// Config for the connection pool
const poolConfig = {
    min: 2,
    max: 4,
    log: true

// Config for the connection to the SQL Server
const connectionConfig = {
    userName: "<db-user>",
    password: "<db-password>",
    server: "<db-server>",
    options: {
        database: "<db-name>",
        encrypt: true, //Required for connections to Azure SQL
        requestTimeout: 0

// Set up a connection pool so we have db connections ready to go when a request comes in
let pool = new ConnectionPool(poolConfig, connectionConfig);

A connection pool was used to manage the potential of multiple simultaneous requests to the API. With the pool set up, queries can be sent by acquiring free connections from the pool and releasing them when they have executed.

const { Request } = require("tedious");

function queryDatabase(query, dataPoints) {

    // the query will be asynchronous so we implement a promise
    let queryPromise = new Promise((resolve, reject) => {

        // fetch a free connection from the pool
        pool.acquire((e, connection) => {
            if (e) {
            } else {
                let results = [];

                let req = new Request(query, (e, rowCount) => {
                    // finished with the connection so release it back to the pool

                    if (e) {
                    } else {

                // when we receive a row, push the columns to the results array
                req.on("row", (columns) => {
                    let result = {};
                    columns.forEach((col) => result[col.metadata.colName] = col.value);

                // finally execute the request

    return queryPromise;

Using these components, we set up an API and hosted it in an Azure App Service. The API requires 2 parameters;

  1. A start time.
  2. The number of data points to return since the start time.

An example of this would be which will return data since 09:02 on the 3rd of August 2017 spread in to 60 data points.

API Response

Step 5 – Presenting the data.

The final piece of the puzzle was to render the data in WordPress and implement live updates to the charts.

To do this, we created a WordPress plugin which created a widget to place in our articles. ChartJS was used to render the data in to charts.

Widget Configuration

In order to calculate the refresh rate for each chart, we calculate the number of seconds since the start time and divide this by the number of data points returned then multiply by 1000 to get the millisecond refresh rate.

chartWidget.interval = Math.ceil(chartWidget.seconds / chartWidget.points) * 1000;

We then update this chart with an additional data point and remove the first data point each interval.

chartWidget.UpdateChart = function(newTime) {
    $.get("" + newTime + "/1", function (data) {
        if (data.length > 0) {
            var lights = (item) {
                return {
                    x: item.UTCTime,
                    y: item.Light

            var temps = (item) {
                return {
                    x: item.UTCTime,
                    y: item.Temp

  [0].data.splice([0].data.length - lights.length);
  [0].data = lights.concat([0].data);

  [1].data.splice([1].data.length - temps.length);
  [1].data = temps.concat([1].data);


        window.setTimeout(function () {
        }, chartWidget.interval);

The Result

We now have a live updating chart within our WordPress Website and you can see “if it is sunny in Silversands”.  We have also included below some example charts where you can visibly see night-time and when clouds are overhead.

Useful Links.

To view the live feed from the device, live please visit the associated blog. Click here.

Contact Silversands

We hope you have found this blog informative?  If you want to find out more about Silversands’ application developed skills, Azure, IoT or anything else related to this blog please contact on us on the form below.  We would love to discuss how we can work with you to realise your ambitions with Microsoft Azure and Silversands development skills.

Contact us

  • This field is for validation purposes and should be left unchanged.


We have the expertise and the experience to provide specialist solutions and drive your business forward

Get in touch

How can we help you?

Get in touch

What updates would you like?