Written by pekka on 8/25/2015

Azure IoT services are collection of services targeting the internet of things. They provide event collection, processing, analytics and machine learning capabilities.


In this series of articles I'm going to explain how to setup simple event collection and processing pipeline. Target is to enchance it with analytic and dashboard views as we go forwards.

Planned articles in this series

  1. Event collection

  2. Event dispatching

  3. Simple analytics

  4. Live dashboards

  5. Reports

We're going to build a sample application showing all the capabilities we're going trough in this series. Sources will be available on GitHub.

Event collection

First part of collecting the events from a device or device gateway is to have a endpoint in Azure which will receive the events. For such purpose Azure provides a service called EventHub. This service will allow us to receive events in big data scale from hundreds - thousands of event sources.

Diagram 1. Overview

Diagram 1. shows the architecture overview of how event collection could be setup in most cases not involving tens of thousands of event sources with high frequence of generated events. In the diagram you can find the event source "Device" which will send the events to EventHub named "Event Aggregator".

Long-term event storage

In our case we will want to setup a long-term storage of the events for future processing needs for example using HDInsight (Hadoop as service). Long-term storage should be cheap and size should scale to fit future events. In Azure we can leverage the Azure Blob Storage to store the raw events as blobs of data.

To process the incoming events we could use Azure WebJobs, Azure Worker Roles or Azure Stream Analytics to store the events in blob storage. In our case we will use Stream Analytics as the implementation will be scalable and simple by using it.

Event processor

Diagram 1. contains a Stream Analytics block named "Event Dispatcher" connected to both "Event Aggregator" and "Long-term storage". As events are processed from the EventHub we will write them to blob storage using a Stream Analytics query. Stream Analytics query language follows closely syntax of SQL. It provides query extensions to query incoming events based on time windows. In our query we will just read the event and output it to the blob storage. Our input will be the "Event Aggregator".


Diagram 2. Stream Analytics query

Event source

Event source can be anything that can do an HTTPS request or AMQP 1.0 connection to our EventHub. EventHub uses policies to control from whom it will accept incoming requests from. For scalability reasons and to gain control over invidual event sources we will use a publisher tokens derived from the policy. This will also allow us to set the lifetime of the token and as such increasing the security of our system.

Stream Analytics supports JSON, CSV and Avro as the format of the messages. We will use JSON as the format of our messages.

    "EventType": "temperature",
    "Timestamp": "..",
    "Value": 24.5,
    "SensorId": "room1-temperature"

Diagram 3. Sample event message

To send the event message to the EventHub we will need to create a publisher token. First we will need a Shared Access Policy with Send -permission. You can manage policies in the Azure portal under ServiceBus / YourHub / Configure. That might change when the new Azure portal gets support for EventHubs. More information found here Event Hubs Overview.

Currently there's no support for creating publisher tokens in the UI but we can easily do that in code.

var timeToLive = TimeSpan.FromDays(365*2);
var token = SharedAccessSignatureTokenProvider.GetPublisherSharedAccessSignature(

Diagram 4. Publisher token generation

Note that the EventHub does not store the publishers anywhere. EventHub will decrypt the token in message using the policy and check the lifetime based on encryption and decryption. Set the token as the "Authorization" header of the request.

When using publishers the address where to send the request is similar to following https://{namespace}.servicebus.windows.net/{eventHubPath}/publishers/{publisher}/messages. You will receive 201 OK status code as response if the requst is successfull.


This should give you the beginning steps on how to setup the event pipeline using Azure services and give you long-term storage of your events. In next article we will add event dispatching based on the EventType.

Written by pekka on 8/13/2015

ASP.NET 5 (DNX) is the up coming release of the ASP.NET framework from Microsoft. You can read more about it from Scott Guthries

Azure Web Jobs provide a method of running background jobs in the context of app service web app (previously known as Azure Web Site). These jobs are deployed with your web app and run in same context sharing the configuration etc. More info at web jobs.

ASP.NET 5 project can be executed as a console application from command prompt behaving closely like standard .NET console application would. This makes it good candidate to run it as a web jobs in continuous mode.

Here's a simple example. This application will keep running in a loop until it detects the special shutdown file and at that point it will exit itself.

public class Program
    private string _shutdownFile;
    private bool _running = true;
    public static IConfiguration Configuration;
    private readonly IApplicationEnvironment _env;

    public Program(IApplicationEnvironment env)
        _env = env;

    public void Main(string[] args)

        // load configuration
        var builder = new ConfigurationBuilder(_env.ApplicationBasePath);
        Configuration = builder.Build();

        while (_running)
            Console.WriteLine("Running and waiting for shutdown signal " + DateTime.UtcNow);
            // todo: do something
            // wait for one minute

        Console.WriteLine("Stopped " + DateTime.UtcNow);

    private void InitializeWebJobShutdown()
        // shutdown file path from the environment
        _shutdownFile = Environment.GetEnvironmentVariable("WEBJOBS_SHUTDOWN_FILE");

        if (!string.IsNullOrWhiteSpace(_shutdownFile))
            // monitor folder for file creation
            var fileSystemWatcher = new FileSystemWatcher(Path.GetDirectoryName(_shutdownFile));
            fileSystemWatcher.Created += OnChanged;
            fileSystemWatcher.Changed += OnChanged;
            fileSystemWatcher.NotifyFilter = NotifyFilters.CreationTime | NotifyFilters.FileName |
            fileSystemWatcher.IncludeSubdirectories = false;
            fileSystemWatcher.EnableRaisingEvents = true;

    private void OnChanged(object sender, FileSystemEventArgs e)
        if (e.FullPath.IndexOf(Path.GetFileName(_shutdownFile), StringComparison.OrdinalIgnoreCase) >= 0)
            // hit the mark
            _running = false;

ASP.NET 5 provides a way of packaging the application including all the dependencies and the runtime itself so it can be executed from command line.

Here's a PowerShell script which will upgrade the runtime to current version and use the package manager dnu to publish the application with the currently active runtime. Output folder will contain then a script which will launch the runtime and execute the application using it.

PowerShell script build-webjob.ps1 will take two parameters; first one being the name of the webjob folder and second one being the output folder.

param($webjob, $out)
# example build-webjob.ps1 -webjob MyWebJobsProjectFolder -out output

# activate runtime
& $env:USERPROFILE\.dnx\bin\dnvm upgrade -r clr -arch x86

# web jobs
& dnu publish $webjob --runtime active --out $out

Azure Web Jobs uses a convention for continuous jobs where the job executable is deployed into the {WebSiteRoot}\App_Data\jobs\continuous\{job-name} folder. When the app is deployed the Azure Web Job infrastructure will automatically start it.

So to publish the ASP.NET 5 console application you just need to use the above build-webjob script and set the out parameter to point to the special directory under your web sites root folder. This works best when you're packaging your site for deployment for example inside your build server.

If you want to also install the dnvm SDK manager to the build machine you can enhance the above script by adding following above the #activate runtime comment. This will install the dnvm before the runtime.

# bootstrap DNVM into this session.
&{$Branch='dev';iex ((new-object net.webclient).DownloadString('https://raw.githubusercontent.com/aspnet/Home/dev/dnvminstall.ps1'))}

Written by pekka on 2/2/2015

I've been using Microsoft Windows 7 USB/DVD Download tool for ages to create bootable USB stick for installing Windows. Since UEFI has become mainstream the tool is not anymore valid if you want the full benefits of UEFI.

I'm not going to explain here what UEFI is and why you should enable it. It has been explained in detail elsewhere. Just do a search on UEFI vs. legacy :)

Problems with booting from USB created by the Windows 7 USB/DVD download tool:

  • You have to enable CSM mode from the BIOS which basically turns the UEFI off and enables legacy BIOS functionality making booting slower as BIOS handles more of the hardware initialization.

  • Enabling UEFI boot after installation can be painful -> impossible

  • If CSM is disabled for normal UEFI boot sequence then it does not support the USB format created by the tool and ignores your USB

To create UEFI compatible boot media there's one really fast and easy method -> Rufus. It can create bootable USB for multiple purposes including Windows installation.

Steps using Rufus:

  1. Download and launch the application

  2. Select the Windows 8 - 10 ISO file

  3. Change format to GPT! (This is important to enable full UEFI mode)

  4. Make sure CSM is disabled in your BIOS

  5. Boot from the USB and continue with installation as normal

Possible issues:

  • Secure boot or similar feature can block the installation -> Disable it from the BIOS

Written by pekka on 9/27/2014

Modern file uploading for Angular.JS applications with a ASP.NET WebApi backend.


Flow.JS is a JavaScript library created to upload multiple simultaneous uploads in stabe, fault tolereant and resumable fashion by using the HTML5 File API

ng-flow provides a Angular.JS directives to allow you to setup uploading in HTML.

Sample ng-flow HTML

<div class="row" data-flow-init="{target: '/folders/uploads/folder'}"
     data-flow-file-success="$file.msg = $message">
    <div class="col-sm-12">
        <div class="row">
            <div class="col-sm-6 last">
                <div class="progress" ng-show="$flow.isUploading()">
                    <div class="progress-bar progress-bar-striped active" role="progressbar" aria-valuenow="45" aria-valuemin="0" aria-valuemax="100" style="width: 45%">
                        <span class="sr-only">Uploading..</span>
        <div class="row">
            <div class="col-sm-12">
                <table class="table table-hover">
                            <th style="width: 25px"><span class="glyphicon glyphicon-file"></span></th>
                            <th style="width: 150px">Size</th>
                            <th style="min-width: 200px">Name</th>
                        <tr data-ng-repeat="file in $flow.files">
                            <td><span class="glyphicon glyphicon-file"></span></td>
                            <td>{{file.size / 1024 | number:0}} KB</td>
                            <td><a ng-href="/folders/folder/{{file.name}}">{{file.name}}</a></td>
        <div class="row">
            <div class="col-sm-12">
                <div class="panel panel-success" data-flow-drop data-flow-drag-enter="class='panel panel-primary'" data-flow-drag-leave="class='panel panel-success'"
                    <div class="panel-body">
                        Drag And Drop your file here
        <div class="row">
            <div class="col-sm-12">
                <span class="btn btn-default" data-flow-btn><span class="glyphicon glyphicon-upload"></span> Upload File</span>
                <span class="btn btn-default" data-flow-btn data-flow-directory ng-show="$flow.supportDirectory">
                    <span class="glyphicon glyphicon-folder-open"></span> Upload Folder

Directives starting with flow-*are provided by ng-flow

  • Initialize flow scope: data-flow-init="{target: '/folders/uploads/folder'}"

  • Setup action for upload: data-flow-files-submitted="$flow.upload()"

  • Success event handler: data-flow-file-success="$file.msg = $message"

  • Setup a file drop area: data-flow-drop

  • File drag enter event handler: data-flow-drag-enter=""

  • File drag leave event handler: data-flow-drag-leave=""

  • Open file picker button data-flow-btn

  • Open folder picker button: data-flow-directory

More can be found from ng-flow web site.


Flow.JS uses two types of requests to server when uploading files. GET method to test if the chunk is already uploaded. This method should return 200 OK if file is found and any other status code is intepreted as that the file does not exist.

Both GET and POST methods include the same request parameters

namespace Tanka.FileSystem.WebApi.FlowJS
    using System;

    public class FlowRequest
        /// <summary>
        /// The index of the chunk in the current upload. 
        /// First chunk is 1 (no base-0 counting here).
        /// </summary>
        public ulong? FlowChunkNumber { get; set; } 
        /// <summary>
        /// The total number of chunks.
        /// </summary>
        public ulong? FlowTotalChunks { get; set; }

        /// <summary>
        /// The general chunk size. Using this value and flowTotalSize 
        /// you can calculate the total number of chunks. Please note 
        /// that the size of the data received in the HTTP might be 
        /// lower than flowChunkSize of this for the last chunk for a
        /// file.
        /// </summary>
        public ulong? FlowChunkSize { get; set; }

        /// <summary>
        /// The total file size.
        /// </summary>
        public ulong? FlowTotalSize { get; set; }

        /// <summary>
        /// A unique identifier for the file contained in the request.
        /// </summary>
        public string FlowIdentifier { get; set; }

        /// <summary>
        /// The original file name (since a bug in Firefox results in 
        /// the file name not being transmitted in chunk multipart posts).
        /// </summary>
        public string FlowFilename { get; set; }

        /// <summary>
        /// The file's relative path when selecting a directory 
        /// (defaults to file name in all browsers except Chrome).
        /// </summary>
        public string FlowRelativePath { get; set; }

        /* ... */

I've created a simple library which handles both requests. It's available in GitHub Tanka.FileSystem.WebApi. It provides a class Flow which handles the request based on the method. It uses simple abstraction over the file system and includes a local disk implementation.

See the includes sample application for example usage.

Written by pekka on 8/16/2014

Tanka.Markdown supports markdown text paragraphs where the paragraphs are separated by a blank line.

Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor
incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis 
nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. 
Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu 
fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in 
culpa qui officia deserunt mollit anim id est laborum.

Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor
incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis 
nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. 
Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu 
fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in 
culpa qui officia deserunt mollit anim id est laborum.

Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.

Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.


[heikura.me](http://www.heikura.me) or [heikura.me][me]

Link definitions at the bottom of the page
[me]: http://www.heikura.me

heikura.me or heikura.me


![Tanka blogposts](https://dl.dropboxusercontent.com/u/5784123/blog/tanka-blog-post-list.PNG)

Tanka blogposts

Currently Tanka does not support giving size options for the images.


emphasis of *text* or strong emphasis of **text**

emphasis of text or strong emphasis of text

Inline code blocks

inline code block `var test = 123;` should be supported

inline code block var test = 123; should be supported.

Written by pekka on 2/25/2014

Tanka.Markdown supports codeblocks and lists.


Codeblocks are pre-formatted blocks of text.

Codeblocks must be surrounded by three ` characters in the start 
and end lines. Content of the block must go between those lines. 

Code block 
will render as it is written 
with new lines
exactly the same way as you type it.

Typically codeblock is used to render piece of source code. This blogging software supports syntax highlighting of codeblocks.

public class Codeblock
    public string Content


Lists are paragraphs of text which are rendered as either ol or ul html tags. Multiline items are parsed and rendered as paragraphs. Item cannot yet contain multiple paragraphs of content.


1. item
2. item
   items can continue for multiple lines and contain links [Heikura.Me](http://www.heikura.me)
3. item
  1. item

  2. item, items can continue for multiple lines and contain links Heikura.Me

  3. item


* item
* item
* item
  • item

  • item

  • item

Written by pekka on 2/24/2014

Markdown is very popular markdown format for writing readable documents which can be parsed to render them as html. Most technology oriented writing platforms support it. As there's not strict specification anywhere there are multiple flavors of the syntax out there. Most popular flavor seems to be the GitHub flavor.

Tanka.Markdown was born out of frustration with trying to add some of my own syntax sugar to the mix. As most current C# implementations seems to be port of a parser made for some other language the codebase are a mess.



Install-Package Tanka.Markdown -Pre

Code is hosted at GitHub

Supported syntax


# Heading 1
## Heading 2
### Heading 3
#### Heading 4
##### Heading 5
###### Heading 6

Heading 1

Heading 2

Heading 1

Heading 2

Heading 3

Heading 4

Heading 5
Heading 6

Heading 1

Heading 2

This is first in series of posts about Tanka.Markdown and its capabilities.

Written by pekka on 1/20/2014

ScriptCs is a scripting framework for C#. It uses Roslyn to dynamically compile and run scripts written in C#. In common terms it's node.js for .NET. For me it seems to be the perfect choice for automating some build steps as you can write the script in familiar language C# and use the power of full .NET framework to get things done.

Script structure

So here are the steps my build needs to execute in order:

  1. Build the solution and output the results to build/

  2. Execute Xunit tests

First you need to install ScriptCs which takes couple of minutes. You can find the instructions from ScriptCs web site. Chocolatey is fast and updating is easy so I recommend you use it.


Here's the draft of the script. Notice that I'm using bit more script style formatting. It just feels more "scripty".

// configuration
var config = new {

// clean the build directory

// if build succeeds run tests
if (Build()) {
} else {
    Console.WriteLine("Build failed");

public void Clean() {


public bool Build() {
    return false;

public bool Test() {
    return false;

Notice the lack of class around the methods. ScriptCs uses Roslyn undercovers and it will wrap the script into a class.


Build needs some folder paths and other variables to execute properly so let's add a class to hold this information. I don't want to specify the class so I use an anonymous class.

This should go into the beginning of your build script.

var config = new {
    buildOut = @"build",
    solutionFile = @"src\Solution.sln"

That should be enough for Clean and Build steps so lets see how to implement them. I'm building a Visual studio solution in this sample and as I have VS 2013 installed msbuild.exe should be on my environment PATH already.

Build tasks

So we have a basic script but it's not doing anything yet. Lets implement those methods to get the build going.


Clean will check if the build output folder exists and recursively delete it if it does.

public void Clean() {
    Console.WriteLine("Clean directory {0}", config.buildOut);

    // lets get the full path to the folder
    var fullPath = Path.GetFullPath(config.buildOut);

    // check if directory exists and delete if it does
    if (Directory.Exists(fullPath))
    	Directory.Delete(fullPath, recursive: true);


Build the solution and output the output to buildOut -folder.

public bool Build() {
    var fullSolutionPath = Path.GetFullPath(config.solutionFile);
    var fullOut = Path.GetFullPath(config.buildOut);
    var info = new ProcessStartInfo("msbuild.exe");
    info.Arguments = string.Format(
        "{0} /p:OutputPath={1} /t:rebuild", 
    var process = Process.Start(info);
    // zero should mean all ok
    return process.ExitCode == 0 ? true : false;


Run tests from test assemblies in the build output -folder. I'm using my own ScriptCs.Xunit2 script pack to give me an easy access to Xunit test runner.

You can install it with the following command.

scriptcs -install ScriptCs.Xunit2 -pre

Here's the actual test step

public bool Test() {
    var runner = Require<XunitRunner>();
    // define pattern for test assembly matching
    var pattern = "*Tests.dll";
    // directory to look for the test assemblies
    var testsPath = Path.GetFullPath(config.buildOut);
    // find test assemblies
    var files = Directory.GetFiles(testsPath, pattern);
    // execute tests in found assemblies
    bool testsPassed = true;
    foreach (var assemblyPath in files) {
        Console.WriteLine("Executing tests from {0}", assemblyPath);
        var result = runner.Execute(assemblyPath);
        Console.WriteLine("Executed  {0} tests - {1} failed - in {2} seconds", 
        if (result.TestsFailed > 0)
            return false;
    return testsPassed;


When you combine steps above you should have a basic build script. Which you can execute with following command.

scriptcs build.csx

In the part 2 of this blog series I'm going to add an packaging step for creating NuGet packages.

Written by pekka on 1/17/2014

Read NancyFx request body stream as string

Extension method

public static class RequestBodyExtensions
    public static string ReadAsString(this RequestStream requestStream)
        using (var reader = new StreamReader(requestStream))
            return reader.ReadToEnd();

Example usage

Post["/body-as-string"] = parameters =>
    string content = Request.Body.ReadAsString();

Written by pekka on 1/12/2014


All the pros will kill me for saying this, but merging using normal text editors is painful except if you're one of those Vim -gurus. So to make those merges bit easier I'm going to show you how to configure P4Merge to be used as Git mergetool and difftool.

First download P4Merge from Perforce web site. You only need to install the Visual Merge tool.


Then open your .gitconfig file and add/replace with following:

    tool = p4merge
    tool = p4merge
[mergetool "p4merge"]
    trustExitCode = true
    keepBackup = false
    prompt = false

Latest Git version has a native support for p4merge so you don't need to provide anything else.

In Windows .gitconfig file is located at the root of your user profile folder. You do not need to provide the full path to p4merge.exe as it should be on your PATH after the installation.

Next time you get a merge conflict remember to execute git mergetool and you're good to go. You can also get a diff of files by running git difftool


You're performing a three-way merge where the left side shows theirs, middle is the common base and right side your copy of the file. Bottom part shows the result of the merge. On the right side of bottom pane there are icons which you can use to control what ends up in the resulting merged file. You can also use the top "tree" view to select one of the three copies as the resulting merged file.

Tanka - 0.1.2