Overview

The purpose of this tutorial is to shovel the data arriving to the MultiTech Conduit LoRa gateway to an AWS IoT channel for scriptr.io users who want to use AWS as data source and for device management.

Assumptions

The following assumes that you already know how to configure the Conduit as a LoRa network server and how to configure the LoRa Mote to join the Conduit’s network and issue periodic messages containing the sensors’ readings (temperature and ambient light).

For more info about how to do the previous steps, you can refer to our previous blog entry which will walk you through all the steps and artifacts needed to get the gateway and the LoRa device talking to each other.

Requirements

Hardware

  1. LoRa device – the following has been tested with a the RN2903 LoRa® Mote from Microchip: although the device generating the data doesn’t matter (as long as it knows how to talk LoRa to the MultiTech Conduit) we will use the LoRa Mote since it is a full demo-device with a screen and interface to configure. You can easily use another device like the mDot if you’re comfortable configuring it.
  2. MultiTech Conduit – the following has been tested with the MTCDT-LEU1 running both mLinux and AEP firmware.

Software

  1. Scriptr.io account – you can create one by going to www.scriptr.io/register.
  2. AWS account.
  3. The provided sample code – available on scriptrdotio‘s GitHub account.

Steps

Once logged in to your AWS account, you will need to create two “devices”, one to represent the Conduit and the other to represent scriptr.io. The first will be publishing to AWS all the messages received by the Conduit, from all the LoRa devices on the network. The second will be used to subscribe scriptr.io to all messages received by AWS.

Creating Devices on AWS

The easiest way to do this is through the AWS wizard, from the Onboard menu and selecting Configure a device. This will provide you with a “package” (a zip file) containing the needed credentials (a set of keys and certificates) and an installer for the needed SDK.

The provided shovel is written in Python and will run on the Conduit, so we will select Linux/OSX and Python:
aws-conduit-sdk-selection

Next, you will need to create a “thing”. A thing is a device, a gateway or “anything” that will connect to the AWS service:
aws-conduit-create-thing

Next, you will be provided with a screen summarizing your options, the content of the generated package and a button to download it:

aws-conduit-download-sdk

Note: You will need to repeat the previous steps in order to create another thing which will represent the scriptr.io side of the connection (practically, getting the files that will authenticate / authorize scriptr.io to subscribe to the AWS topics).

Installing the Package on the Gateway

Once downloaded, you will need to copy the provided SDK to your gateway:

#unzip the package
unzip connect_device_package.zip
#make a gzip tarball: This is done because the mLinux unzip doesn't understand the zip format provided by aws
tar -zcvf conduit.tar.gz conduit.public.key conduit.private.key conduit.cert.pem start.sh
#move the downloaded package to the gateway
scp conduit.tar.gz admin@routersip:~/aws
#connect through ssh to the gateway
ssh admin@routerip
#unpack the downloaded package
cd aws
tar -xzvf conduit.tar.gz

Installing the Shovel Code on the Gateway

#download the python code
wget https://raw.githubusercontent.com/scriptrdotio/multitech-shovels/master/shovel-aws.py -O shovel.py
#the shovel requires mqtt and aws sdks
opkg update
opkg install python-pip
wget https://bootstrap.pypa.io/ez_setup.py
python ez_setup.py
pip install paho-mqtt
#download aws root-ca
wget https://www.symantec.com/content/en/us/enterprise/verisign/roots/VeriSign-Class%203-Public-Primary-Certification-Authority-G5.pem -O ~/aws/root-CA.crt
#install python sdk for aws
pip install AWSIoTPythonSDK

Note: the previous might fail, depending on your gateway configuration. By default /etc/opkg/mlinux-feed.conf is referencing version 3.4 while the OS was version 3.1. If you face this problem, you will have to update the links to point to your version.

You can figure out what version to use by checking the installed version on your gateway. You can check the version by running the command “cat /etc/mlinux-version” which will output something similar to the following (we’re interested by the part in bold):

mLinux 3.3.6
Built from branch: (detachedfrom4dc1cd8)
Revision: 4dc1cd8d41adde7b96656f848196d55284c1fbc8

You can find the list of available source here: http://www.multitech.net/mlinux/feeds/

Running the Code

First, you will need to update the aws enpoint to match yours: you can get your endpoint from the aws UI or by reading the content of start.sh

grep amazonaws.com start.sh | awk '{print $4}'

If you have named your thing “conduit” as defined in the previous steps, you will just need to start the script:

python shovel.py

If, on the other hand, you have named it differently, you will need modify it script and update the path to point to the correct certificates.

Subscribing to Data Events from Scriptr.io

Now that the data is flowing from the gateway to AWS IoT, the next step is to get the data into scriptr.io. For this, you will need to:

  1. Create a thing on AWS that will represent the scriptr.io connection:
    Follow the same steps as above, but call your thing scriptrdot.io
  2. Create a bridge in scriptr.io between AWS and scriptr.io:
    1. Define the external endpoint as follows:
      The URL is provided by AWS, the easiest way to get it is to extract it from start.sh. You can do so by running the following command:

      grep amazonaws.com start.sh | awk '{print $4}'

      aws-conduit-create-aws-bridge

    2. Create a channel in scriptr.io to which the messages will be published:
      lora-blog-create-channel
    3. Connect the channel to the external endpoint by creating a bridge: Use a valid device token instead of XXXxxxxxxxxx. These will be the device credentials used to publish the messages to the channel.
      aws-conduit-create-bridge
  3. Create a test script:
    if(request.body.data)
         return atob(request.body.data);
    else
         return "unexpected message format"
    
  4. Subscribe the test script to the channel:
    aws-conduit-subscribe-script
  5. The test script will now be called for each message arriving to AWS. You can validate that by the checking the logs