Developer Guide

Introduction

MQTTRoute is the perfect middleware to be used for collecting data from IoT edge device. It is the highly extendable, customizable and scalable MQTT Broker. This documentation provides a complete guide for developers to make use of new IoT Application framework.

MQTTRoute Extensions

MQTTRoute is now provided with new powerful extensions that helps you manage and build your own IoT / IIoT applications. All custom implementation provided can be done using the additional hooks. These hooks as of now are python based.

Custom Storge Configuration

MQTTRoute has an option to store data to Elastic by default. It has an extension wherein you can hook the received payload and store the data into any of your analytics / big data engine. If you are planning for storing the data in your own engine, you need to use the data_store.conf to configure your storage. 


data_store.conf

[CUSTOM STORAGE]
CUSTOMSTORAGE = DISABLED
# ENABLED || DISABLED

DATASTORE = CUSTOM
# ELASTIC || CUSTOM

[ELASTIC]
HOSTNAME = 127.0.0.1
PORT = 9200
INDEX_NAME = mqtt
BULK_INSERT_TIMING = 2

[CUSTOM] INTERCEPT_FILEPATH = ./../extensions/custom_store.py

Enable CUSTOMSTORAGE option in data_store.conf file to send data to the Document data store in addition to the MySQL/SQLite/MS SQL Storage. To store data in the Elastic search, the value of DATASTORE need to be specified as ELASTIC. For Custom implementation the value of DATASTORE need to be specified as CUSTOMHOSTNAME, the host name of the data store you are using to store data. If you are using Elastic as custom data store, then the host name of Elastic search will be mentioned. PORT, the port of the custom data store you are using. INDEX_NAME, the index name in which you want to store data. It similar to db name in MySQL. You can implement this method in your own file and then specify the path here: INTERCEPT_FILEPATH.

Custom Data Store

MQTT Route provides an option called custom store to receive data at back end to be stored as needed. The data received can be modelled and stored to any BIG data engine for further analysis and decision-making. Custom Store implementation is used to hook the received payload from MQTTBroker and store the payload in any of your analytics / big data engine. To store payload in any of your analytics, you need to use custom_store.py file. To configure, you must enable the CUSTOMSTORAGE in data_store.conf file.

The custom data store hook for the Big Data Storage. The Custom data hook can be enabled in the broker.conf inside conf/ folder. The parameter data will be in dict format and the keys are ‘sender’,’topic’, ‘message’, ‘unixtime’, ‘timestamp’.


custom_store.py

global db_cursor

#
# elastic_search cursor
#
global elastic_search
import os, sys

global datasend

SQL Connector will be sqlite / mssql / mysql cursor based on your configuration in db.conf. You have to construct your queries accordingly.


custom_store.py

global Client_obj

sys.path.append(os.getcwd()+’/../extensions’)

# Called on the initial call to set the SQL Connector

def setsqlconnector(conf):

global db_cursor
db_cursor=conf[“sql”]
# Called on the initial call to set the Elastic Search Connector

def setelasticconnector(conf):
global elastic_search
elastic_search=conf[“elastic”]
def setwebsocketport(conf):
global web_socket
web_socket=conf[“websocket”]

def setclientobj(obj):
global Client_obj
Client_obj=obj[‘Client_obj’]
#Client_obj

Client object is used to send / publish message to any active clients. Simply call the function with parameters like User_name, Client_id, Topic_name, Message, QOS.


custom_store.py

# Importing the custom class into the handler

from customimpl import DataReceiver

datasend = DataReceiver()

def handle_Received_Payload(data):

#
# Write your code here. Use your connection object to
# Send data to your data store

print ” print in the handle_received_payload “,data

result = datasend.receive_data(data)

# if result is none then write failed
def handle_Sent_Payload(data):

#
# Write your code here. Use your connection object to
# Send data to your data store

print ” print in the handle_Sent_payload “,data

result = datasend.sent_data(data)

You need to enable CUSTOMSTORAGE to receive the data of the message received from the devices into a python call back handle_Received_Payload(data). The method need to be implemented in the python file specified in the INTERCEPT_FILEPATH.

Your implementation should receive the data and store it and return the method. We advise you just to store the data or hand over the data to a stream analysis and return the method handle.

 

Custom Scheduler

ML and AI works best when the data already collected is processed better. Scheduling module helps in processing of data on a predefined time interval. The Custom Sheduler will help you create your own schedule in MQTTRoute by adding your own code on the server side.


custom_scheduler.py

def schedule_conf():

schedules={}

schedules={
‘STATUS’:’DISABLE’,
‘SCHEDULES’:[
{‘OnceIn’:1,’methodtocall’:oneminschedule},
{‘OnceIn’:5,’methodtocall’:fiveminschedule}]}

  return schedules

Enable / Disable your schedule by adding value as Enable / Disable in ‘STATUS’. You can add your schedule in MINUTES in ‘OnceIn’. Add your method to call on schedule in ‘methodtocall’.


custom_scheduler.py

global elastic_search
# Called on the initial call to set the SQL Connector

global web_socket
# Web_socket
def setsqlconnector(conf):
global db_cursor
db_cursor=conf[“sql”]

def setelasticconnector(conf):
global elastic_search
elastic_search=conf[“elastic”]

def setwebsocketport(conf):
global web_socket
web_socket=conf[“websocket”]

def setclientobj(obj):
global Client_obj
Client_obj=obj[‘Client_obj’]

def oneminschedule():
pass
#Write your code here
#print “extension print”

def fiveminschedule():
pass
#Write your code here
#print “extension print”

Custom UI Server

UI custom server provides an option to customize the user interface. It will help you customize the UI of the MQTTRoute by adding your own code on the server side. You can alter the code in Custom_ui_server.py  file as you need to customize it. 


custom_ui_server.py

global Client_obj

# Called on the initial call to set the SQL Connector
def setsqlconnector(conf):

global db_cursor
db_cursor=conf[“sql”]

# Called on the initial call to set the Elastic Search Connector

def setelasticconnector(conf):
global elastic_search
elastic_search=conf[“elastic”]

def setclientobj(obj):
global Client_obj
Client_obj=obj[‘Client_obj’]

The Data connectors, SQL Connector will be provided as a cursor global variable for querying the Database and Elastic Search connector for querying Elastic if you have enabled the custom storage option.


custom_ui_server.py

#
# Configure your additional URLs here.
# The default URLs are currently used for the UI.
# Please don’t remove them, if you are building it over the same UI.
#

def custom_urls():

urllist={
“AUTHENTICATION”:’DISABLE’,
“urls”:[{“/extend/url1”:method},
{“/extend/url2”:method1},
{“/extend/url3”:method2}]
}
return urllist

# write your url function codes in the following methods
def method():
return (“BEVYWISE NETWORKS”)

def method1():
return (“BEVYWISE NETWORKS”)

def method2():
return (“BEVYWISE NETWORKS”)

Add your new functionality using the URL and the corresponding method. These URLs can be invoked from your User Interface for manipulating data. We support GET http method in this version.

MQTTRoute comes up with the complete internet of things application including user interface customization, data aggregation & analysis, event data comparison with the processed data. The new IoT application framework will help building and managing the industrial IoT applications faster and much easier within a single process.

MQTT Broker plugin

The ready to use MQTT Broker plugins helps you connect MQTT broker to the Elastic search, Mongo DB and Redis.

MongoDB Connector

Mongodb is one of the most widely used Document Storage engine for IOT data analysis. This plugin connects Bevywise MQTT Broker with the Mongodb to store received payload data into MongoDB. It helps you handle complex data in easy manner and for powerful analysis.


Configure and Set up MQTTRoute-MongoDB-connector

1. Open plugin.conf and configure the

 
  • Update hostname and port no of the MongoDB server in MONGO section

  • If AUTHENTICATION is enabled in MQTTRoute, then update the Mongodb credentials otherwise set AUTHENTICATION_ENABLED = FALSE.

  • Update log file path to your own folder location. [default = Bevywise/MQTTRoute/extensions].

plugin.conf

[MONGO]
HOSTNAME = 127.0.0.1
PORT = 27017
DB_NAME = bevywise
COLLECTION = mqttroute

[AUTHENTICATION]
AUTHENTICATION_ENABLED = FALSE
# TRUE || FALSE
USERNAME = root
PASSWORD = root

[LOG]
LOG_FILE_PATH = ../extensions

2. Copy the folder mongo and paste it into Bevywise/MQTTRoute/extensions.

3. Copy the folder plugin.conf and paste it into Bevywise/MQTTRoute/extensions.

4. Replace custom_store.py with Bevywise/MQTTRoute/extensions/custom_store.py.

5. Open Bevywise/MQTTRoute/conf/data_store.conf.

  • update CUSTOM STORAGE = ENABLED

  • update DATA STORE = CUSTOM

6. Start the MQTTRoute and it will start storing all the payload into the Mongo DB server.

Redis Connector

This plugin connects MQTTRoute with the Redis server to store all the payload to the redis server for the further processing.


Configure and Set up MQTTRoute-Redis-connector

1. Replace custom_store.py with Bevywise/MQTTRoute/lib/custom_store.py.

2. In custom_store.py change the server name and port of the redis if you are running redis on a different server or port.


custom_store.py

redishost=‘localhost’
redisport=6379

3. Open Bevywise/MQTTRoute/conf/data_store.conf

  • update CUSTOM STORAGE = ENABLED

  • update DATA STORE = CUSTOM

4. Start the MQTTRoute and it will start storing all the payload into the Redis server with clientId_unixtime as the key.

 

Elastic Connector

MQTT Broker store data to Elastic search via custom implementation for better data visualization. The published payload only push to the Elastic search which helps you hook it and send to your data visualization tool.


Configure and Set up MQTTRoute-Elasticsearch-connector

1. Open plugin.conf and configure the

 
  • Update hostname and port no of the Elastic search

  • Update log file path to your own folder location. [default = Bevywise/MQTTRoute/extensions].

plugin.conf

[ELASTIC]
HOSTNAME = 127.0.0.1
PORT = 9200
INDEX_NAME = mqttroute

[LOG]
LOG_FILE_PATH = ../extensions

2. Copy the folder plugin.conf and paste it into Bevywise/MQTTRoute/extensions.

3. Copy the folder Elastic and paste it into Bevywise/MQTTRoute/extensions.

4. Replace custom_store.py with Bevywise/MQTTRoute/extensions/custom_store.py.

5. Open Bevywise/MQTTRoute/conf/data_store.conf.

  • update CUSTOM STORAGE = ENABLED

  • update DATA STORE = ELASTIC

6. Start the MQTTRoute and it will start storing all the payload into the Elastic search server.

 

Have more Questions?

We are with all ears waiting to hear from you. Post us with your questions and feedback.