0% found this document useful (0 votes)
29 views47 pages

IoT Design for Home Automation

The document outlines a comprehensive methodology for designing IoT systems, emphasizing the importance of defining purpose, requirements, and specifications through various steps. It details the integration of physical devices, such as Raspberry Pi, with cloud offerings and communication APIs, while also addressing the challenges of vendor lock-in and system updates. The methodology includes steps from purpose specification to application development, providing a structured approach to creating effective IoT solutions.

Uploaded by

anisha01531
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views47 pages

IoT Design for Home Automation

The document outlines a comprehensive methodology for designing IoT systems, emphasizing the importance of defining purpose, requirements, and specifications through various steps. It details the integration of physical devices, such as Raspberry Pi, with cloud offerings and communication APIs, while also addressing the challenges of vendor lock-in and system updates. The methodology includes steps from purpose specification to application development, providing a structured approach to creating effective IoT solutions.

Uploaded by

anisha01531
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 47

UNIT V

IoT Platforms Design Methodology: Introduction, IoT Platform Design Methodology, IoT
Physical Devices & Endpoints, Raspberry Pi interfaces, Programming Raspberry Pi with
Python, Other IoT Devices. IoT Physical Servers & Cloud Offerings: Introduction to
Cloud Storage Models & Communication APIs, WAMP - AutoBahn for IoT, Xively Cloud
for IoT, Python Web Application Framework-Django, Amazon Web Services for IoT.

INTRODUCTION

Designing IoT systems can be a complex and challenging task as these systems involve
interactions between various components such as IoT devices and network resources, web
services, analytics components, application and database servers. Due to a wide range of
choices available for each of these components, IoT system designers may find it difficult to
evaluate the available alternatives. IoT system designers often tend to design IoT systems
keeping specific products/services in mind. Therefore, these designs are tied to specific
product/service choices made. This leads to product, service or vendor lock-in, which while
satisfactory to the dominant vendor, is unacceptable to the customer. For such systems,
updating the system design to add new features or replacing a particular product/service choice
for a component becomes very complex, and in many cases may require complete re-design of
the system.

IOT DESIGN METHODOLOGY

Figure shows the steps involved in the IoT system design methodology. To explain these steps,
we use the example of a smart IoT-based home automation system.

Step 1: Purpose & Requirements Specification

The first step in IoT system design methodology is to define the purpose and requirements of
the system. In this step, the system purpose, behavior and requirements (such as data collection
requirements, data analysis requirements, system management requirements, data privacy and
security requirements, user interface requirements) are captured.

Applying this to our example of a smart home automation system, the purpose and
requirements for the system may be described as follows:
Figure: Steps involved in IoT system design methodology

Purpose: A home automation system that allows controlling of the lights in a home remotely
using a web application.

Behavior: The home automation system should have auto and manual modes. In auto mode,
the system measures the light level in the room and switches on the light when it gets dark. In
manual mode, the system provides the option of manually and remotely switching on/off the
light.

System Management Requirement: The system should provide remote monitoring and
control functions.

Data Analysis Requirement: The system should perform local analysis of the data.
Application Deployment Requirement: The application should be deployed locally on the
device, but should be accessible remotely.

Security Requirement: The system should have basic user authentication capability.

Step 2: Process Specification

The second step in the IoT design methodology is to define the process specification. In this
step, the use cases of the IoT system are formally described based on and derived from the
purpose and requirement specifications. Figure shows the process diagram for the home
automation system. The process diagram shows the two modes of the system - auto and manual.
In a process diagram, the circle denotes the start of a process, diamond denotes a decision box
and rectangle denotes a state or attribute. When the auto mode is chosen, the system monitors
the light level. If the light level is low, the system changes the state of the light to "on". Whereas,
if the light level is high, the system changes the state of the light to "off". When the manual
mode is chosen, the system checks the light state set by the user. If the light state set by the user
is "on", the system changes the state of light to "on". Whereas, if the light state set by the user
is "off", the system changes the state of light to "off".

Figure: Process specification for home automation IoT system


Step 3: Domain Model Specifications

The third step in the IoT design methodology is to define the Domain Model. The domain
model describes the main concepts, entities and objects in the domain of IoT system to be
designed. Domain model defines the attributes of the objects and relationships between objects.
Domain model provides an abstract representation of the concepts, objects and entities in the
IoT domain, independent of any specific technology or platform. With the domain model, the
IoT system designers can get an understanding of the IoT domain for which the system is to be
designed. Figure shows the domain model for the home automation system example. The
entities, objects and concepts defined in the domain model include:

Physical Entity: Physical Entity is a discrete and identifiable entity in the physical
environment (e.g. a room, a light, an appliance, a car, etc.). The IoT system provides
information about the Physical Entity (using sensors) or performs actuation upon the Physical
Entity (e.g., switching on a light). In the home automation example, there are two Physical
Entities involved one is the room in the home (of which the lighting conditions are to be
monitored) and the other is the light appliance to be controlled.

Virtual Entity: Virtual Entity is a representation of the Physical Entity in the digital world.
For each Physical Entity, there is a Virtual Entity in the domain model. In the home automation
example, there is one Virtual Entity for the room to be monitored, another for the appliance to
be controlled.

Device: Device provides a medium for interactions between Physical Entities and Virtual
Entities. Devices are either attached to Physical Entities or placed near Physical Entities.
Devices are used to gather information about Physical Entities (e.g., from sensors), perform
actuation upon Physical Entities (e.g. using actuators) or used to identify Physical Entities (e.g.,
using tags). In the home automation example, the device is a single-board mini computer which
has light sensor and actuator (relay switch) attached to it.

Resource: Resources are software components which can be either "on-device" or "network-
resources". On-device resources are hosted on the device and include software components that
either provide information on or enable actuation upon the Physical Entity to which the device
is attached. Network resources include the software components that are available in network
(such as a database). In the home automation example, the on-device resource is the operating
system that runs on the single-board mini computer.
Service: Services provide an interface for interacting with the Physical Entity. Services access
the resources hosted on the device or the network resources to obtain information about the
Physical Entity or perform actuation upon the Physical Entity.

In the home automation example, there are three services: (1) a service that sets mode to auto
or manual, or retrieves the current mode; (2) a service that sets the light appliance state to
on/off, or retrieves the current light state; and (3) a controller service that runs as a native
service on the device. When in auto mode, the controller service monitors the light level and
switches the light on/off and updates the status in the status database. When in manual mode,
the controller service retrieves the current state from the database and switches the light on/off.

Figure: Domain model of the home automation IoT System

Step 4: Information Model Specifications

The fourth step in the IoT design methodology is to define the Information Model. Information
Model defines the structure of all the information in the IoT system, for example, attributes of
Virtual Entities, relations, etc. Information model does not describe the specifics of how the
information is represented or stored. To define the information model, we first list the Virtual
Entities defined in the Domain Model. Information model adds more details to the Virtual
Entities by defining their attributes and relations. In the home automation example, there are
two Virtual Entities - a Virtual Entity for the light appliance (with attribute - light state) and a
Virtual Entity for the room (with attribute - light level). Figure shows the Information Model
for the home automation system example.

Figure: Information Model for the home automation system.

Step 5: Service Specifications

The fifth step in the IoT design methodology is to define the service specifications. Service
specifications define the services in the IoT system, service types, service inputs/output, service
endpoints, service schedules, service preconditions and service effects.
Figures show specifications of the controller, mode and state services of the home automation
system. The Mode service is a RESTful web service that sets mode to auto or manual (PUT
request), or retrieves the current mode (GET request). The mode is updated to/retrieved from
the database. The State service is a RESTful web service that sets the light appliance state to
on/off (PUT request), or retrieves the current light state (GET request). The state is updated
to/retrieved from the status database. The Controller service runs as a native service on the
device. When in auto mode, the controller service monitors the light level and switches the
light on/off and updates the status in the status database. When in manual mode, the controller
service, retrieves the current state from the database and switches the light on/off.

Step 6: IoT Level Specification

The sixth step in the IoT design methodology is to define the IoT level for the system. Figure
shows the deployment level of the home automation IoT system, which is level-1.

Figure Deployment level of the home automation IoT system

Step 7: Functional View Specification

The seventh step in the IoT design methodology is to define the Functional View. The
Functional View (FV) defines the functions of the IoT systems grouped into various Functional
Groups (FGs). Each Functional Group either provides functionalities for interacting with
instances of concepts defined in the Domain Model or provides information related to these
concepts.

The Functional Groups (FG) included in a Functional View include:

Device: The device FG contains devices for monitoring and control. In the home automation
example, the device FG includes a single board mini-computer, a light sensor and a relay switch
(actuator).
Communication: The communication FG handles the communication for the IoT system. The
communication FG includes the communication protocols that form the backbone of IoT
systems and enable network connectivity. The communication FG also includes the
communication APIs (such as REST and WebSocket) that are used by the services and
applications to exchange data over the network. In the home automation example the
communication protocols include - 802.11 (link layer), IPv4/IPv6 (network layer), TCP
(transport layer), and HTTP (application layer). The communication API used in the home
automation examples is a REST-based API.

Services: The service FG includes various services involved in the IoT system such as services
for device monitoring, device control services, data publishing services and services for device
discovery. In the home automation example, there are two REST services (mode and state
service) and one native service (controller service).

Management: The management FG includes all functionalities that are needed to configure
and manage the IoT system.

Security: The security FG includes security mechanisms for the IoT system such as
authentication, authorization, data security, etc.

Application: The application FG includes applications that provide an interface to the users to
control and monitor various aspects of the IoT system. Applications also allow users to view
the system status and the processed data.

Figure Mapping deployment level to functional groups for home automation IoT system.
Step 8: Operational View Specification

The eighth step in the IoT design methodology is to define the Operational View Specifications.
In this step, various options pertaining to the IoT system deployment and operation are defined,
such as, service hosting options, storage options, device options, application hosting options,
etc.

Figure Mapping functional groups to operational view specifications for home


automation IoT system.

Operational View specifications for the home automation example are as follows:

Devices: Computing device (Raspberry Pi), light dependent resistor (sensor), relay switch
(actuator).

Communication APIs: REST APIs

Communication Protocols: Link Layer - 802.11, Network Layer - IPv4/IPv6, Transport - TCP,
Application - HTTP.

Services:

1. Controller Service - Hosted on device, implemented in Python and run as a native service.

2. Mode service REST-ful web service, hosted on device, implemented with Django-REST
Framework.

3. State service- REST-ful web service, hosted on device, implemented with Django-REST
Framework.

• Application:

Web Application - Django Web Application, Application Server - Django App Server,
Database Server - MySQL.

Security:

Authentication: Web App, Database

Authorization: Web App, Database

Management:

Application Management - Django App Management Database Management - MySQL DB


Management, Device Management - Raspberry Pi device Management.

5.2.9 Step 9: Device & Component Integration

The ninth step in the IoT design methodology is the integration of the devices and components.
The devices and components used in this example are Raspberry Pi mini computer, LDR sensor
and relay switch actuator.

5.2.10 Step 10: Application Development


The final step in the IoT design methodology is to develop the IoT application. The application
has controls for the mode (auto on or auto off) and the light (on or off). In the auto mode, the
IoT system controls the light appliance automatically based on the lighting conditions in the
room. When auto mode is enabled the light control in the application is disabled and it reflects
the current state of the light. When the auto mode is disabled, the light control is enabled and
it is used for manually controlling the light.

IOT PHYSICAL DEVICES & ENDPOINTS

What is an IoT Device

A "Thing" in Internet of Things (IoT) can be any object that has a unique identifier and which
can send/receive data (including user data) over a network (e.g., smart phone, smart TV,
computer, refrigerator, car, etc.). IoT devices are connected to the Internet and send information
about themselves or about their surroundings (e.g. information sensed by the connected
sensors) over a network (to other devices or servers/storage) or allow actuation upon the
physical entities/environment around them remotely. Some examples of IoT devices are listed
below:

• A home automation device that allows remotely monitoring the status of appliances and
controlling the appliances.
• An industrial machine which sends information abouts its operation and health monitoring data
to a server.
• A car which sends information about its location to a cloud-based service.
• A wireless-enabled wearable device that measures data about a person such as the number of
steps walked and sends the data to a cloud-based service.

Basic building blocks of an IoT Device

An IoT device can consist of a number of modules based on functional attributes, such as:

• Sensing: Sensors can be either on-board the IoT device or attached to the device. IoT device
can collect various types of information from the on-board or attached sensors such as
temperature, humidity, light intensity, etc. The sensed information can be 0communicated
either to other devices or cloud-based servers/storage.
• Actuation: IoT devices can have various types of actuators attached that allow taking actions
upon the physical entities in the vicinity of the device. For example, a relay switch connected
to an IoT device can turn an appliance on/off based on the commands sent to the device.
• Communication: Communication modules are responsible for sending collected data to other
devices or cloud-based servers/storage and receiving data from other devices and commands
from remote applications.
• Analysis & Processing: Analysis and processing modules are responsible for making sense of
the collected data.

Figure 7.1 shows a generic block diagram of a single-board computer (SBC) based IoT device
that includes CPU, GPU, RAM, storage and various types of interfaces and peripherals.
Exemplary Device: Raspberry Pi

Raspberry Pi is a low-cost mini-computer with the physical size of a credit card. Raspberry Pi
runs various flavors of Linux and can perform almost all tasks that a normal desktop computer
can do. In addition to this, Raspberry Pi also allows interfacing sensors and actuators through
the general purpose I/O pins. Since Raspberry Pi runs Linux operating system, it supports
Python "out of the box".

About the Board

Figure shows the Raspberry Pi board with the various components/peripherals labeled.

Processor & RAM: Raspberry Pi is based on an ARM processor. The latest version of
Raspberry Pi (Model B, Revision 2) comes with 700 MHz Low Power ARM1176JZ-F
processor and 512 MB SDRAM.

USB Ports: Raspberry Pi comes with two USB 2.0 ports. The USB ports on Raspberry Pi
can provide a current upto 100mA. For connecting devices that draw current more than 100mA,
an external USB powered hub is required.

Ethernet Ports: Raspberry Pi comes with a standard RJ45 Ethernet port. You can connect
an Ethernet cable or a USB Wifi adapter to provide Internet connectivity.

HDMI Output: The HDMI port on Raspberry Pi provides both video and audio output. You
can connect the Raspberry Pi to a monitor using an HDMI cable. For monitors that have a DVI
port but no HDMI port, you can use an HDMI to DVI adapter/cable.

Composite Video Output: Raspberry Pi comes with a composite video output with an RCA
jack that supports both PAL and NTSC video output. The RCA jack can be used to connect old
televisions that have an RCA input only.

Audio Output: Raspberry Pi has a 3.5mm audio output jack. This audio jack is used for
providing audio output to old televisions along with the RCA jack for video. The audio quality
from this jack is inferior to the HDMI output.

• GPIO Pins: Raspberry Pi comes with a number of general purpose input/ouput pins. Figure
shows the Raspberry Pi GPIO headers. There are four types of pins on Raspberry Pi - true
GPIO pins, 12C interface pins, SPI interface pins and serial Rx and Tx pins.
Display Serial Interface (DSI): The DSI interface can be used to connect an LCD panel to
Raspberry Pi.

Camera Serial Interface (CSI): The CSI interface can be used to connect a camera module
to Raspberry Pi.

Status LEDs: Raspberry Pi has five status LEDs. Table 7.1 lists Raspberry Pi status LEDs
and their functions.

• SD Card Slot: Raspberry Pi does not have a built in operating system and storage. You can
plug-in an SD card loaded with a Linux image to the SD card slot. Appendix-A e provides
instructions on setting up New Out-of-the-Box Software (NOOBS) on Raspberry Pi. You will
require atleast an 8GB SD card for setting up NOOBS.

• Power Input: Raspberry Pi has a micro-USB connector for power input.


Linux on Raspberry Pi

Raspberry Pi supports various flavors of Linux including:

Raspbian Raspbian Linux is a Debian Wheezy port optimized for Raspberry Pi. This is the
recommended Linux for Raspberry Pi. Appendix-1 provides instructions on setting up
Raspbian on Raspberry Pi.

• Arch: Arch is an Arch Linux port for AMD devices.


• Pidora: Pidora Linux is a Fedora Linux optimized for Raspberry Pi.
• RaspBMC: RaspBMC is an XBMC media-center distribution for Raspberry Pi.
• OpenELEC : OpenELEC is a fast and user-friendly XBMC media-center distribution.
• RISC OS: RISC OS is a very fast and compact operating system.
Figure 7.4 shows the Raspbian Linux desktop on Raspberry Pi. Figure 7.5 shows the default
file explorer on Raspbian. Figure 7.6 shows the default console on Raspbian. Figure 7.7 shows
the default browser on Raspbian. To configure Raspberry Pi, the raspi-config tool is used which
can be launched from command line as (Sraspi-config) as shown in Figure 7.8. Using the
configuration tool you can expand root partition to fill SD card, set keyboard layout, change
password, set locale and timezone, change memory split, enable or disable SSH server and
change boot behavior. It is recommended to expand the root file-system so that you can use the
entire space on the SD card.
Though Raspberry Pi comes with an HDMI output, it is more convenient to access the device
with a VNC connection or SSH. This does away with the need for a separate display for
Raspberry Pi and you can use Raspberry Pi from your desktop or laptop computer. Appendix-
A provides instructions on setting up VNC server on Raspberry Pi and the instructions to
connect to Raspberry Pi with SSH. Table 7.2 lists the frequently used commands on Raspberry
Pi.
RASPBERRY PI INTERFACES

Raspberry Pi has serial, SPI and I2C interfaces for data transfer as shown in Figure 7.3.
7.5.1 Serial
The serial interface on Raspberry Pi has receive (Rx) and transmit (Tx) pins for communication
with serial peripherals.
7.5.2 SPI
Serial Peripheral Interface (SPI) is a synchronous serial data protocol used for communicating
with one or more peripheral devices. In an SPI connection, there is one master device and one
or more peripheral devices. There are five pins on Raspberry Pi for SPI interface:
• MISO (Master In Slave Out): Master line for sending data to the peripherals.
• MOSI (Master Out Slave In): Slave line for sending data to the master.
• SCK (Serial Clock): Clock generated by master to synchronize data transmission
• CEO (Chip Enable 0): To enable or disable devices.
• CEO (Chip Enable 1): To enable or disable devices.
7.5.3 12C
The I2C interface pins on Raspberry Pi allow you to connect hardware modules. I2C interface
allows synchronous data transfer with just two pins - SDA (data line) and SCL (clock line).
PROGRAMMING RASPBERRY PI WITH PYTHON
Raspberry Pi runs Linux and supports Python out of the box. Therefore, you can run any Python
program that runs on a normal computer. However, it is the general purpose input/output
capability provided by the GPIO pins on Raspberry Pi that makes it useful device for Internet
of Things. You can interface a wide variety of sensor and actuators with Raspberry Pi using the
GPIO pins and the SPI, I2C and serial interfaces. Input from the sensors connected to Raspberry
Pi can be processed and various actions can be taken, for instance, sending data to a server,
sending an email, triggering a relay switch.
7.6.1 Controlling LED with Raspberry Pi
Let us start with a basic example of controlling an LED from Raspberry Pi. Figure 7.9 shows
the schematic diagram of connecting an LED to Raspberry Pi. Box 7.1 shows how to turn the
LED on/off from command line. In this example the LED is connected to GPIO pin 18. You
can connect the LED to any other GPIO pin as well.
Box 7.2 shows a Python program for blinking an LED connected to Raspberry Pi every second.
The program uses the RPi. GPIO module to control the GPIO on Raspberry Pi. In this program
we set pin 18 direction to output and then write True/False alternatively after a delay of one
second.
7.6.2 Interfacing an LED and Switch with Raspberry Pi
Now let us look at a more detailed example involving an LED and a switch that is used to
control the LED.
Figure 7.10 shows the schematic diagram of connecting an LED and switch to Raspberry Pi.
Box 7.3 shows a Python program for controlling an LED with a switch. In this example the
LED is connected to GPIO pin 18 and switch is connected to pin 25. In the infinite while loop
the value of pin 25 is checked and the state of LED is toggled if the switch is pressed. This
example shows how to get input from GPIO pins and process the input and take some action.
The action in this example is toggling the state of an LED. Let us look at another example, in
which the action is an email alert. Box 7.4 shows a Python program for sending an email on
switch press. Note that the structure of this program is similar to the program in Box 7.3. This
program uses the Python SMTP library for sending an email when the switch connected to
Raspberry Pi is pressed.
7.6.3 Interfacing a Light Sensor (LDR) with Raspberry Pi
So far you have learned how to interface LED and switch with Raspberry Pi. Now let us look
at an example of interfacing a Light Dependent Resistor (LDR) with Raspberry Pi and turning
an LED on/off based on the light-level sensed.
Figure 7.11 shows the schematic diagram of connecting an LDR to Raspberry Pi. Connect one
side of LDR to 3.3V and other side to a 1uF capacitor and also to a GPIO pin (pin 18 in this
example). An LED is connected to pin 18 which is controlled based on the light-level sensed.
Box 7.5 shows the Python program for the LDR example. The readLDR() function returns a
count which is proportional to the light level. In this function the LDR pin is set to output and
low and then to input. At this point the capacitor starts charging through the resistor (and a
counter is started) until the input pin reads high (this happens when capacitor voltage becomes
greater than 1.4V). The counter is stopped when the input reads high. The final count is
proportional to the light level as greater the amount of light, smaller is the LDR resistance and
greater is the time taken to charge the capacitor.
OTHER LOT DEVICES
Let us look at single-board mini-computers which are alternatives to Raspberry Pi. Table 7.3
provides a comparison of some single-board mini-computers that can be used for IoT.

1. pcDuino
pcDuino is an Arduino-pin compatible single board mini-computer that comes with a 1 GHz
ARM Cortex-A8 processor. pcDuino is a high performance and cost effective device that runs
PC like OS such as Ubuntu and Android ICS. Like, Raspberry Pi, it has an HDMI video/audio
interface. pcDuino supports various programming languages including C, C++ (with GNU tool
chain), Java (with standard Android SDK) and Python.

2. BeagleBone Black
BeagleBone Black is similar to Raspberry Pi, but a more powerful device. It comes with a 1
GHz ARM Cortex-A8 processor and supports both Linux and Android operating systems. Like
Raspberry Pi, it has HDMI video/audio interface, USB and Ethernet ports.

3. Cubieboard
Cubieboard is powered by a dual core ARM Cortex A7 processor and has a range of
input/output interfaces including USB, HDMI, IR, serial, Ethernet, SATA, and a 96 pin
extended interface. Cubieboard also provides SATA support. The board can run both Linux and
Android operating systems.
IOT PHYSICAL SERVERS & CLOUD OFFERINGS
Introduction to Cloud Storage Models & Communication APIs
Cloud computing is a transformative computing paradigm that involves delivering applications
and services over the Internet. NIST defines cloud computing as - Cloud computing is a model
for enabling ubiquitous, convenient, on-demand network access to a shared pool of
configurable computing resources (e.g., networks, servers, storage, applications, and services)
that can be rapidly provisioned and released with minimal management effort or service
provider interaction.
WAMP - AUTOBAHN FOR LOT
Web Application Messaging Protocol (WAMP) is a sub-protocol of Websocket which provides
publish-subscribe and remote procedure call (RPC) messaging patterns. WAMP enables
distributed application architectures where the application components are distributed on
multiple nodes and communicate with messaging patterns provided by WAMP.
Let us look at the key concepts of WAMP:
• Transport: Transport is channel that connects two peers. The default transport for
WAMP is WebSocket. WAMP can run over other transports as well which support
message-based reliable bi-directional communication.
• Session: Session is a conversation between two peers that runs over a transport.
✓ Client: Clients are peers that can have one or more roles. In publish-subscribe model
client can have following roles: Publisher: Publisher publishes events (including
payload) to the topic maintained by the Broker.
✓ Subscriber: Subscriber subscribes to the topics and receives the events including the
payload.
In RPC model client can have following roles:
- Caller: Caller issues calls to the remote procedures along with call arguments.
- Callee: Callee executes the procedures to which the calls are issued by the caller and returns
the results back to the caller.
Router: Routers are peers that perform generic call and event routing. In publish-subscribe
model Router has the role of a Broker:
- Broker: Broker acts as a router and routes messages published to a topic to all subscribers
subscribed to the topic.
In RPC model Router has the role of a Broker:
- Dealer: Dealer acts a router and routes RPC calls from the Caller to the Callee and
routes results from Callee to Caller.

Application Code: Application code runs on the Clients (Publisher, Subscriber, Callee or
Caller).

Figure 8.1 shows a WAMP Session between Client and Router, established over a Transport.
Figure 8.2 shows the WAMP protocol interactions between peers. In this figure the WAMP
transport used is WebSocket. WAMP sessions are established over WebSocket transport within
the lifetime of WebSocket transport.

Figure 8.1 Wamp session between client and router

Figure 8.3 shows the communication between various components of a typical WAMP-
AutoBahn deployment. The Client (in Publisher role) runs a WAMP application component
that publishes messages to the Router. The Router (in Broker role) runs on the Server and routes
the messages to the Subscribers. The Router (in Broker role) decouples the Publisher from the
Subscribers. The communication between Publisher - Broker and Broker - Subscribers happens
over a WAMP-WebSocket session.

Let us look at an example of a WAMP publisher and subscriber implemented using AutoBahn.
Box 8.1 shows the commands for installing AutoBahn-Python.

#Set up Autobahn

sudo apt-get install python-twisted python-dev


sudo apt-get install python-pip
sudo pip install -upgrade twisted
sudo pip install -upgrade autobahn
After installing AutoBahn, clone AutobahnPython from GitHub as follows:
git clone https://github.com/tavendo/AutobahnPython.git
Create a WAMP publisher component as shown in Box 8.2. The publisher compone publishes
a message containing the current time-stamp to a topic named 'test-topic'. Ne: create a WAMP
subscriber component as shown in Box 8.3. The subscriber component th subscribes to the
'test-topic'. Run the application router on a WebSocket transport server: follows:
python AutobahnPython/examples/twisted/wamp/basic/server.py
Run the publisher component over a WebSocket transport client as follows:
python AutobahnPython/examples/twisted/wamp/basic/client.py -component "publisher
App.Component"
Run the subscriber component over a WebSocket transport client as follows:
python AutobahnPython/examples/twisted/wamp/basic/client.py -component
"subscriberApp.Component"
Example of a WAMP Publisher implemented using AutoBahn framework -publisher
App.py
from twisted.internet import reactor
from twisted. internet.defer import inlineCallbacks:
from autobahn.twisted.util import sleep
from autobahn.twisted.wamp import ApplicationSession import time, datetime
def getData():
#Generate message
timestamp= datetime.datetime.fromtimestamp (
time.time()).strftime('%Y-m-%d%H:%M:%S").
data "Message at time-stamp: return data
"+str(timestamp)
#An application component that publishes an event every second. class Component
(ApplicationSession):
@inlineCallbacks.
def onJoin(self, details):
while True:
data getData()
self.publish('test-topic', data)
yield sleep (1)
Example of a WAMP Subscriber implemented using AutoBahn framework -
subscriberApp.py
from twisted.internet import reactor
from twisted. internet.defer import inlineCallbacks
from autobahn.twisted.wamp import ApplicationSession
#An application component that subscribes and receives events: class Component
(ApplicationSession) :)
@inlineCallbacks
def onJoin (self, details): self.received 0
def on event (data):
print "Received message:
+data
yield self.subscribe (on event, test-topic')
def onDisconnect (self):
reactor.stop()
While you can setup the server and client processes on a local machine for trying out the
publish-subscribe example, in production environment, these components run on separate
machines. The server process (the brains or the "Thing Tank"!) is setup on a cloud-based
instance while the client processes can run either on local hosts/devices or in the cloud.

XIVELY CLOUD FOR LOT


Xively is a commercial Platform-as-a-Service that can be used for creating solutions for
Internet of Things. With Xively cloud, IoT developers can focus on the front-end infrastructure
and devices for IoT (that generate the data), while the backend data collection infrastructure is
managed by Xively.
Figure 8.4: Screenshot of Xively dashboard - creating a new device
Xively platform comprises of a message bus for real-time message management and routing,
data services for time series archiving, directory services that provides a search-able directory
of objects and business services for device provisioning and management. Xively provides an
extensive support for various languages and platforms. The Xively libraries leverage standards-
based API over HTTP, Sockets and MQTT for connecting IoT devices to the Xively cloud. In
this chapter we will describe how to use the Xively Python library.
To start using Xively, you have to register for a developer account. You can then create
development devices on Xively. Figures 8.4 shows screenshot of how to create a new device
from the Xively dashboard. When you create a device, Xively automatically creates a
Feed-ID and an API Key to connect to the device as shown in Figures 8.5. Each device has a
unique Feed-ID. Feed-ID is a collection of channels or datastreams defined for a device and
the associated meta-data. API keys are used to provide different levels of permissions. The
default API key has read, update, create and delete permissions.
Xively devices have one or more channels. Each channel enables bi-directional
communication between the IoT devices and the Xively cloud. IoT devices can send data to a
channel using the Xively APIs. For each channel, you can create one or more triggers. A trigger
specification includes a channel to which the trigger corresponds, trigger condition
(e.g. channel value less than or greater than a certain value) and an HTTP POST URL to which
the request is sent when the trigger fires. Triggers are used for integration with third-party
applications.
Let us look at an example of using Xively cloud for an IoT system that monitors temperature
and sends the measurements to a Xively channel. The temperature monitoring device can be
built with the Raspberry Pi board and a temperature sensor connected to the board. The
Raspberry Pi runs a controller program that reads the sensor values every few seconds and
sends the measurements to a Xively channel. Box 8.4 shows the Python program for the
sending temperature data to Xively Cloud. This example uses the Xively Python library. To
keep the program simple and without going into the details of the temperature sensor we use
synthetic data (generated randomly in readTempSensor() function). In this controller program,
a feed object is created by providing the API key and Feed-ID. Then a channel named
temperature is created (if not existing) or retrieved. The temperature data is sent to this channel
in the runController() function every 10 seconds. Figures 8.6 shows the temperature channel in
the Xively dashboard. In this example we created a single Xively device with one channel. In
real-world scenario each Xively device can have multiple channels and you can have multiple
devices in a production batch.

Python program sending data to Xively Cloud


import time
import datetime:
import requests import xively
from random import randint global temp_datastream #Initialize Xively Feed
FEED_ID = "<enter feed-id>"
API_KEY"<enter api-key>"
api = xively.XivelyAPIClient (API_KEY)
#Function to read Temperature Sensor def readTempSensor():
#Return random value
return randint (20, 30)
#Controller main function def runController():
global temp_datastream
temperature-readTempSensor ()
temp_datastream.current_value = temperature temp_datastream.at =
datetime.datetime.utcnow ()
print "Updating Xively feed with Temperature:
try:
temp_datastream.update()
except requests.HTTPError as e:: print "HTTPError (0) :
#Function to get existing or
%s" & temperature
1".format (e.errno, e.strerror)
#create new Xively data stream for temperature def get tempdatastream (feed):
try:
datastream feed.datastreams.get("temperature") return datastream
except:
datastream feed, datastreams.create("temperature",
tags="temperature")
return datastream
#Controller setup function def setupController():
global temp_datastream
feed api.feeds.get (FEED_ID) feed.location.lat="30.733315"
feed.location.lon="76.779418"
feed.tags="Weather"
feed.update()
temp_datastream get_tempdatastream (feed)
temp_datastream.max_value = None
temp_datastream.min_value
setupController())
while True:
runController ()
time.sleep(10)
None
PYTHON WEB APPLICATION FRAMEWORK - DJANGO
To build IoT applications that are a backed by Xively cloud or any other data collection
systems, you would require some type of web application framework.
Django is an open source web application framework for developing web applications in
Python. A "web application framework" in general is a collection of solutions, packages and
best practices that allows development of web applications and dynamic websites. Django is
based on the well-known Model-Template-View architecture and provides a separation of the
data model from the business rules and the user interface. Django provides a unified API to a
database backend. Therefore, web applications built with Django can work with different
databases without requiring any code changes. With this flexibility in web application design
combined with the powerful capabilities of the Python language and the Python ecosystem,
Django is best suited for IoT applications. Django, concisely stated, consists of an object-
relational mapper, a web templating system and a regular-expression-based URL dispatcher.
Django Architecture
Django is a Model-Template-View (MTV) framework wherein the roles of model, template
and view, respectively, are:
Model
The model acts as a definition of some stored data and handles the interactions with the
database. In a web application, the data can be stored in a relational database, non-relational
database, an XML file, etc. A Django model is a Python class that outlines the variables and
methods for a particular type of data.
Template
In a typical Django web application, the template is simply an HTML page with a few extra
placeholders. Django's template language can be used to create various forms of text files
(XML, email, CSS, Javascript, CSV, etc.)
View
The view ties the model to the template. The view is where you write the code that actually
generates the web pages. View determines what data is to be displayed, retrieves the data from
the database and passes the data to the template.
Starting Development with Django
Creating a Django Project and App
provides the commands for creating a Django project and an application within a project.
When you create a new django project a number of files are created as described below:
.__init__.py: This file tells Python that this folder is a Python package
manage.py: This file contains an array of functions for managing the site.
settings.py: This file contains the website's settings
urls.py: This file contains the URL patterns that map URLs to pages.
A Django project can have multiple applications ("apps"). Apps are where you write the code
that makes your website function. Each project can have multiple apps and each app can be
part of multiple projects.
When a new application is created a new directory for the application is also created which has
a number of files including:
model.py: This file contains the description of the models for the application. views.py: This
file contains the application views.
Creating a new Django project and an app in the project
#Create a new project.
django-admin.py startproject blogproject
#Create an application within the project python mangage.py startapp myapp
#Starting development server
python manage.py runserver
#Django uses port 8000 by default #The project can be viewed at the URL:
#http://localhost:8000
Django comes with a built-in, lightweight Web server that can be used for development
purposes. When the Django development server is started the default project can be viewed at
the URL: http://localhost:8000. Figure 8.7 shows a screenshot of the default project.

Configuring a Database
Till now you have learned how to create a new Django project and an app within the project.
Most web applications have a database backend. Developers have a wide choice of databases
that can be used for web applications including both relational and non-relational databases.
Django provides a unified API for database backends thus giving the freedom to choose the
database. Django supports various relational database engines including MySQL, PostgreSQL,
Oracle and SQLite3. Support for non-relational databases such as MongoDB can be added by
installing additional engines (e.g. Django-MongoDB engine for MongoDB).
Let us look at examples of setting up a relational and a non-relational database with a Django
project. The first step in setting up a database is to install and configure a database server. After
installing the database, the next step is to specify the database settings in the setting.py file in
the Django project.
Box 8.6 shows the commands to setup MySQL. Box 8.7 shows the database setting to use
MySQL with a Django project.
Box 8.6: Setting up MySQL database
#Install MySQL
sudo apt-get install mysql-server mysql-client
sudo mysqladmin u root -h localhost password 'mypassword'
Box 8.7: Configuring MySQL with Django - settings.py
DATABASES (
'default'a
ENGINE: django.db.backends.mysql. 'NAME': '<database-name>
USER: root!
PASSWORD: mypassword'
HOST': <hostname>", ', # set to empty for localhost PORT: '<port>', #set to empty for default
port
Box 8.8 shows the commands to setup MongoDB and the associated Django-MongoDB
engine. Box 8.9 shows the database setting to use MongoDB within a Django project.
■ Box 8.8: Setting up MongoDB and Django-MongoDB engine
#Install MongoDB
sudo apt-key adv -keyserver keserver.ubuntu.com -recv 7F0CEB10
echo 'deb http://downloads-distro.mongodb.org/repo/ubuntu-upstart dist 10gen' sudo tee
/etc/apt/sources.list.d/10gen.list
sudo apt-get update
sudo apt-get install mongodb-10gen
#Setup Django MongoDB Engine
sudo pip install
https://bitbucket.org/wkornewald/django-nonrel/get/tip.tar.gz
sudo pip install
https://bitbucket.org/wkornewald/djangotoolbox/get/tip.tar.gz
sudo pip install
https://github.com/django-nonrel/mongodb-engine/tarball/master
Box 8.9: Configuring MongoDB with Django - settings.py
DATABASES - (
'default':
ENGINE: 'django_mongodb_engine",
NAME:<database-name>
HOST': PORT:
<mongodb-hostname>', set to empty for localhost <mongodb-port>', #set to empty for default
port
Defining a Model
A Model acts as a definition of the data in the database. In this section we will explain Django
with the help of a weather station application that displays the temperature data collected by an
IoT device. Box 8.10 shows an example of a Django model for TemperatureData. The
TemperatureData table in the database is defined as a Class in the Django model.
Each class that represents a database table is a subclass of django.db.models.Model which
contains all the functionality that allows the models to interact with the database. The
Temperature Data class has fields timestamp, temperature, lat and lon all of which are
CharField. To sync the models with the database simply run the following command: Opython
manage.py syncdb
When the syncdb command is run for the first time, it creates all the tables defined in the
Django model in the configured database.
Box 8.10: Example of a Django model
from django.db import models.
class TemperatureData (models.Model):
timestamp
models.CharField(max_length=10) temperature: = models.CharField(max_length=5) lat =
models.CharField(max_length-10)
Ion
models.CharField(max_length=10)
def unicode (self):
return self.timestamp
Django Admin Site
Django provides an administration system that allows you to manage the website without
writing additional code. This "admin" system reads the Django model and provides an interface
that can be used to add content to the site. The Django admin site is enabled by adding
django.contrib.admin and django.contrib.admindocs to the INSTALLED_APPS section in the
settings.py file.
To define which of your application models should be editable in the admin interface, a new
file named admin.py is created in the application folder as shown in Box 8.11.
Box 8.11: Enabling admin for Django models
from django.contrib import admin from myapp.models import TemperatureData
admin.site.register (TemperatureData)
Figure 8.8 shows a screenshot of the default admin interface. You can see all the tables
corresponding to the Django models in this screenshot. Figure 8.9 shows how to add new items
in the Temperature Data table using the admin site.
Defining a View
The View contains the logic that glues the model to the template. The view determines the data
to be displayed in the template, retrieves the data from the database and passes it to the
template. Conversely, the view also extracts the data posted in a form in the template and
inserts it in the database. Typically, each page in the website has a separate view, which is
basically a Python function in the views.py file. Views can also perform additional tasks such
as authentication, sending emails, etc. backend used (e.g. MySQLdb for MYSQL, PyMongo
for MongoDB, etc.) to write database backed specific code. For more information about the
Django views refer to the Django documentation [118].
In the view shown in Box 8.12, the TemperatureData.objects.order_by('id')[0] query returns the
latest entry in the table. To retrieve all entries, you can use table.objects.all(). To retrieve
specific entries, you can use table.objects.filter(**kwargs) to filter out queries that match the
specified condition. To render the retrieved entries in the template, the render_to_response
function is used. This function renders a given template with a given context dictionary and
returns an HttpResponse object with that rendered text. Box 8.13 shows an alternative view
that retrieves data from the Xively cloud.
AMAZON WEB SERVICES FOR LOT
1 Amazon EC2
Amazon EC2 is an Infrastructure-as-a-Service (IaaS) provided by Amazon. EC2 delivers
scalable, pay-as-you-go compute capacity in the cloud. EC2 is a web service that provides
computing capacity in the form of virtual machines that are launched in Amazon's cloud
computing environment. EC2 can be used for several purposes for IoT systems. For example,
IoT developers can deploy IoT applications (developed in frameworks such as Django) on EC2,
setup IoT platforms with REST web services, etc.
Let us look at some examples of using EC2. Box 8.23 shows the Python code for launching an
EC2 instance. In this example, a connection to EC2 service is first established by calling
boto.ec2.connect_to_region. The EC2 region, AWS access key and AWS secret key are passed
to this function. After connecting to EC2, a new instance is launched using the
conn.run_instances function. The AMI-ID, instance type, EC2 key handle and security group
are passed to this function. This function returns a reservation. The instances associated with
the reservation are obtained using reservation.instances. Finally the status of an instance
associated with a reservation is obtained using the instance.update function. In the example
shown in Box 8.23, the program waits till the status of the newly launched instance becomes
running and then prints the instance details such as public DNS, instance IP, and launch time.
Box 8.23: Python program for launching an EC2 instance
import boto.ec2
from time import sleep
ACCESS KEY "<center access key>"
SECRET KEY="<center secret key>"
REGION="us-east-1"
AMI ID="ami-d0f89fb9"
EC2 KEY HANDLE = "<center key handle>"
INSTANCE_TYPE "t1.micro"
SECGROUP HANDLE-"default"
print "Connecting to EC2"
conn=boto.ec2.connect_to_region (REGION,
aws_access_key_id-ACCESS_KEY, aws_secret_access_key-SECRET_KEY)
print "Launching instance with AMI-ID %s, with keypair %s, instance type s, security group
%s" (AMI_ID, EC2_KEY_HANDLE, INSTANCE_TYPE, SECGROUP_HANDLE)
reservation = conn. run_instances (image_id=AMI_ID, key_name EC2 KEY HANDLE,
instance_type=INSTANCE_TYPE,
security_groups- [SECGROUP HANDLE, )
instance reservation. instances[0]
print "Waiting for instance to be up and running"
status=instance.update()
while status == 'pending':
sleep (10)
status instance.update()
if status==running':
print “Instance is now running. Instance details are: "
print "Intance Size: + str(instance,instance_type)”
print "Intance State: "+ str(instance.state)”
print "Intance Launch Time: + str(instance. Launch time):”
print "Intance Public DNS: "+ str(instance.public_dns_name)
print "Intance Private DNS:+ str(instance.private_dns_name)
print "Intance IP: "+ str(instance.ip_address)”
print "Intance Private IP: " + str(instance.private_ip_address)
Box 8.24 shows the Python code for stopping an EC2 instance. In this example the
conn.get_all_instances function is called to get information on all running instances. This
function returns reservations. Next, the IDs of instances associated with each reservation are
obtained. The instances are stopped by calling conn.stop_instances function to which the IDs
of the instances to stop are passed.
Box 8.24: Python program for stopping an EC2 instance
import boto.ec2
from time import sleep
ACCESS KEY="<enter access key>"
SECRET_KEY="center secret key>"
REGION-us-east-1"
print "Connecting to EC2"
сол = boto.ec2.connect_to_region (REGION,
aws_access_key_id-ACCESS KEY,
aws_secret_access_key=SECRET_KEY)
print "Getting all running instances" reservations conn.get_all_instances () print reservations
instance_rs
reservations[0].instances
instance- instance_rs[0]
instanceid-instance_rs[0].id
print "Stopping instance with ID: + str(instanceid) conn.stop_instances
(instance_ids=[instanceid])
status=instance.update()
while not status == ‘stopped':
sleep (10)
status =instance.update()
print "Stopped instance with ID: " + str(instanceid)

2 Amazon AutoScaling
Amazon AutoScaling allows automatically scaling Amazon EC2 capacity up or down
according to user defined conditions. Therefore, with AutoScaling users can increase the
number of EC2 instances running their applications seamlessly during spikes in the application
workloads to meet the application performance requirements and scale down capacity when
the workload is low to save costs. AutoScaling can be used for auto scaling IoT applications
and IoT platforms deployed on Amazon EC2.
Let us now look at some examples of using AutoScaling. Box 8.25 shows the Python code for
creating an AutoScaling group. In this example, a connection to AutoScaling service is first
established by calling boto.ec2.autoscale.connect_to_region function.
The EC2 region, AWS access key and AWS secret key are passed to this function.

After connecting to AutoScaling service, a new launch configuration is created by calling


conn.create_launch_configuration. Launch configuration contains instructions on how to
launch new instances including the AMI-ID, instance type, security groups, etc. After creating
a launch configuration, it is then associated with a new AutoScaling group. AutoScaling group
is created by calling conn.create_auto_scaling_group. The settings for AutoScaling group
include maximum and minimum number of instances in the group, launch configuration,
availability zones, optional load balancer to use with the group, etc. After creating an
AutoScaling group, the policies for scaling up and scaling down are defined. In this example,
a scale up policy with adjustment type ChangeInCapacity and scaling_adjustment = 1 is
defined. Similarly a scale down policy with adjustment type ChangeInCapacity and scaling_ad
justment is defined. With the scaling policies defined, the next step is to create Amazon
CloudWatch alarms that trigger these policies. In this example, alarms for scaling up and
scaling down are created. The scale up alarm is defined using the CPUUtilization metric with
the Average statistic and threshold greater 70% for a period of 60 sec. The scale up policy
created previously is associated with this alarm. This alarm is triggered when the average CPU
utilization of the instances in the group becomes greater than 70% for more than 60 seconds.
The scale down alarm is defined in a similar manner with a threshold less than 50%.

3. Amazon S3

Amazon S3 is an online cloud-based data storage infrastructure for storing and retrieving a
very large amount of data. S3 provides highly reliable, scalable, fast, fully redundant and
affordable storage infrastructure. $3 can serve as a raw datastore (or "Thing Tank") for IoT
systems for storing raw data, such as sensor data, log data, image, audio and video data. Let us
look at some examples of using $3. Box 8.26 shows the Python code for uploading a file to
Amazon S3 cloud storage. In this example, a connection to $3 service is first established by
calling boto.connect_s3 function. The AWS access key and AWS secret key are passed to this
function. This example defines two functions upload_to_s3_bucket_path and
upload_to_s3_bucket_root. The upload_to_s3_bucket_path function uploads the file to the S3
bucket specified at the specified path. The upload_to_s3_bucket_root function uploads the file
to the S3 bucket root.
Box 8.26: Python program for uploading a file to an S3 bucket
import boto.s3
ACCESS KEY="<enter access key>"
SECRET KEY="center secret key>"
conn boto.connect_s3 (aws_access_key_id-ACCESS_KEY,
aws_secret_access_key=SECRET_KEY)
def percent_cb (complete, total):
print (".")
def upload_to_s3_bucket_path (bucketname, path, filename): conn.get_bucket (bucketname)
mybucket
fullkeyname os.path.join (path, filename)
key mybucket.new key (fullkeyname)
key.set_contents_from_filename (filename, cb-percent_cb, num_cb-10)
def upload_to_s3_bucket_root (bucketname, filename):
mybucket conn.get_bucket (bucketname) key mybucket.new_key (filename)
key.set_contents_from_filename (filename, cb-percent_cb, num_cb-10)
upload_to_s3_bucket_path('mybucket 2013', 'data', 'file.txt')

4 Amazon RDS
Amazon RDS is a web service that allows you to create instances of MySQL, Oracle or
Microsoft SQL Server in the cloud. With RDS, developers can easily set up, operate, and scale
a relational database in the cloud.
RDS can serve as a scalable datastore for IoT systems. With RDS, IoT system developers can
store any amount of data in scalable relational databases. Let us look at some examples of using
RDS. Box 8.27 shows the Python code for launching an Amazon RDS instance. In this
example, a connection to RDS service is first established by calling boto.rds.connect_to_region
function. The RDS region, AWS access key and AWS secret key are passed to this function.
After connecting to RDS service, the conn.create_dbinstance function is called to launch a new
RDS instance. The input parameters to this function include the instance ID, database size,
instance type, database username, database password, database port, database engine (e.g.
MySQL5.1), database name, security groups, etc. The program shown in Box 8.27 waits till
the status of the RDS instance becomes available and then prints the instance details such as
instance ID, create time, or instance end point.

5 Amazon DynamoDB
Amazon DynamoDB is a fully-managed, scalable, high performance No-SQL database service.
DynamoDB can serve as a scalable datastore for IoT systems. With DynamoDB, IoT system
developers can store any amount of data and serve any level of requests for the data.
Let us look at some examples of using DynamoDB. Box 8.29 shows the Python code for
creating a DynamoDB table. In this example, a connection to DynamoDB service is first
established by calling boto.dynamodb.connect_to_region.
The DynamoDB region, AWS access key and AWS secret key are passed to this function. After
connecting to DynamoDB service, a schema for the new table is created by calling
conn.create_schema. The schema includes the hash key and range key names and types. A
DynamoDB table is then created by calling conn.create_table function with the table schema,
read units and write units as input parameters.
Box 8.29: Python program for creating a DynamoDB table
import boto.dynamodb

import time

6 Amazon Kinesis

Amazon Kinesis is a fully managed commercial service that allows real-time processing of
streaming data. Kinesis scales automatically to handle high volume streaming data coming
from large number of sources. The streaming data collected by Kinesis can be processed by
applications running on Amazon EC2 instances or any other compute instance that can connect
to Kinesis. Kinesis is well suited for IoT systems that generate massive scale data and have
strict real-time requirements for processing the data. Kinesis allows rapid and continuous data
intake and support data blobs of size upto 50Kb. The data producers (e.g. IoT devices) write
data records to Kinesis streams. A data record comprises of a sequence number, a partition key
and the data blob. Data records in a Kinesis stream are distributed in shards. Each shard
provides a fixed unit of capacity and a stream can have multiple shards. A single shard of
throughput allows capturing 1MB per second of data, at up to 1,000 PUT transactions per
second and allows applications to read data at up to 2 MB per second.

7 Amazon SQS

Amazon SQS offers a highly scalable and reliable hosted queue for storing messages as they
travel between distinct components of applications. SQS guarantees only that messages arrive,
not that they arrive in the same order in which they were put in the queue. Though, at first look,
Amazon SQS may seem to be similar to Amazon Kinesis, however, both are intended for very
different types of applications. While Kinesis is meant for real-time applications that involve
high data ingress and egress rates, SQS is simply a queue system that stores and releases
messages in a scalable manner. SQS can be used in distributed IoT applications in which
various application components need to exchange messages.

8 Amazon EMR

Amazon EMR is a web service that utilizes Hadoop framework running on Amazon EC2 and
Amazon S3. EMR allows processing of massive scale data, hence, suitable for IoT applications
that generate large volumes of data that needs to be analyzed. Data processing jobs are
formulated with the MapReduce parallel data processing model.

MapReduce is a parallel data processing model for processing and analysis of massive scale
data. MapReduce model has two phases: Map and Reduce. MapReduce programs are written
in a functional programming style to create Map and Reduce functions. The input data to the
map and reduce phases is in the form of key-value pairs.

Consider an IoT system that collects data from a machine (or sensor data) which is logged in a
cloud storage (such as Amazon S3) and analyzed on hourly basis to generate alerts if a certain
sequence occurred more than a predefined number of times. Since the scale of data involved in
such applications can be massive, MapReduce is an ideal choice for processing such data.

You might also like