Tag Archives: connected devices

Building The Boat of Things

Boat of things

With the variety of different IoT-related work we do (including the oft-blogged-about CWRU course), it only made sense for us to have an “IoT sandbox” to experiment and play with. It was one of our hack days that provided us with an opportunity to bring such an idea to life.

Building a sandbox gives us an opportunity to experiment with new IoT devices and software, while giving LeanDoggers a breakable toy to work with during hack days …and for some nerdy fun. It also allows us to start collecting sensor data for research and exploration.

First Iteration

The first iteration of this idea consisted of two parts — one team would build an Alexa Skill for the Amazon Echo to ask as an interface to other devices on the Boat-of-Things network, while the second team would build the infrastructure, set up the MQTT broker, and start connecting other devices.

By the end of the first iteration, we had established our user interfaces into the Boat of Things — a Slackbot called Otis, which acts as a sort of command-line interface, and an Alexa Skill, allowing us to say “Alexa! Ask Otis to <verb>”.

We also built our first actual integration — a long-standing issue in the LeanDog Studio is music. Since the inception of LeanDog Studio, we’ve used a Mac Mini attached to speakers running a browser with Pandora. We would individually VNC into the server to change stations, with a mutual understanding that any station played should be kept on for at least three songs to prevent music anarchy.

Okay, so we didn’t solve music anarchy — what we created is a Google Chrome plugin that scrapes the website and publishes the stations and current playing song. It subscribes to a control topic that allows playback control and changing stations.

By the end of all this, we could say “Alexa! Ask Otis what’s playing” or use Slack:

music

Other Integrations

The number of integrations we’ve built since then has exploded. Here are some of them:

A couple years back GE created this module for makers called Green Bean which can connect to the diagnostics port of some of their appliances. We just happened to have compatible appliances on the boat, so we ordered one and hooked it up to our fridge and a Raspberry Pi.

The status of the fridge is now published over MQTT, which allows us to create some alarms:

dooropen

And do some fun useless things:

peeonfloor

output_r86HqR

Weather Station

In the quest to attach all the things, we found that our weather station upstairs had a USB port! We attached yet another Raspberry Pi (we’ve got a lot of Raspberry Pis) and publish the data every few seconds. We also used the opportunity to script an integration with Wunderground — our station handle is KOHCLEVE65. Now we can ask Otis for the weather:

weather

CI Screen + Radiator

We have a couple radiators running on mounted TVs around LeanDog Studio, including a CI board and individual project radiators. All of those subscribe to a marquee topic, which allows us to display images and animated gifs for a set amount of time. For instance, when someone finishes off the coffee and doesn’t brew a new pot:

Screen Shot 2016-08-04 at 12.45.35 PM

Works In Progress

Motion Sensor

In the future hope that we’ll be able to play some music when the boat starts rocking, we started logging motion events on the boat. To do this, we employed the help of an ESP8266 module and a 9DOF sensor.

motionThis is a really cool module that uses several sensors including accelerometer, gyroscope, and magnetometer and outputs simple euler angles so we know our position and orientation in 3D space. Right now we’re just collecting data in Amazon DynamoDB — soon, we hope to trigger some interactions when the boat starts moving — maybe a dramamine dispenser?

Coffee Pot

Our own Steve Jackson is working on a connected coffee scale to let us know how much coffee is left in our carafes and when it’s time to brew a new pot. The proof-of-concept has been completed and soon we’ll be building two of them and installing them in the kitchen.

coffee

That’s it. Let’s polka!

Finally, every Friday morning after standup, we allocate a little time to cleaning up the boat. For historical reasons that no one quite remembers, we do this to polka music. Thanks to an integration with the Amazon Dash button, announcing cleanup is simpler than ever:

 

Developing an amazing technology product of your own? Take our 1-Minute self-assessment to make sure you’re project is on-track for a successful launch!  Or, reach out to us at LeanDog.com! We’d love to hear all about it!

IoT Course Week 14 – Final Projects

Screen Shot 2015-10-15 at 3.07.21 PM

It’s been an intense 13 weeks for both us and the students. Now it’s finally time to dig into final projects. Students were invited to come up with an idea that added some sort of value to the LAMPI product, and we provided some possible ideas. Here are some of the highlights from those projects.

Build An Alarm Clock

For this project, the students created an alarm clock system using LAMPI. Through the web interface they already built, the user can create one or several alarms.

Screen Shot 2016-07-19 at 4.14.13 PM

Because LAMPI doesn’t have a speaker, the students had to improvise and blinked the light on and off several times instead.

Screen Shot 2016-07-19 at 4.14.24 PM

From LAMPI, the user can see the current time as well as snooze the alarm.

Challenges – multiple time zones, transmitting time, conflicting alarms, conflict between light settings and alarm

Natural Light Mode

This was an original idea from the students and not one of our suggested projects. This project used LAMPI to reflect the state of the light outdoors for a handful of benefits, as outlined in their presentation:

Screen Shot 2016-07-19 at 4.16.51 PM

Using the API from OpenWeatherMap, they were able to get sunset, sunrise, and daylight conditions and map them to a color spectrum based on LAMPI’s current time. Because we didn’t have all day to watch the light color slowly change, they also built a demo mode that progressed through a 24 hour cycle in the span of couple minutes.

User / Device Association

By the end of the course students had a functional system that connected a single LAMPI to the cloud. This project focused on expanding the system to accept multiple users, each with a unique LAMPI device. The LAMPI doesn’t have a keyboard and noting that an on-screen keyboard would probably result in a poor experience, these students built a key-generation system similar to Netflix and other services that run on set-top boxes and smart TVs.

When the user presses a button to connect to the cloud, the LAMPI would display a randomly-generated code like this:

Screen Shot 2016-07-19 at 4.15.01 PM

The user can then log into their LAMPI web portal, enter the code, and the device is connected to their logged in account. The codes are only good for one minute, afterwards they would need to generate a new code.

Distributed Load Testing

While we had covered some basic load testing scenarios in a previous week, there was still work to be done. This team of students took charge and starting investigating how to load test a protocol such as MQTT using something like Locust, a LeanDog favorite for load testing web sites. Locust supports HTTP out of the box, but has a plugin system for testing other protocols. These students actually created their own MQTT plugin for Locust and open-sourced it on GitHub. From there, they ran a “locust swarm” of distributed clients from Digital Ocean to attack their Mosquitto broker in Amazon EC2.

Their results were very promising. They were able to flood CPU and network traffic but unable to cause catastrophic failure in the Mosquitto broker. Messages with QOS 1 and 2 eventually got where they were intended to go after congestion resolved, demonstrating why Mosquitto continues to be our go-to MQTT broker:

Screen Shot 2016-07-19 at 4.19.20 PM

Wrapping Up

With final projects completed, we also ran a brief retrospective. We asked the students to post what worked, what didn’t work, and what surprised them. Lots of good feedback came out of this. We were able to hone in on content that was too technical and not technical enough. We learned that homework submissions being due on Monday caused issues as students would often wait until the weekend, a time at which we could only provide limited assistance over Slack. It was also a validation that we had done something right — we received an overwhelming amount of positive feedback, with several students saying how much they had gotten out of the class due to the breadth of the covered topics.

Looking Forward

With our first class finally wrapped up, it’s time to look ahead. Preparations are already being made for a second run. We’re taking the feedback given, making some needed tweaks, and we’ll be ready for a new round of students in the fall. See you then!

Developing an amazing technology product of your own? Take our 1-Minute self-assessment to make sure you’re project is on-track for a successful launch!  Or, reach out to us at LeanDog.com! We’d love to hear all about it!

IoT Course Week 13 – IoT Platforms

Screen Shot 2015-10-15 at 3.07.21 PM

Last week, we explored remote firmware updates for IoT Devices, using the Debian Package system. This week, we’ll be discussing various IoT platforms.

When we started the course, we had an explicit goal to avoid “black box” solutions, platforms, and vendor lock-in, as much as possible.  We wanted students to understand how these systems are built, as well as architectural and security considerations. The course in some ways is “Learn IoT the Hard Way”, by learning through building various components of an IoT system, stiching those components into a holistic system, and touching on a number of important non-functional requirements, like security, load testing, analytics, and firmware update.  Through that experience (and occasional struggle), we hoped to arm students with enough knowledge and experience to understand both the individual components as well as the overall system.

You can, of course, purchase a complete IoT system – they’re generally referred to as IoT Platforms.  There are many, many choices

Screen Shot 2016-06-13 at 9.30.23 AM

Platform Tradeoffs

When building a product or a business around any technical platform, one must consider the long term implications of that platform. There are the basic questions of functionality and offloading work and operations, but the added complexities of hardware. What does this platform scale to, how quickly can I go from prototype to market, where can I source large quantities of an item, etc. Software as a service also has a few horror stories of products or companies discontinuing a line, which other companies heavily rely on. Controlling your own destiny is very important, and can sometimes be difficult when building your business on a platform that is someone else’s responsibility to keep running. One platform which we feel is here to stay for some time however, is Amazon Web Services.

AWS IoT

From the beginning of this course, the intention was to never take the easy path in building the LAMPi system. Amazon offers a service encompassing much of the functionality we have spent the past several weeks piecing together, AWS IoT, which provides secure, bidirectional communication between internet-connected things and the AWS cloud. This includes a robust security model, device registry, MQTT message broker, as well as integration ease with the remainder of AWS’ cloud offering. Let’s dive in.

Screen Shot 2016-06-13 at 10.15.23 AM

The Message Broker offered through AWS IoT mirrors sections of the MQTT broker, Mosquitto, that we used for LAMPi. AWS takes it to the next level by providing an HTTP RESTful interface to get and change the current state of your devices. The broker does not retain any messages, but simply provides a central point for the pub-sub model.

Aptly named, the Thing Registry, acts as the central location for managing and identifying the things, or devices hooked into the AWS IoT system. The Thing Registry keeps track of any resources or attributes associated with a particular thing. It also provides a location to keep track of MQTT client ID’s and associated certificated, which improve one’s ability to manage and troubleshoot individual things.   

Coupled with the Thing Registry is AWS’ concept of Thing Shadows. This is a persistent digital representation of the state of a device. As well as providing the current reported state of a device, it also will report the state desired, clientToken which it uses to send MQTT environments, and metadata.

AWS IoT comes with the robust Security and Identity Service that our team has come to know and love throughout this course. Things retain their own credentials, and access is granted to the system through the assignment of rules and permissions. Three identity principals are supported in this system, X.509 certificates, IAM, and Amazon Cognito.   

All of these services have the added benefit of being fairly cheap. The current rate is at $5 per million messages.

Next week, join us for the final installment of the IoT Course Blog Series: Week 14 Final Projects and Wrap Up.

Can’t get enough insights? Discover why A Locust Swarm is a Good Thing or how Selecting the Right User Research Method can make all the difference to your product’s success.

Developing an amazing technology product of your own? Take our 1-Minute self-assessment to make sure you’re project is on-track for a successful launch!  Or, reach out to us at LeanDog.com! We’d love to hear all about it!

IoT Course Week 12 – Remote Firmware Updates

Screen Shot 2015-10-15 at 3.07.21 PM

Last week, we explored Load Testing of HTTP and MQTT and how to measure the scalability of your system.

This Week

This week, we’ll continue our focus on non-functional requirements with Remote Firmware Update.  A typical desk lamp, or other non-IoT device, will have the same functionality 10 years after it leaves the factory.  The functionality and value of a “smart” device, however, can increase over time, as new software functionality is deployed.  

As students have experienced, updating the functionality of the Web is relatively straight-forward: deploying new code to a server updates the web application.  Similarly, as new iOS and Android mobile capabilities are deployed, the new Apps are published on the iTunes and Google Play stores.  But how do you update the software/firmware on your smart device?  There could be hundreds of thousands, or even millions, of devices distributed across the country or world and each embedded system is slightly different.  For Week 12, we show students how to remotely update LAMPi.

Screen Shot 2016-05-23 at 11.05.56 AM

 

Debian Packages

Since we are using Raspbian, a Debian-based Linux system for LAMPi , we settled on the Debian Package System. This addresses the actual packaging and installation of software, as well as the distribution and security (authentication and integrity) of those packages.

Create Folder Structure

First, we need an executable to package. We’re going to make a package called “hi” that contains an executable also called “hi”. Let’s make a directory to build our deb package in:

cloud$ mkdir -p ~/pkg/hi/{DEBIAN,opt/hi}
cloud$ cd ~/pkg/hi/

Viewed in tree (you can install tree through apt-get), this folder structure should look like so:

pkg
├── hi
│ ├── DEBIAN
│ └── opt
│ └── hi

So ~/pkg/hi is the directory that holds everything we want to package.

  • DEBIAN is a special folder that contains all the configuration & metadata for the debian package
  • Everything else in ~/pkg/hi will be installed in the root of the system. So ~/pkg/hi/opt/hi will install into /opt/hi on the system in which it is installed. If we wanted to install some supervisor scripts with our packag. For example, we could make a ~/pkg/hi/etc/supervisor/conf.d/ directory and files in it would install into /etc/supervisor/conf.d.

Create Executable

Now let’s build an executable. When the package is installed, we’ll want the executable to be installed in /opt/hi/ so create it as ~/pkg/hi/opt/hi/hi

#!/usr/bin/env python

import os

version = 'Unknown'
version_path = os.path.join(os.path.dirname(__file__), '__VERSION__')
with open(version_path, 'r') as version_file:
version = version_file.read()

print('Hello Deb! Version {}'.format(version))

Let’s create a file to hold the version of our program. Create ~/pkg/hi/opt/hi/__VERSION__ with the following contents (no whitespace, no newline):

0.1

Save and close both files, mark “hi” as executable, then run it:

cloud$ cd ~/pkg/hi/opt/hi/
cloud$ sudo chmod a+x hi
cloud$ ./hi

Hello Deb! Version 0.1

Create Package Metadata

Now let’s build a control file to describe our package.

Create a file at ~/pkg/hi/DEBIAN/control, replacing {{YOUR_NAME}} with your name:

Package: hi
Architecture: all
Maintainer: {{YOUR_NAME}}
Depends: python, python-dev, python-pip
Priority: optional
Version: 0.1
Description: Hello, Deb!
Section: misc

Note that these metadata files are whitespace sensitive and do not allow additional empty lines so be careful while editing.

Finally, we need to fix file permissions and make root the owner of the entire directory structure. These permissions will travel with the package, so if we don’t do this, the files will be installed with bad permissions.

cloud$ sudo chown -R root:root ~/pkg/hi/

Note that after you do this, further edits to files in this directory will require sudo.

This should be all we need to build our deb package, so let’s go:

cloud$ cd ~/pkg/
cloud$ dpkg-deb --build hi

You should now have a hi.deb in ~/pkg/.
You’ve just created a Debian Package!

Setting up a Debian Repository
We use reprepro, an easy to set up Debian Package Repository, and show students how to publish their packages to that repository, add that repository to LAMPi, and then install the package on LAMPi from the repository.

Automating Deployment

Everytime we change our hi package, there are several things we need to do. We need to increment the version number, create the package, and finally upload it to our package repo. We teach the students how to build an automated script for these so we don’t have to manually run the commands each time. The package and deployment script will act as living documentation of the process we need to do each time the package is updated, so future maintainers of your project don’t need to start from scratch. We use a Python module called bumpversion to accomplish automatic updating of version information.

Finally

After walking through the above creation and deployment of a Debian package, setting up the reprepro repository, and installing the hi package on LAMPi, the students’ assignment for Week 12 was to demonstrate their understanding by applying the tools on the LAMPi code. The assignment required them to package the LAMPi UI application, the Bluetooth service, and the lamp hardware service into a package, including maintainer scripts to run before the package is installed (preinst), after installation (postinst), when removing the package, etc. and demonstrate versioning of the package in class.

Next Week –  IoT platforms

IoT Course Week 11: Load Testing

Screen Shot 2015-10-15 at 3.07.21 PM

Last week, we dove into into the importance of incorporating and collecting analytics through your connected device, how that information helps provide business value, and played with some of the ways that information can be displayed using some pretty graphs.

This Week

This week, we’ll continue our focus on non-functional requirements and start load testing. With connected devices, if the device can’t call home to its shared services, it loses a lot of its value as a smart device. These services need to be highly reliable, but things get interesting when thousands or millions of devices decide to call home at the same time.

To load test, we’ll generate concurrent usage on system until a limit, bottleneck, unexpected behavior, or issue is discovered. This usage should model real-life usage as close as possible, so the analytics we put in place last week will be a valuable resource. In instances where we don’t have data to work with, we can build out user funnels and extrapolate based on anticipated usage. Bad things will happen if we ship thousands of products without any idea how our system will react under the load. This data will also be a useful baseline for capacity planning and system optimization experiments.

The Lampi system has two shared services that we need to put under load. One is the Django web server that handles login, and the other is the MQTT broker that handles sending messages to the lamp.

Load Testing with Locust

To test the web server we use Locust. Locust has become a LeanDog favorite due to its simple design, scalability, extensibility, and scriptability. We’ve used it to generate loads of 200,000 simultaneous users distributed across the US, Singapore, Ireland, and Brazil. These simulated users (locusts) walked through multi-page workflows at varying probabilities, modeling the end-to-end user interaction, complete with locusts dropping out of the user funnel at known decision points.

Locusts are controlled via a locustfile.py. The one below shows a user logging in and going to the home page:

from locust import HttpLocust, TaskSet, task

class UserBehavior(TaskSet):

def on_start(self):
self.login()

def login(self):
response = self.client.get("/accounts/login/?next=/")
csrftoken = response.cookies.get('csrftoken', '')
self.client.post("/accounts/login/?next=/", {"csrfmiddlewaretoken": csrftoken, "username": {{USERNAME}}, "password": {{PASSWORD}} })

@task(1)
def load_page(self):
self.client.get("/")

class WebsiteUser(HttpLocust):
task_set = UserBehavior
min_wait = 5000
max_wait = 9000

In order to run locust, we’ll need a machine outside of the system to simulate a number of devices. Locust is a python package, so it can run on most OSes. It uses a master/slave architecture so you can distribute the simulated users and allow for more and more load.

Once you install locust and start the process, you control the test via a web interface.

Screen Shot 2016-05-04 at 12.09.46 PM

Locust will aggregate the requests to a particular endpoint and provide statistics and errors for those requests.  

Screen Shot 2016-05-04 at 12.09.55 PM

Screen Shot 2016-05-04 at 12.10.03 PM

Load Testing with Malaria

To test MQTT we used a fork of Malaria. Malaria was designed to exercise MQTT brokers. Like locust, Malaria spawns a number of processes to publish MQTT messages. Unlike locust, it’s not easy to script; you have to fork it to do parametric testing or randomize data.

usage: malaria publish [-D DEVICE_ID] [-H HOST] [-p PORT] [-n MSG_COUNT] [-P PROCESSES]

Publish a stream of messages and capture statistics on their timing

optional arguments:
-D DEVICE_ID (Set the device id of the publisher)
-H HOST (MQTT host to connect to (default: localhost))
-p PORT, (Port for remote MQTT host (default: 1883))
-n MSG_COUNT (How many messages to send (default: 10))
-P PROCESSES (How many separate processes to spin up (default: 1))

By modulating MSG_COUNT and PROCESSES you can control the load being sent to the broker.

Running some Example loads
Small load: Using 1 process, send 10 messages, from device id [device_id]

loadtest$ ./malaria publish -H [broker_ip] -n 10 -P 1 -D [device_id]

Produces results similar to this:

Clientid: Aggregate stats (simple avg) for 1 processes
Message success rate: 100.00% (10/10 messages)
Message timing mean 344.51 ms
Message timing stddev 2.18 ms
Message timing min 340.89 ms
Message timing max 347.84 ms
Messages per second 4.99
Total time 14.04 secs

Large load: Using 8 processes, send 10,000 messages each, from device id [device_id]

loadtest$ ./malaria publish -H 192.168.0.42 -n 10000 -P 8 -D [device_id]

Monitoring The Broker

MQTT provides a set of topics that allow you to monitor the broker.

This command will show all the monitoring topics (note that the $ is escaped with a backslash):

cloud$ mosquitto_sub -v -t \$SYS/#

The sub topics ‘…\load...’ are of particular interest.

Gather data

Before we start testing, we should figure out what metrics we want to measure. Resources on the shared system (CPU, memory, bandwidth, file handles) are good candidates for detecting capacity issues. Focusing on the user experience (failure rate, response time, latency) will help you hone in on the issues that will incur support costs or retention problems. Building the infrastructure to gather, analyze and visualize those metrics can be a significant part of the load testing process – but those tools are also necessary to do useful operational support in production. For the class, students used sysstat, locust, mqtt and malaria to gather metrics. A production-like system might use AWS Cloudwatch, New Relic, Nagios, Cacti, Munin, or a combination of other excellent tools.

The point of load testing is to find the limits and then decide what to do about them. There will be a point where the cost to rectify the issue is greater than any immediate benefit, load testing will help you find that bar. During the class, limits of a 1000 simultaneous users for web and 5,000-10,000 MQTT messages per process were common.

Final project

For their final project two students from the class, Matthew Bentley and Andrew Mason, decided to take on some of the problems with mqtt-malaria and extend Locust to publish MQTT messages. Using Locust they were able to scale their load test infrastructure across many machines and put a broker under more stress. In their previous testing with malaria, they found the point where a single device could send no more messages (at a reasonable rate), but they could not scale malaria to determine at what point the broker would not process any additional connected devices’ messages. Through their efforts, they reached 100% CPU on the broker, pushing 1 million messages a minute to 4000 users. As a result of their work they also open sourced their contribution to locust.

IoT Course Week 10: Analytics

IoTBackground

Last week we got our feet wet with an introduction to Bluetooth Low-Energy on iOS. This week, we’ll dive into analytics, provide business value, and make some pretty graphs.

Why Analytics?

When building a new product, there are always a variety of options on the table with which to improve that product. At LeanDog, we practice a software development cycle that includes short sprints coupled with an open and honest feedback loop that provides us with the information we need to make informed decisions about where to focus our efforts and resources. This allows us to make sure that we are building the right thing the first time and minimize the amount of risk inherent in the process.

Until relatively recently, collecting feedback about a product in-use was a long process that required either direct observation or careful reading of written user reviews and complaints. Due to the complex and inconsistent nature of users, collecting strong quantitative data about a product experience can be difficult. In a now infamous incident from 2013, a New York Times journalist wrote a negative review of the Tesla Model S, only to have the car’s onboard analytics refute many of his claims. It is not uncommon for a customer to report one thing, but end up doing something entirely different, and your user experience process will need to account for these inconsistencies. One of the many ways we solve that problem is through the use of analytics platforms and reporting tools.

In addition to uncovering potential pitfalls, analytics are a powerful way for product owners, designers, and developers to understand how a product is actually used. For companies that make physical devices, this provides insights that are difficult to collect otherwise. Imagine receiving a coupon in the mail for a smart GE light bulb you love that’s nearing the end of it’s lifetime. The only way GE could possibly anticipate that your current bulb is about to go out (without calling you every day to ask how often you turned it on in the last 24 hours) is through analytics. With analytics, you get an avenue outside of sales to start to figure out which features and products your users actually love, which have problems or aren’t worth further development, and even identify disengaged users for retention campaigns.

Enter Keen IO
For this class, we will use a popular analytics platform called Keen.io. Keen is a general purpose tool, not locked into web, mobile, or embedded specifics. It has a large number of supported software development kits (SDK’s), including Ruby, iOS, Python, .NET, etc. It also offers a powerful free tier, which is perfect for the amount of traffic currently being driven on student’s LAMPi systems. Registering and sending a notification in Python is as simple as as this:

from keen.client import KeenClient

client = KeenClient(
project_id="xxxx",
write_key="yyyy",
)

client.add_event("sign_ups", {
"username": "lloyd",
"referred_by": "harry"
})

This will send an event containing the signup data to Keen’s database. Now back at LAMPi headquarters we can track those signups on a giant web dashboard:

var series = new Keen.Query(“count”, {
eventCollection: “sign_ups”,
timeframe: “previous_7_days”,
interval: “daily”
});

client.draw(series, document.getElementById(“signups”), {
chartType: “linechart”,
label: “Sign Ups”,
Title: “Sign Ups By Day”
});

image01

Keen also provides a number of ways to pull out the analytics data and do additional processing to get exactly the view we wanted. Like if we wanted to build a tree of who our top referrers are what their “network” looks like:

image00

What’s next?
Analytics can also provide a leading indicator to help model the number of users that will be pounding on your infrastructure. To learn more about how to address that issue, join us next week when we talk about load testing!

IoT Course Week 9: Introduction to Bluetooth Low Energy

 

Internet of Things Course

To continue our goal of providing industry experience to the students of EECS397 Connected Devices, this week we will be diving deep into Bluetooth LE on iOS.

Recap

Last week students completed setting up a UI on iOS and Android that mirrored the interactions present on the LAMPi display and the web. The goal for this week is to connect those pieces

CoreBluetooth and Bluetooth 4.0

With the release of iOS 5 and the iPhone 4S, Bluetooth LE was positioned and continues to be one of the most common methods of short range data communication.  CoreBluetooth is the framework that Apple provides to developers to interact with Bluetooth LE hardware and peripherals. This is useful, as the current Bluetooth LE spec weighs in at over 2000 pages in PDF form.

Communication with LAMPi through CoreBluetooth can be broken into a four step process:

  1. Scanning for LAMPi device (from provided array of service id’s)
  2. Connect to discovered service (lamp-service)
  3. Probe characteristics (hsv, brightness, on/off).
  4. Subscription notifies when something changes. notify on property write.

Scanning for LAMPi

Students began the class by making an update to their LAMPi’s. Each team was given a BlueGiga BLED112 to plug into their Raspberry Pis, as well as updated Python services which allow the LAMPi to act as a Generic Attribute Profile (GATT) server. What the GATT server does is broadcast a number of available services to any BLE devices nearby that care to listen. In the case of the LAMPi, there is only one service being exposed, which is aptly called the Lampi Service.

Screen Shot 2016-04-12 at 11.18.43 AM

The service being advertised from the LAMPi includes a device id, which students use to identify their unique LAMPi in a classroom containing many more. Once discovered, it is time to connect.


- (void)startScanningIfEnabled {

if(self.shouldConnect) {

[self.delegate onLoading:@"Searching for sensor..."];

NSArray *services = @[[CBUUID UUIDWithString:LAMPI_SERVICE_UUID]];

[self.bluetoothManager scanForPeripheralsWithServices:services options:nil];

}

}

Connecting to a Peripheral

Screen Shot 2016-04-12 at 11.20.51 AM

CoreBluetooth abstracts away much of the detail required in making a connection to a BLE peripheral. When a peripheral is discovered, our code will immediately attempt to connect. If that connection is successful, we search through the set of services that exist on the peripheral, looking for one that is recognized.

“`

– (void)centralManager:(CBCentralManager *)central

didDiscoverPeripheral:(CBPeripheral *)lampPeripheral

    advertisementData:(NSDictionary *)advertisementData

                 RSSI:(NSNumber *)RSSI {

       …

       [self.bluetoothManager connectPeripheral:self.lampPeripheral options:nil];

}

 

– (void)centralManager:(CBCentralManager *)central

 didConnectPeripheral:(CBPeripheral *)lampPeripheral {

   NSLog(@”Peripheral connected”);

   [self.delegate onLoading:@”Found lamp! Reading…”];

   lampPeripheral.delegate = self;

   …

       // Search for a known service

       for (CBService *service in lampPeripheral.services) {

           if([service.UUID isEqual:[CBUUID UUIDWithString:LAMPI_SERVICE_UUID]]) {

               self.lampService = service;

           }

}

“`

Services and Characteristics

At this point, we are connected to a Lamp Service, which is now providing us with a collection of characteristics. Characteristics are how communication in BLE works. To make a comparison to software, Services can be thought of as Classes while Characteristics are more like the properties on an Object. Characteristics support four different actions: read, write, notify and indicate. While read and write are arguably fairly straightforward, notify and indicate both have to do with a subscription flow that we will use heavily in the iOS application.

Screen Shot 2016-04-12 at 11.23.20 AM

Subscribing to Characteristics

Because LAMPi has both on device and cloud controls, we want to be able to track the state of the LAMPi in real time while the iOS app is running. If a user were to change the color of LAMPi by using the Raspberry Pi UI, the Bluetooth service would send a notification to iOS that the HSV characteristic had been changed.  The following block of code is an example of a discovered Characteristic being initialized. It reads the current hue and saturation of the HSV Characteristic, and then tells the app to subscribe to the Notify value (the notification) of the lamp peripheral.

“`

           [self.lampPeripheral readValueForCharacteristic:self.hsvCharacteristic];

           if(self.hsvCharacteristic.value != nil) {

               float fHue = [self parseHue:self.hsvCharacteristic.value];

               float fSat = [self parseSaturation:self.hsvCharacteristic.value];

               [self.delegate onUpdatedHue:fHue andSaturation:fSat];

           }

           

           [self.lampPeripheral setNotifyValue:YES forCharacteristic:self.hsvCharacteristic];

“`

At this point, when the LAMPi HSV Characteristic changes, CoreBluetooth will call a delegate method that is triggered from the setNotifyValue line.

“`

– (void)peripheral:(CBPeripheral *)peripheral

didUpdateValueForCharacteristic:(CBCharacteristic *)characteristic

            error:(NSError *)error ;

“`

It is in this block of code that the HSV value is updated in the app, and logic to refresh the UI is executed.

Fun Fact: Origins of Bluetooth Name

As a bonus for making it this far, did you know that the origin of the word “Bluetooth” comes from a c. 970 King of Denmark, called Harald Bluetooth? In fact, the Bluetooth logo is comprised of the Nordic runes for H(8px-Runic_letter_ior.svg) and B (12px-Runic_letter_berkanan.svg), Harald’s initials. 

IoT Course Week 8: Intro to Mobile Development

Internet of Things Course

For a change of pace, we are taking a step back from the cloud -> server model we have been working so diligently on, and instead turn our eyes towards mobile. As more and more people around the world enter the global smartphone market, the Internet of Things space is becoming increasingly reliant on smartphone interfaces to control connected devices. This is because the components native to the smartphones that many of us carry, are simple and effective media of interaction with the connected world around us.

The Mission

The goals of week 8 are to provide an introductory course on modern mobile development, in both native iOS with Objective-C and native Android in Java. This week will set up a user interface to control the LAMPi, which next week, will be extended to operate over Bluetooth.

iOS

While the students were almost equally divided on Android/iOS, most of the students utilized Mac laptops, so we made the pairs heterogeneous as each pair had to build the app for both platforms.  This is necessary due to restrictions that Apple places on its developers, where software meant to run on the iOS platform must be developed on a machine running some recent version of OSX. Students were led through the creation of a project in Xcode, and some initial configuration that Apple requires developers to follow in order to sign and run their code. Students used Xcode to create a single view project, and they got to work.

Working in Interface Builder and a UIViewController, students were guided through adding and connecting a UISlider and UILabel to an IBOutlet.

“`

#import <UIKit/UIKit.h>

@interface LampiViewController : UIViewController

@property (nonatomic, strong) IBOutlet UISlider *slider;

@property (nonatomic, strong) IBOutlet UILabel *label;

-(void)onSliderChanged:(UISlider*)sender;

@end

“`

“`

#import “LampiViewController.h”

@interface LampiViewController ()

@end

@implementation LampiViewController

– (void)viewDidLoad {

   [super viewDidLoad];

   NSLog(@”slider: %@ \n label: %@”, self.slider, self.label);

}

-(void)onSliderChanged:(UISlider*)sender {

   NSLog(@”slider changed”);

}

@end

“`

A this point, a slider appeared on the screen that could be interacted with to update the label.  However a problem existed when the screen was rotated on its side.

Screen Shot 2016-04-05 at 9.36.44 AM

As can be seen, the slider and logo fail to expand out when the screen changes. This can quickly be remedied by utilizing one of the much loved nuances of Xcode’s Interface Builder, which is defining constraints on the views.pin_image

And once applied, everything came together. As can be seen here, the iOS Simulator is capable of fluid rotation from portrait to landscape, without any upsetting of the slider position.

with_constraints

Android

Android development, unlike iOS development, is capable of running on a much larger swath of machines. Since its introduction, Google’s Android has been open sourced and designed to run on the Java Virtual Machine (JVM).

Students began by downloading Android Studio and the latest Android SDK Tools. Using the Android Studio Interface, students created a new Android Studio project, allowing the template to be generated for a blank activity. Android Studio wants to get developers started quickly. They offer some shims that help this process. Because our project will be full of custom layouts and widgets, choosing a blank activity is the ideal scenario. After all was said and done, students were left with an application containing one activity.shim_project

Students were given a custom UI Widget in order to define a consistent slider for the class to use. The slider was added to the created activity layout in the XML here:

“`

<?xml version=”1.0″ encoding=”utf-8″?>

<RelativeLayout xmlns:android=”http://schemas.android.com/apk/res/android”

   xmlns:tools=”http://schemas.android.com/tools”

   android:layout_width=”match_parent”

   android:layout_height=”match_parent”

   android:paddingTop=”@dimen/activity_vertical_margin”

   tools:context=”.Lampi”>

 

   <View

       android:id=”@+id/bar”

       android:layout_width=”match_parent”

       android:layout_height=”10dp”

       android:layout_alignParentLeft=”true”

       android:layout_alignParentTop=”true” />

 

   <com.leandog.widgets.hsv.sliders.HueSliderView

       android:id=”@+id/hue_slider”

       android:layout_width=”match_parent”

       android:layout_height=”wrap_content”

       android:layout_alignParentLeft=”true”

       android:layout_alignParentTop=”true” />

</RelativeLayout>

“`

and when rendered by the running application it looked like this:

hue_slider_example

An event listener is added to the HueSlider, which tells a changeColor to run whenever the slider is interacted with.  It also allows the slider to initialize to a certain position or color, which will be useful when starting and restarting the LAMPi appication.

“`

package com.leandog.lampi;

 

import android.os.Bundle;

import android.support.v7.app.AppCompatActivity;

 

import com.leandog.widgets.hsv.sliders.HueSliderView;

import com.leandog.widgets.hsv.sliders.OnSliderEventListener;

 

public class Lampi extends AppCompatActivity {

 

   HueSliderView hueSliderView;

   View bar;

 

   @Override

   protected void onCreate(Bundle savedInstanceState) {

       super.onCreate(savedInstanceState);

       setContentView(R.layout.activity_lampi);

       bar = findViewById(R.id.bar);

       setupHueSlider();

   }

 

   private void changeColor(int color) {

       bar.setBackgroundColor(color);

   }

 

   private void setupHueSlider() {

       hueSliderView = (HueSliderView) findViewById(R.id.hue_slider);

 

       hueSliderView.setOnSliderEventListener(new OnSliderEventListener() {

           @Override

           public void onChange(int color) {

               changeColor(color);

           }

 

           @Override

           public void onSliderInitialized() {

               changeColor(hueSliderView.getHue());

           }

       });

   }

}

“`

Where we end up

The homework for this week was to follow the patterns of user interface and code interaction just demonstrated, and build a fully operational user interface for both iOS and Android. These user interfaces will essentially mirror that which we have already built on both the LAMPi screen, and the web interface.

iOS Simulator screen capture:

ios_assignment

Android Emulator screen capture:

android_assignment

What is next!

Next week we will add the bluetooth functionality that connects our devices directly to the bluetooth hardware running on the Raspberry Pi controlled LAMPi.  For the sake of brevity and focus, from this point onward, mobile development will be done primarily on the iOS platform.

IoT Course Week 6: Setting Up User Accounts

PastedGraphic-2

Welcome to Week 6 of the Case Western Reserve University IoT course. Over the next two weeks, we will be tackling a new kind of challenge: security. This week kicks off by setting up the web framework Django.

Last Week

Last week, students spent time setting up a static webpage on EC2 that communicates via MQTT to their hardware LAMPi’s. We kicked this week of with a demo and discussion about what everyone was able to accomplish.  See the previous week’s post here: WEEK 5 POST

Why Django?

Despite the multitude of web frameworks available, using Django was an easy choice for this project. It is a mature, modern, web framework with a highly active community. It is easy to configure to work with many different databases, and is shipped with a robust ORM at its core. It supports user account configuration out-of-the-box, as well as a powerful admin interface. The fact that Django is Python-based is just an added bonus, allowing the student’s Python experience on LAMPi to be transferred to the Web.

Linking user accounts

Step one was to get Django set up and configured to have user accounts through its default interface.  Details of how to do that can be found in the Django Documentation.  Students were provided with a sample login template which, when loaded through Django, looks just like this.

Screen Shot 2016-02-11 at 10.10.59 AM

Students changed their static hosted pages to be hosted through Django, and added the provided login template to web/lamp/templates/login.html. With a configuration of the routes, navigating unauthenticated to the root page will now redirect the user to this login screen.

urls.py 
urlpatterns = patterns('lamp.url',
    url(r'^$', login_required(views.index), name='index'),
    url(r'^logout/$', views.logout_user, name='logout_user')
)

From here, the static LAMPi control page is moved to the index template at  web/lamp/templates/index.html and the request is handled in the Django view logic through the addition of an index function in web/lamp/views.py.

def index(request):
   context={}
   user = request.user
   lamps = user.lamp_set.all()
   context['lamps'] = lamps
   return render(request, 'index.html', context)

Whats next

At this point, students have taken the first step into providing user access control to the LAMPi system. Our more security conscious readers will note, however, that virtually every point of integration within the LAMPi system at this point still remains completely insecure. Moving into Week 7, students will receive a crash course in common attack vectors and practical implementation of modern communications cryptography.

IoT Course Week 5 : Lampi Control From The Web

PastedGraphic-2

Welcome to week 5 of the Case Western Reserve University IoT course. Last week’s lesson, though backed by a massive amount of behind the scenes work, left something to be desired from a usability standpoint. By the end of this week’s lesson, the pieces will all begin to fit into place.

Demoing Previous Week’s Work

Class kicked off with a discussion around the previous week’s work. By following the MQTT protocol, students had bridged their Python controlled LAMPi’s to an Ubuntu server running on Amazon EC2. See the previous week’s post here: Week 4.

Building a Web Interface

To continue our effort of doing the simplest thing possible, students began building their web interfaces directly on NGINX. Though other popular web frameworks (e.g. Rails, Django) have web servers built in, in many ways they are rarely designed for high transaction throughput. In the future, it will also be important for students to be able to configure the web server to handle static and dynamic content, HTTP redirects, URL writing and redirects, HTTP/HTTPs, etc. Keeping things simple with NGINX ensures that the projects will begin on a scalable, configurable system.

Students were provided with a directory of sample HTML, CSS, and Javascript files to base their static websites from. The page resembled very much the display of the LAMPi’s control screen, just in a web form.  It is up to the students to build the functionality for the page.  

PastedGraphic-1

MQTT and Websockets

Last week, we were left with an EC2 instance that had been configured to listen to MQTT topics via bridging. We require a method for Mosquitto, our MQTT broker, to communicate with the controls on our static web page. TCP, the Transmission Control Protocol, via WebSockets, will provide the channel we need for this communication to take place. WebSockets provide a fast and scalable two-way communication that our web client can harness via Javascript. There is an added bonus that Mosquitto supports WebSockets out of the box, making setup as simple as specifying a TCP port to accept connections from.

Paho and Javascript

Paho was briefly covered in the Week 3 post. Following that pattern of use, we utilized the Paho Javascript client to communicate with Mosquitto. Paho supports websockets, so connecting to Mosquitto is as simple as connecting to the same TCP port that Mosquitto is active on.  Below, we have included an example of coding that demonstrates connecting paho.js to tcp.


var client = new Paho.MQTT.Client(hostAddress, Number(hostPort), clientId);
…
onConnect : function(response) {
client.subscribe("devices/+/label/changed");
},
onMessageArrived : function(message) {
console.log(message);
console.log(message.payloadString);
configurationState = JSON.parse(message.payloadString);
console.log(configurationState);
obj.updateUI();
},

 

Once this connection is made, we add the following code to sync the LAMPi state from MQTT to a javascript object that looks like this:

var lampState = {
color : {
h : .5,
s : 1,
},
brightness : 1
};

And finally, a little bit of Javascript which uses Jquery to manipulate the sliders to match this lamp state:

function setSliderValues(hue, saturation, brightness) {
$( ".hue" ).each(function(index, hueSlider) {
hueSlider.value = hue;
});
$( ".saturation" ).each(function(index, saturationSlider) {
saturationSlider.value = saturation;
});
$( ".brightness" ).each(function(index, brightnessSlider) {
brightnessSlider.value = brightness;
});

updateSliderStyles(hue, saturation, brightness);
}

Witness the Big Picture

We now have a full path messaging system, from the Lampi to the cloud, and back again! When the LAMPi interface is changed on the device or the static webpage just built, the change is mirrored on the other interface instantly.

Next Week!

Next week, we’ll tackle a glaring problem in this system: security. Security in the IoT space has been a huge point of conversation as devices continue the path of hyperconnectivity and intelligence. At this point, there is no access control or security of the LAMPi system. We will take a big step forward, and add a web framework to give us greater control over users and access.