The IoT World Conference, which ran from May 10-12 in Santa Clara, was a full-on scriptr-fest! At our own booth we were demoing an integration with the Lorauino from our partner Gimasi. At other booths on the floor, our partners Multitech and Stream Technologies were showing another demo we developed demonstrating how facilitates the monitoring and management of live LoRa sensors. Tom Gilley sat on a panel talking IoT security while I (@edborden) participated in the IoT for Cities Hackathon.

The IoT for Cities Hackathon had some great anchor sponsorships from GE and Pitney Bowes, with the mandate to build applications that a city mayor or manager would want to use. I was particularly interested in hacking on the two big sponsors’ offerings and hooked up with Walé Ogundipé and Pinaki Sinha to form a team. We ended up building GoGoKart, an application that monitors real-time foot traffic and parking availability in parks, then routes food trucks/vendors toward under-served areas. It also incorporates demographic and historical performance for future recommendations and back-office reporting. We were excited to receive a runner up award for our efforts! Read on for details of the integration and code.


Part 1: GE’s Current Lamps

Our biggest challenge was to integrate with GE’s Intelligent Environments APIs, which are powered by streaming data from GE’s Current Lamps. The Current Lamps utilize cameras (up to 3), that can be configured to generate events correlated to pedestrians and vehicles entering and exiting a view. These are websocket streams accessed via GE’s Predix platform.


Getting the service set up and getting an initial round of credentials for testing was a challenge. The GE team provided a sample app that was a big help — highly recommended!

We then were able to subscribe to 3 sample datastreams that were provided by GE, which we then augmented with our own sample data to get a good spread of active locations around SF. Our node websocket client scripts available here, which POSTs to with every event.


Part 2: Orchestration inside Scriptr

POSTing to early on allowed us a lot of flexibility in building out the application. Because scripts become live microservices instantly on creation, it allowed our team to collaborate really easily — someone managing the inbound datastreams on the GE side, multiple people writing microservices as necessary, someone building the frontend accessing that data. Our scripts, which essentially became our project’s API, are all here.

As data streamed in to, we converted raw pedestrian & vehicle in/out events into a running total of actual humans and vehicles “currently” inside our monitored locations. We then produced a location “score”, which was a function of total pedestrians and available parking for food vendors. All of this metadata was then pushed into a Firebase instance, which provided the database used by the frontend directly. The ability to produce and make available the real information that the application required out of a flow of raw data was the clincher for what we were trying to do.


Part 3: EmberJS frontend

A very simple view that showed a map of SF with monitored locations and the realtime status of those locations was built on top of EmberJS. The key thing to note here is that using a full javascript frontend gives us access to a framework with a real data model. This makes it much easier to build more advanced features that might correlate disparate data sources, as well as make the realtime responsiveness of the view a cinch. Further, even though we only built a view intended for the mayor/manager, with Ember it would be easy to make a slightly different version of this view, wrap it up into a mobile app via Phonegap, and have complete other application for the vendors all built on the same codebase.


The only real action that we had time to prototype here was the “Blast”, which is accessed by tapping on a location marked in red, signifying action is needed. The Blast is a simple POST to a endpoint that integrates Twilio for text messaging. The text goes out to all subscribed food trucks, which can be customized for each vendor.


Part 4: Pitney Bowes demographic data

Pitney Bowes has an IoT-related set of API’s they are calling their Location Intelligence platform. The whole offering is really nice — easy to use REST services, documented well with sample applications and SDK’s, and clear pricing. Nicely done.


We gravitated toward their “GeoLife” service, wanting to incorporate demographic information into the lifecycle of vendor events. The output of a lat/lng call to the GeoLife endpoint generates quite a slew of information, including age group, ethnicity, income, and purchasing behavior. When this is mashed up against an action in our application — either a vendor requesting a recommendation for a location to service or a vendor responding to a Blast which they are going to service — the information derived becomes much richer.

For example, we can make better recommendations to vendors for types of food they should offer or pricing they should offer it at. We can also mash this data up against receipts of actual revenue derived from the location, so that future recommendations can incorporate actual performance against specific demographics. This is the piece of the service that really pushes home the opportunity to maximize revenue for the vendors and provide better service for the citizens, both of which the persona of the city manager is very interested in doing.

Complete code can be found at!

Oh, look who is the first one there in the morning, before the venue even opened. BOOM

Oh, look who is the first one there in the morning, before the venue even opened. BOOM