Flogo + IOTA—The little IoT app engines that could!

We are thrilled to have announced Flogo Edge Applications and TIBCO® IOT App Engine (IOTA™) last week at TIBCO NOW in Berlin. On this occasion, we are happy to feature a 3-part series of blogs that will explain our strategy and vision behind Flogo Edge and IOTA:

The three articles in this series will be:

  • Strategizing for IoT edge – Rajeev Kozhikkatuthodi
  • Introducing Flogo edge apps – Matt Ellis
  • Industrializing the IoT edge with TIBCO IOTA – Rahul Kamdar

There are few things in today’s technology landscape that promise to be more transformational than the emergence of an estimated 20 billion IoT edge devices by 2020. Even for an industry, as the joke goes, that has incorrectly predicted 10 out of the 3 “game-changers” in the past couple of decades, the implications of this brave new era of edge compute are difficult to ignore.

Last week at TIBCO NOW in Berlin, we unveiled TIBCO IoT App Engine (IOTA), a commercial, industrial IoT offering for  your IoT Edge Applications. Powered by the open source Project Flogo, IOTA sits at the intersection of Internet of Things (IoT) and edge application development. It is an important milestone in the evolution of not just TIBCO’s mission to Interconnect Everything, but also our overarching Connected Intelligence vision and strategy. In this post, I will provide some context around emergence of edge computing, importance of edge applications in particular, and outline TIBCO’s vision for IoT edge applications.

Emergence of edge computing and edge apps

A lot of insightful analyses have been been published about this topic including by Peter Levine, General Partner at a16z, on how cloud computing is coming to end. He argues that today’s cloud computing era powered by largely centralized compute infrastructures is going to be (yet again!) replaced by a distributed computing era. A key area of interest for my team are the applications that physically run on these edge devices and interact with cloud apps and services. At the end of the day, despite appearances, those 20 billion things are nothing but programmable, resource-constrained computers that also happen to be things. This opens up unprecedented opportunities and challenges around architecting, developing, and operating these billions of edge apps. These edge apps are not going to look like traditional cloud or on-premise apps. Analysts like Janakiram MSV have written about how edge is not just ideal for IoT solutions, but also extends to an entire new class of business applications and we wholeheartedly agree.

Today’s cloud-centric IoT models won’t hold

One of the well understood impacts of the edge as it emerges is that it will unleash a data tsunami. The evidence is already out there—a connected car can produce up to 5 terabytes of data hourly, an oil and gas drilling rig can produce 7 to 8 terabytes of data daily. Waiting for all of this data to be sent to the cloud and acted upon won’t cut it for many applications in industrial or consumer domains. Even for edge analytical applications, conventional big data architectures that involve forwarding everything to the cloud to be stored and analyzed doesn’t make sense—what you likely need are streaming analytics capabilities offered by solutions like TIBCO StreamBase in the edge. Simply put, cloud-centric IoT approaches are too expensive to operate, not secure and unreliable to operate.

TIBCO’s IoT edge vision

Around late 2015, it became clear to us that we needed some fresh thinking to address these formidable challenges posed by IoT edge. Virtually everything we knew about building distributed systems in the past 30 years had to be questioned and analyzed in the context of this newly emerging edge landscape. Out of this realization, was born Project Flogoour moonshot project to build integration technology that could run on the next 20 billion edge devices. What Flogo started off in 2016, we are taking it to the next level in 2017 with Flogo edge applications that run on the tiniest of microcontrollers with a footprint as low as 50KB.

Figure 1: Flogo Edge Applications run with footprints as low as 50KB

Figure 1: Flogo Edge Applications run with footprints as low as 50KB

At a high level, our vision for IoT edge applications is driven by three core beliefs:

  • Edge-native by design
  • Engineered for Connected Intelligence
  • Open as a matter of strategic choice

I will attempt to provide a little more color around these three beliefs and connect to a couple of take-aways for technologists and decision makers.

Edge-native design

Edge-native design simply means we are not retrofitting cloud-native technologies and architectures for the edge unless absolutely necessary. This sort of “edge-washing” is not entirely new. In many ways, this is reminiscent of the “cloud-washing” we saw in the early days of cloud computing. When innovators pioneer a new paradigm shift and enter the early majority, there is often a gold rush by vendors, suppliers, analysts, and users alike to reflexively retrofit what worked in the past into what is emerging as a new paradigm. This often translates into admirable but somewhat wonky efforts such as retrofitting a technology like Node.js, a good fit for server-side JavaScript apps, being abused for edge applications and infrastructure. Ultimately design is what design does and there are good examples of such edge-native design throughout Flogo such as externalizing all of the application state to a pluggable state management service, step-back remote debugging, externalizing flow configuration to a remote flow service, etc.

Engineered for Connected Intelligence

We are also approaching this design from the unique perspective of Connected Intelligence. In simple terms, we are no longer looking at IoT edge apps merely as an application development, integration, or analytics problem. We are not building just edge apps, we are seeking to provide connected intelligence capabilities that are physically resident in the edge as edge “applications”. What are some of these connected intelligence capabilities in these edge apps? 

  • Sense activity that is happening all around and turn them into events of significance
  • Connect in the edge and back into cloud and on-premise apps
  • Learn from these event streams in supervised, semi-supervised, and unsupervised ways
  • Act in real time in response to these events based on pre-trained models and or declarative logic

Another way to conceptualize this would be to think of these edge applications as facilitating connected but potentially self-reliant swarms of Observe-Orient-Decide-Act (OODA) loops with support from peers in the edge as well as the cloud. We believe this is a powerful shift in perspective and stands in sharp contrast to conventional server-side or cloud-native application development approaches.

Figure 2: Contrasting edge-native and cloud-native thinking

Figure 2: Contrasting edge-native and cloud-native thinking

Open as a matter of choice

When there are 20 billion things that are nothing short of full-fledged computers, one needs to approach this as a general computing problem, not a highly specialized sensor-actuator problem. Openness becomes a strategic choice to reduce specific types of technology lock-in risk, but also to promote experimentation and agility.

Many of today’s IoT platforms embed a proprietary SDK or API in the edge layer. In my opinion, they do not represent an edge application strategy, no more than a SQL client library represents a data management strategy. Those remote platform interfaces may very well be required to be embedded in edge apps, however the right architecture with the right set of abstractions will pay off in the medium to long run. We also believe that edge applications will need to embrace a multi-paradigmatic approach. For instance, some edge apps may be served better by an event-driven abstractions than a traditional API-led approach. Another example of this openness in action would be to leverage cheap cloud infrastructure to do model training leveraging deep-learning frameworks like Google TensorFlow, Amazon MXNet, and Microsoft CNTK in conjunction with technologies like TIBCO Enterprise Runtime for R and TIBCO Spotfire for data science tooling and visual analytics. These models can now be embedded in edge apps enabling real time inferencing without costly cloud hops. Assemble a toolchain of best in class capabilities, but do so with a radical commitment to exercising degrees of freedom where it matters to promote experimentation and freedom of choice.

Take aways

If you are a technologist or business strategist looking at edge computing, here are the key takeaways:

  • Start experimenting today: The lifeblood of digital transformation and innovation is data-driven experimentation. We are incredibly lucky to live in an age where the cost of experimentation has hit unprecedented lows with emergence of open source frameworks and cloud offerings—get started today.  
  • Translate your technology strategy to digital objectives: Focus on driving net digital value, not just point technology outcomes with your edge app strategy. Work with lines of business and executive stakeholders to build that closed loop translation between your edge innovation efforts and business outcomes.
  • Embrace edge-native and openness: There is no longer a need to edge-wash technologies to meet your edge application needs. Embrace edge-native design choices that do justice to the challenges and opportunities that edge computing brings up.

These are indeed exciting times and we are in many ways just getting started. We would love to hear your thoughts and experiences!

Comments are closed.