Formerly known as Wikibon
Close this search box.

AI-Infused Software Is Eating the IoT Edge

Premise. The Internet of Things (IoT) is becoming intelligent all the way to the edge through the power of artificial intelligence (AI). As AI software takes residence on more “things,” it will transform the IoT into a pervasively intelligent system that permeates everyone’s lives. At the same time, the AI-driven IoT is evolving into a single low-latency application target for sensor-driven apps’ entire lifecycle. Continually upgrading and tuning the IoT’s smarts will require federated distribution of AI algorithms, training data, and other key resources all the way to the edge. Like almost everything to do with the IoT edge, federating the development, deployment, and administration of these edge-oriented capabilities requires careful planning and execution.

The Internet of Things (IoT) is growing ever “softer,” thanks to the power of artificial intelligence (AI) software that runs locally on sensors, smartphones, and other IoT edge devices.

In this way, AI-infused software is eating the IoT. AI provides the smarts for IoT edge devices—such as smart sensors, smartphones, and wearables–to deliver contextualized, predictive smarts in a wide range of real-world operating scenarios. Chief among these AI software capabilities are machine learning (ML), deep learning (DL), predictive analytics, and natural language processing. Through these and other AI technologies, the next generation of sensor-driven tools is gaining data-driven smarts far more sophisticated and flexible than such IoT precursors as supervisory control and data acquisition (SCADA) systems.

As AI software deepens its footprint on IoT edge devices, it will need to work within more distributed, multi-domain, multiplatform, cloud-first application environments. This trend will place an increasing emphasis on management of AI-driven IoT apps in federated microservices environments in which different functions are managed by various organizations. For example, many industrial IoT applications already span federated value chains in which trading partners’ various process control, logistics, inventory, and procurement systems must interoperate in loosely coupled fashion.

To optimize AI-driven IoT applications out to federated edge devices, developers and other IT professionals should follow these guidelines:

  • Identify the specific AI workloads to be federated on the IoT. Federation supports flexible allocation of selected IoT AI workloads to edge nodes and/or gateways, thereby reducing or eliminating the need to round-trip many capabilities back to federated computing clusters in the cloud.
  • Containerize your AI microservices for federated IoT edge deployment. Increasingly, AI and other microservices run inside whatever containerized environments, such as Docker/Kubernetes, are an IoT’s software backplane, as evidenced by growing support for this approach by leading solution providers such as IBM, Google, Intel, and Microsoft.

Identify the specific AI workloads to be federated on the IoT

The AI pipeline—from algorithmic development through training, deployment, feedback, and iteration—includes diverse workloads.

Within an IoT architecture, these workloads may be distributed for optimized execution in various ways, including in the cloud, at gateways, and at the edge devices themselves. Federated workload management is an essential piece of the distributed IoT application puzzle. That’s because IoT edge devices often lack sufficient local compute power, memory, storage, and other resources to store data and execute all algorithms to do their jobs effectively. Federation enables IoT applications to tap into resources in the cloud, in gateways, and at edge devices, all of which may be managed as separate resource domains by different organizations and individuals.

Table 1 presents the principal AI workloads that developers should consider federating across the IoT.


In cloud-native IoT environments, AI and other functions are increasingly being developed, deployed, and orchestrated as containerized microservices. But, in order to do so, edge-application developers will need to downsize, optimize, and tune their AI assets to conform to the resource, bandwidth, and connectivity constraints often found at the IoT edge. Table 2 provides guidelines for deploying AI algorithms closer to the edge.
Containerize Your AI Microservices for Federated IoT Edge Deployment


Action Item. 
Developers should be prepared to embed AI software into IoT endpoints. Doing so will enable these edge nodes to make decisions and take actions autonomously based on algorithmic detection of patterns in locally acquired sensor data. For developers working projects that converge AI and the IoT, the key Wikibon guidance is to factor AI functions as modular, containerized microservices that fit the resource constraints of edge devices, can be deployed over federated cloud-computing environments to edge devices, and can be retrained, managed, monitored, and updated automatically and remotely over their lifecycles.Table 2: Guidance for Deploying AI Algorithms Closer to the IoT Edge

Book A Briefing

Fill out the form , and our team will be in touch shortly.
Skip to content