Formerly known as Wikibon

Recipe for Building Digital Twins

Premise: The concept of the Digital Twin has gained currency with people who architect, build, and manage IoT applications. Digital Twins will not just define how IoT applications get built, however, but will become a key construct for mainstream enterprise applications.

Digital business captures data and puts it to work. As part of this process, digital businesses must construct analytic models capable of “learning” from new data and “activating” systems that perform work on behalf of the brand. These analytic models — which are the fundamental, value-producing big data artifacts — are termed “Digital Twins.” A Digital Twin (DT) is a data representation or model of a product, process, service, customer, supplier – any entity involved in a business (see Table 1). Originally conceived in the industrial control world, DTs represent a new and better way to build enterprise applications. Unlike applications built solely with traditional logic, properly designed DTs get “smarter” over time. DTs use machine learning to improve their fidelity as the system “learns” more about its use cases and the contexts in which it operates. For example, a factory robot can inhabit multiple contexts: It can exist by itself, it could be part of an assembly line with other instances of the same robot, or it could be part of several assembly lines within a factory. The most critical success factor, however, requires properly integrating Digital Twins with operational applications that can take their predictions and prescriptions and apply them to change the business processes that the operational applications manage. Ultimately, the DT concept transforms machine learning from a means of creating models to programming DTs and through them, the real world.

The most critical milestones in building DT’s that transform business outcomes are:

    • DTs “program” the real world. Digital Twins enable enterprises to simulate, test, and enact changes in order to optimize services, products, and processes.
    • DTs model products and processes better than traditional apps. Traditional applications represent business processes. DTs not only model physical devices and products, but they integrate them with the operation of business processes.
    • “Knowledge Graphs” are the secret sauce for holistic and coherent DTs. A Knowledge Graph integrates the many applications behind a DT.

 

Term Definition
Digital Twin Representation or model of a product, process or service, customer, supplier – any entity involved in a business. Ultimately, a digital twin is a simulacrum whose fidelity developers continue to improve over time.
Data model Represents the structure but not the behavior of the model in DT.
Security A typical rule of thumb for enhancing security is to minimize the surface area of an object such as a DT. This can be challenging with DTs because many industrial devices in operations achieve security through physical isolation from traditional IT networks.
API Represents the behavior of the DT. It should conform to the data model for maximum developer usability.
Object model A combination of the data model and the API – defines the structure and the behavior of the DT.
Canonical model Represents the generic version of a DT or data model or Knowledge Graph that has no customer-specific extensions.
Knowledge Graph Holds together the representation of the DT. The KG knows how the pieces fit together semantically. In addition to a generic version, there are customer-specific extensions. But all customers should be running the canonical version.
Backward compatibility Enhancements to the DT co-developed at one customer require that prior customers be able to upgrade to the canonical model part of this most recent DT.
Level of detail The hierarchical structure that organizes multiple DTs and their APIs and data models. For example, four anti-lock brakes fit within the drivetrain of a car. The closer the data fits to the DT’s structure, the richer the context.

Table 1: Concepts and definitions related to Digital Twins.

 

DTs “Program” the Real World.

DTs build on a broader scope of application integration than was customary with prior application generations. As a result, Digital Twins can work in a broad range of operational contexts. For example, a manufacturer capable of mass customization would need a range of deeply integrated applications that could interact with the real world through DTs. The ability for a firm to manufacture lot sizes of one would require integration all the way from ERP-level sales order management to plant-level production plans to CAD/PLM-level management of bills of material variants to coordination of machine tools on an assembly line down to individual machine tool setup and configuration.

Getting all the necessary DTs to perform this delicate dance in coordination is not easy.  A robotic machine tool’s Digital Twin might have to track machining tolerances for each item after the tool is reprogrammed to produce a new variant of an item. A DT for the whole assembly line might have to measure changes in quality for all the downstream tools that would be impacted by the output of that first machine tool. Product development and manufacturing engineers might have to be able to design and manage different, custom product variants from the CAD system that could reprogram the machine tools via their DTs in order to build a previously unmanageable range of products. The manufacturing execution system might have to be able to simulate and test plant-wide reconfiguration changes in all relevant processes and devices in order to maximize yields for production runs of small lot sizes. Over time, as DTs represent machine tools all the way to multi-site supply chains, the Digital Twins would have to be configured and optimized to control how enterprises source, make, sell, deliver, and maintain products and services.

 

DTs Model Products and Processes Better Than Traditional Apps.

Modeling the structure and operation of real world products and processes as DTs, from an individual machine tool to an entire supply chain, creates richer representations than traditional applications and conventional methods of integrating them. Traditional data modeling typically has focused on supporting human-designed business processes like accounting. DTs are richer. DTs not only model real world objects, but they integrate them with traditional business processes.

Choosing the right attributes to model, ensuring the “thing” is properly instrumented with sensors (or other data capturing resources), and testing the model, among other tasks, all require significant skill. Moreover, the tools typically employed to work with “things” are not typically used by IT pros. What’s more, the traditional enterprise applications have to be integrated with the Digital Twins in order for them to “program” the real world in the form of the DT’s physical counterpart. CAD and product lifecycle management (PLM) software has to be able to drive tooling setup via the Twin. Manufacturing execution systems (MES) have to be able to drive an assembly line or an entire plant represented by Twins. ERP applications have to be able to convert customer orders into work-in-process and finished goods production plans for Twins. And maintenance, repair, and overhaul (MRO) applications need to track when the Twins are serviced and prescribe maintenance before failures.

 

Knowledge Graphs Are the Secret Sauce for Holistic and Coherent DTs

The connective tissue that provides the ultimate integrity and coherence to complex DTs often requires software that IBM calls a Knowledge Graph. The graph creates a common holistic structure that holds together the representation of the DT. The graph integrates the contexts of the many applications behind each DT. Once the graph integrates the disparate structures and behaviors, it’s much simpler to map real-time and historical telemetry data about how the physical version of the twin is operating.

Once there’s a coherent structure to consume all the operational telemetry, machine learning models that are part of the DT then analyze that data along with other information about the twin’s structure and behavior. That analysis is what enables the models to predict and optimize both the twin’s behavior as well as the business processes with which it interacts. Much of this analysis happens at the edge of the network near the twin so that the analysis can inform the twin’s operation with really low latency.

Action Item: Tools to design DTs and set up the individual data feedback loops are still somewhat immature. At this point, firms that can work with CAD-centric tools can try products such as Thingworx from PTC. Extremely complex products would likely benefit from a more semi-custom approach from a vendor like IBM. But whatever the product choice, all but the most sophisticated firms should work with a partner with a strong professional services arm in order to get the benefit of skills transfer.

 

Article Categories

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
"Your vote of support is important to us and it helps us keep the content FREE. One click below supports our mission to provide free, deep, and relevant content. "
John Furrier
Co-Founder of theCUBE Research's parent company, SiliconANGLE Media

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well”

You may also be interested in

Book A Briefing

Fill out the form , and our team will be in touch shortly.
Skip to content