Story, Features, and Benefits The division of labor—the allocation oftasks to highly specialized individuals—was a game-changing idea of the 18thcentury that created the foundationfor industrialization and the moderneconomic system. Today, the division oflabor concept is a key part of organizingthe best workflow. The latest approachto solving that challenge is the processdigital twin which may seem revolutionaryand even borderline science fiction! Digital twin and digital twin systems Originating in the aerospace industry, the digital twin is a virtual representationof a physical object or process that is sufficiently accurate to support decision-making, and is synchronized with the real world in a near real-time manner. The digital twin applies to a variety of problem domains. For example, the U.S. AirForce uses the digital twin conceptto keep its fleet of aircraft operational, whileSingapore is developinga digital twin of the nation’s power gridto enhancethe resilience and reliability of its power supply. The value of digital twin systems is realized in: •State monitoring and visualization•Decision-making support and analysis•Optimization of a physical asset or a process There are various types of digital twinsystems according to the scale of the objectbeing modeled. In this whitepaper, we coverthe specific type of the digital twin concept—the process digital twin and explore its mainbuilding blocks. What’s the process digital twin? The process digital twin focuses on sequencing and orchestrating all of the activitiesexecuted by people or equipment that produce products or deliver services forthe customer. It is suitable for modeling both automated and manual operations. As with any innovative idea, the process digital twin concept is grounded in the well-established technologies, such as: •Process Modeling•Process Data•Process Simulations•Virtual Representations (2D, 3D visualizations, AR, VR) Let’s review the underlying concepts of the process digital twin, particularly howthey were previously applied in the business world, how they evolved, and howthey currently enable a new level of process understanding and control. Origins of process modeling Acentury ago, the Western world saw rapid industrial and economic growth. Highconsumer demand forced engineers to seek improvement of the production process thatrelied on massive amounts of human labor. As a result, multiple production managementtheories were created, which laid a foundation for modern industrial engineering. In 1921, an American engineer Frank Bunker Gilbreth, apioneer in managerial science and an efficiency enthusiast,set one’s mind to find “the one best way to do work.” Eventually, he proposed a method of process improvement,based on visual “process charts.” This method foresaw aholistic analysis of all activities before introducingany changes. The paper process charts weregraphical representations ofthe actual process, servingas communication vehiclesduring the process ofiterative improvements. Since the 1920s, Mr. Gilbreth’s idea of creating process charts has evolved, and cannow be seen in modern Process Mapping and Value Stream Mapping techniques. While process mapping has its benefits, it also has some inherent flaws that limitits applications, such as: Subjectivity Actualization Process maps are often created usingobservations and interviews—andindividual interpretations can differdepending on the role and perspectiveof the interviewee. The resultingdiagrams can reflect an “ideal” processrather than an actual process. Process maps are point-in-timerepresentations of the process,and time and effort are requiredto keep them current. Below, we'll describe a modern method for process modeling that doesn’t have these flaws. Process simulation During the 1960s, process engineers were occupied by morecomplex process-related problems, such as: Determining the optimallayout for airport runwaysand passenger terminals thatmaximized airport efficiency. Timing urban traffic signals toreduce traffic congestion. Pen-and-paper process modeling techniques were not enough to solve such problems,so engineers began using mathematical modeling and early computers to simulateprocesses and find optimal configurations. One method that became popular forprocess simulation was Discrete EventModeling. Geoffrey Gordon, an IBMengineer, started experimenting with thisapproach, and in 1961, IBM released thefirst version of GPSS (General PurposeSimulation System), the first commercialsoftware for discrete event modeling. Geoffrey Gordon, 1961 In discrete event modeling, the process is displayed as a flowchart where blocksrepresent operations. The flowchart usually begins with the "source" blocksgenerating entities and injecting them into the process, and ends with the "sink"blocks that remove entities from the model. This type of diagram is known as aProcess Model Notation and is widely used for describing the process steps. With a di