Process Analytics enables World Class Operations Management
If you think about fact-based scenarios of your processes, then “process analytics, process simulation, process mining and digital twin” are buzzwords that come to mind. But what is it and what added value does it bring?
Imagine having a simulation tool of your end-2-end manufacturing or supply chain process to play with different variables, learn about your process based on real data and adapt to reach your objectives. Or having a tool that tells you how to optimize capacity for future levels of demand. Or to run scenario analysis of the operations (such as closing a factory, running overtime on weekend, shutting down a SKU line….).
This insight focuses on a common understanding of the added value of process analytics, how to apply it in a standardized way and showcases the importance of linking digital with traditional change and operational excellence concepts.
The benefits of process analytics
Process analytics can bring multiple advantages to your company, for instance:
- Scoping investment and improvement projects via (large) objective data analyses to prove assumptions
- Quickly and precisely simulate via what-if scenarios how productive outputs change according to the variation of the parameters
- Avoid expensive trials by building a digital twin environment (as long as you use realistic input data)
- Find optimal configurations (shipment plans, volume mix, shifts, safety stocks, etc.) according to a given objective (minimize inventory costs, lead times, shipment costs, optimize service levels, prices, etc.) based on real data
- Process transparency / efficiency tracking in Operational Excellence (e.g. for your Shared-Service Organizations)
- Dynamically detect productivity losses, bottlenecks, machine saturation, man saturation
- Identify manual steps in the process that could be addressed with RPA (Robotics process automation)
The advantages of process analytics boost results achieved through traditional operational excellence.
Process analytics allows you to support decision making with fact-driven, statistical data. The complex reality can be analyzed and/or simulated through mathematical models to eliminate losses. Simulations and digital twins can be used to analyze, classify and/or forecast a flow of different process steps, while more specific data analytic techniques can be used to stabilize your processes or assess specific improvement opportunities based on detailed parameter analysis and classification. An integrated PHDS approach is key: targeting process, human, digital and sustainable factors. Supporting your operational excellence approach with process analytics is simply an example of combining process and digital. (see EFESO Insight on Predictive maintenance )
Processes are often documented as a linear
sequence of activities. In reality, these processes tend to play out significantly different from the way they were designed. First of all, process inefficiencies, rework or lacking information can alter the flow even though it was designed to prevent these issues. Secondly, not every exception is included in the process design cycle leading to circumstances where an unpredictable series of actions are taken to achieve the desired outcome. In most cases, it is not feasible to document a process flow for each exception, but a well-designed exception management procedure can be implemented. Finally, people’s creativity and their experience with the process execution tend to lead to variations and potentially can result in a more efficient process.
Within process analytics there are multiple ways to analyze and optimize processes. Analytics is complementary to traditional Swimlane / Makigami process analysis or indirect activity analysis. Within process analytics you will analyze and model the digital traces of real processes (e.g., preparation of offers, accounting, production process, P2P…). We use the log data from ERP, CRM, MES… to model and mine the process. It is quick, objective and it depicts the real process.
|Process variants||All actually occurring process variants||1 selected reference variant
(most frequent / longest)
|Effort||Effort at the analyst
|Effort at the process expert(s)
(workshop with many participants)
|KPI||Real values with real distributions||Estimation|
|Comparability / Benchmarking||Very well possible||Partly possible|
|Reuse of the analyses||Can be automated as a control process||Always a comparable effort|
Different types of process analytics
Process analytics uses process-related data with the objective of analyzing and optimizing its performance. There are 3 types of process analytics: descriptive, predictive and prescriptive, all with potential use in the decision-making process.
In descriptive analytics, we use historical data to explain what happened in the past. It helps to understand the process with objective data. Process mining can be categorized here, bridging the gap between traditional process analysis (e.g. business process management techniques such as lean management and six sigma) and data-centric analysis techniques such as machine learning and data mining. Within predictive analytics, we will discover scenarios of what could happen. Simulation, somewhere between descriptive and predictive, is most related to predictive analytics because of the use of mathematical models. Predictive analytics goes one step further than simulations because they will give insights in realistic future scenarios. Finally, prescriptive analytics will actually show us what will or should happen in the future often through an advanced decision-making process that will suggest or execute decisions and actions with or without human interference.
Process analytics can be used to analyze flows and full processes (simulation, digital twin, mining), or focusses more on the analysis of parameters and their influence on the quality of the product, delivery time, costs, etc. It can be predictive, but also classifying.
Both simulation and digital twin use digital models to simulate the reality. While we could say that simulation is more used in product design, the digital twin becomes most relevant once a process is operational and fed with real-time data. There is a bidirectional flow of information / real-time data between sensors that collect the data (IoT) and the twin, which means that it can feed efficiently predictive analytical models. A digital twin is a software simulated duplicate of the process, as accurate as it can be. The power of having a simulation of the reality is that we can calculate exact information at a fraction of time, also for scenarios that have never happed for real. In case you take the step to a digital twin, it can be easily tailored to your specific requirements and is fully scalable to highly complex scenarios.
To drive return on investment, process simulations allow you to rapidly assess different solutions and define the priorities. Creating multiple realities with a single source of code allows you to dynamically follow up (with the passing of time) the system behavior, assess performance and identify criticality (bottlenecks, fluctuations, etc.) of the physical system.
Within Industry 4.0, process analytics will help you to see and understand what is happening and to anticipate what will happen. Applying Lean principles and tools is important to achieve performance improvement (cost, quality, time) and forms the basis for solid process analytics. Industry 4.0 however requires additionally the systematic use of data, connected processes and digital technologies.
To see what is happening, information is captured via for instance connected assets, e.g. through IoT platforms, however if we would like to understand why certain events are happening, we could make use of process analytics mining techniques or BI dashboards in combination with digital shopfloor management. One step further we would make use of digital process twins to anticipate and predict what will happen, making use of advanced analytics. The ultimate self-adjusting autonomy is reached through AI-based decision making, which is not the scope of this article.
How to approach a successful process analytics journey
So far, we have established how analytics can contribute to classical WCOM (world class operations management) approach. Its goal is trying to provide insight to your process or even to predict or prescript what actions should be taken. Before discussing an actual client case study, it is worth to briefly explain the typical steps to take on a process analytics journey. A typical process analytics journey includes 3 steps: ‘Explore’, ‘Prototype’ and ‘Industrialize’.
1. Exploration: Start with understanding your process
A typical challenge when starting with process analytics is to clarify the scope, the objective, and the starting point of your analytics journey. Many companies have a ton of data but don’t have a clear vision on how to transform it into results and reap the benefits.
A clear understanding of the process is required. Typically, a series of interviews and workshops with experts and key-users is the first step to understand the process and map its symptoms, the influence of different process variables and the available sensors or control data.
An important step in process analytics is to classify the different input sources on your process map. Some of your process steps may be lacking reliable data and evaluation must take place how to obtain the missing data points. In the end, it should be your goal to understand what is happening and how it can be monitored.
2. Prototype: How can we accurately model our process?
Building further on your previous insights, the next step in process analytics is to create a working prototype. This model will need to be validated against reality, which in turn will allow you to truly test the business case as well. Note that when you get at this stage, there will be a vast array of possible modelling approaches to consider.
Note that this is often an iterative process where the model outputs are discussed with the process experts and during this phase, you will also sharpen your knowledge on the functionalities you need and the data that are required to run the models.
In order to perform a prototype / process analytics, we must:
- Gather relevant data and make it accessible
- Transform the data to be used
- Build a model able to evolve with the passing of time similarly to the system under study
- Apply and carry out experiments on the model generating different statistical reports in order to deduce the real system behavior under the set conditions
- Analyze the results, evaluate the alternatives of decision taking information on the links among the decisions taken and the system performances
3. Industrialize: How do you scale-up your model?
The most integral success factor which will determine the further scaling of process analytics will be an integrated approach for your roll-out. Since people are involved in all processes, we recommend to leverage the strong industrial process expertise and integrate the human factor to adopt towards an optimal symbiosis. We recommend being very focused on a single site and even a single process (step), namely the one with the most potential for savings, to start your journey. Ideally, the savings you will be able to generate can serve as a proof of concept (show potential) for the next process steps(s) and the next site(s).
Two very crucial points in process analytics need to be made in this regard as closing points:
- First, it does not make sense to keep optimizing your model. At a certain point, the marginal returns of a more optimized model simply do not outweigh the extra development costs anymore.
- Secondly, activities such as process control, process confirmation, standard operating procedures which are typically part of your daily control loop, should be updated with the newly gained insights and way-of-work. Your process will still need monitoring day after day.
Process analytics applied at a chemical site
Highly complex process with fluctuating final product characteristics in an increasingly competitive market. The objective with the process analytics project was to proof the potential of applying advanced analytics on product quality through classification of ‘good’ production runs versus ‘bad’ production runs, by developing a system, together with the Client, that predicts product characteristics for a certain product type.
Our process analytics approach started by developing a clear definition of a good quality product. It is critical to align with the stakeholders about the evaluation criteria for quality. Given the Client’s situation, the decision was to focus more on stabilizing quality (viscosity) of the final product than on early anomaly detection of bad quality. This was crucial in managing expectations.
To start off, the symptoms and failure modes of the process were identified, together with the Client. It was possible to rely on historical events, laboratory product samples, equipment drawings and the actual experience of operators and experts.
Then, through several workshops, ‘variables to be measured’ were identified and the availability of necessary data on these variables was assessed. A gap was identified that needed to be filled by mostly extrapolating sensor data. It is worth mentioning that it is common for process analysis that data cleaning and time series windows determination needs to be performed.
Once these steps have been taken, historical (event) data was assessed and used to build classification models. This work is iterative in nature, as it requires adding data types and validation of its workings.
When the process analytics model was successful in predicting, the last steps were to automate the processing of sensor data, develop the right visualizations and to take it into production.
Together with our Client, with the process analytics project, we delivered the following concrete benefits:
- Increased annual output by reducing the required change overs (up to one week worth of continuous production)
- Reduced cleaning and logistics costs
- Reduced degrading of products by stabilizing production characteristic
Key learnings: how to achieve success with process analytics
Now that we have explored an overall view on process analytics, together with a general approach and an actual Client case, let’s dive into the critical factors that determine success.
First success factor: Define a clear objective
It is worthy to use process analytics when the new reality to analyze is complex, phenomena of statistical interference or variation are relevant, an existing system shows problems whose causes cannot be understood, the investments carried out are relevant, etc. In the article multiple objectives of process analytics were listed, as it is crucial you start from a clear tangible objective. Think about the following examples:
- What happens if my demand increases in the next quarter by X %? Will I be able to keep the same service level?
- What happens if I reduce the number of FTEs at a certain process step by one unit?
- What is the average lead time that I can achieve if the percentage of rework increases by X%?
- How much can I reduce the cost per unit sold keeping the same service level?
- What are the saturation levels at a certain process step?
- What is the optimal shifts configuration that allows to minimize the lead time?
When defining your objective, you should take the following points into consideration:
- Benefits coming from process analytics must justify time and sources expenditure to realize the study;
- People involved in the process analytics project must have the same perception of the problem;
- Do not overshoot, limit the systems within the process in scope;
- A common terminology to define the process is necessary;
- All that we want to model and the outputs we want to study must be perfectly defined;
- Flow diagrams of the process under study must be available.
Second success factor: Use a phased approach
As stated in the client case story, when launching a process analytics project, it is critical to make sure you understand the problem. Making (wrong) assumptions or not fully understanding the problem will most likely lead to wrong conclusions and will hence result in actions with no or even undesired outcome.
Next, be sure to have a clear view of what you will need in terms of data metrics, expertise. A data requirement overview will come in handy during the data gathering step. Make sure your data is clean and usable. Remember, the quality of the input data will determine the quality of the model. Data cleaning and transformation is a time-intensive step, but you will be thankful for taking the time to do it thoroughly once you evaluate the model accuracy. Make sure the analysis is understandable; your insights should be able to tell the story.
When launching a process analytics project, remember to respect the 3 main phases: exploration, prototyping, industrializing. Within the prototyping we will use an iterative approach to gather and transform the data, build the model, apply and carry out experiments and analyze the results. As already mentioned, don’t overshoot and use the phased approach for your different scopes (e.g. pilot, scale up).
Third success factor: Get the right (skilled) people on board in an integrated PHDS approach
Supporting your operational excellence strategy by digital projects, requires not only a perfectly set up virtual world. Since people are involved in all processes, we leverage on our strong industrial and change management experience in order to make bring value to the company and implement the change if adequate. Traditional models such as Kotter and ADKAR stay relevant to create a climate of success, involve and support the people. Our strength lies within this integrated approach where the Process Consultant role bridges the gaps between the process expert (at client side) and the Data Engineer role (typically also a consultant). Our consultants have the ability to challenge process experts as they have a broad industry knowledge combined with a deep digital expertise.
Fourth success factor: It is pays off to visualize the simulation
Process analytics supported by graphic animation of the implemented model has various advantages:
- It helps to immediately verify that the model logic and the material flow are correct
- It allows a real-time correction of any mistakes entered in the program
- The visual representation of process analytics, with dynamic animation, makes the model available and simply understandable to people not skilled in the specific environment. In this way any project can be shown to any level without complex explanations
Where should you start with process analytics?
We hope that by now we have unraveled some of the mysteries on process analytics and showcased in what way it can contribute to boosting your operational excellence program. It is through the deliberate application of the right digital technologies that you will be able to unlock even more operational improvements.
Jonas Ingelbrecht – Senior Consultant
Glenn Rosez – Senior Consultant