After the business software with its digital processes and functionality is put to good use in the organization, it will start to produce content and data, and for a digital enterprise, what you do with this data will largely determine your level of success.
First, it is important to set some evaluation criteria to be able to analyse the output from the processes. The objective and scope of the analysis needs to be clarified before diving into the data crunching part. The business process itself must be understood, but if the “pyramid” is followed, then the business needs, the business process and its digital counterpart is adequately described.
The data and content produced when executing the processes must be harvested analysed, measured, and managed to provide actionable insight.
Is the quantity of data as expected? If no one uses the processes in the system, then there will not be much content, and it will not matter if the process is perfectly implemented and described….
What is the quality of the data like? Is it complete, accurate and consistent?
Does the data appear in a timely manner for decisions to be taken based on the data, or does it only arrive after the fact?
All these criteria (and there are of course a lot more) are difficult enough when considering processes executed within one business software platform, but becomes a lot more difficult when a process spans multiple business software platforms, meaning integration and data exchange will have to be considered as well. We will, for now focus on processes within a platform as the next chapter will be about data exchange across platforms.
A part of the analysis should consider if security and data privacy are sufficient to meet regulatory compliance as well as your internal company standards.
When considering the execution of business processes, it is common today that business software has capabilities to describe What happened typically in the form of dashboards and reports. Here it is a matter of defining the KPI’s one would wish to monitor. To examine why it happened becomes a bit trickier because it entails some form of root cause analysis. Moving even further into more predictive analysis to answer what might happen if we reduce the throughput of something, for instance throughput of the cooling pumps in a cooling system would depend on the quality of several data sets combined: The design data, the calibrated asset data as well as the runtime data from integrated control and safety systems and large amounts of historical data. Then “something/someone” needs to make sense of it all, and here AI with machine learning come into play. Furthermore, it would be nice if we could get an answer to what should be done based on all the data that has been analysed which land us firmly in the realm of prescriptive analytics.
All of this depends on the quality and amount of data.
The old saying still holds true: “garbage in, garbage out”, so data and process must be monitored, analysed, and optimised in an iterative fashion to allow for continuous improvement of both.
Some examples I have used before highlights the potential value of data: “…. real power in the operational phase becomes evident when operational data is associated with each delivered pump. In such a case the operational data can be compared with environmental conditions the physical equipment operates in. Let us say that the fluid being pumped contains more and more sediments, and our historical records of similar conditions tells us that the pump will likely fail during the next ten days due to wear and tear of critical components. However, it is also indicated that if we reduce the power by 5 percent, we will be able to operate the full system until the next scheduled maintenance window in 15 days. Information like that gives real business value in terms of increased uptime” Digital Twin - What needs to be under the hood?
or
“ ..Data in itself is not of any value whatsoever, but if the data can be analysed to reveal meaning, trends, or knowledge about how a product is used by different customer segments, then it has tremendous value to product manufacturers.
If we look at the operational phase of a product, and by that, I mean everything that happens from manufactured product to disposal, then any manufacturer would like to get their hands on such data, either to improve the product itself or sell services associated with it.
Such services could be anything from utilizing the product as a platform for an ecosystem of connected products to new business models where the product itself is not the key but rather the service it provides. You might sell guaranteed uptime or availability provided that the customer also buys into your service program for instance.” Big Data and PLM, what’s the connection?
In short. Paying attention to your process generated data, its quality and the way you analyse allows you to make business decisions based on data, not gut feeling.
Bjorn Fidjeland