Follow us at
plmPartner
  • Home
  • About
  • Blog
  • Archive
  • Video Log
  • Podcast
  • Learning Center
  • Privacy Policy

The Digital Enterprise - Data Exchange

5/8/2025

0 Comments

 
Picture

With data exchange, in this context, I mean the need for support of processes and data transfer between software platforms.
There is also another aspect, which is the need for interoperability between stakeholders both internally and external to the enterprise. 

Would it be possible to combine the two? Yes.
​
Most companies that do any engineering soon finds itself in a similar situation to the figure below.
The names of the applications might be different, but the underlying problem remains the same: Engineering data is created by best of bread design tools used by different engineering disciplines, and the data must at some point be consolidated across disciplines and communicated to another discipline. This other discipline being procurement, project execution, manufacturing, service and/or supply chain.

​
Picture

​As the figure above indicates, this has often ended up in a lot of application specific point to point integrations. For the last 25 years, more or less, so called Enterprise Service Buses have been available. The Enterprise Service Bus, often referred to as ESB is a common framework for integration allowing different applications to subscribe to published data from other applications and thereby creating a standardized “information highway” between different company domains and their software of choice.

Note: 
In later years more modern alternatives like iPaas platforms, Microservice orchestration tools, event streaming platforms etc. have become available. They often yield quicker results, but inherently share some of the same challenges on higher process level as still, data needs to be mapped.
​
Picture

By implementing an Enterprise Service Bus, the situation in the company would look somewhat like the figure above. Now from an enterprise architecture point of view this looks fine, but what I often see in organizations I work with is more depressing.
​
​Let’s dive in and see what often goes on behind the scene.
Picture

​Modern ESB’s have graphical user interfaces that can interpret the publishing applications data format, usually by means of xml or rather the xsd. The same is true for the subscribing applications.
This makes it easy to create integrations by simply dragging and dropping data sources from one to the other. Of course, often, one will have to combine some attributes from one application into one specific attribute in another application, but this is also usually supported.

So far everything is just fine, and integration projects have become a lot easier than before. BUT, and there is a big but. What happens when you have multiple applications integrated?


Picture

​The problems of point-to-point integrations have effectively been re-created inside the Enterprise Service Bus, because if I change the name of an attribute in a publishing application’s connector, all the subscribing application’s connectors must be changed as well.
How can this be avoided? Well, several ESB’s support the use of so-called dictionaries, and the chances are that the Enterprise Service Bus ironically is already using one in the background.

So, what is a dictionary in this context?
Think of it as a Rosetta stone. Well, what is a Rosetta stone you might ask. The find of the Rosetta stone was the breakthrough in understanding Egyptian hieroglyphs. The stone contained a decree with the same text in hieroglyphs, Demotic script and ancient Greek allowing us to decipher Egyptian hieroglyphs.
Imagine the frustration before this happened. A vast repository of information carved in stone all over the magnificent finds from an earlier civilization…. And nobody could make sense of it….. Sounds vaguely familiar in another context.

Back to our more modern integration issues.
Picture

​If a dictionary or Rosetta stone is placed in the middle, serving as an interpretation layer, it won’t matter if the name of some of the attributes in one of the publishing applications changes. None of the other applications connectors will be affected, since it is only the mapping to the dictionary that must be changed which is the responsibility of the publishing application.
​
Picture

​If such a dictionary is based on an industry standard, it will also have some very beneficial side effects.

Why?
​
Because if your internal company’s integration dictionary is standards based, then the effort of generating information sent to clients and suppliers, traditionally referred to as transmittals or submittals, will be very easy indeed.

If we expand our line of thought to interpretation of data from operational systems (harvesting data from physical equipment in the field). Commonly referred to as IoT, or acquisition of data through SCADA systems, then the opportunities becomes even greater.

In this case it really is possible to kill two birds with one stone, and thereby creating a competitive advantage!

Bjorn Fidjeland


A similar article was published at plmPartner.com several years ago “Data Integration – Why Dictionaries…..? “ but then in a slightly different context.
0 Comments

The Digital Enterprise - Content and Data

5/6/2025

0 Comments

 
Picture


After the business software with its digital processes and functionality is put to good use in the organization, it will start to produce content and data, and for a digital enterprise, what you do with this data will largely determine your level of success.

First, it is important to set some evaluation criteria to be able to analyse the output from the processes. The objective and scope of the analysis needs to be clarified before diving into the data crunching part. The business process itself must be understood, but if the “pyramid” is followed, then the business needs, the business process and its digital counterpart is adequately described.

The data and content produced when executing the processes must be harvested analysed, measured, and managed to provide actionable insight.
Is the quantity of data as expected? If no one uses the processes in the system, then there will not be much content, and it will not matter if the process is perfectly implemented and described….
What is the quality of the data like? Is it complete, accurate and consistent?
Does the data appear in a timely manner for decisions to be taken based on the data, or does it only arrive after the fact?

All these criteria (and there are of course a lot more) are difficult enough when considering processes executed within one business software platform, but becomes a lot more difficult when a process spans multiple business software platforms, meaning integration and data exchange will have to be considered as well. We will, for now focus on processes within a platform as the next chapter will be about data exchange across platforms.
A part of the analysis should consider if security and data privacy are sufficient to meet regulatory compliance as well as your internal company standards.
​
When considering the execution of business processes, it is common today that business software has capabilities to describe What happened typically in the form of dashboards and reports. Here it is a matter of defining the KPI’s one would wish to monitor. To examine why it happened becomes a bit trickier because it entails some form of root cause analysis. Moving even further into more predictive analysis to answer what might happen if we reduce the throughput of something, for instance throughput of the cooling pumps in a cooling system would depend on the quality of several data sets combined: The design data, the calibrated asset data as well as the runtime data from integrated control and safety systems and large amounts of historical data. Then “something/someone” needs to make sense of it all, and here AI with machine learning come into play. Furthermore, it would be nice if we could get an answer to what should be done based on all the data that has been analysed which land us firmly in the realm of prescriptive analytics.
All of this depends on the quality and amount of data.
The old saying still holds true: “garbage in, garbage out”, so data and process must be monitored, analysed, and optimised in an iterative fashion to allow for continuous improvement of both.

Some examples I have used before highlights the potential value of data: “…. real power in the operational phase becomes evident when operational data is associated with each delivered pump. In such a case the operational data can be compared with environmental conditions the physical equipment operates in. Let us say that the fluid being pumped contains more and more sediments, and our historical records of similar conditions tells us that the pump will likely fail during the next ten days due to wear and tear of critical components. However, it is also indicated that if we reduce the power by 5 percent, we will be able to operate the full system until the next scheduled maintenance window in 15 days. Information like that gives real business value in terms of increased uptime” Digital Twin - What needs to be under the hood?

or

“ ..Data in itself is not of any value whatsoever, but if the data can be analysed to reveal meaning, trends, or knowledge about how a product is used by different customer segments, then it has tremendous value to product manufacturers.
If we look at the operational phase of a product, and by that, I mean everything that happens from manufactured product to disposal, then any manufacturer would like to get their hands on such data, either to improve the product itself or sell services associated with it.
Such services could be anything from utilizing the product as a platform for an ecosystem of connected products to new business models where the product itself is not the key but rather the service it provides. You might sell guaranteed uptime or availability provided that the customer also buys into your service program for instance.” Big Data and PLM, what’s the connection?
 
In short. Paying attention to your process generated data, its quality and the way you analyse allows you to make business decisions based on data, not gut feeling.
 
Bjorn Fidjeland
0 Comments

The Digital Enterprise - Digital Processes and Functionality

4/13/2025

0 Comments

 
Picture

As the business software is selected, the next step is to implement the described processes “your way of working” as digital equivalents in the business software.
These are the typical PLM, ERP, CRM, MES, EAM projects that are executed in organizations and that tends to create headlines like: “More than 50 percent of IT projects fail”

How do we avoid ending up in these statistics?
As for any kind of project it is important to define clear objectives. If we have, as per the “pyramid” image above, identified the business needs and defined the business processes, this part should be straightforward. However, be careful not to bite off more than you can chew. Then establish measurable objectives for the implementation.

Pay careful attention to involving the affected stakeholders. They need to be informed, engaged and consulted regularly during the implementation of the digital processes. They will also be very valuable in terms of feedback and guidance.

If your enterprise has history in terms of legacy data, some tough decisions must be made. Does it make sense to migrate all the data into the new system or systems? Is it even possible? What value will it bring to migrate all data compared to the cost?
Do we leave all legacy data in the old system and gradually phase it out? However, gradually might take a really long time depending on the lifecycle of your product or facility data managed in the system….
A third option would be to partially migrate data from the legacy system, meaning, that only the most important data will be migrated to the new system in accordance with the new data model and processes. This option will make the migration exercise less costly and time consuming, but it will still leave the need for a functioning legacy system containing the old data even if very few individuals in the organization need to use it.

Next, decisions will have to be made regarding how much configuration or customization of the business software compared to the needs of the business. In my view it is absolutely crucial that business is involved in this step and that they keep an open mind. Is it really important to our business execution to change the software process or would it be possible to run as per Out Of The Box? If the process is unique to you and gives you a competitive advantage, then I would say yes, it would probably be a good idea to customize. Just bear in mind that the cost is not just to implement the unique process, but also to maintain and upgrade it yourself.

For the execution of the implementation project there are several different schools. Personally, I lean towards an agile approach with SCRUM like frameworks. The main reason for this is that I’ve seen first-hand on multiple occasions the power of the “show and tell” at the end of a sprint where business stakeholders get to see the developed functionality and ask questions to the scrum team and the scrum team get to show what they have achieved as well as asking questions to business stakeholders. It facilitates a powerful “us” dynamic as opposed to “us vs them”, and it also means that the project is better equipped to capture misunderstandings and errors early.

As the system with the defined digital processes and functionality is ready to be deployed and put to real use in the organization the biggest pitfall of all is hidden in the shadows……

Rollout, training, and organizational change management. Tons of literature has been written regarding its importance, and virtually everybody agrees it is important, so why is it so hard?
Because it involves people, and even more difficult, people that must change the way they do things. People, we, do not like to change, we resist change.

Is it then hopeless? Not at all, it just means that attention and resources must be set aside to deal with this important phase of any business software project. As an example, SharePLM, offers a lot of great articles on this topic.

After the business software with its digital processes and functionality is put to good use in the organization, it will start to produce content and data. For a digital enterprise, what you do with this data will largely determine your level of success.  

Bjorn Fidjeland


0 Comments

The Digital Enterprise - Business Software

4/7/2025

0 Comments

 
Picture
 When business needs are identified and processes are defined it becomes important to identify a business software strategy to support business process execution.
Implementing the business processes in software allows the enterprise to make sure the process is followed, measure it, ensure feedback with respect to both the process itself and regarding its outcomes.
Effective implementation of the business processes in software is absolutely crucial for a digital enterprise to be able to continuously improve, detect changes, transform and respond to new business opportunities as well as threats.

In recent times I have encountered some interesting points for debate. Some startup companies only roughly define their business processes before scanning the market for business software. The idea being that such software solutions generally come with pre-defined processes at least on a lower discipline level, and that they may be adopted straight away by the startup as there are no legacy processes nor legacy data.

I see both pros and cons to this approach, but would love to hear your opinion first.


If we examine some different business software strategy approaches, I would like to focus on the three most common ones.
 
The monolithic approach:
In this approach, one software solution or at least software from only one vendor is selected. It does have the advantage that, at least in theory, there is a one stop shop for Product Lifecycle Management, Enterprise Resource Planning, Manufacturing Execution System, after sales and services etc.
The downside is that all eggs are in one basket, and it will be difficult to ever change systems.
​And trust me, the software provider knows this.

​​ 
Picture

A few core enterprise business systems:

The strategy here is to identify best of breed platforms for major “chunks” of the business process. Examples could be one platform for design and engineering, another for procurement and manufacturing, a third for aftermarket and service. For this setup to work it becomes necessary to spend quite a bit of time and money on integration strategies to ensure that sufficient information flows back and forth between the software platforms. A key enabler here is to define a common language across the enterprise, meaning a Reference Data Library (RDL) of master data classes to ensure interoperability between the software platforms. This will greatly aid integrations as cumbersome data mapping tables can be eliminated in the integrations (Data Integration – Why Dictionaries…..? )

​
Picture
Orchestrated microservices:
The idea here is to utilize central orchestration which manages the interactions and workflows between different microservices while the services themselves are developed to perform the activities.
This approach is flexible and allows for using tools like Kubernetes for container orchestration and workflow engines like camunda and apache airflow for managing business processes. The downside is that it will lead to considerable development to implement your business processes

​
Picture
One should always carefully consider the amount of energy and resources put into in-house developed solutions as they will have to be maintained as well as upgraded technically to ensure they stand the test of time both functionality wise as well as from a software security aspect.

​Bjorn Fidjeland


0 Comments

The Digital Enterprise - Business Processes

4/4/2025

0 Comments

 
Picture
​Business processes should be formally described to convey the way we work in the company. Such process descriptions are typically found in the company’s management system.

Picture
Figure 1

There are different policies regarding granularity and how to break down high level processes to lower level processes with activities. A rule of thumb is that there should not be much more than 3 levels from a high level process until you find what activities you actually need to perform to fulfill the overall process. 
 
Activities or tasks that need to be performed in a process must have clear descriptions of what the required input is to successfully complete the activity, what tasks need to be performed within the activity, by what role and discipline and what the expected outcome or output is from the activity (figure 2). 


Picture
Figure 2​

Furthermore, it must be clearly specified who should receive the output of the activity and why (figure 3).
​
Picture
Figure 3

When I interview organizations, especially during different elicitation activities, I very often hear that there are missing or insufficient feedback loops between different disciplines. Such feedback loops are easy to draw in a process, like the figure 4 below, but it is a whole lot more difficult in reality as it involves.............. People, departments (which is even more difficult as it involves a different "tribe of people" ) and possibly even other companies. 


Picture
Figure 4

Hard as it may be, feedback loops are crucial for continuous improvement and learning, especially as the company grows to a size where it is hard to know what everyone is doing.
​

In my view, trying to measure or at least monitor that feedback activities are performed is of the utmost importance.

​Bjorn Fidjeland


0 Comments

The Digital Enterprise - Business Needs

4/1/2025

0 Comments

 
Picture
​
​In the article “What is a digital enterprise? And why should you care?” I wrote that “Business needs must be catered for by defined business processes (“the way we work”)”
The needs of the business may be in support of strategies, so plans for how we intend to achieve a certain goal or mission.
Such strategies are typically there so that we may reach some vision we have of what our enterprise should be in the future.
Furthermore, as business is also conducted in the digital realm, we need to capture business needs to support the enterprise’ policy for IT security. Such a policy or IT security strategy also influences budgets and resources.
Questions like: “Can we move to cloud to diminish the need for hardware and resources?”. The answer to this may depend on regulatory requirements as well as industry requirements.
We may see clear business needs that we wish to solve in the best possible way, however we are more often than not restricted by available budgets and resources.
 
But how do we identify business needs?
Even in well established organizations I’ve found that performing so called “business requirement elicitation” initiatives or projects regularly (at least every 3 years) across the departments of an enterprise is very useful. Not only to capture the business needs within departments, but also needs regarding what each department need from other departments to effectively do their work.

When the needs are validated, verified and communicated to all stakeholders it becomes a very effective way of communicating why for instance engineering need this and that information from sales, or why manufacturing desperately would like to have information they know engineering has, but is not provided.
 
I was once responsible for an enterprise-wide business requirement elicitation initiative regarding an end-to-end process for facility assets. The lifecycle of assets were defined as “from a need is identified in the facility to decommissioning of the physical asset” needless to say it spanned pretty much all departments and functions in the company from engineering through procurement, manufacturing, supply, installation, commissioning, operations, maintenance and ultimately decommissioning.
Some key stakeholders clearly voiced their discontent with the initiative saying it was a complete waste of their time, and that it was a classical case of shooting sparrows with cannons… Nevertheless, after a bit of persuasion they went along with it.
After the elicitation exercises, the business requirements from all departments and functions were verified, validated and mapped towards an overall end-to-end business process. The findings and recommendations based on the findings were presented to all stakeholders involved.

Interestingly the very same stakeholders who voiced discontent, now saw the benefit of the exercise and wondered why we had not done it sooner.
 
When the business needs are identified, it becomes a question of how we can fulfill them. How can we organize our way of working to fulfill the business needs?
This leads us to the next level in the pyramid, namely business processes.

Bjorn Fidjeland
​
0 Comments

What is a digital enterprise? And why should you care?

4/2/2024

0 Comments

 
Picture
Picture

​
​Hallmarks of so called digital enterprise companies are that they are able to mirror and refine the real-world way of conducting business in the digital world, and in some cases even blur the boundaries between the two.
​

Picture


​Such companies design their business processes (their “ways of working”), implements them digitally, executes, and measures their processes continuously. The measurements are then used to optimize and adjust the business processes both in the real and digital world.
Picture



​Emphasis is put on company services being protected by cyber security measures to safeguard both the services and the company itself



​
​Companies that have focused on this are more able to continuously improve as well as detect, adapt, and transform to changes, opportunities and threats in their business environment.
​
To make this work, as per the figure below, business needs must be catered for by defined business processes ("the way we work")

The defined business processes must be implemented by digital counterparts in enterprise business software

Data and content produced when executing the processes must be analyzed, measured and managed to provide actionable insight

Data exchanges must be interoperable to support cross cutting processes and data transfer or even better, linking of data, between enterprise business software when needed.

An infrastructure of hardware and software capable of running the enterprise business software is the foundation for it all


Picture


​In my article series "PLM Tales from a true mega project" and more specifically in Chapter 8 - Digital Twin I wrote:

​"The thing is…. In order to achieve what is described in this article series, most companies would have to change the way they are currently thinking and working across their different departments, which brings us to the fact that real business transformation would also be required. The latter is most of the time a much larger and more time-consuming obstacle than the technical ones because it involves a cultural change."

So while the "PLM Tales from a true mega project" and "Plant Information Management" articles focused on the information that needs to be managed and how this may be done, this and articles to come will focus on how the business may be transformed as well as the governance needed to make it happen.


Bjorn Fidjeland


0 Comments

An advisor’s most important abilities: Knowing when to keep your mouth shut

4/22/2022

0 Comments

 
Picture
I realized through a question from a young consultant the other day that I’ve never really written about the main portion of what I do as advisor. Sure, I write about the strategies and outcomes of those, but not really about the advisory role.

Coming back to the question from the young consultant, she asked me “what would you say is the single most important ability for you as an advisor?” I thought for a while before responding: “the ability to keep my mouth shut”. She looked at me like I just fell out of the sky, so some further explanation was needed.

In my experience, especially when advising management, they have fewer people to confide in. It is a common phrase that “the closer to the top you get, the lonelier it gets”. And it is true in a lot of aspects, these people have a lot of responsibility and power to influence the everyday working life of people working in the organization, so most of them are very careful about speaking their mind too early for fear of being either misinterpreted or causing rumors that could potentially be damaging.

I’m going to use Tom as an example. Tom is CIO of a large company, and Tom is not his real name, by the way.
We were reviewing a vision and potential strategies to support it, and what each strategy would mean for internal business processes (ways of working) etc. I soon realized that Tom was in desperate need of a sounding board, and sometimes, not even that. Just the opportunity to formulate his thoughts calmly into the spoken word was enough to unlock several strategies and business opportunities.
My task would then be to simply probe and interrogate those strategies from different angles, and offer possible consequences to both ways of working and viable implementations.
If I had not kept my mouth shut to give Tom time to arrange his thoughts, his creative thinking process would have been interrupted, and the sessions nowhere near as valuable.

I’ve seen this pattern repeat itself with several clients.

Another essential rule is to never, ever relate anything from any such conversation as the one described above outside of the room, unless explicitly told to. They are to always be treated as confidential to foster an environment for candidness and openness. The reason being that if not adhering to that rule, rumors tend to start flying about new directions, they get tweaked along the way and before you know it, the organization is in turmoil.

Outcome: You will have failed as an advisor, your client will never trust you again, you will not work with that company anymore and the grapevine will ensure that you get less opportunities to work with other companies as well.

So, yeah, I would definitely say that an advisors most important ability is knowing when to keep your mouth shut.    
 
Bjorn Fidjeland
0 Comments

Digital Twin - What’s in it for the facility owner

4/1/2022

0 Comments

 
Picture
All facility owners want a facility that produces 24/7, 365 days a year, however, nobody has it.

Production needs to be halted for all sorts of reasons both scheduled, like planned maintenance, and un-scheduled, like interventions due to failures in critical equipment. The digital twin promises that if you have an exact data-based replica of your physical facility which actually “talks to” and “understands” your physical facility, you will be able to greatly optimize operations, maintenance, risk management, safety etc.

How?
Well, if data is acquired real-time from sensors on equipment in the operating facility and fed to the digital twin for analysis towards the design and engineering specifications of both the equipment itself and the facility system in which it operates, then one can learn really interesting stuff: Is the equipment operating within the thresholds set by the system design? Is the equipment nearing its end of useful life, is it behaving oddly, do we have historical knowledge to predict what happens when the equipment behaves like this? Can we reduce throughput and expect to reach the next scheduled maintenance window before it breaks, thereby limiting production downtime?

This sort of things, that can facilitate pro-active actions and predictive maintenance instead of re-active and corrective maintenance and thereby increasing the operational time.
In addition, if you do have such a data-based up to date replica of the facility, it becomes a lot easier to simulate and test everything from hazardous operations, to regular work with Lock-Out Tag-Out procedures, installation planning and execution of new equipment, inspection routes etc. because you can train for it in the virtual world of the digital replica. Of course, this is only true if the digital replica is kept up to date…….. 
So how can we go about actually getting such a digital replica of our facility?

There are essentially two ways to get there:
​
Facility owners can start specifying that a digital replica of the facility is a part of the EPC (I-C) contract, and that this delivery is just as important as the actual physical facility.
I have seen some facility owners moving in this direction; however, they then have to specify exactly what constitutes a successful digital replica delivery, and then make sure that it is updated continuously during operations.


Picture
Figure 1: "Digital Twin" created by EPC and handed over as part of commissioning
​
Or, if the facility is already built and operational, laser scans can be performed to gain an As-Operated model of the facility. However, this will only give you a pretty model. Data about initial design requirements, installed assets and their design, data from installation and commissioning, and what has happened to the equipment and systems since then must be reverse engineered and connected to achieve a digital twin fit for purpose.
Picture
Figure 2: "Digital Twin" created during operations by laser scanning and reverse engineering

Both of these approaches have one important thing in common. They both heavily depend on a properly defined information model that can handle information exchange across multiple disciplines, software tools and even companies throughout the lifecycle of the facility.

To achieve that, interoperability is vital.

What does that mean then, and how can it be done?
The “PLM tales from a true mega-project” series and "Digital Twin - What needs to be under the hood?"
 offers ways of doing it.
 
A facility owner that owns multiple facilities would benefit even greater by having such a defined information model, as it can be shared across facilities and digital twins. This would allow for a new level of insight across facilities. As an example: if a certain type of equipment keeps failing during certain circumstances, it can immediately be analyzed if that type of equipment is used the same way not only in one facility, but in all other owned facilities as well. This would open up for a lot more effective knowledge sharing across all owned sites, and prevent un-necessary downtime. 
​
Picture
​In my view, any facility owner embarking on a “digital twin” journey should pay great attention to the information model behind the “digital twin”, and devise strategies for how benefits can be utilized across a portfolio of twins as well as within a single twin.
​

After all, it does not make sense to do the same mistakes at every facility, when the knowledge to prevent it is there.
 
Bjorn Fidjeland

The two facility images used in this post is by Narmada Gharat and purchased at Dreamstime.com
0 Comments

Opportunities and strategies - Product Configuration Lifecycle Management

4/25/2021

0 Comments

 
Picture

This time an article that aims towards the more traditional Product Lifecycle Management domain and especially towards configurable products or so called Configure To Order (CTO) products. This article is a direct result of discussions I’ve had with Henrik Hulgaard, the CTO of Configit, on Configuration Management in general and Product Configuration Management in particular. Configit specializes in Product Configuration Management, or as they prefer to call it Configuration Lifecycle Management.
 
Most businesses that design, manufacture and sell products have a system landscape in place to support key areas during the lifecycle of a product pretty much as in the image below (there are of course differences from company to company).

​
Picture
Figure 1.

This works well as long as the product lifecycle is linear, like it has mostly been in the past. However, as more and more companies strive after being able to let customers “personalize” their products, (so, to configure them to support their individual personal needs), harvest data and behavior from “the field” through sensors to detect trends in usage as well as being able to offer new services while the product is in use (operational), the lifecycle cannot be linear anymore in my view. This is because all phases of the lifecycle need feedback and information from the other phases to some degree. You may call this “a digital thread”, “digital twin” or “digital continuity” if you will (figure 2).
Picture
Figure 2.

Such a shift puts enormous requirements on traceability and change management of data all the way from how the product was designed, through to how it is used, how it is serviced and ultimately how it is recycled. If the product is highly configurable, the number of variants of the product that can be sold and used is downright staggering.
Needless to say, it will be difficult to offer a customer good service if you do not know what variant of the product the customer has purchased, and how that particular instance of the product has been maintained or upgraded in the past.
 
So, what can a company do to address these challenges and also the vast opportunities that such feedback loops offer?

If we consider the three system domains that are normally present (there are often more), they are more often than not quite siloed. That is in my experience not because the systems cannot be integrated, but more as a result of organizations still working quite silo oriented (Figure 3).

​


Picture
Figure 3.

All companies I’ve worked with wants to break down these silos and internally become more transparent and agile, but what domain should take on the responsibility to manage the different aspects of product configuration data? I mean, there is the design & engineering aspect, the procurement aspect, the manufacturing aspect, the sales aspect, the usage/operation aspect, the service/maintained aspect and ultimately the recycling aspect.
 
Several PLM systems today have configuration management capabilities, and it would for many companies make sense to at least manage product engineering configurations here, but where do you stop? I mean, sooner or later you will have to evaluate if more transactional oriented data should be incorporated in the PLM platform which is not a PLM system’s strongpoint (figure 4).
Picture
Figure 4.

On the other hand, several ERP systems also offer forms of configuration management either as an addition or as part of their offering. The same question needs to be answered here. Where does it make most sense to stop as ERP systems are transactional oriented, while PLM systems are a lot more process and iteratively work oriented (figure 5).


Picture
Figure 5.

The same questions need to be asked and answered for the scenario regarding CRM. Where does it make sense to draw the boundaries towards ERP or PLM, like in figure 6.

​
Picture
Figure 6.

I have seen examples of companies wanting to address all aspect with a single software vendor’s portfolio, but in my experience, it only masks the same questions within the same portfolio of software solutions. Because, who does what, where and with responsibility for what type of data when, is not tackled by using a single vendor’s software. Those are organizational, and work process related questions, not software questions.
 
Another possible solution is to utilize what ERP, PLM and CRP systems are good at in their respective domains, and implement the adjoining business processes there. Full Product Configuration Management or Configuration Lifecycle Management needs aspects of data from all the other domains to effectively manage the full product configuration, so a more domain specific Configuration Management platform could be introduced.


Picture
Figure 7.

Such a platform will have to be able to reconcile information from the other platforms and tie it together correctly, hence it would need a form of dictionary to do that. In addition, it needs to define or at least master the ruleset defining what information from PLM can go together with what information in ERP and CRM to form a valid product configuration that can legally be sold in the customer’s region.

As an example, consider: What product design variant that meets the customer requirements can be manufactured most cost effectively and nearest the customer with the minimal use of resources and still fulfill regulatory requirements in that customers country or region?
These are some of the questions that must be answered.

More strategic reasons to evaluate a setup like in figure 7 could be:
  • As the departmental silos in an organization is often closely linked to the software platform domain, it might be easier to ensure collaboration and acceptance by key stakeholders across the organization with a “cross-cutting” platform that thrives on quality information supplied by the other platforms.
  • It poses an opportunity for companies with a strategy of not putting too many eggs in the basket of one particular software system vendor.
  • It could foster quality control of information coming from each of the other domains as such a CLM solution is utterly dependent on the quality of information from the other systems.
  • Disconnects in the information from the different aspects can be easily identified.

I would very much like to hear your thoughts on this subject.
​
Bjorn Fidjeland 


​The header image used in this post is by plmPartner
0 Comments
<<Previous

    plmPartner

    This is where we share our thoughts, ideas and experiences with you

    RSS Feed

    View my profile on LinkedIn

    Categories

    All
    AEC
    BIM
    Data Management
    Digital Enterprise
    Digital Transformation
    Digital Twin
    ERP
    Facility Lifecycle Management
    Governance
    Integration
    Internet Of Things
    IOT
    Platform
    PLM
    Process
    Product Lifecycle Management
    Strategy
    Structured Data
    Technical Information Management
    VDC
    Virtual Design And Construction

Contact us:
[email protected]