Follow us at
plmPartner
  • Home
  • About
  • Blog
  • Archive
  • Video Log
  • Podcast
  • Learning Center
  • Privacy Policy

Who owns what data when…..?

7/7/2017

0 Comments

 
Picture
​A vital questions when looking at cross departmental process optimization and integration is in my view: who owns what data when in the overall process? 
Picture
Usually this question will spark up quite a discussion between the process owners, company departments, data owners and the different enterprise architects. The main reason for this is that depending on where the stakeholders have their main investment, they tend to look at “their” part of the process as the most important and the “master” for their data.

Just think about sales with their product configurators, engineering with CAD/PLM, supply chain, manufacturing & logistics with ERP and MES. Further along the lifecycle you encounter operations and service with EAM, Enterprise Asset Management, systems sometimes including MRO, Maintenance Repair and Operations/Overhaul. The last part being for products in operational use. Operations and service is really on the move right now due to the ability to receive valuable feedback from all products used in the field (commonly referred to as Internet of Things) even for consumer products, but hold your horses on the last one just for a little while.
​
The different departments and process owners will typically have claimed ownership of their particular parts of the process, making it look something like this:
Picture
This would typically be a traditional linear product engineering, manufacturing and distribution process. Each department has also selected IT tools that suit their particular needs in the process.
This in turn leads to information handovers both between company departments and IT tools, and due to the complexity of IT system integration, usually, as little as possible of data is handed from one system to the next.
​
So far it has been quite straight forward to answer “who owns what data”, especially for the data that is actually created in the departments own IT system, however, the tricky one is the when in “ who owns what data when”, because the when implies that ownership of certain data is transferred from one department and/or IT system to the next one in the process. In a traditional linear one, such information would be “hurled over the wall” like this:
Picture
Now, since as little information as possible flowed from one department / IT system to the next, each department would make it work as best as they could, and create or re-create information in their own system for everything that did not come directly through integration.
Only in cases where there were really big problems with lacking or clearly faulty data, an initiative would be launched to look at the process and any system integrations that would be affected.

The end result being that the accumulated information throughout the process that can be associated with the end product, that is to say the physical product sold to the consumer, is only a fraction of the actual sum of information generated in the different department’s processes and systems.
​
Now what happens when operations & services get more and more detailed information from each individual product in the field, and starts feeding that information back to the various departments and systems in the process?
Picture
The process will cease to be a linear one, it becomes circular with constant feedback of analyzed information flowing back to the different departments and IT systems.

Well what’s the problem you might ask.

The first thing that becomes clear is that each department with their systems does not have enough information to make effective use of all the information coming from operations, because they each have a quite limited set of data concerning mainly their discipline.

Secondly, the feedback loop is potentially constant or near real-time which will open up for completely new service offerings, however, the current process and infrastructure going from design through engineering and manufacturing was never built to tackle this kind of speed and agility.

Ironically, from a Product Lifecycle Management perspective, we’ve been talking about breaking down information and departmental silos in companies to utilize the L in PLM for as long as I can remember, however the way it looks now, it is probably going to be operations and the enablement of Internet Of Things and Big Data analytics that will force companies to go from strictly linear to circular processes.

And when you ultimately do, please always ask yourself “who should own what data when”, because ownership of data is not synonymous with the creation of data. Ownership is transferred along the process and accumulates to a full data set of the physically manufactured product until it is handed back again as a result of fault in the product or possible optimization opportunities for the product.

 – And it will happen faster and faster
​
Bjorn Fidjeland


The header image used in this post is by Bacho12345 and purchased at dreamstime.com
0 Comments

Linking vision with strategy and implementation, but prepare for disruption

6/24/2016

2 Comments

 
Picture

​
​Over the years I’ve seen the importance of defining a vision for where one would like to get.



Picture

​Such a vision should serve as a guiding star.
This is true for full corporate visions as well as for smaller areas of the enterprise.

It could be for our product lifecycle management and the internet of things, how we see ourselves benefitting from big data, or harnessing virtual design and construction.
Picture
​
​The strategy is our plan for how we intend to get to the promised land of our vision. It should be detailed enough to make sense of what kind of steps we need to take to reach our vision, but beware:

Don’t make it too detailed, the first casualty of any battle is the battle plan.
​Prepare, but allow for adjustments as maturity and experience grows.

​An important part of the strategy work is to acid test the vision itself. Does it make sense, will it benefit us as a company and if so, HOW.
​
During the strategy work, all stakeholders should be consulted. And yes both internal and external stakeholders. The greatest vision in the world with a good strategy and top notch internal implementation won’t help much if you have external dependencies that cannot deliver what you need according to your new way of working…..
​Business processes will simply come grinding to a halt.
Picture
​The implementation rarely takes the same path as the strategy intended it to do. There are usually many reasons for this.
My best advice regarding implementation is to take it one step at a time and evaluate each step. Try to deliver some real business value in each step, and align the result towards vision and strategy. Such an approach will give you the possibility to respond to changing business needs as well as external threats.

But as we come closer to the promised land of our vision, we can simply sense the feel of it and can almost touch it…. Then, disruption happens........

Picture
​Such a disruption might present itself in the form of a merger and acquisition where vision, strategy and also implementation must be re-evaluated.
The “new” company might have entered another industry where strategy and processes need to be different and old technical solutions won’t fit.

Today everybody seems to be talking about disruptive technology and or disruptive companies. Most companies acknowledge that there is a real risk that it will happen, but the problem is, since most disruptive companies haven’t even been formed yet, it is difficult to identify where the attack will come from and in what form.
​
This leads back to vision, strategy and implementation. Vision and strategy can be changed quickly, but unless the implementation is flexible enough and able to respond to changing needs, the company will be unable to respond quick enough to meet the disruption.

Bjorn Fidjeland

The header image used in this post is by Skypixel and purchased at dreamstime.com
2 Comments

PLM platforms, the difficult organizational rollout

3/6/2016

0 Comments

 
Picture
What is PLM really about? In my view it is about tying relevant information to business processes, you know, the stuff that makes your company truly unique and then tying your employees to those very same processes throughout the life of a product.

So it’s about information, processes, people and an IT platform, in this case a PLM platform.
​

To be successful, ALL areas must intersect.



It does not matter if you have the perfect PLM system with perfectly defined processes if the information you need to manage is bad.

Just as little as it will help to have good quality data with perfectly defined processes and an organization ready to adopt it if the PLM platform is unable to scale to your needs.

It will not help to have good quality data tied to perfectly defined processes and a state of the art PLM system either if nobody is using it….

So going back to the headline: PLM platforms, the difficult organizational rollout.
I’ve seen far too many PLM implementations underperform due to unsuccessful rollout in the organization.
I find it strange that although the projects are often run iteratively to develop or customize smaller chunks of functionality in each iteration to ensure success, one expects the end users to devour the full elephant of the project in more or less one big bite…

In my view a rollout of such a large and business critical platform should also be considered iterative and with time for the end users to come to terms with what they have learned after each iteration before the next iteration starts.
I would compare it to building a house.
​ You would never start erecting the walls before the concrete slab is sufficiently cured.
The same is true for an organization. If more functionality and new processes are put on top before the previously learned functionality and processes has had time to settle, you get resistance, and the foundation becomes weak.

Another important factor is to not only train the end users in a classroom environment and then expect them to perform well in their new system… Because they won’t.
They’re still afraid to do something wrong, and they will struggle to remember what they learned in the classroom.
Then they will try to find solutions in the manuals, and growing more and more frustrated by the minute.

If this frustration is allowed to continue for too long, you can be sure that the end result is that they feel that the system is too difficult to use and basically suck. It might sound childish, but holding hands work! Have some super users or trainers available in the everyday work situation to help and guide the users the first few weeks.​
That will mitigate the fear factor of doing something wrong, and steadily build confidence and ability.

​Bjorn Fidjeland
0 Comments

Challenges when going from entrepreneur to industrialized manufacturer

1/3/2016

0 Comments

 
Picture
​

​In my neck of the woods there are a lot of very talented engineers, and a lot of entrepreneurial spirit. My region (South Western part of Norway) is very much exposed to oil & gas industry and the delivery of products to plants, oil platforms etc. This means that it is very project focused and ETO (Engineer To Order) intensive.

The entrepreneurial spirit I mentioned has led to a whole host of startups with good ideas of how to solve some problem with a new product in better and more cost effective ways.

One story I keep hearing from such product companies, not only in Norway, but also in other countries and project intensive industries goes something like this:
So we won our first contract and the customer is really impressed with our product and our technology. It became a bit more expensive to deliver the project than we thought, but we managed and we were sure we would have better returns on the next project.
​
As time progresses we expect our cost in the projects to drop significantly.

​
Picture
​But what happens in a lot of such product companies?
Picture
​There is nothing strange in expecting such a development. One would instinctively think that one would be able to shorten the project execution time as one gains experience and have successfully delivered such a product before. The organization knows what suppliers can deliver and which cannot. Engineers and employees in installation and commissioning are becoming more and more experienced etc.
​It becomes very, very hard to drive down the cost of delivering projects, even if the product delivered from project to project is very similar. The transition from entrepreneur to industrialized manufacturer becomes hard for a lot of these companies.
Why is that?

Personally I think there are several factors
  • It was too hard to say no to those small insignificant changes that the client required in the next project….. That for engineering or manufacturing turned out to be not so insignificant.
  • Product development is constantly being performed in the projects. Engineers will always search for the perfect and most elegant solution. That does not mean that it is the best or most cost effective way to manufacture the product.
  • Clients or operators documentation requirements in terms of LCI (LifeCycle Information) deliveries. If the product company is unable to define a process to deal with shifting requirements from operator to operator, this becomes a manual nightmare that constantly diverts resources. Such a process should be an integrated part of the project execution process, and not as it mostly is today, a separated process.
  • It is in my view paramount that smaller parts of the product at least, is standardized and modularized  in such a way that the engineering information can be re-used from project to project (You can read more about my views here: “Engineering Master Data - Why is it different?” and “Can PLM help industrializing Oil & Gas projects?”   )
  • Last but not least there is a screaming need to manage project specific engineering data (Tag structures, P&ID’s, D&ID’s, electrical) together with, but NOT in a one to one relation with more generic product development data

I’ve seen the three last bullets addressed with PLM platforms at various companies, however the technology itself is just one factor. The organizational processes and how they are enforced in the platform is of far bigger importance.

The first two bullets are a lot harder, as they require a shift in mindset from entrepreneur to industrialized manufacturer of the organization. This includes going from quick and nimble to more standardized processes, and continuous process improvement. If you consider PLM as a mindset rather than just a technology, you will also harvest  benefits here, but it is hard work.

Bjorn Fidjeland




0 Comments

PLM and disconnected corporate processes

7/19/2015

2 Comments

 
Picture
How many times have you seen flashy corporate “blue books” with impressive process maps of how an organization do their business and perform their work?

I’ve seen quite a few. It’s not that I have anything against them, it’s just…. Is this really how the organization work? Are the corporate processes updated to actually reflect how work is performed, or are the processes really enforced in the organization? Are the processes adjusted based on feedback from the project organizations? And last but not least, are projects measured against the processes?
To my experience this very rarely happens, if at all.

I have however seen one company take drastic measures to do something to bridge the gap between the corporate processes and how the organization actually worked. This company decided to visualize their corporate processes in a PLM platform for use across all countries they were involved in. Due to regulatory differences between countries they managed variants of the processes in each country as well.
So how is this different from any other process map you might ask. Well, they not only visualized the processes, they also instantiated the processes, so when a particular project were to be executed the project manager would select the appropriate process and got an instantiated project with template WBS (Work Breakdown Structure) together with all document deliverables that were expected for such a project related to tasks and milestones.

This way they forced the organization to follow the defined processes….. As you might expect there was an outrage, because the instantiated processes was not at all how the organization worked. The project organization had through experience figured out where the corporate process did not work in the real world and had found ways to overcome the problems. This was however the intention, because now there was a very clear feedback loop so that the processes could be adjusted to reflect how the organization actually worked, and since the processes were instantiated in real projects, they could now also be analyzed, measured and improved across the entire organization through dashboards in the PLM platform. This practice became a competitive advantage, and it also allowed processes to be verified and tested in one country before being rolled out in other countries.

I’ve also seen examples of what typically happens when there is only a loose coupling between the corporate processes and how projects are executed and measured. In one company they had very impressive process maps with clearly defined input, output, description of responsibility and activities to be performed within each process step. However, the processes were not really instantiated and measured in the projects.

The obvious question then became: “How is it possible to measure and improve the processes if they are not instantiated and measured in actual projects?”

Well, it worked as long as the company was fairly small and key persons managed to have an overview of most projects. Mostly the projects were executed based on experience and a functioning culture at the main site. The challenge then became apparent when trying to replicate this for other sites where the culture was different. Those other sites tried to execute their projects solely by the process maps….

And that led to some very real problems

In such cases it is of paramount importance to harvest all experience from such projects, analyze, validate and update the process maps accordingly.

So where am I going with this?

In my view companies should work hard on closing the gap between their corporate processes and how the organization actually perform their work. Creating feedback loops from the project organization is one way of doing it. Another way is to actually instantiate the processes for use in projects and thereby making sure that the processes are followed. If the latter approach is selected it becomes very important to have an organization in place to collect feedback, analyze and adjust the processes as the company evolve and develop over time.


Some points to ponder
Bjorn Fidjeland
2 Comments

    plmPartner

    This is where we share our thoughts, ideas and experiences with you

    RSS Feed

    View my profile on LinkedIn

    Categories

    All
    AEC
    BIM
    Data Management
    Digital Enterprise
    Digital Transformation
    Digital Twin
    ERP
    Facility Lifecycle Management
    Governance
    Integration
    Internet Of Things
    IOT
    Platform
    PLM
    Process
    Product Lifecycle Management
    Strategy
    Structured Data
    Technical Information Management
    VDC
    Virtual Design And Construction

Contact us:
[email protected]