Follow us at
plmPartner
  • Home
  • About
  • Blog
  • Archive
  • Video Log
  • Podcast
  • Learning Center
  • Privacy Policy

PLM platforms, the difficult organizational rollout

3/6/2016

0 Comments

 
Picture
What is PLM really about? In my view it is about tying relevant information to business processes, you know, the stuff that makes your company truly unique and then tying your employees to those very same processes throughout the life of a product.

So it’s about information, processes, people and an IT platform, in this case a PLM platform.
​

To be successful, ALL areas must intersect.



It does not matter if you have the perfect PLM system with perfectly defined processes if the information you need to manage is bad.

Just as little as it will help to have good quality data with perfectly defined processes and an organization ready to adopt it if the PLM platform is unable to scale to your needs.

It will not help to have good quality data tied to perfectly defined processes and a state of the art PLM system either if nobody is using it….

So going back to the headline: PLM platforms, the difficult organizational rollout.
I’ve seen far too many PLM implementations underperform due to unsuccessful rollout in the organization.
I find it strange that although the projects are often run iteratively to develop or customize smaller chunks of functionality in each iteration to ensure success, one expects the end users to devour the full elephant of the project in more or less one big bite…

In my view a rollout of such a large and business critical platform should also be considered iterative and with time for the end users to come to terms with what they have learned after each iteration before the next iteration starts.
I would compare it to building a house.
​ You would never start erecting the walls before the concrete slab is sufficiently cured.
The same is true for an organization. If more functionality and new processes are put on top before the previously learned functionality and processes has had time to settle, you get resistance, and the foundation becomes weak.

Another important factor is to not only train the end users in a classroom environment and then expect them to perform well in their new system… Because they won’t.
They’re still afraid to do something wrong, and they will struggle to remember what they learned in the classroom.
Then they will try to find solutions in the manuals, and growing more and more frustrated by the minute.

If this frustration is allowed to continue for too long, you can be sure that the end result is that they feel that the system is too difficult to use and basically suck. It might sound childish, but holding hands work! Have some super users or trainers available in the everyday work situation to help and guide the users the first few weeks.​
That will mitigate the fear factor of doing something wrong, and steadily build confidence and ability.

​Bjorn Fidjeland
0 Comments

Customization – Upgradeability

10/24/2015

0 Comments

 
Picture
In one of my previous posts “Customization – Do you fit in the box?” I touched upon an important aspect when deciding to customize a software platform (same principles apply whether it is a PLM platform or for instance a CRM platform).



How will my customization impact the upgradeability of the platform?

The reason why this becomes very important becomes obvious when you want to upgrade the platform from an old release to a newer release. Most companies skip one or two releases in each release cycle from the vendor before upgrading, and that makes it all the more important to perform an upgrade analysis beforehand.

If no customization is made it becomes an analysis of:
  •  What does our current data model look like, and what will it look like after the upgrade
  • What does our data look like, and how will the modifications from the vendor impact our data set.


If customizations have been made the analysis becomes a bit more complex:
  • What did the original data model of the software platform look like
  •  What does our current and customized data model look like?
  •  What will the data model look like after the upgrade?
  • Are there any conflicts between our current customized data model and how the data model will look like after the upgrade?
  • Should some of our customizations be removed since the new release covers some of our customizations?
  • What does our data look like, and how will our customizations and the modifications from the vendor impact our new data set?

In my view there are a few rules that should be followed to avoid too many problems with upgrades.


  • Try as much as possible to avoid changing the OOTB (Out Of The Box) data model itself, it is usually a lot safer to add to the OOTB data model and GUI.
  • Avoid to make changes in the OOTB business logic itself. If you have to make changes, then override the OOTB logic and create your own separately, BUT make sure to document such overrides so that you can switch back to OOTB. Remember that if you change the business logic directly, chances are those modifications will be overwritten by an upgrade.
  • If you decide to make your own data model completely customized with GUI and Business logic you are generally safe from an upgrade point of view, but you then stand the risk of not being able to benefit from the software vendors new releases of the platform in the future if they have decided to incorporate the same kind of functionality in the OOTB platform. In such cases you will be faced with a migration project, not an upgrade….
  • One of the things that always cause a hassle when upgrading is changes to the user interface (GUI). I would offer the same advice here. Do not change the original user interface! Make a copy instead, implement the changes and override the original. This is because if you change the original, your modifications will probably be overwritten during an upgrade. In addition when performing an upgrade analysis you’ll have to perform a three way comparison between the old OOTB, your customizations and the new OOTB to reveal the consequence of the changes.
  • If the software platform comes with a framework that rapidly enables you to build a user interface, data model and associate business logic, this is often preferable, BUT make sure to always analyze what new functionality is provided in the new release. Can you remove some of your older customizations? If you have overridden or hidden the old OOTB user interface, you will not be able to see the new juicy stuff that came as a result of an upgrade.
  • Never upgrade the production environment without having tested extensively in a sandbox first (a copy of the production environment). Not even the best upgrade analysis in the world will find all issues or problems when performing an upgrade

Conclusion:
When dealing with software platforms you should always perform an upgrade analysis to determine how the upgrade will impact your installation. In my view, this should be done even if you have gone strictly OOTB. Such an analysis can help you weed out the worst of the problems, and should serve as a decision point for the upgrade project. Test your upgrade and procedures in a sandbox environment extensively first before upgrading the production environment.

Some points to ponder
Bjorn Fidjeland

The image used in this post is by Dirk Ercken and purchased at dreamstime.com

0 Comments

PLM and disconnected corporate processes

7/19/2015

2 Comments

 
Picture
How many times have you seen flashy corporate “blue books” with impressive process maps of how an organization do their business and perform their work?

I’ve seen quite a few. It’s not that I have anything against them, it’s just…. Is this really how the organization work? Are the corporate processes updated to actually reflect how work is performed, or are the processes really enforced in the organization? Are the processes adjusted based on feedback from the project organizations? And last but not least, are projects measured against the processes?
To my experience this very rarely happens, if at all.

I have however seen one company take drastic measures to do something to bridge the gap between the corporate processes and how the organization actually worked. This company decided to visualize their corporate processes in a PLM platform for use across all countries they were involved in. Due to regulatory differences between countries they managed variants of the processes in each country as well.
So how is this different from any other process map you might ask. Well, they not only visualized the processes, they also instantiated the processes, so when a particular project were to be executed the project manager would select the appropriate process and got an instantiated project with template WBS (Work Breakdown Structure) together with all document deliverables that were expected for such a project related to tasks and milestones.

This way they forced the organization to follow the defined processes….. As you might expect there was an outrage, because the instantiated processes was not at all how the organization worked. The project organization had through experience figured out where the corporate process did not work in the real world and had found ways to overcome the problems. This was however the intention, because now there was a very clear feedback loop so that the processes could be adjusted to reflect how the organization actually worked, and since the processes were instantiated in real projects, they could now also be analyzed, measured and improved across the entire organization through dashboards in the PLM platform. This practice became a competitive advantage, and it also allowed processes to be verified and tested in one country before being rolled out in other countries.

I’ve also seen examples of what typically happens when there is only a loose coupling between the corporate processes and how projects are executed and measured. In one company they had very impressive process maps with clearly defined input, output, description of responsibility and activities to be performed within each process step. However, the processes were not really instantiated and measured in the projects.

The obvious question then became: “How is it possible to measure and improve the processes if they are not instantiated and measured in actual projects?”

Well, it worked as long as the company was fairly small and key persons managed to have an overview of most projects. Mostly the projects were executed based on experience and a functioning culture at the main site. The challenge then became apparent when trying to replicate this for other sites where the culture was different. Those other sites tried to execute their projects solely by the process maps….

And that led to some very real problems

In such cases it is of paramount importance to harvest all experience from such projects, analyze, validate and update the process maps accordingly.

So where am I going with this?

In my view companies should work hard on closing the gap between their corporate processes and how the organization actually perform their work. Creating feedback loops from the project organization is one way of doing it. Another way is to actually instantiate the processes for use in projects and thereby making sure that the processes are followed. If the latter approach is selected it becomes very important to have an organization in place to collect feedback, analyze and adjust the processes as the company evolve and develop over time.


Some points to ponder
Bjorn Fidjeland
2 Comments

From digital archive to intelligent data

6/7/2015

1 Comment

 
Picture
A lot of companies these days are working hard to turn their big digital archives into more intelligent data. These initiatives usually comes from some kind of digitalization strategy that has been formed to support a vision.

We see it every day, data is power, data can be analyzed, used in different contexts to support end customers, to sell new services or support internal processes in the company. 

Picture
However, for this to happen it is not enough so simply store and manage data in digital format. The data must be “connected”, stored in object or information structures that represents the data used in different contexts. Coming from a PLM background, some of the aspects are quite easy to identify. From a product perspective you’ve got a requirements breakdown structure, maybe a model configuration or variant structure, an engineering bill of material that represents the design intent and supporting CAD structures from various design tools. All of the structures mentioned are being managed today as digital information, but very few companies have structured the information and put it all in context of the other information structures to achieve full traceability and change consequence control.

Note, I’ve so far only touched the product design aspect, so when considering the manufacturing intent (the manufacturing bill of material), the manufactured product, the sold product and the installed product the complexity grows, but so does the benefits of managing it all as connected data structures stored in context of each other. This data can be used to sell services to the end customers.

Examples could be a pump manufacturer who has full traceability on all pumps sold to different facilities. The pump manufacturer could offer services for maintenance of the pumps, and if the pump contains different sensors, the manufacturer could also analyze operational data to schedule preventive maintenance. This data could then serve as valuable input to the design processes of new and even better pumps. As a consequence of the fact that all data structures are connected, the pump manufacturer knows the location of all pumps sold, and can offer the new and improved model not only to all customers, but to all customers and all the locations for each customer.

All of a sudden we are touching one of the biggest fashion words these days “the Internet Of Things”, because what would happen if a large portion of the pumps were installed on ships and they contained sensors? The pump manufacturer could set up maintenance offices in large ports. Knowing exactly what pumps would arrive in what ports, at what times and what maintenance need they have would allow the manufacturer or service provider to order the right spare parts just in time and to reduce the maintenance time. This would minimize the risk of fines by the ship owners because the ship had to stay in port longer than scheduled or even worse, having the service personnel performing the service at sea and thereby leaving the service office in the port severely under manned.

This is only one example of the power of “connected data” or digitalization. Quite a few companies have similar business models as our pump manufacturer, but very few have the opportunity to utilize the services by “connected data”. Instead there are a lot of manual work, interpretation and searching for data in different digital archives. This in turn leads to errors, misunderstandings and lost business opportunities.

 Some points to ponder

Bjorn Fidjeland


All images used in this post are purchased at dreamstime.com

1 Comment

VDC, is that like PLM for construction industry?

5/17/2015

2 Comments

 
Picture
VDC (Virtual Design & Construction) is a hot topic among construction companies these days, and a short look at Wikipedia will tell you that “(VDC) is the management of integrated multi-disciplinary performance models of design-construction projects, including the product (i.e., facilities), work processes and organization of the design - construction – operation…”. So it has to do with harvesting data from multiple disciplines in a project, consolidating it and managing it in order to capitalize on the data in processes during the project and through handover to operation. It is very closely dependent on BIM (Building Information Modelling).

However the initiatives I’ve seen also concentrates a lot on 5D. That is the consolidation of the design data from multiple disciplines into a virtual 3D representation (BIM), and then adding the dimensions of time and cost. This is done both for estimation purposes and for project execution purposes. This of course makes a lot of sense.

So where does PLM fit into this picture?


PLM is also about consolidation design data from multiple disciplines, connecting that data to business processes and people to those processes. The difference however is that most PLM platforms are created to support multiple projects within the same data base, whereas the VDC tools I’ve seen support project by project, database by database. Just like plant design solutions. This has the drawback that it is very difficult to harvest knowledge or re-use data across projects.

Picture
So do I mean that PLM platforms are better suited for the job than VDC tools of today? Not necessarily. The VDC tools I’ve seen have very strong integrations to the authoring tools that allows them to create a full virtual 3d model. This is partly due to the standardized format for information exchange in that industry, IFC (Industry Foundation Classes), and partly due to very good point integrations. They are also very strong in the domain almost overlapping ERP, the cost and estimation aspects, and they naturally need to integrate towards project planning tools in order to get the time dimension.

Now, that sounds very good, so what’s the problem? Well the problem is that the data management aspect is largely missing. This includes integrated dash board analysis across projects, change management, document management, revision and version control on object structures. The things PLM platforms do well. 

A few years ago I was responsible for the development of a solution on top of a PLM platform for the construction industry, and after a few visits to different BIM centers of excellence I got the impression that the BIM model was important, but one of the headaches for these companies was to keep track of, and control all the other information in a project that also was specifying to the project and to the “data structures” in the BIM model. This included thousands of documents….

So either VDC software companies should implement some of the things that PLM platforms are good at, or PLM software companies must implement some of the things VDC software do very well if they wish to expand their footprint in  the construction industry.

However, there is a third path as well. Maybe a VDC software company could collaborate with a PLM software company to create a killer platform for the construction industry…….

Now wouldn’t that be something?

Some points to ponder
Bjorn Fidjeland

2 Comments

Resistance to change, a PLM tale from Oil & Gas

4/21/2015

10 Comments

 
Recently I've been reading quite a lot regarding resistance to change both on personal level and organizational level:
Individuals don't like the unknown, it might pose a risk therefore they resist change.
The same principle seems to apply to organizations as well. 
Nobody would argue that the world is changing however. So what is driving all the changes around us?
In my view there has to be some external incentives to change. Either the changes are related to making life more comfortable
Picture
or they are a necessity to meet some external threat.

15 years ago I was part of an initiative called VisiWorld. The vision was using a PLM platform as a means to manage all data regarding a Facility from early planning through engineering, construction, operations and maintenance. Our Virtual Reality technology was used to visualize the vast amount of data through all lifecycle stages. The idea was that VisiWorld would integrate to a number of engineering and planning tools to create an information backbone where technical information could be consolidated, visualized and managed in a 3D world as well as structured data in 2D object structures.
To ensure full interoperability the data model and integrations were based on POSC/CAESAR (Later merged into ISO 15926)

The main objective was to reduce cost by making information consolidation and traceability easier, and to eliminate all the costly information handovers between EPC (Engineering Procurement and Construction) companies, product companies and ultimately Owner/Operator in capital projects. This would be achieved by working in a fully integrated environment. 

Picture
So what happened? VisiWorld never took off.

I think our project manager said it best when asked by a reporter actually months before we went under. The reporter asked: “What if this product does not become a success, what would the inscription on its tombstone be?” Our project manager thought for a while before responding: “Too much, too early”.

That was of course a sad ending to VisiWorld,  but not to this story.
I moved on to work with PLM in many different industries, but could never really forget about the concepts and ideas.

10 years later I again found myself building a “Plant/Facility Lifecycle Management” solution on top of a PLM system. It was nowhere near as ambitious as VisiWorld, but most of the concepts were there. This time the visualization engine was based on XMpLant and ISO 15926. What I found really interesting was that the ideas from 10 years before were still considered innovative and new by the industry. The oil&gas companies we approached with this solution were interested and agreed that this was the necessary way forward, but I still got the feeling that it was “Too much, too early”. 

The incentives to change were just not strong enough.

Now 15 years after our initial Facility Lifecycle Management project I find myself having helped two organizations implement some of the thoughts and concepts. Ironically, neither is in the Oil & Gas industry.

Why is that?
Change happens very slowly, unless there are very strong incentives. Those incentives have not been there before in the Oil & Gas industry. However they could very well be coming now, if the oil price stays as low as it is currently or even lower for years to come….
Other industries have been forced to change the way they act and operate due to shrinking margins and competition from others. 
Oh and when I say shrinking margins it is as in shrinking so much it means that either the company finds a better way of managing projects/products or die

Some points to ponder

Bjorn Fidjeland


Advanced Control – Drilling & Well AS has copyright to the images used in this post, but has graciously allowed me to use them.



10 Comments

The journey of PLM vs the journey of PDM

3/28/2015

0 Comments

 
Picture
Inspired by the blog post: “Is PLM a Journey? Follow (or Join!) the Blogfight!" in "The PLM Dojo" group at LinkedIn I started thinking about the topic of “PLM as a journey”. In my previous company I wrote the post “PLM – Tool or Mindset”, and a PLM implementation is in my view a journey pretty much as Jos Voskuil describes it. If and there is an if, you think about the scope of PLM (Product Lifecycle Management) then I think it is crucial to have a vision, strategy and a clear commitment from business in order to be able to execute. This is because the initiative involves several different departments, and in bigger organizations also multiple sites on different continents. Such a project becomes a journey, because in order to eat that particular elephant it is very important to do it one bite at a time, and in between each bite, business and the organization needs to digest and mature.
This is where organizational rollout and communication comes in as a very important factor. IT must in this case work very closely with business to deliver functionality after each bite has been swallowed…. A lot of eating here, but I think the analogy is good.


 I’ve been fortunate enough to be a part of a few such PLM projects, but in the beginning I must say that it was the proverbial catfight between business and IT. 
But, as healthy group processes were promoted and everybody tried to see it from the other party’s perspective they became ONE team with ONE goal. 

This could happen because there were external personnel present to mediate and translate the language of business to IT and vise versa.   

After a few months it was impossible to tell who was business and who was IT!
I  would describe this process as an important part of the PLM journey. Then, as business gets more and more of their processes implemented in the PLM system and the solution matures, more and more bites can be introduced and devoured. 
Picture

So what about PDM (Product Data Management)?
Well PDM is where PLM originally came from, and it primarily addresses the needs of product engineering and design. Mostly a one department effort, although such projects can also span multiple sites on different continents. 

However if we face the facts, most PLM projects today are STILL abut implementing PDM functionality in a full blown PLM platform. 

Why is that? 
Well in my view it is because:
a.       It started as an engineering or IT effort without appropriate business vision and strategy. 
b.      The PLM project, with business involved and strategy developed, got constipated because they bit over more than they could chew.

If the PDM project started as an IT or engineering department only effort, then there seems to be a glass ceiling preventing the project to get acceptance from business to grow the scope into a full PLM implementation.
I cannot really explain why, so feel free to comment!
My hunch though, is that it has to do with a certain “Not invented here” syndrome…. 
But I seriously doubt that business would say that out loud......

Some points to ponder
Bjorn Fidjeland

0 Comments
Forward>>

    plmPartner

    This is where we share our thoughts, ideas and experiences with you

    RSS Feed

    View my profile on LinkedIn

    Categories

    All
    AEC
    BIM
    Data Management
    Digital Enterprise
    Digital Transformation
    Digital Twin
    ERP
    Facility Lifecycle Management
    Governance
    Integration
    Internet Of Things
    IOT
    Platform
    PLM
    Process
    Product Lifecycle Management
    Strategy
    Structured Data
    Technical Information Management
    VDC
    Virtual Design And Construction

Contact us:
[email protected]