Follow us at
plmPartner
  • Home
  • About
  • Blog
  • Archive
  • Contact
  • Video Log
  • Learning Center
  • Privacy Policy

PLM Benchmark – Operator 1 What did they do and why?

3/9/2018

0 Comments

 
Picture
This as a first in a series of articles where I share some experiences with you from different product companies, EPC’s and operators.
The articles will cover the motivation for doing as they did, and where their main focus was put in order to achieve their goals.

There is a span in the different experiences of almost 20 years… I would like you to reflect a bit on that and keep in mind some of the buzzwords of today. Especially digital twin, IOT and Big Data analytics.
In this series I will use my information structure map, or the “circle of life” as a client jokingly called it, to explain where the different companies put their focus in terms of information management strategy and why.

An overview explaining the different structures can be found in the article:
Plant Information Management - Information Structures, and further details regarding each information structure are discussed in:
Plant Engineering meets Product Engineering in capital projects
Handover to logistics and supply chain in capital projects
Plant Information Management - Installation and Commissioning
Plant Information Management – Operations and Maintenance
​
Picture
Operator 1’s first objectives was to shorten the project execution time from design through installation and commissioning by letting the projects information model be gradually built up through all project phases and by all stakeholders in one common platform.
By doing it this way there would be no handover of documentation but rather a handover of access and responsibility of data. A large focus was put on standardizing information exchange between both stakeholders in the capital projects and between computer systems. The entry point to all information was a 3D representation of the data structures!

Makes you think of digital twin……. However this initiative was before anybody had heard of it...The 3D representation was NOT a design model, but rather a three-dimensional representation of the asset linked to all the information structures creating different dimensions or information layers if you will.

So it had to be quite small assets this operator was dealing with you might think?

Actually no, one of the assets managed was about a million tags. Concepts from the gaming industry like Level Of Detail and back-face culling were used to achieve the level of performance needed from the 3D side.
So why this enormous effort by an operator to streamline just the initial stages of an assets lifecycle?
I mean the operators real benefit comes from operating the asset in order to produce whatever it needs to produce, right?
​
Because it was seen as a prerequisite to capitalize on plant information in training, simulation, operations, maintenance and decommissioning. Two words summarizes the motivation: Maximum up-time. How to achieve it: operational run-time data from sensors linked and compared with accurate and parametric as-designed, as-built and as-maintained data.

​
Picture
​Figure 2. shows what information structures the operator put most emphasis on. Quite naturally the Functional structure (tag structure and design requirements), and corresponding physically installed asset information was highly important, and this is what they started with (see figure 3). Reference Data to be able to compare and consolidate data from the different structures was next in line together with an extensive parts (article) catalog of what could be supplied by whom in different regions of the world.
Picture
There was an understanding that a highly document-oriented industry could not shift completely to structured data and information structures overnight for everything, so document management was also included as what was regarded as an intermediate step. The last type of structure they focused on was project execution structures (Work Breakdown Structures). This was not because it was regarded as less important, actually it was regarded as highly important since it introduced the time dimension with traceability and control of who should do what, or did what when. The reasoning behind it was that since work breakdown structures tied into absolutely everything, they wanted to test and roll out the “base model” of data structures in the three-dimensional world (the 3D database) before introducing the fourth dimension.

​Bjorn Fidjeland

​
The header image used in this post is by Jacek Jędrzejowski and purchased at dreamstime.com
0 Comments

Big Data and PLM, what’s the connection?

1/3/2018

2 Comments

 
Picture
I was challenged the other day to explain the connection between Big Data and PLM by a former colleague. The connection might not be immediately apparent if your viewpoint is from traditional Product Lifecycle Management systems which primarily has to do with managing the design and engineering data of a product or plant/facility.

However, if we first take a look at a definition of Product Lifecycle Management from Wikipedia:

“In industry, product lifecycle management (PLM) is the process of managing the entire lifecycle of a product from inception, through engineering design and manufacture, to service and disposal of manufactured products. PLM integrates people, data, processes and business systems and provides a product information backbone for companies and their extended enterprise.”
​
Traditionally it has looked much like this
Picture
Then let’s look at a definition of Big Data
​

“Big data is data sets that are so voluminous and complex that traditional data processing application software are inadequate to deal with them. Big data challenges include capturing data, data storage, data analysis, search, sharing, transfer, visualization, querying, updating and information privacy. There are three dimensions to big data known as Volume, Variety and Velocity.
Lately, the term "big data" tends to refer to the use of predictive analytics, user behavior analytics, or certain other advanced data analytics methods that extract value from data, and seldom to a particular size of data set. "There is little doubt that the quantities of data now available are indeed large, but that’s not the most relevant characteristic of this new data ecosystem." Analysis of data sets can find new correlations to "spot business trends, prevent diseases, combat crime and so on.”

Included in Big Data you’ll find data sets harvested from sensors within all sorts of equipment and products as well as data fed back from software running within products. One can say that a portion of Big Data is the resulting feedback from the Internet of Things. Data in itself is not of any value whatsoever, but if the data can be analyzed to reveal meaning, trends or knowledge about how a product is used by different customer segments then it has tremendous value to product manufacturers.
If we take a look at the operational phase of a product, and by that, I mean everything that happens from manufactured product to disposal, then any manufacturer would like to get their hands on such data, either to improve the product itself or sell services associated with it. Such services could be anything from utilizing the product as a platform for an ecosystem of connected products to new business models where the product itself is not the key but rather the service it provides. You might sell guaranteed uptime or availability provided that the customer also buys into your service program for instance.

 
The resulting analysis of the data should in my view be managed by, or at least serve as input to the product definition because the knowledge gleamed from all the analytics of Big Data sets ultimately impacts the product definition itself since it should lead to revised product designs that fulfills the customer needs better. It might also lead to the revelation that it would be better to split a product in two different designs going after two distinct end user behavior categories found as a result of data analysis from the operational phase of the products.
​
Connected products, Big Data and analysis will to a far greater extent than before allow us to do the following instead:
Picture
It will mean that experience throughout the full lifecycle can be made available to develop better products, tailor to new end user behavior trends and create new business models.

Note: the image above focuses on the feedback loops to product engineering, but such feedback loops should also be made available from for instance service and operation to manufacturing.

Most companies I work with tell me that the feedback loops described in the image above is either too poor, or virtually nonexistent. Furthermore, they all say that such feedback loops are becoming vital for their survival as more and more of their revenue comes from services after a product sale and not from the product sale itself. This means that it is imperative for them to have as much reliable and analyzed data as possible about their products performance in the field, how their customers are actually using them and how they are maintained.

For these companies at least, the connection between Big Data analysis and its impact on Product Lifecycle Management is becoming clearer and clearer.


Bjorn Fidjeland


The header image used in this post is by garrykillian and purchased at dreamstime.com

2 Comments

Plant Information Management – Operations and Maintenance

1/29/2017

0 Comments

 
Picture
This post is a continuation of the posts in the Plant Information Management series of:
“Plant Information Management - Installation and Commissioning”
“Handover to logistics and supply chain in capital projects”
“Plant Engineering meets Product Engineering in capital projects”
 “Plant Information Management - What to manage?”

During operations and maintenance, the two main structures of information needed in order to operate the plant in a safe and reliable manner is the functional or tag structure and the physically installed structure.
The functional tag structure is a multidiscipline consolidated view of all design requirements and criteria, whereas the physically installed structure is a representation of what was actually installed and commissioned together with associated data. It is important to note that the physically installed structure evolves over time during operations and maintenance, so it is vital to make baselines of both structures together to obtain “As-Installed” and “As-Commissioned” documentation
​
Picture
Figure 1.
​

Let’s zoom in on some of the typical use cases of the two structures.
Picture
Figure 2.
​

The requirements in the blue tag structure are fulfilled by the physical installation, the yellow structures. In a previous post I promised to get back to why they are represented as separate objects. The reason for this is that during operations one would often like to replace a physical individual on site with another physical individual. This new physical individual still has to fulfill the tag requirements, as the tag requirements (system design) have not changed. In addition we need full traceability of not only what is currently installed, but also what used to be installed at that functional location (see figure 3).
Picture
Figure 3.

Here we have replaced the vacuum pump during operations with another vacuum pump from another vendor. The new vacuum pump must comply with the same functional requirements as the old one even if they might have different product designs.
This is a very common use case where a product manufacturing company comes up with a new design a few years later. The product might be a lot cheaper and still fulfills the requirements, so if the operator of the plant has 500 instances of such products in the facility, it makes perfect sense to replace them when the old product nears end of life or have extensive maintenance programs.
 
Another very important reason to keep the tag requirements and physically installed as separate objects is if….or rather when the operator wishes to execute a modification or extension project to the plant.
In such cases one must still manage and record the day to day operation of the plant (work requests and work orders performed on physical equipment in the plant) while at the same time performing a plant design and execution project. This entails Design, Engineering, Procurement, Construction and Commissioning all over again.
Picture
Figure 4.
​

The figure shows, that when the blue functional tag structure is kept separate from the yellow physically installed structure we can still operate the current plant on a day to day basis, and at the same time perform new design on the revised system (Revision B).
This allows us to execute all the processes right up until commissioning on the new revision, and when successfully commissioned, the revision B becomes operational.
​
This all sounds very good in theory, but in practice it is a bit more challenging, as there in the meantime might have been made change orders that effected the design of the previous revision as a result of operations. This is one of the use cases where structured or linked data instead of a document centric approach really pays off, because such a change order would immediately indicate that it would affect the new design, and thus,  appropriate measures can be taken at an early stage instead of nasty surprises popping up during installation and commissioning of the new system.

Bjorn Fidjeland

The header image used in this post is by nightman1965 and purchased at dreamstime.com
0 Comments

Plant Information Management - Installation and Commissioning

1/27/2017

0 Comments

 
Picture
I realize that the last post “Handover to logistics and supply chain in capital projects” went quite a lot further in the information lifecycle than the headline suggested, so here is a brief recap on how structured and linked data can support processes during construction/installation and commissioning.

This post is a continuation of the posts in the Plant Information Management series of:
 “Handover to logistics and supply chain in capital projects”
“Plant Engineering meets Product Engineering in capital projects”
 “Plant Information Management - What to manage?”

Let’s jump in and follow the journey of the manufactured physical products as they move into installation and commissioning phases.
​
Picture
Figure 1.
Provided that the information from the different structures and their context in relation to each other is kept, it is possible to trace perfectly what physical items should be installed where, corresponding to the tag requirements in the project (note: I’ve removed the connections from tag to EBOM in this figure for clarity).

We are now able to connect the information from tag: =AB.ACC01.IS01.VS04.EP03, the one in the safety classed area to the physical item with serial number S/N: AL11234-12-15 that contains the documentation proving that it is fit for purpose in a safety classed area.
As the other two tags are not in a safety classed area, and have no special requirements, any of the two physical pumps can be used to fulfill the tag requirements, however we still want full traceability for commissioning, operations & maintenance.
​
Picture
Figure 2.
Since we now have a connection between the tag requirements and the physically installed individuals, we can commence with various commissioning tests and verify that what we actually installed works as intended in relation to what we designed (the plant system), and furthermore we can associate certificates, commissioning documentation and processes to the physical individuals.

The reason for this split between tag object and physical item object I’d like to come back to in a future post regarding operations and maintenance.


Bjorn Fidjeland

The header image used in this post is by Satori13 and purchased at dreamstime.com

0 Comments

Handover to logistics and supply chain in capital projects

12/12/2016

0 Comments

 
Picture
This post is a continuation of the post “Plant Engineering meets Product Engineering in capital projects” and “Plant Information Management - What to manage?”
​
As the last post dwelled on how EPC’s and product companies are trying to promote re-use in very Engineer To Order (ETO) intensive projects, we will focus on the handover to supply chain and logistics in this post.

The relationship between the tag containing the project specific requirements, and the article or part containing the generic product design constitutes a project specific demand that supply chain and logistics should know about. If both the tag and the connected part is released, a “signal” is sent with information regarding both the tag’s requirements and the part’s requirements.
​An exception to this rule is typically Long Lead Items (LLI). I’ve seen this handled via a special process that allows transfer of the information to supply chain and logistics even if the specific tag has not been released.
Picture
Figure 1.
As the project specific information regarding all three tags and the intended use of product design is sent to logistics and supply chain it is possible to distinguish what tags need special attention and what tags can be ordered “off the shelf”.

Let’s say that tag: =AB.ACC01.IS01.VS04.EP03 is in a safety classed area and the other two are not. Information in the purchase order for the safety classed tag must then contain information to the manufacturer that documentation regarding the manufacturing process must follow the produced individual that will be used to implement this specific tag, whereas the other two deliveries can have standard documentation.
​
Picture
Figure 2.
Figure 2 depicts that all three manufactured products or physical items with serial numbers come from the same Engineering Bill Of Material, but that the individual with serial number S/N: AL11234-12-15 has some extra information attached.
This is because since it is to be used in a safety classed environment, proof must be produced from the manufacturer’s side that the product fulfills the safety class requirements given on the tag. This could for instance be X-Ray documentation that all welds are up to spec or that the alloy used has sufficient quality.
As you can see, If the information is kept as information structures with relationships between the different data sets detailing what context the different information is used in, it becomes possible to trace and manage it all in project specific processes.
There are some other very important information structures that I mentioned in the post “Plant Information Management - What to manage?” like the Sales BOM (similar to manufacturing industries Manufacturing BOM), the Supply BOM and warehouse management, however I would like to cover those in more detail later in later posts.
​
For now let’s follow the journey of the manufactured products as they move into installation and commissioning.


Picture
Figure 3.
Provided that the information from the different structures and their context in relation to each other is kept, it is possible to trace perfectly what physical items should be installed where, corresponding to the tag requirements in the project (note: I’ve removed the connections from tag to EBOM in this figure for clarity).

We are now able to connect the information from tag: =AB.ACC01.IS01.VS04.EP03, the one in the safety classed area, to the physical item with serial number S/N: AL11234-12-15 that contains the documentation proving that it is fit for purpose in a safety classed area.
As the other two tags are not in a safety classed area, and have no special requirements, any of the two physical pumps can be used to fulfill the tag requirements, however we still want full traceability for commissioning, operations & maintenance.
​
Picture
Figure 4
Since we now have a connection between the tag requirements and the physically installed individuals, we can commence with various commissioning tests and verify that what we actually installed works as intended in relation to what we designed (the plant system), and furthermore we can associate certificates and commissioning documentation to the physical individuals.
The reason for this split between tag object and physical item object I’d like to come back to in a future post regarding operations and maintenance.

Bjorn Fidjeland


The header image used in this post is by Nostal6ie and purchased at dreamstime.com
0 Comments

Plant Engineering meets Product Engineering in capital projects

9/30/2016

0 Comments

 
Picture
This post is a follow up of “Plant Information Management - What to manage?”.

It focuses on the needed collaboration between Plant Engineering (highly project intensive) and Product Engineering which ideally should be “off the shelf” or at least Configure To Order (CTO), but in reality is more often than not, Engineer To Order (ETO) or one-offs.

More and more EPC’s (Engineering Procurement Construction companies), and product companies exposed to project intensive industries are focusing hard on ways to  re-use product designs from one project to the next or even internally in the same project through various forms of configuration and clever use of master data, see “Engineering Master Data - Why is it different?”.
​
However, we will never get away from the fact that the product delivery in a capital project will always have to fulfill specific requirements from Plant Engineering, and especially in safety classed areas of the plant.
If you look at the blue object structure, it represents a consolidated view of multi-discipline plant engineering. The system might consist of several pumps, heat exchangers, sensors, instrumentation and pipes, but we are going to focus on a specific tag and it’s requirements, namely one of the pumps in the system.​
Picture
At one point in the plant engineering process the design is deemed fit for project procurement to start investigating product designs that might fulfill the requirements stated in the plant system design.
If the plant design is made by an EPC that does not own any product companies, the representing product is typically a single article or part with associated preferred vendors/manufacturers who might be able to produce such a product or have it in stock. If the EPC does own product companies, the representing product might be a full product design. In other words a full Engineering Bill Of Material (EBOM) of the product.
 
This is where it becomes very interesting indeed because the product design (EBOM) is generic in nature. It represents a blueprint or mold if you will, used to produce many physical products or instances of the product design. The physical products typically have serial numbers, and you are able to touch them. However, due to requirements from the Owner/Operator, the EPC will very often dictate both project and tag specific documentation from the product company supplying to the project, which in turn often leads to replication of the product designs X number of times to achieve compliance with the documentation requirements in the project (Documentation For Installation and Operations).
​
Picture
So, even if it is exactly the same product design it ends up being copied each time there is a project specific delivery. This often happens even if let’s say 40 pumps are being supplied by the same vendor to the same project, as responses to the requirements on 40 different tags in the plant design……
Needless to say it becomes a lot of Engineering Bill Of Materials in order to comply with documentation requirements in capital projects. Even worse, for the product companies it becomes virtually impossible to determine exactly what they have delivered each time, since it is different Engineering Bills Of Materials all the time, yet 97% of the information might be the same. The standardized product has now become an Engineer To Order product.
So how is it possible to avoid this monstrous duplication of work?
More and more companies are looking into ways to make use of data structures used in different contexts. The contexts might be different deliveries to the same project or across multiple projects, but if one is able to identify and separate the generic information from what information that needs to be project specific it is also possible to facilitate re-use.
​
Picture
​The image above shows how a generic product design (EBOM) is able to fulfill three different project specific tags or functional locations in a plant. Naturally three physical instances or serial numbers must then be manufactured based on the generic product design, but since we have the link or relationship between the project specific requirements (the tags) and the generic (the EBOM), one can generate project specific data and documentation without making changes to the generic representation of the product (the EBOM).
This approach even enables the product company to identify and manufacture one of the pumps which happens to be in a safety classed area in the plant design according to regulatory requirements without having to make changes or duplicate the product design, however more on that next time.
 
Bjorn Fidjeland


The header image used in this post is by Nostal6ie and purchased at dreamstime.com
0 Comments

Managing Documentation For Installation and Operations

5/1/2016

0 Comments

 
Picture
​In one of my previous articles “Plant Information Management - What to manage?”
I wrote about different information structures needed in a plant project from early design through commissioning and operations.
​
The article left some questions hanging. One of them was: how can all this information be consolidated, managed and distributed to the various stakeholders in a plant project at the right time and with the right quality?

​Traditionally this has been called LCI or LifeCycle Information, at least in Norwegian oil & gas industry, DFI/DFO internationally or Documentation For Installation / Documentation For Operations. In short it is the operator’s requirements and needs for information from early design through engineering, procurement, construction and up to and including commissioning. The requirements are safety, regulatory and also requirements for information the operator finds important in order to control, monitor and guide progress of the project executed by the EPC.
Picture
​As the figure describes, the operator drives expectation of deliveries in terms of standardization, safety & regulatory and expected documentation needed to operate and maintain the plant after commissioning. All stakeholders in the value chain must abide to these requirements, and it is the EPC who usually has the task of coordinating and consolidating this mountain of information. A successful commissioning includes that the operator confirms that it has received all documentation and information required to operate the plant in a safe and regulatory compliant manner. At this point the EPC is excused from the project.
Picture
​In theory, the documentation handover would look like the figure above, however operators experience has told them that this seldom works well. Therefore a much more frequent information exchange is required between EPC and operator leading up towards commissioning. The main reason for this is that it enables the operator to monitor, check and verify the progress in the project. It also makes for a more gradual build-up and maturing of documentation in the project. For the EPC it means frantic activity to secure all required documentation from its own engineering disciplines, and all external companies in the projects value chain (see the pyramid in figure 1.) at each milestone.
Picture

Traditionally a whole host of LCI Coordinators has been needed both on the EPC side, and on the operator side to make sure that all documentation is present, and if not, to make sure it is created…. The very “best” LCI coordinators on the EPC side manage to produce the information without “bothering” engineering too much. It has largely been a document centric process separated from the plant & product engineering process.
 
As long as EPC’s are only active in one country, this approach is manageable for them, however once they turn global, they find themselves having to deal with many different safety standards, regulatory standards and last but not least varying requirements and formats from different operators. Even product companies and module suppliers delivering to projects in different parts of the world experience the same thing.

In later years I’ve experienced more and more interest in leaving the document centric approach for a more data centric approach. This means that data is created and consolidated from various disciplines in data structures as described in the article “Plant Information Management - What to manage?”, and that the LCI process becomes an integral part of the engineering, procurement, construction and commissioning processes instead of being a largely separated one.

 Of course there are varying strategies among companies when it comes to how much to manage, and how to hand it over.
  • Some create data structures in PLM like platforms, consolidates them, manage changes and transfer data to other stakeholders in the projects via generated transmittals. This is more similar to the document centric approach only more automated.
  • Some companies target re-use from project to project in addition to the aspects mentioned above by creating data structures in catalogs that can be selected in other projects as well. The selected data structure is then replicated in a project specific context and gets auto generated project specific information like tags and documentation.
  • Others again removes or reduces the need for transmittals and document handovers by letting the project stakeholders directly into their platform to work and deliver information there instead of handing over documents.
  • One approach was to not hand over documents at all, but simply give the operator access to the platform, link the information from the data structures as deliverables to the milestones the operator required, and then handing over the entire platform to the operator as Documentation For Operations after successful commissioning.

​Bjorn Fidjeland


The header image used in this post is by Norbert Buchholz and purchased at dreamstime.com
0 Comments

PLM platforms, the difficult organizational rollout

3/6/2016

0 Comments

 
Picture
What is PLM really about? In my view it is about tying relevant information to business processes, you know, the stuff that makes your company truly unique and then tying your employees to those very same processes throughout the life of a product.

So it’s about information, processes, people and an IT platform, in this case a PLM platform.
​

To be successful, ALL areas must intersect.



It does not matter if you have the perfect PLM system with perfectly defined processes if the information you need to manage is bad.

Just as little as it will help to have good quality data with perfectly defined processes and an organization ready to adopt it if the PLM platform is unable to scale to your needs.

It will not help to have good quality data tied to perfectly defined processes and a state of the art PLM system either if nobody is using it….

So going back to the headline: PLM platforms, the difficult organizational rollout.
I’ve seen far too many PLM implementations underperform due to unsuccessful rollout in the organization.
I find it strange that although the projects are often run iteratively to develop or customize smaller chunks of functionality in each iteration to ensure success, one expects the end users to devour the full elephant of the project in more or less one big bite…

In my view a rollout of such a large and business critical platform should also be considered iterative and with time for the end users to come to terms with what they have learned after each iteration before the next iteration starts.
I would compare it to building a house.
​ You would never start erecting the walls before the concrete slab is sufficiently cured.
The same is true for an organization. If more functionality and new processes are put on top before the previously learned functionality and processes has had time to settle, you get resistance, and the foundation becomes weak.

Another important factor is to not only train the end users in a classroom environment and then expect them to perform well in their new system… Because they won’t.
They’re still afraid to do something wrong, and they will struggle to remember what they learned in the classroom.
Then they will try to find solutions in the manuals, and growing more and more frustrated by the minute.

If this frustration is allowed to continue for too long, you can be sure that the end result is that they feel that the system is too difficult to use and basically suck. It might sound childish, but holding hands work! Have some super users or trainers available in the everyday work situation to help and guide the users the first few weeks.​
That will mitigate the fear factor of doing something wrong, and steadily build confidence and ability.

​Bjorn Fidjeland
0 Comments

Challenges when going from entrepreneur to industrialized manufacturer

1/3/2016

0 Comments

 
Picture
​

​In my neck of the woods there are a lot of very talented engineers, and a lot of entrepreneurial spirit. My region (South Western part of Norway) is very much exposed to oil & gas industry and the delivery of products to plants, oil platforms etc. This means that it is very project focused and ETO (Engineer To Order) intensive.

The entrepreneurial spirit I mentioned has led to a whole host of startups with good ideas of how to solve some problem with a new product in better and more cost effective ways.

One story I keep hearing from such product companies, not only in Norway, but also in other countries and project intensive industries goes something like this:
So we won our first contract and the customer is really impressed with our product and our technology. It became a bit more expensive to deliver the project than we thought, but we managed and we were sure we would have better returns on the next project.
​
As time progresses we expect our cost in the projects to drop significantly.

​
Picture
​But what happens in a lot of such product companies?
Picture
​There is nothing strange in expecting such a development. One would instinctively think that one would be able to shorten the project execution time as one gains experience and have successfully delivered such a product before. The organization knows what suppliers can deliver and which cannot. Engineers and employees in installation and commissioning are becoming more and more experienced etc.
​It becomes very, very hard to drive down the cost of delivering projects, even if the product delivered from project to project is very similar. The transition from entrepreneur to industrialized manufacturer becomes hard for a lot of these companies.
Why is that?

Personally I think there are several factors
  • It was too hard to say no to those small insignificant changes that the client required in the next project….. That for engineering or manufacturing turned out to be not so insignificant.
  • Product development is constantly being performed in the projects. Engineers will always search for the perfect and most elegant solution. That does not mean that it is the best or most cost effective way to manufacture the product.
  • Clients or operators documentation requirements in terms of LCI (LifeCycle Information) deliveries. If the product company is unable to define a process to deal with shifting requirements from operator to operator, this becomes a manual nightmare that constantly diverts resources. Such a process should be an integrated part of the project execution process, and not as it mostly is today, a separated process.
  • It is in my view paramount that smaller parts of the product at least, is standardized and modularized  in such a way that the engineering information can be re-used from project to project (You can read more about my views here: “Engineering Master Data - Why is it different?” and “Can PLM help industrializing Oil & Gas projects?”   )
  • Last but not least there is a screaming need to manage project specific engineering data (Tag structures, P&ID’s, D&ID’s, electrical) together with, but NOT in a one to one relation with more generic product development data

I’ve seen the three last bullets addressed with PLM platforms at various companies, however the technology itself is just one factor. The organizational processes and how they are enforced in the platform is of far bigger importance.

The first two bullets are a lot harder, as they require a shift in mindset from entrepreneur to industrialized manufacturer of the organization. This includes going from quick and nimble to more standardized processes, and continuous process improvement. If you consider PLM as a mindset rather than just a technology, you will also harvest  benefits here, but it is hard work.

Bjorn Fidjeland




0 Comments

Plant Information Management - Information Structures

11/8/2015

0 Comments

 
Picture
​Plant or Facility projects are demanding and require enormous volumes of detailed information from various information sources, information structures from different stakeholders, and in most cases also different companies.
This blog post is an attempt to describe the different information structures and how they tie into each other on a high level. I would be very happy if you could comment, state your views, and maybe help me complement the picture
Information structures throughout a plant project:
​
The project execution structure or WBS (Work Breakdown Structure) is key to plan, staff, execute and measure the progress of the project. It is usually developed in a solution like Primavera or Safran.​
Picture
​
​The functional structure or tag structure represents the functional decomposition of the multi discipline plant design which includes design from process engineering, mechanical, piping material & installation, electrical, instrumentation and civil. The information from these disciplines are usually 2D process flow diagrams (PFD), piping and instrumentation diagrams (P&ID), general arrangement drawings (GA), ducting and instrumentation diagrams(D&ID), material lists etc. In addition there is usually a more or less multi discipline 3D model of the plant as well.  All of this information needs to be consolidated and monitored against the project execution plan (WBS)
Picture
​
​Information in the functional structure is usually classified according to some kind of standard to better aid consolidation of data and interoperability between disciplines and/or companies in the supply chain
Picture
​
​The location structure represents the physical location (where stuff is supposed to be installed). This structure is typically an area structure divided in areas and zones, or buildings, floors and rooms. By separating the functional location structure (tag structure) and the location structure it becomes possible to trace the tag for a cable that spans multiple locations like multiple floors, rooms or areas.
​
The more civil & structural or conventional facilities there are in a project, the more important this structure becomes as they to a lesser degree have tag information in their deliverables to the project but a lot of location centric information.
Already now the number of information structures and their interaction becomes quite complex, and we’ve still only covered plant engineering and execution.
Picture

​As the plant design is starting to mature, discussions are started with equipment vendors and service providers who can supply equipment that fulfills the requirements stated in the plant design (functional structure). This can be pure off the shelf products or engineer to order (ETO) products. What they all have in common is a product design or engineering bill of material (which is far more detailed than both the EPC and ultimately the operator is interested in, but vital to the product development). Right now there is a debate regarding how much project specific information is really needed (in Norway), as even off the shelf products need to be documented quite rigorously for each delivery (tag) to a project. This has led to an explosion in cost.
Picture

​Based on the product design, a commercial structure or Sales BOM is devised from the product manufacturer. This structure includes what needs to be procured and how the company intends to sell the product in question. If it is an off the shelf product this structure is along with the engineering bill of material quite generic, but if it is an engineer to order product they are quite specifically tailored to each project delivery.
Picture

​The result of a product sales in a project will from the product manufacturer mean that a more or less structured Supply BOM is created. This information structure captures how the product is intended to be supplied. This includes whether it should be assembled in the product manufacturers workshop and then shipped to site, or if the product needs to be shipped in crates and assembled at the site itself
Picture

​The warehouse structure or system at the project site must be populated with information from the product suppliers regarding what product deliveries are scheduled for installation, and what products are going into the warehouse as spares.
Picture

​The physical devises with serial numbers are recorded (usually in a MRO system) and “mated” with the tag or functional location they are supposed to fulfill in the plant. The physical delivery for the product to be installed includes supplier documentation, product documentation, spare parts lists and certificates.
Picture

​The relationships between functional structure (tag) and physical structure (serial numbers) constitutes the basis for installation and commissioning procedures. Each product delivery (serial number) undergoes mechanical completion tests to make sure it fulfills the requirements set by the tag requirements. As the plant is more and more finished, subsystems are tested and undergoes commissioning procedures, then bigger and bigger systems until hot commissioning is performed on the entire plant.
All the information represented by the functional structure and the physical structure in the end represents the documentation for operation (DFO), and is the basis from which the operator can operate and maintain the plant in a regulatory compliant and safe manner.
Picture

​​Now this all looks very sequential and nice, but the reality is that information needs to flow back and forth between operator, EPC and product companies all the way from early design through installation, commissioning and ultimately operations & maintenance.

So how can all this information be consolidated, managed and distributet to the various stakeholders in a plant project at the right time and with the right quality?

Some points to ponder
​Bjorn Fidjeland

The header image used in this post is by Nostal6ie and purchased at dreamstime.com
0 Comments
<<Previous
Forward>>

    plmPartner

    This is where we share our thoughts, ideas and experiences with you

    RSS Feed

    View my profile on LinkedIn

    Categories

    All
    AEC
    BIM
    Data Management
    Digital Twin
    ERP
    Facility Lifecycle Management
    Integration
    Internet Of Things
    IOT
    Platform
    PLM
    Process
    Product Lifecycle Management
    Strategy
    Structured Data
    Technical Information Management
    VDC
    Virtual Design And Construction

Contact us:
plmPartner AS    Lyngfjellveien 14    4580 Lyngdal    Norway    +47 99 03 05 19    info@plmpartner.com