Follow us at
plmPartner
  • Home
  • About
  • Blog
  • Archive
  • Video Log
  • Podcast
  • Learning Center
  • Privacy Policy

PLM Benchmark 2 – EPC 1 What did they do and why?

4/27/2018

0 Comments

 
Picture
This is the second article in the series regarding PLM benchmarking among operators, EPC’s and product companies where I share some experiences with you originating from different companies.
The articles cover the motivation for doing as they did, and where their main focus was put in order to achieve their goals.
In this series I use my information structure map, or the “circle of life” as a client jokingly called it, to explain where the different companies put their focus in terms of information management and why. 
Picture
​EPC 1’s first objective was to replace an in-house built engineering data hub. The reason for this was that over the years, needs and requirements had changed both from customers (operators) as the EPC went global, and internally within the organization. This situation lead to more and more customizations of the engineering data hub resulting in sky rocketing cost of ownership, and ironically, less and less flexibility.

This is by no means a unique situation as many EPC’s were forced to build such hubs in the late nineties for consolidation and control of multidiscipline plant information since no software vendor at the time could support their needs.

Secondly it was considered crucial to enable standardization and re-use of previously delivered designs and engineering data.
A huge effort was put on building reference data for sharing and alignment across plant engineering disciplines, procurement and ultimately client handover of Documentation For Installation & Operations (DFI/DFO). An ISO 15926 ontology was put in place for this purpose.
The main reason for enabling standardization and re-use of engineering data however, was to reduce the gigantic number of engineering hours that were spent in the early phases of each project delivery. Especially during the FEED phase (Front End Engineering and Design). Another important reason was to connect engineering with procurement and the wider supply chain more seamlessly.
Picture
​Figure 2. shows what information structures EPC 1 put most emphasis on. Quite naturally the Functional Location structure (tag structure, multi discipline plant design requirements) received a lot of focus. To enable re-use and efficient transfer of data, both the reference data and a library of re-usable design structures using the reference data was built.

Extensive analysis of previously executed projects revealed that even if the EPC had a lot of engineering concepts and data that could be re-used across projects, they more often than not created everything from scratch in the next project. In order to capitalize on and manage the collective know-how of the organization, the re-usable design structures received a lot of focus.

EPC 1 also faced different requirements from operators with respect to use of tagging standards depending on what parts of the world they delivered projects to, so as a consequence, multiple tagging standards needed to be supported. It was decided that no matter what format the operator wanted to receive, all tags in all projects would be governed by an internal “master-tag” in the EPC’s own system while communicated to the customer in their specified format.

The third focus area was an extensive part (or article) library with internal part numbers and characteristics showing what kind of products could fulfill the tag requirements in the functional structure. Each part was then linked via a relationship to objects representing preferred suppliers of that product in different regions of the world. This concept greatly aided engineering procurement when performing Material Take-Off (MTO) since each tag would be linked to a part where preferred supplier could be selected. 
Picture
​EPC 1 chose to focus on the reference data first in order to get a common agreement regarding needed data across their disciplines during the EPC project lifecycle. Next in line was the catalog of re-usable engineering structures. These structures could be used and selected as a starting point in any EPC project.
The third delivery in the project centered around delivering the capabilities to create and use the different plant engineering structures (functional structure, tags, with connected parts where both entities used the same reference data )
 
An overview explaining the different structures can be found in the article:
Plant Information Management - Information Structures, and further details regarding each information structure are discussed in:
Plant Engineering meets Product Engineering in capital projects
Handover to logistics and supply chain in capital projects
Plant Information Management - Installation and Commissioning
Plant Information Management – Operations and Maintenance

Bjorn Fidjeland

The header image used in this post is by Viacheslav Iacobchuk and purchased at dreamstime.com
0 Comments

PLM Benchmark – Operator 1 What did they do and why?

3/9/2018

0 Comments

 
Picture
This as a first in a series of articles where I share some experiences with you from different product companies, EPC’s and operators.
The articles will cover the motivation for doing as they did, and where their main focus was put in order to achieve their goals.

There is a span in the different experiences of almost 20 years… I would like you to reflect a bit on that and keep in mind some of the buzzwords of today. Especially digital twin, IOT and Big Data analytics.
In this series I will use my information structure map, or the “circle of life” as a client jokingly called it, to explain where the different companies put their focus in terms of information management strategy and why.

An overview explaining the different structures can be found in the article:
Plant Information Management - Information Structures, and further details regarding each information structure are discussed in:
Plant Engineering meets Product Engineering in capital projects
Handover to logistics and supply chain in capital projects
Plant Information Management - Installation and Commissioning
Plant Information Management – Operations and Maintenance
​
Picture
Operator 1’s first objectives was to shorten the project execution time from design through installation and commissioning by letting the projects information model be gradually built up through all project phases and by all stakeholders in one common platform.
By doing it this way there would be no handover of documentation but rather a handover of access and responsibility of data. A large focus was put on standardizing information exchange between both stakeholders in the capital projects and between computer systems. The entry point to all information was a 3D representation of the data structures!

Makes you think of digital twin……. However this initiative was before anybody had heard of it...The 3D representation was NOT a design model, but rather a three-dimensional representation of the asset linked to all the information structures creating different dimensions or information layers if you will.

So it had to be quite small assets this operator was dealing with you might think?

Actually no, one of the assets managed was about a million tags. Concepts from the gaming industry like Level Of Detail and back-face culling were used to achieve the level of performance needed from the 3D side.
So why this enormous effort by an operator to streamline just the initial stages of an assets lifecycle?
I mean the operators real benefit comes from operating the asset in order to produce whatever it needs to produce, right?
​
Because it was seen as a prerequisite to capitalize on plant information in training, simulation, operations, maintenance and decommissioning. Two words summarizes the motivation: Maximum up-time. How to achieve it: operational run-time data from sensors linked and compared with accurate and parametric as-designed, as-built and as-maintained data.

​
Picture
​Figure 2. shows what information structures the operator put most emphasis on. Quite naturally the Functional structure (tag structure and design requirements), and corresponding physically installed asset information was highly important, and this is what they started with (see figure 3). Reference Data to be able to compare and consolidate data from the different structures was next in line together with an extensive parts (article) catalog of what could be supplied by whom in different regions of the world.
Picture
There was an understanding that a highly document-oriented industry could not shift completely to structured data and information structures overnight for everything, so document management was also included as what was regarded as an intermediate step. The last type of structure they focused on was project execution structures (Work Breakdown Structures). This was not because it was regarded as less important, actually it was regarded as highly important since it introduced the time dimension with traceability and control of who should do what, or did what when. The reasoning behind it was that since work breakdown structures tied into absolutely everything, they wanted to test and roll out the “base model” of data structures in the three-dimensional world (the 3D database) before introducing the fourth dimension.

​Bjorn Fidjeland

​
The header image used in this post is by Jacek Jędrzejowski and purchased at dreamstime.com
0 Comments

Digital Twin - What needs to be under the hood?

10/22/2017

0 Comments

 
Picture
In the article Plant Information Management – Information Structures, and the following posts regarding Plant Information Management (see Archive) I explained in more detail the various information structures, the importance of structuring the data as object structures with interconnecting relationships to create context between the different information sources. 

​What does all of this have to do with the digital twin? - Let's have a look.

Picture
​Information structures and their interconnecting relationships can be described by one of the major fashion word these days, the digital thread or digital twin.
The term and concept of a digital twin was first coined by Michael Grieves at the University of Michigan in 2002, but has since taken on a life of its own in different companies.
 
Below is an example of what information can be accessed from a digital twin or rather what the digital twin can serve as an entry point for:
Picture
​If your data is structured in such a way with connected objects, attributes and properties, an associated three-dimensional representation of the physically delivered instance is a tremendously valuable asset as a carrier of information. It is however, not a pre- requisite that it is a 3D model, a simple dashboard giving access to the individual physical items might be enough. The 3D stuff is always promoted in the glossy sales representations by various companies, but it’s not needed for every possible use case. In a plant or aircraft, it makes a lot of sense, since the volume of information and number of possible entry points to the full data set is staggering, but it might not be necessary to have individual three-dimensional representations for all mobile phones ever sold. It might suffice to have each data set associated with each serial number.
 
On the other hand, if you have a 3D representation, it can become a front end used by end users for finding, searching and analyzing all connected information from the data structures described in my previous blog posts. Such insights takes us to a whole new level of understanding of each delivered products life, their challenges and opportunities in different environments and the way they are actually being used by end customers.
 
Let’s say that we via the digital twin in the figure above select a pump. The tag of that pump uniquely identifies the functional location in the facility. An end user can pull information from the system the pump belongs to in the form of a parametric Piping & Instrumentation Diagram (P&ID), the functional specification for the pump in the designed system, information about the actually installed pump with serial number, manufacturing information, supplier, certificates, performed installation & commissioning procedures and actual operational data of the pump itself.
 
The real power in the operational phase becomes evident when operational data is associated with each delivered pump. In such a case the operational data can be compared with environmental conditions the physical equipment operates in. Let’s say that the fluid being pumped contains more and more sediments, and our historical records of similar conditions tells us that the pump will likely fail during the next ten days due to wear and tear of critical components. However, it is also indicated that if we reduce the power by 5 percent we will be able to operate the full system until the next scheduled maintenance window in 15 days. Information like that gives real business value in terms of increased uptime.
 
Let’s look at some other possibilities.
If we now consider a full facility with a three-dimensional representation:
During the EPC phase it is possible to associate the 3D model with a fourth dimension, time, turning it into a 4D model. By doing so, the model can be used to analyze and validate different installation execution plans, or monitor the actual ongoing installation of the Facility. We can actually see the individual parts of the model appearing as time progresses.
 
A fifth dimension can also be added, namely cost. Here the cost development over time according to one or several proposed installation execution plans or the actual installation itself can be analyzed or monitored.
This is already being done by some early movers in the construction industry where it is referred to as 5D or Virtual Design & Construction.
 
The model can also serve as an important asset when planning and coordinating space claims made by different disciplines during the design as well as during the actual installation. It can easily give visual feedback if there is a conflict between space claims made by electrical engineering and mechanical engineering, or if there is a conflict in the installation execution plan in terms of planned access by different working crews.
More and more companies are also making use of laser scanning in order to get an accurate 3D model of what has been actually installed so far. This model can easily be compared with the design model to see if there are any deviations. If deviations are found, they can be acted upon by analyzing how it will impact the overall system if it is left as it is, or will it require re-design? Does the decision to leave it as it is change the performance of the overall system? Are we still able to perform the rest of the installation, due to less available space?
Answers to these questions might entail that we will have to dismantle the parts of the system that has deviations. It is however a lot better and cost effective to identify such problems as early as possible.
 
This is just great, right? Such insights as mentioned would have huge impacts on how EPC’s manage their projects, operators run their plants and how product vendors can operate or service their equipment in the field, as well as feeding information back to engineering to make better products.
​
​New business models can be created in the likes of: “We sell power by the hour, dear customer, you don’t even have to buy the asset itself”!
(Power-by-the-Hour is a trademark of Rolls-Royce, although the concept itself is 50 years old you can read about a more recent development here)
 
So why haven’t more companies already done it?
 
Because in order to get there, the underlying data must be connected, and in the form of… yes data as in objects, attributes and relationships. It requires a massive shift from document orientation to connected data orientation to be at its most effective.
 
On the bright side, several companies in very diverse industries have started this journey, and some are already starting to harvest the fruits of their adventure.
​
My advice to any company thinking about doing the same would be along the lines of:
When eating this particular elephant, do it one bite at the time, remember to swallow and let your organization digest between each bite.

Bjorn Fidjeland

The header image used in this post is by Elnur and purchased at dreamstime.com

​
0 Comments

Digitalization - sure, but on what foundation?

4/7/2017

5 Comments

 
Picture
The last couple of years I’ve been working with some companies on digitalization projects and strategies. Digitalization is of course very attractive in a number of industries:

  • Equipment manufacturers, where digitalization can be merged with Internet Of Things to create completely new service offerings and relationships with the customers
  • Capital projects EPC’s and operators, where a digital representation of the delivery can be handed over as a “digital twin” to the operator , and where the operator can use it and hook it up to EAM or MRO solutions to monitor the physical asset real-time in a virtual world. The real value for the operator here is increased up-time and lower operational costs, whereas EPC’s can offer new kinds of services and in addition mitigate project risks better.
  • Construction industry, where the use of VDC (Virtual Design & Construction) technology can be extended to help the facility owner minimize operational costs and optimize comfort for tenants by connecting all kinds of sensors in a modern building and adjust accordingly.
But hang on a second: If we look at the definition of digitalization, at least the way Gartner views it

“Digitalization is the use of digital technologies to change a business model and provide new revenue and value-producing opportunities; it is the process of moving to a digital business.” (Source: Gartner)

…The process of moving to a digital business….

The digitalization strategies of most of the companies I’ve been working with focuses on the creation of new services and revenue possibilities on the service side of the lifecycle of a product or facility, so AFTER the product has been delivered, or the plant is in operation.
There is nothing wrong with that, but if the process from design through engineering and manufacturing is not fully digitalised (by that I do not mean documents in digital format, but data as information structures linked together) then it becomes very difficult to capitalize on the promises of the digitalization strategy.
​
Consider 2 examples
Picture
Figure 1.
​
Figure 1 describes a scenario where design and engineering tools work more or less independently and where the result is consolidated in documents or excel before communicated to ERP. This is the extreme scenario to illustrate the point, and most companies have some sort of PDM/PLM or Engineering Register to perform at least partial consolidation of data before sending to ERP. However I often find some design or engineering tools operating as “islands” outside the consolidation layer.

So if we switch viewpoint to the new digital service offering promoted to end customers. What happens when a sensor is reporting back a fault in the delivered product? The service organization must know exactly what has been delivered, where the nearest spare parts are, how the product  is calibrated etc. to quickly fix the problem with a minimum use of resources in order to make a profit and to exceed customer expectation to gain a good reputation.
​
How likely is that to happen with the setup in figure 1?

​
Picture
Figure 2.
​
The setup in figure 2 describes a situation where design and engineering information is consolidated together with information regarding the actually delivered physical products. This approach does not necessarily dictate that the information is only available in one and only one software platform, however the essence is that the data must be structured and consolidated.

Again let’s switch viewpoint to the new digital service offering promoted to end customers. What happens when a sensor is reporting back a fault in the delivered product?
When data is available as structured and linked data it is instantly available to the service organization, and appropriate measures can be taken while informing the customer with accurate data.
​
My clear recommendation is that if you are embarking on a digitalization journey to enhance your service offering and offer new service models, then make sure you have a solid digital foundation to build those offerings on. Because if you don’t it will be very difficult to achieve the margins you are dreaming of.
​
Bjorn Fidjeland


The header image used in this post is by kurhan and purchased at dreamstime.com
5 Comments

Plant Information Management – Operations and Maintenance

1/29/2017

0 Comments

 
Picture
This post is a continuation of the posts in the Plant Information Management series of:
“Plant Information Management - Installation and Commissioning”
“Handover to logistics and supply chain in capital projects”
“Plant Engineering meets Product Engineering in capital projects”
 “Plant Information Management - What to manage?”

During operations and maintenance, the two main structures of information needed in order to operate the plant in a safe and reliable manner is the functional or tag structure and the physically installed structure.
The functional tag structure is a multidiscipline consolidated view of all design requirements and criteria, whereas the physically installed structure is a representation of what was actually installed and commissioned together with associated data. It is important to note that the physically installed structure evolves over time during operations and maintenance, so it is vital to make baselines of both structures together to obtain “As-Installed” and “As-Commissioned” documentation
​
Picture
Figure 1.
​

Let’s zoom in on some of the typical use cases of the two structures.
Picture
Figure 2.
​

The requirements in the blue tag structure are fulfilled by the physical installation, the yellow structures. In a previous post I promised to get back to why they are represented as separate objects. The reason for this is that during operations one would often like to replace a physical individual on site with another physical individual. This new physical individual still has to fulfill the tag requirements, as the tag requirements (system design) have not changed. In addition we need full traceability of not only what is currently installed, but also what used to be installed at that functional location (see figure 3).
Picture
Figure 3.

Here we have replaced the vacuum pump during operations with another vacuum pump from another vendor. The new vacuum pump must comply with the same functional requirements as the old one even if they might have different product designs.
This is a very common use case where a product manufacturing company comes up with a new design a few years later. The product might be a lot cheaper and still fulfills the requirements, so if the operator of the plant has 500 instances of such products in the facility, it makes perfect sense to replace them when the old product nears end of life or have extensive maintenance programs.
 
Another very important reason to keep the tag requirements and physically installed as separate objects is if….or rather when the operator wishes to execute a modification or extension project to the plant.
In such cases one must still manage and record the day to day operation of the plant (work requests and work orders performed on physical equipment in the plant) while at the same time performing a plant design and execution project. This entails Design, Engineering, Procurement, Construction and Commissioning all over again.
Picture
Figure 4.
​

The figure shows, that when the blue functional tag structure is kept separate from the yellow physically installed structure we can still operate the current plant on a day to day basis, and at the same time perform new design on the revised system (Revision B).
This allows us to execute all the processes right up until commissioning on the new revision, and when successfully commissioned, the revision B becomes operational.
​
This all sounds very good in theory, but in practice it is a bit more challenging, as there in the meantime might have been made change orders that effected the design of the previous revision as a result of operations. This is one of the use cases where structured or linked data instead of a document centric approach really pays off, because such a change order would immediately indicate that it would affect the new design, and thus,  appropriate measures can be taken at an early stage instead of nasty surprises popping up during installation and commissioning of the new system.

Bjorn Fidjeland

The header image used in this post is by nightman1965 and purchased at dreamstime.com
0 Comments

Plant Information Management - Installation and Commissioning

1/27/2017

0 Comments

 
Picture
I realize that the last post “Handover to logistics and supply chain in capital projects” went quite a lot further in the information lifecycle than the headline suggested, so here is a brief recap on how structured and linked data can support processes during construction/installation and commissioning.

This post is a continuation of the posts in the Plant Information Management series of:
 “Handover to logistics and supply chain in capital projects”
“Plant Engineering meets Product Engineering in capital projects”
 “Plant Information Management - What to manage?”

Let’s jump in and follow the journey of the manufactured physical products as they move into installation and commissioning phases.
​
Picture
Figure 1.
Provided that the information from the different structures and their context in relation to each other is kept, it is possible to trace perfectly what physical items should be installed where, corresponding to the tag requirements in the project (note: I’ve removed the connections from tag to EBOM in this figure for clarity).

We are now able to connect the information from tag: =AB.ACC01.IS01.VS04.EP03, the one in the safety classed area to the physical item with serial number S/N: AL11234-12-15 that contains the documentation proving that it is fit for purpose in a safety classed area.
As the other two tags are not in a safety classed area, and have no special requirements, any of the two physical pumps can be used to fulfill the tag requirements, however we still want full traceability for commissioning, operations & maintenance.
​
Picture
Figure 2.
Since we now have a connection between the tag requirements and the physically installed individuals, we can commence with various commissioning tests and verify that what we actually installed works as intended in relation to what we designed (the plant system), and furthermore we can associate certificates, commissioning documentation and processes to the physical individuals.

The reason for this split between tag object and physical item object I’d like to come back to in a future post regarding operations and maintenance.


Bjorn Fidjeland

The header image used in this post is by Satori13 and purchased at dreamstime.com

0 Comments

Handover to logistics and supply chain in capital projects

12/12/2016

1 Comment

 
Picture
This post is a continuation of the post “Plant Engineering meets Product Engineering in capital projects” and “Plant Information Management - What to manage?”
​
As the last post dwelled on how EPC’s and product companies are trying to promote re-use in very Engineer To Order (ETO) intensive projects, we will focus on the handover to supply chain and logistics in this post.

The relationship between the tag containing the project specific requirements, and the article or part containing the generic product design constitutes a project specific demand that supply chain and logistics should know about. If both the tag and the connected part is released, a “signal” is sent with information regarding both the tag’s requirements and the part’s requirements.
​An exception to this rule is typically Long Lead Items (LLI). I’ve seen this handled via a special process that allows transfer of the information to supply chain and logistics even if the specific tag has not been released.
Picture
Figure 1.
As the project specific information regarding all three tags and the intended use of product design is sent to logistics and supply chain it is possible to distinguish what tags need special attention and what tags can be ordered “off the shelf”.

Let’s say that tag: =AB.ACC01.IS01.VS04.EP03 is in a safety classed area and the other two are not. Information in the purchase order for the safety classed tag must then contain information to the manufacturer that documentation regarding the manufacturing process must follow the produced individual that will be used to implement this specific tag, whereas the other two deliveries can have standard documentation.
​
Picture
Figure 2.
Figure 2 depicts that all three manufactured products or physical items with serial numbers come from the same Engineering Bill Of Material, but that the individual with serial number S/N: AL11234-12-15 has some extra information attached.
This is because since it is to be used in a safety classed environment, proof must be produced from the manufacturer’s side that the product fulfills the safety class requirements given on the tag. This could for instance be X-Ray documentation that all welds are up to spec or that the alloy used has sufficient quality.
As you can see, If the information is kept as information structures with relationships between the different data sets detailing what context the different information is used in, it becomes possible to trace and manage it all in project specific processes.
There are some other very important information structures that I mentioned in the post “Plant Information Management - What to manage?” like the Sales BOM (similar to manufacturing industries Manufacturing BOM), the Supply BOM and warehouse management, however I would like to cover those in more detail later in later posts.
​
For now let’s follow the journey of the manufactured products as they move into installation and commissioning.


Picture
Figure 3.
Provided that the information from the different structures and their context in relation to each other is kept, it is possible to trace perfectly what physical items should be installed where, corresponding to the tag requirements in the project (note: I’ve removed the connections from tag to EBOM in this figure for clarity).

We are now able to connect the information from tag: =AB.ACC01.IS01.VS04.EP03, the one in the safety classed area, to the physical item with serial number S/N: AL11234-12-15 that contains the documentation proving that it is fit for purpose in a safety classed area.
As the other two tags are not in a safety classed area, and have no special requirements, any of the two physical pumps can be used to fulfill the tag requirements, however we still want full traceability for commissioning, operations & maintenance.
​
Picture
Figure 4
Since we now have a connection between the tag requirements and the physically installed individuals, we can commence with various commissioning tests and verify that what we actually installed works as intended in relation to what we designed (the plant system), and furthermore we can associate certificates and commissioning documentation to the physical individuals.
The reason for this split between tag object and physical item object I’d like to come back to in a future post regarding operations and maintenance.

Bjorn Fidjeland


The header image used in this post is by Nostal6ie and purchased at dreamstime.com
1 Comment

Plant Engineering meets Product Engineering in capital projects

9/30/2016

0 Comments

 
Picture
This post is a follow up of “Plant Information Management - What to manage?”.

It focuses on the needed collaboration between Plant Engineering (highly project intensive) and Product Engineering which ideally should be “off the shelf” or at least Configure To Order (CTO), but in reality is more often than not, Engineer To Order (ETO) or one-offs.

More and more EPC’s (Engineering Procurement Construction companies), and product companies exposed to project intensive industries are focusing hard on ways to  re-use product designs from one project to the next or even internally in the same project through various forms of configuration and clever use of master data, see “Engineering Master Data - Why is it different?”.
​
However, we will never get away from the fact that the product delivery in a capital project will always have to fulfill specific requirements from Plant Engineering, and especially in safety classed areas of the plant.
If you look at the blue object structure, it represents a consolidated view of multi-discipline plant engineering. The system might consist of several pumps, heat exchangers, sensors, instrumentation and pipes, but we are going to focus on a specific tag and it’s requirements, namely one of the pumps in the system.​
Picture
At one point in the plant engineering process the design is deemed fit for project procurement to start investigating product designs that might fulfill the requirements stated in the plant system design.
If the plant design is made by an EPC that does not own any product companies, the representing product is typically a single article or part with associated preferred vendors/manufacturers who might be able to produce such a product or have it in stock. If the EPC does own product companies, the representing product might be a full product design. In other words a full Engineering Bill Of Material (EBOM) of the product.
 
This is where it becomes very interesting indeed because the product design (EBOM) is generic in nature. It represents a blueprint or mold if you will, used to produce many physical products or instances of the product design. The physical products typically have serial numbers, and you are able to touch them. However, due to requirements from the Owner/Operator, the EPC will very often dictate both project and tag specific documentation from the product company supplying to the project, which in turn often leads to replication of the product designs X number of times to achieve compliance with the documentation requirements in the project (Documentation For Installation and Operations).
​
Picture
So, even if it is exactly the same product design it ends up being copied each time there is a project specific delivery. This often happens even if let’s say 40 pumps are being supplied by the same vendor to the same project, as responses to the requirements on 40 different tags in the plant design……
Needless to say it becomes a lot of Engineering Bill Of Materials in order to comply with documentation requirements in capital projects. Even worse, for the product companies it becomes virtually impossible to determine exactly what they have delivered each time, since it is different Engineering Bills Of Materials all the time, yet 97% of the information might be the same. The standardized product has now become an Engineer To Order product.
So how is it possible to avoid this monstrous duplication of work?
More and more companies are looking into ways to make use of data structures used in different contexts. The contexts might be different deliveries to the same project or across multiple projects, but if one is able to identify and separate the generic information from what information that needs to be project specific it is also possible to facilitate re-use.
​
Picture
​The image above shows how a generic product design (EBOM) is able to fulfill three different project specific tags or functional locations in a plant. Naturally three physical instances or serial numbers must then be manufactured based on the generic product design, but since we have the link or relationship between the project specific requirements (the tags) and the generic (the EBOM), one can generate project specific data and documentation without making changes to the generic representation of the product (the EBOM).
This approach even enables the product company to identify and manufacture one of the pumps which happens to be in a safety classed area in the plant design according to regulatory requirements without having to make changes or duplicate the product design, however more on that next time.
 
Bjorn Fidjeland


The header image used in this post is by Nostal6ie and purchased at dreamstime.com
0 Comments

Managing Documentation For Installation and Operations

5/1/2016

0 Comments

 
Picture
​In one of my previous articles “Plant Information Management - What to manage?”
I wrote about different information structures needed in a plant project from early design through commissioning and operations.
​
The article left some questions hanging. One of them was: how can all this information be consolidated, managed and distributed to the various stakeholders in a plant project at the right time and with the right quality?

​Traditionally this has been called LCI or LifeCycle Information, at least in Norwegian oil & gas industry, DFI/DFO internationally or Documentation For Installation / Documentation For Operations. In short it is the operator’s requirements and needs for information from early design through engineering, procurement, construction and up to and including commissioning. The requirements are safety, regulatory and also requirements for information the operator finds important in order to control, monitor and guide progress of the project executed by the EPC.
Picture
​As the figure describes, the operator drives expectation of deliveries in terms of standardization, safety & regulatory and expected documentation needed to operate and maintain the plant after commissioning. All stakeholders in the value chain must abide to these requirements, and it is the EPC who usually has the task of coordinating and consolidating this mountain of information. A successful commissioning includes that the operator confirms that it has received all documentation and information required to operate the plant in a safe and regulatory compliant manner. At this point the EPC is excused from the project.
Picture
​In theory, the documentation handover would look like the figure above, however operators experience has told them that this seldom works well. Therefore a much more frequent information exchange is required between EPC and operator leading up towards commissioning. The main reason for this is that it enables the operator to monitor, check and verify the progress in the project. It also makes for a more gradual build-up and maturing of documentation in the project. For the EPC it means frantic activity to secure all required documentation from its own engineering disciplines, and all external companies in the projects value chain (see the pyramid in figure 1.) at each milestone.
Picture

Traditionally a whole host of LCI Coordinators has been needed both on the EPC side, and on the operator side to make sure that all documentation is present, and if not, to make sure it is created…. The very “best” LCI coordinators on the EPC side manage to produce the information without “bothering” engineering too much. It has largely been a document centric process separated from the plant & product engineering process.
 
As long as EPC’s are only active in one country, this approach is manageable for them, however once they turn global, they find themselves having to deal with many different safety standards, regulatory standards and last but not least varying requirements and formats from different operators. Even product companies and module suppliers delivering to projects in different parts of the world experience the same thing.

In later years I’ve experienced more and more interest in leaving the document centric approach for a more data centric approach. This means that data is created and consolidated from various disciplines in data structures as described in the article “Plant Information Management - What to manage?”, and that the LCI process becomes an integral part of the engineering, procurement, construction and commissioning processes instead of being a largely separated one.

 Of course there are varying strategies among companies when it comes to how much to manage, and how to hand it over.
  • Some create data structures in PLM like platforms, consolidates them, manage changes and transfer data to other stakeholders in the projects via generated transmittals. This is more similar to the document centric approach only more automated.
  • Some companies target re-use from project to project in addition to the aspects mentioned above by creating data structures in catalogs that can be selected in other projects as well. The selected data structure is then replicated in a project specific context and gets auto generated project specific information like tags and documentation.
  • Others again removes or reduces the need for transmittals and document handovers by letting the project stakeholders directly into their platform to work and deliver information there instead of handing over documents.
  • One approach was to not hand over documents at all, but simply give the operator access to the platform, link the information from the data structures as deliverables to the milestones the operator required, and then handing over the entire platform to the operator as Documentation For Operations after successful commissioning.

​Bjorn Fidjeland


The header image used in this post is by Norbert Buchholz and purchased at dreamstime.com
0 Comments

Why manage multiple BIM's together

2/7/2016

0 Comments

 
Picture
In the construction industry today there is a lot of talk, and quite a few interesting initiatives around how companies can utilize VDC (Virtual Design & Construction) and BIM (Building Information Model or Modelling) in order to bring down the cost of projects, or to be able to estimate the cost of projects more accurately. That is to say to avoid some of those nasty surprises that often occur at some point in construction projects.

One company has taken the following approach: They use VDC very actively in the proposal phase, where they create several project alternatives in the virtual world with BIM models, and then use VDC software to simulate optimal cost compositions both in terms of materials used and labor. In addition, several different project execution schedules are tested.
The philosophy is that one should weed out the errors and mistakes commonly found in construction projects in the virtual world where it is cheap, instead of at the construction site where it is very expensive.

If you can understand Danish you can learn more here: MT Højgaards lille revolution

I would agree that this approach will optimize and help a lot in the actual project. It is in my view a big step in the right direction. However, big projects today are still executed in “silos”. There is very little communication between projects, re-use of concepts, knowledge, data or documentation. In other words, each project is its own world and one gets very dependent on knowledgeable key people to pull off successful projects.  Historically this is not so strange (think about the Master Builders), and the same thing is true for plant related industries, however some interesting changes are happening there.
More and more EPC’s (Engineering Procurement and Construction companies) have moved in a direction where concepts, processes, knowledge, engineering data, documentation, supply and logistics data is shared across projects as a form of structured master data. This is achieved by defining modular and smaller building blocks that are needed in projects, and allowing them to be adjusted in the project specific context. Each project is of course still a unique “one-off” but with more and more re-use, traceability and controlled changes.

So how could this translate to construction industry?

Well, I think a first step would be to manage the portfolio of projects, and BIM models together in some sort of information backbone. This could be a PLM like solution (There are examples of those) where data can be analyzed, managed and executed with process support across all projects. You can read more about this in one of my earlier blog posts: “VDC is that like PLM for construction industry”. Such a solution would also brings control to the thousands of documents that are still specifying to the projects and the BIM models themselves. An approach like this would take a big step towards “linked or structured data”
Another approach could be more federated, however it would still require some sort of engine to orchestrate data and information from all systems involved in all projects.

If you’d like to read more about a data oriented approach, I would recommend these 3 articles written by Jos Voskuil
“The difference between files and data-oriented – a tutorial part 1”
“The difference between files and data-oriented – a tutorial part 2”
“The difference between files and data-oriented – a tutorial part 3”

​Bjorn Fidjeland

The image used in this post is by Adam121 and purchased at dreamstime.com


0 Comments
<<Previous
Forward>>

    plmPartner

    This is where we share our thoughts, ideas and experiences with you

    RSS Feed

    View my profile on LinkedIn

    Categories

    All
    AEC
    BIM
    Data Management
    Digital Enterprise
    Digital Transformation
    Digital Twin
    ERP
    Facility Lifecycle Management
    Governance
    Integration
    Internet Of Things
    IOT
    Platform
    PLM
    Process
    Product Lifecycle Management
    Strategy
    Structured Data
    Technical Information Management
    VDC
    Virtual Design And Construction

Contact us:
[email protected]