It is safe to say that a lot is riding on asset data. Asset data is required for proper maintenance and operation, and proper maintenance and operation are essential to safety and production.

One obvious challenge is getting the data right in the first place.

In capital projects, data challenges are apparent when the ownership of asset data is transferred from engineering contractors to the Owner/Operator. The change in data ownership often occurs abruptly near the commissioning of the asset. This is commonly known as the “dump truck effect”.

The Owner/Operator is handed over an overwhelming volume of data just before operation, leaving little or no time for the gathering of feedback or structured validation. When data quality issues are finally identified, the contractors have typically already moved on to another project.

Asset-Lifecycle

To avoid the “dump truck” effect, organizations in asset-intensive industries are adopting cloud-based technologies that enable continuous data handover. The cloud acts as a data hub that brings contractors and the Owner/Operator together for on-going data acquisition and building activities. Engineering tags are converted to a maintenance hierarchy, spare parts are allocated to equipment, and maintenance planning is performed on critical assets; all in a data hub that promotes connectivity and transparency. Data integrity checks (including things like the detection of missing BOMs on safety critical equipment) are assisted by automation, allowing people to focus on actual maintenance planning.  Data is validated as it is developed and it all starts as early as the FEED (post-conceptual development) of the asset.

Employing an open approach to asset data management also creates opportunities for reuse. There are, for example, many instances where the same equipment is deployed in multiple parts of the organization, or even in multiple parts of the same plant. There is no reason to reinvent the way identical equipment is maintained when operating conditions are comparable. The open approach I have been describing enables experts in different parts of the organization to share data templates and best practices.

To measure data quality before it is too late, contractors and Owner/Operators have established data completeness objectives in each phase of the asset life cycle. An example may be completeness of the asset data hierarchy by the end of FEED. Such measurements can only be made when data is shared and continuously fine tuned in a central environment.

The best is yet to come.

The cloud is the ideal place for a global community of asset data experts to collaborate and share lessons learned. As the cloud continues to absorb asset data and usage behaviours, it is only a matter of time before patterns emerge around how people maintain and operate their assets. These are patterns that go beyond what is documented in industry standards and user manuals.  With such patterns, the cloud will be able to make intelligent recommendations for data building, remediation and validation.

Considering the importance of data quality, it is difficult to justify the traditional approach of managing data in silos. The open approach offers many benefits today while laying out the groundwork for exciting possibilities.

 
Alfred Yang | @Alfredhubhead
Vice-President, Global Customer Service
 

Share this article

FacebooktwitterredditpinterestlinkedinmailFacebooktwitterredditpinterestlinkedinmail