But one of the points that Henning made really struck me as being a key point to help organizations benefit from analytics. He was making it in the context of predictive analytics but I believe it applies to analytics generally. He said that one of the things that surprised him was that the hardest part of getting the technology to work was not figuring out the AI algorithms but getting the input data right.
In the spot welding case study it had something to do with making sure that production data was correctly aligned to time stamps. At least that’s my 30000 ft understanding of the issue since I’m definitely not an OT guy. The point is that there wasn’t anything really wrong with how they historically collected data but that they needed it to be a bit more accurate to support predictive analytics. So it wasn’t really a case of garbage in garbage out. More a need to improve an existing big data set to support a new use case of predictive analytics.
This is something we see all the time in our business when we discuss maintenance optimization with our customers. Analytics are an important first step because it provides the business with guidance on where to start first in addressing maintenance optimization and data issues. To find out more about how to run an effective Data Cleansing Project read our whitepaper.
But the number one issue our customers run in to is that although there is a lot of good information in the historical work orders the data is not consistent, perfectly accurate or amenable to easy analytical analysis. It’s not garbage, but it’s not ready for analysis yet. In order to prepare the historical work order data for analysis you need to first undertake a data quality improvement exercise. Being able to load the data into a staging area and correct data quality issues efficiently prior to analysis is extremely useful. Some of the data quality issues result from human error or lack of training – for example, not entering failure information correctly or not tracking hours worked accurately enough. But many of the historical work order data issues we see aren’t errors or garbage but just a result of inconsistency across large data sets. For example, businesses that have grown by merger and acquisition. Or different countries with different standards for doing work. Maybe the issues arise from ongoing improvements to maintenance processes and practices that result in data being collected differently. In any case the data needs to be standardized for effective analysis.
To find out more about how HubHead can help you with historical work order analysis contact us and book a meeting.
Share this article