Nine use cases that justify more data for your business

· 3AG blog,analytics,data coach

Common refrains for businesses operating in a data-driven world include: “Where’s the data?”, “Could we just see the data?”, and “Can you back it up with data?” Very few organizations do not rely on data, to some degree, for their success these days. Yet, some still debate improving data collection and reporting, or, worry about defending the cost if they do.

It’s increasingly clear that not investing in data will cost more—in terms of profit, efficiency, and brand reputation—than doing so; with this in mind, we offer nine use cases to justify data expansion to your stakeholders and accounting team. Improving your corporate relationship with data can result in:

  • Improved quality management
  • Better inventory management
  • Reduced reporting costs
  • Real two-way communication between head-office and the plant floor
  • Improved insights
  • Better linkage between discrete parts of the factory floor
  • Quick responses to changing conditions
  • Improved energy efficiency
  • Improved maintenance planning and scheduling

Let’s review these in detail.

Making glass bottles

Improved quality management

From a data perspective, quality management means two things: How accurately organizations measure and record product quality on the factory floor; and, whether or not organizations properly report quality issues when they arise.

With respect to measuring and recording, organizations that use paper-based records for any aspect of quality management are wasting money. Paper-based recording requires double-entry, first to paper and then from paper to digital; this is expensive and time-consuming. Paper-based recording is also error-prone, which adds more to its unnecessary costs. In Cost of Quality: Not only failure costs, Arne Buthmann notes that paper-based data management costs quickly add up in terms of redoing work, delays, and customer bad-will, to name only a few of its issues.

Timely reporting is critical. For example, the average food manufacturing industry recall costs a staggering $10 million; and the longer it takes for such food recalls to be announced, the more expensive they become as more compromised product ends up in-market, more people get sick, etc. This phenomenon applies across industries: The longer it takes to identify quality issues in your plant, the bigger the final cost of the recall. Tools that centralize data collection and provide insightful, real-time reports are critical for accurate and effective quality management.

Better inventory management

If you use Excel for inventory management, you probably know how quickly inventory management can become an endless loop of unnecessary complexity. If you’re using paper-based inventory management, this issue will be further amplified.

An inventory-entry system with intuitive digital forms useable across devices (including mobile) will get the right data into your system. Centralizing this data is also critical, not only to ensure you work with a single source of truth, but also to make it easier for staff to analyze the entire operation instead of only small parts of it.

Inventory ages, deteriorates, and spoils. Poor management decisions resulting in data loss have real costs. American retailers saw an estimated inventory markdown of $300 billion in 2018 and inventory obsolesce is worth 6-12% of a company’s inventory—in non-pandemic years. Identifying inventory issues, minimizing working capital, and reporting this to staff to quickly improve the situation will save you money today and long-term. Given that most companies have been to some degree destabilized by COVID-19, data accuracy and efficiency of use are even more crucial.

Reduced reporting costs

Reporting costs incurred for finance, supply chain, or operations can be hefty. One 3AG client had 300 employees split between head office and their main operating site; at least three full-time resources were creating weekly status reports. This work involved downloading data snapshots from various systems (immediately rendering them obsolete), collecting and electronically transcribing paper records, and maintaining fragile Excel spreadsheets (the loss of which would be catastrophic). This team was performing a wide range of error-prone manual data entry and cleaning tasks, a waste of the skills of those involved.

Just as importantly, this reporting work consumed over 1% of the company’s total operating budget—and the opportunity cost was much higher.

Industrial juicing machine

Real two-way communication between head-office and the plant floor

When you “download to Excel” or perform any data download, you get a snapshot of that data in time. When collecting data required herculean effort, this made sense; and keeping historic records is important. In our digital world, we want everything in real time and we have the tools and technology to get it this way. We expect stock market investment information, traffic conditions, and weather forecasts to be correct and current now; corporate information should be just as accurate and therefore useful.

Traditionally, the cost and effort to deliver a company’s real-time operations and finance data were just too high. With modern data warehouse and reporting tools, however, this concern is moot. If staff are telling you otherwise, your reporting gatekeepers may require extra training or better tools to get them—and your business—up to speed.

Improved insights

Having more data will give your team better insights. Adding a vibration sensor to the factory floor could identify periodic excess vibrations correlated to a particular load. Your maintenance team inputting more detailed descriptions into ticket resolutions might identify a subset of repairs resolvable with higher quality screws. And so on. Increasing the amount of data you collect for your plant or broader operations will improve reporting resolution.

That said, increasing data volume will also increase analysis cost and complexity. Not only will it take more time to wade through more data, but you might also struggle to determine if a particular dataset has a signal buried in the noise, or vice versa.

To manage increased data load, you need the right infrastructure to make your data accessible, accurate, and up-to-date; you also need to be able to determine data statistical significance. Add accessible reporting, and your team will be truly competitive in a crowded, data-first market.

Better linkage between discrete parts of the factory floor

Adding more data can improve head office’s view of the factory floor; more data can also, crucially, effectively link different parts of the factory floor. Some floor connections should be automatic—for example, reading sensors on one side of the plant should automatically adjust a downstream machine via a programmable logic controller or manufacturing execution system.

Adding more data and building use-specific, real-time reports for floor workers can have a huge impact. Without accurate and timely on-floor data and departmental connections, workers may never have more than a general sense of how the plant overall is operating. This can mean something as minor as a floor employee noticing her station feed is slower than usual; but it can also mean major issues, like not realizing the plant is shutting down until it reaches her work area.

Global plant operation dashboards at all workstations can keep floor employees informed. With a better feel for the plant, they will begin noticing patterns and making better suggestions for improving efficiency.

Man in a factory

Quick responses to changing conditions

The sooner a problem is identified, the sooner it can be fixed; this is at the heart of modern data reporting. It’s not enough, however, simply to make source data visible. If it takes significant effort to read data, or analyze it to understand what’s happening, you lose the benefits of responding quickly.

To detect and take advantage of rapid changes, whether common or rare, you need solid reporting in place. Consider most new cars’ collision avoidance auto-braking systems: These systems interpret signals coming from many sensors. They handle a lot of data but if they can’t interpret incoming data to flag a particular pattern in time, such systems are effectively useless. Further, triggers (rapidly braking to avoid collisions) are rare events; systems must constantly monitor without sending either drivers or braking systems unnecessary information.

Data access, statistical measurements of data relevance, and infrastructure to manage all this data together enable rapid responses to changing conditions. More data is better, as long as you have properly designed reporting set up to manage it.