Is there 20% excess inventory hidden in your data?

22 Sep, 2020 •

bad data quality can result in excess inventory

Supply chain management is all about chasing down inefficiencies and ferreting them out. In the quest for a 100% perfect order rate, though, one very powerful factor is often overlooked: data quality. 

These days, the world is running lean. Customer lead-times are getting shorter, as expectations for on-time delivery are increasing. For years, supply chain managers have focused on increasing the accuracy of demand forecasts and optimizing inventory levels. To do that effectively, you need to have a firm grasp on your data.

Volatile demand is the enemy of smoothly operating supply chains; and in the age of COVID-19, volatility is everywhere. As companies are scrambling to deliver goods at a higher velocity than ever before, both the risk and impact of data quality issues increases significantly.

Garbage in, garbage out.

What’s the impact? At best, you’re running your business based on inaccurate planning parameters. At worst, you end up with obsolete inventory that diminishes in value by the day.

If you don’t have a proactive plan in place to manage data quality (DQ), you’re missing an opportunity to drive increased efficiency in your supply chain.

How Data Quality Impacts SCM

To meet customer demand and service levels, the right inventory must be positioned in the right place at the right time.  Given the number of SKUs, suppliers, warehouse locations, and product attributes that most companies must deal with; there’s a lot that can go wrong.

If stock levels are incorrect, it has obvious implications for your organization’s ability to fill orders. Too low, and you risk missing opportunities to sell inventory on hand. Too high, and you fail to meet customer expectations.

If you use safety stock calculation or inventory optimization tools, the problem gets worse; errors in data can create wildly inaccurate safety stock levels. That impacts planning, and it increases the likelihood of shortages or excess inventory.

Planning is supposed to increase efficiency. Planning with bad data does the opposite; it compounds the very problem that inventory management is supposed to solve.

Proactive data quality management attacks this problem head-on.  By preemptively monitoring inventory levels and verifying that they fall within expected ranges, SCM managers can guard against the kind of compound errors that result from poor data quality.

Effective DQ management, driven by automated alerts and workflows, provides a systematic quality control framework that can benefit the entire SCM process from end-to-end.

“When DQ goes high, inventory goes low.”

Stock Attributes

Incorrect inventory quantities are just one example.  Here’s another: inaccurate product attributes will result in bad MRP calculations.

Imagine that your company employs a mix of build-to-stock, configure-to-order and/or build-to-order. If an item is incorrectly flagged as a build-to-stock item, your planning and MRP systems may prompt you to produce inventory that is not needed; that is, it should not be produced unless or until it shows up on a customer purchase order.  With bad data, you end up making product that sits on the shelf.

Likewise, if the numbers for safety stock, lead-time, forecast variability, or supply lead-time are incorrect, it will drive bad decisions that make your inventory problems even worse.

Consistency Of Inventory Flags

Products often move from being new to active, then to discontinued, and sometimes back to inactive again. In the “new product” stage, a good deal of attention is focused on getting the data right. Later in the product lifecycle, things often go wrong. 

Imagine, for example, that your company has a policy of maintaining discontinued inventory only at distribution centers or other upstream locations. If an item is incorrectly flagged as active, or if there is a delay in flagging it as discontinued, then inventory of that product will be pushed downstream when it shouldn’t be. 

The result? Inventory accumulates at downstream locations, where the risk of it becoming obsolete gets significantly higher. That translates into lost margin; something that a proactive DQ management program could have prevented.

Well-defined data exception rules, combined with a system of alerts and workflows to ensure timely updates of inventory flags, can ensure that the company doesn’t continue to build inventory for discontinued products.

SCM Data Quality in Practice

At DVSUM, we’ve worked with some of the largest companies in the world to reduce waste and inefficiency in the supply chain. We’ve seen our clients achieve inventory reductions of between 10% and 20%, just by applying a systematic approach to data quality throughout their inventory and supply chain systems.  In other words, DQ alone (without any changes to existing inventory planning systems or methods) can drive significant efficiencies and lower inventory levels.

Now instead of garbage in, garbage out, we have quality in, quality out.

Here are the three key steps to making it happen:

  1. Modeling your business context exception rules to detect the right anomalies. The process begins with identifying the key factors that drive inaccuracies in the supply chain, and then establishing rules to detect anomalies.  With our extensive background in large enterprise supply chain management, we understand the many nuances of inventory management, demand forecasting, procurement, and execution.  We have first-hand experience in building DQ strategies and rule sets that can produce meaningful results.
  1. Building automation and workflows for data quality.  Efficient processes are built around automation and management-by-exception.  Effective data quality initiatives are no different. DVSUM enables SCM professionals to build custom workflows and processes that make fixing and remediation quick and repeatable.
  1. Make the quality of your data a key operational metric. Given the value-generating impact of DQ initiatives on SCM, we believe that data quality is a key leading indicator of accurate planning, forecasting, and inventory management. We’ve built data quality dashboards into our product that provide clear visibility and transparency for the entire organization.

This leads us to the final recommendation: SCM leaders should proactively develop and nurture a commitment to data quality throughout their organizations.  Here at DvSum, the business case for DQ is crystal clear; but we don’t expect others to take our word for it… we bring data. Verifiable results provide a solid foundation for building a shared consensus for an organizational commitment to data quality. DVSUM produces tangible, measurable results for our customers, and we can demonstrate that. We’d love to demonstrate that for you.  To find out more about how DVSUM can support your efforts to optimize your supply chain, contact us for a free demo.

Share this post:

DvSum Autonomous Data Management System

About DvSum

DvSum’s cloud platform enables a disruptive approach to not only validate and align your data, but also actually fix it. The patented technology scans, checks and compares multiple data sources simultaneously - without moving or consolidating the data. DvSum's engine leverages machine learning and artificial intelligence to auto-discover and solve issues proactively. Along with the socially driven rules library, companies are able to connect and be fixing live data within hours.

Learn how DvSum can improve your business

Let's discuss with you, how we can help you get more value out of your data.

Related posts

Currently there is no related posts

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Recent Posts