In today’s world of high-velocity down-to-the-minute logistics, understanding, controlling, and reliably measuring the performance, and the real cost of freight is a challenge. It involves collecting and matching up data from a half-dozen different systems. Accuracy, timeliness, and completeness are constant challenges in effective cost-savings initiatives. Therefore, a small error in a transaction record, if left uncorrected, eventually corrupts your spending and service analysis accuracy.

As shippers strive to make better decisions, their teams need confidence in the information guiding them. In a hyper-competitive market with skyrocketing consumer demands, improving the shipper-carrier relationship is essential. Thus, shippers need to understand precisely how their insights are affecting their decisions.

If you are justifying decisions on bad data, it doesn’t matter what the system or technology is, or how much you spend on technology. You won’t get the results you truly need to manage the business and control costs effectively.

Here are five common reasons cost-savings initiatives fail:

1. It takes too long to get the required information

It shouldn’t, but collecting all your data and ensuring you are receiving and incorporating all the proper sources takes time. Sources can include your TMS (with contracted rates and accessorials) or your carrier’s system (for bills of lading, tracking history of the shipment in transit, a delivery receipt showing any potential exceptions, and the freight invoice). Not to mention sources from your financial accounting or payables system that include how/when/how much was paid.

Delays cost precious time and money to decide with confidence and accuracy. A mere week can change the entire outcome of your decisions. Sure, you may have made the right decision based on the best data available. But what has changed in the meantime? And how might that new data change your decision and the success of your cost-savings initiatives?

2. You don’t trust the information

If you agree, look at why you don’t trust it. Sure, the master data was once pretty good.  However, from the time master data is created, it has to be rigorously maintained. Things change frequently — like locations, vendors, item masters, SKU’s, and packaging needs. Inventory moves around, locations open and close. Maintaining master data is not one and done — it is a continuous process and is not subjective. 

Let’s clear a misconception: Data cleansing means fixing an error’s root cause, not deleting. 

Once you introduce an error (or data becomes stale, and timeliness erodes), if left uncorrected, the process of using the data will make the exact same mistake, over and over, every time. It only takes one piece of bad data to create a cascading negative effect on decisions. Unless the details around each element are complete and accurate, trust in the data will be an issue.  Improperly maintained master data means those cost assumptions and spend analysis will be off the mark. But without all the proper details, identifying the root cause of the issue will be nearly impossible. 

If you don’t trust the insights you’re given, you will lack confidence and question the credibility of your outcomes. Risk increases and success rates decrease. A decision may be better than no decision, but will your next decision cost you your job?

3. You continually have to “fix” your datasets

There’s common malpractice here. Often in performing analyses, people “clean” the data, which means delete the data that appears to be wrong. However, these data points provide additional context that is describing the exceptions that occur. Removing this data invalidates the entire dataset because it no longer represents the real-world it was describing. 

Let’s clear a misconception: Data cleansing means fixing an error’s root cause, not deleting

Seasonality is an example. Your routing guide from Chicago to Sacramento suggests miles based on the most direct route. That’s fine — in the summer. But now it’s winter, and trucks can’t navigate I-70 through the Rocky Mountains. Instead, they have to go south across I- 40, and your TMS flags excessive miles as being “out of route” and over budget. Deleting these “outliers” removes valid scenarios from your analysis. What might appear to be an error may actually be crucial information needed to make the best decision for your cost-savings initiatives.

Like what you’re reading?

Subscribe to receive even more helpful resources, tips and news in your inbox!
  • Hidden
  • Hidden
    MM slash DD slash YYYY
  • This field is for validation purposes and should be left unchanged.

4. Balancing staffing levels with high rate tolerances

Exception management is a battle many organizations face. It’s a business culture problem that’s exacerbated by traditional freight-audit firms. These firms exist to foment an adversarial relationship between the shipper and carrier. They’re always looking for something wrong with the carrier’s billing. Their job is — and it predicates their existence on — finding errors, but never to keep the error from recurring. The more mistakes they discover, the more value they think they are providing.  

Rate tolerances are put in place to manage the volume of exceptions for both parties. The higher the tolerance, the fewer the exceptions, so it requires fewer bodies at the shipper to address these exceptions. But it’s only an illusion. The shipper has to pay the audit firm more money to find and never correct the same recurring mistakes. Hence, fixing the root-cause is a disincentive for the freight audit firm

Plus, the shipper is paying the carrier late, or short, which frustrates the carrier. As a result, the carrier deprioritizes the shipper, shifting capacity away from the shipper and towards its competitor. Eventually, the shipper must find less-than-optimal alternatives and/or pay more to ship the same freight.

Don’t leave it to chance

It’s understandable that once in a while, a carrier makes a mistake applying a rate. But when you look at the data, errors occur in the vast majority of cases because a rule or contract term is poorly defined, left up to interpretation, or pieces of data are merely missing. Clarifying rules and terms to eliminate subjectivity and ambiguity is critical to ensure no one is guessing.

Exception management doesn’t have to bog down your supply chain. Your processes should take advantage of artificial intelligence and machine learning tools. Technology can find and correct the cause of exceptions and resolve them in a way our human teams can’t. If done early in the process, systems can automate much of what our teams are currently doing, and empower you to allocate them to value-added work.

The overall impact of incomplete or poor data, no data quality and data integrity governance processes, and inaccurate spend analysis and management can easily run into the millions of dollars. 

5. Despite technology upgrades, costs keep increasing

Over time, legacy TMS platforms used by shippers need periodic updates and patches to keep working. These maintenance items involve extra fees paid to the software provider. Internal IT time and costs are necessary for internal systems to test and implement. As the profile of an organization’s supply chain and freight spend develop, and demands for more velocity and flexibility increase, this often requires additional system plug-ins or add-ons.

It all comes down to the data and how complete, accurate, and timely it is, and the effectiveness of the governance process to ensure proper maintenance and validation of quality. 

You might ask yourself, “Am I using data to make decisions or to justify my decisions?” It is surprising how many choose the latter. If you are justifying decisions on bad data, it doesn’t matter what the system or technology is. Or how much you spend on technology. You won’t get the results you truly need to manage the business and control costs effectively.

Small amounts really add up

The overall impact of incomplete or poor data, no data quality and data integrity governance processes can easily run into the millions of dollars. Not to mention the impact of inaccurate spend analysis and management. 

Just think about a national retailer moving 1,200 truckloads a day, at an average cost of $1,850 per load. If your freight audit and pay tolerance is 1 percent, that’s $22,200 you are potentially inflating your actual freight spend – every day! When this historical data determines how much your cost-savings initiatives will save, you will compare accurately rated shipments vs. a potentially inflated historical rate. You saved $500,000 instead of $1,000,000 because you ended up paying 1% more because of the rate tolerance; it wasn’t because you had a bad initiative.

Technology and Service Providers

At the end of the day, the freight data picture is complicated. The challenge in ensuring the quality, trust, and confidence in your freight insights can be overwhelming. Shippers and carriers each have their version of the problem and what caused it. Because of these challenges, focusing on the three pillars of data quality becomes a game-changer:

  1. Accurate — the data is the right value and the right format
  2. Complete — the data contains all the characteristics, i.e., nothing has been “cleaned”
  3. Timely — the data is available and accessible

Successful cost-savings initiatives require detailed insights and results that can be measured and monitored. We see that quality logistics data that is complete, accurate, and timely enables shippers to make better decisions. Without complete, timely, and accurate logistics data, a successful cost-savings initiative is out of reach. Today’s cloud-based supply chain technology advancements provide a solution for shippers to improve insights and finally achieve the sustainable cost-savings they need.