Accuracy or Precision?

 

It is the mark of an educated mind to rest satisfied with the degree of precision which the nature of the subject admits and not to seek exactness where only an approximation is possible. – Aristotle (384 BC – 322 BC)

barn

My favourite part about writing these articles is finding just the right quote to introduce them. Before we get started, go back and read the quote from Aristotle above if you happened to skip past it – I think it both accurately and precisely summarizes my argument.

Now in the context of forecasting for the supply chain, let’s talk about what each of these terms mean:

Accuracy: Ability to hit the target (i.e. how close is the actual to the forecast?)

Precision: Size of the target you’re aiming at (i.e. specificity of product, place and timing of the forecast)

I’m sorry to be a total downer, but the reason this article is titled Accuracy or Precision is because you can’t have both. The upper right quadrant in the illustration above ain’t happening (a bit more on that later).

In the world of forecasting, people seem obsessed with accuracy and often ask questions like:

  • What level of forecast accuracy are you achieving?
  • How should we be benchmarking our forecast accuracy?
  • Are we accurate enough? How can we be more accurate?

The problem here is that any discussion about forecast accuracy that does not at the same time account for precision is a complete waste of time.

For example, one tried and true method for increasing forecast accuracy is by harnessing the mystical properties of The Law of Large Numbers.

To put it another way – by sacrificing precision.

Or to put it in the most cheeky way possible (many thanks to Richard Sherman for this gem, which I quote often):


Sherman’s Law:
Forecast accuracy improves in direct correlation to its distance from usefulness.


So how do we manage the tradeoff between precision and accuracy in forecasting?You must choose the level of precision that is required (and no more precise than that) and accept that in doing so, you may be sacrificing accuracy.

For a retailer, the only demand that is truly independent is customer demand at the point of sale. Customers choose specific items in specific locations on specific days. That’s how the retail business works.

This means that the precision of the forecasting process must be by item by location by day – full stop.

Would you be able to make a more accurate prediction by forecasting in aggregate for an item (or a group of items) across all locations by month? Without a doubt.

Will that help you figure out when you need to replenish stock for a 4 pack of 9.5 watt A19 LED light bulbs at store #1378 in Wichita, Kansas?

Nope. Useless.

I can almost see the wincing and hear the heart palpitations that this declaration will cause.

“Oh God! You’ll NEVER be able to get accurate forecasts at that level of precision!” To that I say two things:

  1. It depends on what level of accuracy is actually required at that level of precision.
  2. Too damn bad. That’s the requirement as per your customers’ expectation.
With regard to the first point, keep in mind that it’s not uncommon for an item in a retail store to sell fewer than 20 units per YEAR. On top of that, there are minimum display quantities and pack rounding that will ultimately dictate how much inventory will be available to customers to a much greater degree than the forecast.Forecasts by item/location/day are still necessary to plan and schedule the upstream supply chain properly, but it’s only necessary for forecasts at that level of precision to be reasonable, not accurate in the traditional sense of the word. This is especially true if you also replan daily with refreshed sales and inventory numbers for every item at every location.

There are those out there who would argue that my entire premise is flawed. That I’m not considering the fact that with advances in artificial intelligence, big data and machine learning, it will actually be possible to process trillions of data elements simultaneously to achieve both precision and accuracy. That I shouldn’t even be constraining my thinking to daily forecasting – soon, we’ll be able to forecast hourly.

Let’s go back to the example I mentioned earlier – an item that sells 20 units (give or take) in a location throughout the course of a year. Assuming that store is open for business 12 hours out of every day and closed 5 days per year for holidays, there are 4,320 hours in which those 20 units will sell. Are we to believe that collecting tons of noise (whoops, I meant “data”) from social media, weather forecasting services and the hourly movement of soybean prices (I mean, why not, right?) will actually be able to predict with accuracy the precise hour for each of those 20 units in that location over the next year? Out of 4,320 hours to choose from? Really?

(Let’s put aside the fact that no retailer that I’ve ever seen even measures how accurate their on hand records are right now, let alone thinking they can predict sales by hour).

I sometimes have a tendency to walk the middle line on these types of predictions. “I don’t see it happening anytime soon, but who knows? Maybe someday…”

Well, not this time.

This is utter BS. Unless all of the laws of statistics have been debunked recently without my noticing, degrees of freedom are still degrees of freedom.

Yes, I’m a loud and proud naysayer on this one and if anyone ever actually implements something like that and demonstrates the benefits they’re pitching, I will gleefully eat a wheelbarrow of live crickets when that time comes (assuming I’m not long dead).

In the meantime, I’m willing to bet my flying car, my personal jetpack and my timeshare on the moon colony (all of which were supposed to be ubiquitous by now) that this will eventually be exposed as total nonsense.

I’m From Missouri

 

“I am from a state that raises corn and cotton and cockleburs and Democrats, and frothy eloquence neither convinces nor satisfies me. I am from Missouri. You have got to show me.” – William Duncan Vandiver, US Congressman, speech at 1899 naval banquet

missouri

“How are you going to incorporate Big Data into your supply chain planning processes?”

It’s a question we hear often (mostly from fellow consultants).

Our typical response is: “I’m not sure. What are you talking about?”

Them: “You know, accessing social media and weather data to detect demand trends and then incorporating the results into your sales forecasting process.”

Us: “Wow, that sounds pretty awesome. Can you put me in touch with a retailer who has actually done this successfully and is achieving benefit from it?”

Them: <crickets>

I’m not trying to be cheeky here. On the face of it, this seems to make some sense. We know that changes in the weather can affect demand for certain items. But sales happen on specific items at specific stores.

It seems to me that for weather data to be of value, we must be able to accurately predict temperature and precipitation far enough out into the future to be able to respond. Not only that, but these accurate predictions need to also be very geographically specific – markets 10 miles from each other can experience very different weather on different days.

Seems a bit of a stretch, but let’s suppose that’s possible. Now, you need to be able to quantify the impact those weather predictions will have on each specific item sold in each specific store in order for the upstream supply chain to respond.

Is that even possible? Maybe. But I’ve never seen it, nor have I even seen a plausible explanation as to how it could be achieved.

With regard to social media and browsing data, I have to say that I’m even more skeptical. I get that clicks that result in purchases are clear signals of demand, but if a discussion about a product is trending on Twitter or getting a high number of page views on your e-commerce site (without a corresponding purchase), how exactly do you update your forecasts for specific items in specific locations once you have visibility to this information?

If you were somehow able to track how many customers in a brick and mortar store pick up a product, read the label, then place it back on the shelf, would that change your future sales expectation?

Clearly there’s a lot about Big Data that I don’t know.

But here’s something I do know. A retailer who recently implemented Flowcasting is currently achieving sustained daily in-stock levels between 97% and 98% (it was at 91% previously – right around the industry average). This is an ‘all in’ number, meaning that it encompasses all actively replenished products across all stores, including seasonal items and items on promotion.

With some continuous improvement efforts and maybe some operational changes, I have no doubt that they can get to be sustainably above 98% in stock. They are not currently using any weather or social media Big Data.

This I have seen.

Small data

Lowes Foods, a family-owned grocery chain with stores located throughout North and South Carolina, is one of the region’s largest retailers, but in the late 2000’s they had a problem.

Declining revenues, triggered by the onslaught of ultra-competitors like Walmart and Amazon threatened the very existence of this 100-or-so retail chain, and unless something was done Management contemplated the closing of a number of stores – which, as everyone knew, would really flip the switch to the inevitable death-spiral of cost cutting and downsizing.

Fortunately, Management took action.  They turned to data analytics to help – even before analytics was in vogue. But instead of utilizing what now is known as Big Data, they retained an analytics expert in a different, and more important, field.  The retained the services of Martin Lindstrom.

Martin is one of the world’s leading branding experts and, arguably, the leading guru in a different form of analytics.

He’s a genius using Small Data to uncover stunning and brilliant insights that, in turn, form the basis of new strategies and tactics that help organizations, like Lowes Foods, thrive.

Instead of slicing and dicing volumes of data – which, as a retailer, Lowes had – his work focuses on very small learnings and observations about customers and indeed, Americans in general.

It’s his contention that Small Data, done well, provides the insights and clarity needed that are almost impossible to find with volumes of data.

Big Data is about information. Small Data is about people – finding the needle in a haystack.

For Lowes, he studied American culture – everything from values and beliefs but importantly very small clues that helped formulate his insight and strategy for Lowes.

As an example, he noticed that American hotels are hotels where the windows are locked.  Coupling that with the number of gated communities and a few other small clues, he concluded the following: despite what they tell you, Americans live in fear.

Studying people around the world, one person at a time, he concluded that the last time people were not afraid was when they were children.  Kids, regardless of culture, are by and large care-free.

So his strategy centered on making Lowes Foods more kid-like.  More fun. More entertaining. The place to go.

The entrance was revamped to include both ChickenWorks and SausageWorks – where busy shoppers could buy ready-made meals.  However, in keeping with the kids theme, the purveyors behind each offering were dressed up characters, complete with costumes that put on a show all day long.  They would argue, shout at each other and generally give each other the gears.  It was pure retail-tainment.

Now, Lowes Foods was always known for the quality of its fresh prepared chicken. In keeping with his insights, Lowes also implemented a new ritual.  When batches of new chickens were removed from the oven, a notice came over the loud speaker and all employees, including Management, would break into their “happy chicken dance”, accompanied by a specific ditty to celebrate hot, fresh, quality chicken.

Another important piece of small data Martin leveraged was the passing of business cards in many cultures. In many cultures, how you hand your business card to someone is an important sign of respect and is done by slightly bowing down and passing the card with two hands.

This small data insight led to a change in how ChickenWorks dealt with customers. Now, purveyors of the chicken would pass the chicken to customers using two hands, while slightly acknowledging the customer in the process. This signals respect to the customer and that what is being bought is of high value – both for the customer, and also for the employee.

All small insights. All based on Small Data – observations and learnings by watching and talking to one person at a time.

And it worked. Sales of ChickenWorks and SausageWorks skyrocketed. And Lowes Foods became known as the place to shop – a fun, unpredictable establishment where customers could buy good quality products, but enjoy themselves in the process.

For us supply chain planners, we’re bombarded every day with people touting the virtues and significance of Big Data. And, to be fair, Big Data is and will be important.

But so is Small Data. Small Data provides insights about people. Small Data opens clues to problem resolution that Big Data would suffer to uncover.

Small Data often provides the tiny clues and insights that drive real, significant change. Here’s a great example from supply chain planning.

For long time readers of our blog you know that the capability now exists to forecast and plan slow and very slow selling products at store level (or any final point of consumption). Hopefully you’re also aware that this allows us to Flowcast every product – that is create time-phased sales, inventory, supply and dollar projections, thereby providing the business with a consistent planning process and a single set of numbers across the organization.

Did you ever wonder how the solution for slow selling products came from?  It came from Small Data (a data point from a single person).

The story goes like this. The architect of the solution was talking about planning at store level with a retail store manager in Canada, when the manager proclaimed the following, “I have no idea when these products will sell, all I know is that I’ll sell about 2 every quarter”.

Boom!

And the idea for integer forecasting and forecasting using different planning horizons (e.g., weekly, monthly, quarterly) was planted. Eventually this nugget of Small Data would be parlayed into the world’s leading and, to date, best solution for planning slow and very slow selling products at store level.

The architect of the slow selling solution didn’t get all sorts of slow selling data and then slice and dice the data to try to uncover a solution. Had he tried that approach, he’d likely still be working on it – just like most technology firms and academics.

When it comes to planning slow selling products, we owe Small Data some thanks:

First, Ken for the small data insight.

And then, Darryl for turning insight to solution.