Changing the game

In 1972, for my 10th birthday, my Mom would buy me a wooden chess set and a chess book to teach me the basics of the game.  Shortly after, I’d become hooked and the timing was perfect as it coincided with Bobby Fischer’s ascendency in September 1972 to chess immortality – becoming the 11th World Champion.

As a chess aficionado, I was recently intrigued by a new and different chess book, Game Changer, by International Grandmaster Matthew Sadler and International Master Natasha Regan.

The book chronicles the evolution and rise of computer chess super-grandmaster AlphaZero – a completely new chess algorithm developed by British artificial intelligence (AI) company DeepMind.

Until the emergence of AlphaZero, the king of chess algorithms was Stockfish.  Stockfish was architected by providing the engine the entire library of recorded grandmaster games, along with the entire library of chess openings, middle game tactics and endgames.  It would rely on this incredible database of chess knowledge and it’s monstrous computational abilities.

And, the approach worked.  Stockfish was the king of chess machines and its official chess rating of around 3200 is higher than any human in history.  In short, a match between current World Champion Magnus Carlsen and Stockfish would see the machine win every time.

Enter AlphaZero.  What’s intriguing and instructive about AlphaZero is that the developers took a completely different approach to enabling its chess knowledge.  The approach would use machine learning.

Rather than try to provide the sum total of chess knowledge to the engine, all that was provided were the rules of the game.

AlphaZero would be architected by learning from examples, rather than drawing on pre-specified human expert knowledge.  The basic approach is that the machine learning algorithm analyzes a position and determines move probabilities for each possible move to assess the strongest move.

And where did it get examples from which to learn?  By playing itself, repeatedly. Over the course of 9 hours, AlphaZero played 44 million games against itself – during which it continuously learned and adjusted the parameters of its machine learning neural network.

In 2017 AlphaZero would play a 100 game match against Stockfish and the match would result in a comprehensive victory for AlphaZero.  Imagine, a chess algorithm, architected based on a probabilistic machine learning approach would teach itself how to play and then smash the then algorithmic world champion!

What was even more impressive to the plethora of interested grandmasters was the manner in which AlphaZero played.  It played like a human, like the great attacking players of all time – a more precise version of Tal, Kasparov, and Spassky, complete with pawn and piece sacrifices to gain the initiative.

The AlphaZero story is very instructive for us supply chain planners and retail Flowcasters in particular.

As loyal disciples know, retail Flowcasting requires the calculation of millions of item/store forecasts – a staggering number.  Not surprisingly, people cannot manage that number of forecasts and even attempting to manage by exception is proving to have its limits.

What’s emerging, and is consistent with the AlphaZero story and learning, is that algorithms (either machine learning or a unified model approach) can shoulder the burden of grinding through and developing item/store specific baseline forecasts of sales, with little to no human touch required.

If you think about it, it’s not as far-fetched as you might think.  It will facilitate a game changing paradigm shift in demand planning.

First, it will relieve the burden of demand planners from learning and understanding different algorithms and approaches for developing a reasonable baseline forecast. Keep in mind that I said a reasonable forecast.  When we work with retailers helping them design and implement Flowcasting most folks are shocked that we don’t worship at the feet of forecast accuracy – at least not in the traditional sense.

In retail, with so many slow selling items, chasing traditional forecast accuracy is a bit of a fool’s game.  What’s more important is to ensure the forecast is sensible and assess it on some sort of a sliding scale.  To wit, if you usually sell between 20-24 units a year for an item at a store with a store-specific selling pattern, then a reasonable forecast and selling pattern would be in that range.

Slow selling items (indeed, perhaps all items) should be forecasted almost like a probability…for example, you’re fairly confident that 2 units will sell this month, you’re just not sure when.  That’s why, counter-intuitively, daily re-planning is more important than forecast accuracy to sustain exceptionally high levels of in-stock…whew, there, I said it!

What an approach like this means is that planners will no longer be dilly-dallying around tuning models and learning intricacies of various forecasting approaches.  Let the machine do it and review/work with the output.

Of course, sometimes, demand planners will need to add judgment to the forecast in certain situations – where the future will be different and this information and resulting impacts would be unknowable to the algorithm.  Situations where planners have unique market insights – be it national or local.

Second, and more importantly, it will allow demand planners to shift their role/work from analytic to strategic – spending considerably more time on working to pick the “winners” and developing strategies and tactics to drive sales, customer loyalty and engagement.

In reality, spending more time shaping the demand, rather than forecasting it.

And that, in my opinion, will be a game changing shift in thinking, working and performance.

Killing Your Sales With Stock

Can one desire too much of a good thing? – William Shakespeare (1564-1616)

Here is one of the most widely accepted logical propositions in retail:

  1. Customers can’t buy product that’s out of stock in the store.
  2. Inventory doesn’t sell when it’s sitting in the warehouse.
  3. Ergo, the more stock you have in your stores, the better it is for sales

It makes some sense, so long as you don’t think about it too hard.

While this thought process can manifest in good ways – reorganizing the supply chain to flow product quickly through a stockless DC based on what’s needed at the store, for example – it can (and often does) result in behaviour that can actually harm sales and productivity.

The old “You can’t sell it out of the warehouse!” chestnut is most often trotted out when the warehouse is packed and they need to make room.

Tell me if this chain of events sounds familiar:

  • The warehouse is running out of space
  • The decision is made to clear out some stock
  • Products are identified that are the biggest contributors to the capacity issue (i.e. they’re taking up a lot of space and not being drawn out as quickly as everyone would like)
  • Push it out to the stores!

A couple weeks later, you run some reports:

  • Warehouse picking efficiency has skyrocketed as a result of shipping out oodles of pallets out to the stores – SUCCESS!
  • Warehouse is unclogged and has sufficient space to maneuver for the next few weeks – SUCCESS!
  • Stores now have all kinds of stock to support sales – SUCCESS!

If we just stop there, we’re feeling pretty good about ourselves. Unfortunately, there’s usually a bit more to the story:

  • The store receives way more stock that can fit on the shelf, so they need to put it somewhere – stores don’t have the luxury of being able to push product out the door to unwilling recipients.
  • Where the stock ultimately ends up is scattered throughout the store – on promotional end caps, in the back room, on overhead storage racks, shoved into a corner in receiving, sometimes even in offsite storage – solving a capacity issue in one location has just created capacity issues in dozens of other locations.

In the best case scenario after this has happened, stores are extremely disciplined and organized in their stock management and can always replenish the shelf from their overstock once it starts to get empty. But protecting sales comes at a significant cost. After the initial receipt of the overstock goods, the product will need to be moved around many times again before it leaves the store:

  • Shelf gets empty, go to the back room and bring out some more, fill the retail displays, bring what didn’t fit back to the back room again, repeat.
  • The overstock product is finally cleared out of the back room, but now you need to start taking down secondary displays as they deplete to replenish the home and fill them up with something more deserving that should have been there in the first place.

In the second best case scenario, the stock is within the 4 walls of the store – somewhere. When the shelf is empty, the vast majority of your customers will seek out a staff member to find the product and wait patiently while said staff member recruits other staff members to go on a costly scavenger hunt that hopefully… eventually… turns up the stock that the customer is waiting for. Crisis averted! Sale retained! But again, at a steep cost.

In the worst case (and most common) scenario, the customer sees an empty shelf and just leaves the store without alerting anyone to his/her dissatisfaction. A couple days later, a staff member walks by, sees the empty shelf and thinks “I’m sure the replenishment system will take care of that.” But it won’t. According to the stock ledger, the store has tons of stock to sell. After a couple more weeks of lost sales, someone realizes that they need to try to find the stock somewhere within the store. After an hour of searching, they give up and just write the stock off in the hopes that more will be sent to fill the hole in the shelf, further exacerbating the overstock problem until it turns up months later during the physical count.

And in all of the above scenarios, the management of overstock is consuming finite store resources that could negatively impact sales for all products in the store, not just the problem children.

So there you have it – rather than an enabler, inventory can be an impediment to sales. Even though inventory is in the store, it might as well be on Mars if it’s not accessible to the customer.

In an ideal world, you would set up your processes, systems and constraints in such a way that product can flow into the back door of the store in such a way that what’s coming in can largely flow directly to the shelf with minimal overstock. it’s not super easy to accomplish this, but it’s not advanced calculus either.

But in the event that you do end up with overstock in your supply chain, the best place to have it is upstream where the product is not yet fully costed, better processes and tools exist to manage it and you still have options to dispose of it or clear it out as cost effectively as possible – you know, postponement and all that.

Arbitrarily pushing stock out to the stores in the hopes that they’ll figure out what to do with it is about the worst thing you can do.

Fossilized thinking

Fossilized

In August 1949 a group of fifteen smokejumpers – elite wild land firefighters – descended from the Montana sky to contain an aggressive fire near the Missouri River.  After hiking for a few minutes the foreman, Wagner Dodge, saw that the fire was raging – flames stretching over 30 feet in the air and blazing forward fast enough to cover two football fields every minute.

The plan was to dig a trench around the fire to contain it and divert it towards an area with little to burn.

Soon it became clear that the fire was out of control and the plan was out the window.  The fire was unstoppable so, instead, they’d try to outrun it, to safer ground.

For the next ten minutes, burdened by their heavy gear and tiring legs, the team raced up an incline, reaching an area that was only a few hundred yards from safety.  But the fire was unflinching, gaining ground like a wolf chasing down a wounded animal.

Suddenly, Dodge stopped.  He threw off his gear and, incredibly, took out some matches, lit them, and tossed them onto the grass.  His crew screamed at him but to no avail – when Dodge didn’t listen, they had no choice and turned and ran as fast as they could, leaving their foreman to what they believed to be certain peril.

But Dodge had quickly devised a different survival strategy: an escape fire.  By torching an area in front of him, he choked off the fuel for the fire to feed on.  Then, he poured water on a rag, put it over his mouth and lay down, face first, on the freshly burnt grass while the fire raged and sped past and over him.  In total, he’d spend close to 15 minutes living off the oxygen close the ground he’d just torched.

Sadly, of the rest of his crew that tried to outrun the blaze, only two would survive.

Wagner Dodge was able to survive not because of his physical fitness, but his mental fitness – the ability to rethink and unlearn.  The prevailing paradigm was that, at some stage for an out-of-control blaze, your only option is to try to out run it.  But Dodge was able to quickly rethink things – believing that, perhaps, by choking off it’s fuel line and providing his own small wasteland area, the fire might avoid him.  The ability to rethink had saved his life.

As it turns out, the ability to rethink and unlearn is also crucial for retails survival and revival.

It’s no secret, many retailers are struggling.  The same is true of many retail supply chains.

Do you ever really wonder why?

Lots of people blame retail’s generally slow adoption of new technologies and business models as the main factor, but I think it’s a deeper, more fundamental and chronic problem.

Technology is not eating retail.  Fossilized thinking is.

What’s fossilized thinking?  It’s people – at all levels in an organization – who are unwilling or unable to challenge their long-held beliefs.  Not only challenge them, but be able to rethink, unlearn and change them often.

As a case in point, many people who work in the retail supply chain don’t include the consumer as part of the supply chain.  Yet, if you think about it, the retail supply chain begins and ends with the consumer.  There are even a number of folks who don’t consider the store part of the supply chain.  Once the product has shipped from the DC to the store, then, incredibly, job done according to them.

Don’t believe me?

I won’t embarrass them, but just recently I read a “thought leadership” article from one of the world’s pre-eminent consulting firms regarding the top trends in retail supply chain management.  At #3, and I kid you not, was the growing view that the store was a key part of the supply chain.

Flowcasting tribe members know better and think differently.  The consumer and store have, and always will be, part of the supply chain.  That’s why we understand that, in retail, there is no such thing as a push supply chain – since you can’t push the product to the consumer.

In my opinion (and I’m not alone), fossilized thinking, not technology adoption, is the real disruptor in retail.

If you want to improve, innovate or disrupt then you must…

Constantly rethink, unlearn and challenge your own thinking!

Agile or Waterfall?

 

Just because something doesn’t do what you planned it to do doesn’t mean it’s useless. – Thomas A. Edison (1847-1931)

samagile

So, which is a better approach to project management? Agile or Waterfall?

The answer is simple: Agile is better. It’s newer, it sounds cooler and it has better terminology (scrums and sprints – that’s awesome!).

Still not convinced? Then check out some of these dank memes that pop up when you to a Google search on “agile or waterfall”.

In just a few simple pictures, you can plainly see that the Agile approach is easier…

Agile Methodology vs Waterfall Model: Pros and Cons

…less risky…

comparison of agile vs waterfall | Waterfall project management, Project  management, Agile

…quicker to achieve benefits…

…all with a greater likelihood of success.

2019 UPDATE] Agile Project Success Rates 2X Higher Than Waterfall
So if it’s been settled for the ages, then why am I writing this piece? And why are you detecting a very slight hint of sarcasm in my tone thus far?

The Agile framework has its roots in software development and in that context, I have no doubt about its superiority. That’s because new software is (generally speaking) easy to modularize, easy to test and easy to change when you find errors, vulnerabilities or awkwardness for the user all with very little risk or significant capital outlay.

I’m certainly no expert, but intuitively, Agile project management seems to be exactly the right tool for that kind of job.

Where I take issue is when I hear it being bandied about as an outright replacement for all previous project management methods because it’s trendy, regardless of whether or not it’s the right fit.

Some Dude: “We’re embarking on an Agile transformation!”

Me: “That sounds great! What do you mean exactly?”

Same Dude: “It means we’ll be more agile!”

Me: “Uh, okay.”

Building a new skyscraper is a project that consumes time, materials and resources and is expected to achieve a benefit upon completion.

You can’t approach such a project with the mindset that “we’ll buy some land, start building upwards and make adjustments as we go”. When you get up to the 20th storey, it’s not so easy to make a decision at that point to go up another 20 storeys (did you put in a foundation at step 1 to support that?). Everything needs to be thoroughly thought through and planned in a significant amount of detail before you even purchase the land, otherwise the project will fail. Yes, there are opportunities to make small changes along the way as needs arise, but you really do need to be following a “grand plan” and you can’t start renting it out until it’s done.

Similarly, if you had a medical condition requiring several complex surgeries to be performed over several months, would you opt instead for a surgeon who says “I’ll just cut you open, start messing around in there and quickly adapt as I go – we can probably shave 20% off the total time”?

This brings me to supply chain planning in retail. On the face of it, the goal is to get people out of spreadsheets and into a functional planning system that can streamline work and improve results. It would seem that an Agile approach might fit the bill.

But building out a new planning capability for a large organization is much more like building a skyscraper than coding a killer app. Yes, the numbers calculated in a planning system are just data that can be easily changed, but that data directly drives the deployment of millions of dollars in physical assets and resources. It requires:

  • Tons of education and training to a large pool of people, grounded in principles that they will initially find unfamiliar. The unlearning is much harder than the learning.
  • A thorough understanding of what data is needed, where it resides and what needs to be done to improve quality and fill gaps.
  • A thorough understanding of how the new planning process and system will fit in with existing processes and systems in the organization that won’t be changing.

All of this needs to happen before you “flip the switch” to start moving goods AND the business has to keep running at the same time. A former colleague described projects of this nature as “performing open heart surgery while the patient is running a marathon”.

After this foundation is in place and stabilized, there are opportunities aplenty to apply Agile techniques for continuous improvement, analysis and a whole host of other things. But there needs to be a firm base on which to build. Even constructing a killer app with the Agile approach still requires a programming language to exist first.

So what am I saying here? That large scale organizational change programs are complex, risky, take a lot of time and require significant upfront investment before benefits can be realized?

Yeah, pretty much.

Sorry.

 

Is the juice worth the squeeze?

Squeezing-Oranges

A little over 10 years ago I was on a project to help one of Canada’s largest grocery and general merchandise retailers design and implement new planning processes and technology. My role was the co-lead of the Integrated Planning, Forecasting & Replenishment Team and, shockingly, we ended up with a Flowcasting-like design.

The company was engaged in a massive supply chain transformation and the planning component was only one piece of the puzzle. As a result of this, one of the world’s preeminent consulting firms, Accenture, was retained to help oversee and guide the entire program.

One of the partners leading the transformation was a chap named Gary. Gary was a sports lover, a really decent person, great communicator and good listener. He also had a number of “southern sayings” – nuggets of wisdom gleaned from growing up in the southern United States.

One of his saying’s that’s always stuck with me is his question, “is the juice worth the squeeze?”, alluding to the fact that sometimes the result is not worth the effort.

I can remember the exact situation when this comment first surfaced. We were trying to help him understand that even for slow and very slow selling items, creating a long term forecast by item/store was not only worth the squeeze, but also critical. As loyal and devoted Flowcasting disciples know this is needed for planning completeness and to be able to provide a valid simulation of reality and work to a single set of numbers – two fundamental principles of Flowcasting.

The good news was that our colleague did eventually listen to us and understood that the squeeze was not too onerous and today, this client is planning and using Flowcasting – for all items, regardless of sales velocity.

But Gary’s question is an instructive one and one that I’ve been pondering quite a bit recently, particularly with respect to demand planning. Let me explain.

The progress that’s been made by leading technology vendors in forecasting by item/store has been impressive. The leading solutions essentially utilize a unified model/approach (sometimes based on AI/ML, and in other cases not), essentially allowing demand planners to largely take their hands off the wheel in terms of generating a baseline forecast.

The implications of this are significant as it allows the work of demand planning to be more focused and value added – that is, instead of learning and tuning forecasting models, they are working with Merchants and Leaders to develop and implement programs and strategies to drive sales and customer loyalty.

But, I think, perhaps we might be reaching the point where we’re too consumed with trying to squeeze the same orange.

My point is how much better, or more accurate, can you make an item/store forecast when most retailers’ assortments have 60%+ items selling less than 26 units per year, by item/store? It’s a diminishing return for sure.

Delivering exceptional levels of daily in-stock and inventory performance is not solely governed by the forecast. Integrating and seamlessly connecting the supply chain from the item/store forecast to factory is, at this stage, I believe, even more crucial.

Of course, I’m talking about the seamless integration of arrival-based, time-phased, planned shipments from consumption to supply, and updated daily (or even in real time if needed) based on the latest sales and inventory information. This allows all partners in the supply chain to work to a single set of numbers and provides the foundation to make meaningful and impactful improvements in lead times and ordering parameters that impede product flow.

The leading solutions and enabling processes need to produce a decent and reasonable forecast, but that’s not what’s going to make a difference, in my opinion. The big difference, now, will be in planning flexibility and agility – for example, how early and easily supply issues can be surfaced and resolved and/or demand re-mapped to supply.

You and your team can work hard on trying to squeeze an extra 1-3% in terms of forecast accuracy. You could also work to ensure planning flexibility and agility. Or you could work hard on both.

It’s a bit like trying to get great orange juice. To get the best juice, you need to squeeze the right oranges.

Which ones are you squeezing?

The Potemkin Village

 

The problem with wearing a facade is that sooner or later life shows up with a big pair of scissors. – Craig D. Lounsbrough

Potemkin Village

Russia had recently annexed Crimea from the Ottoman Empire and after a long war, the region of New Russia now found itself under the rule of Empress Catherine II (a.k.a. Catherine the Great).

In 1787, Catherine embarked on a 6 month journey down the Dnieper River to New Russia to survey her new territory. Accompanying her on this journey was her boyfriend, Grigory Potemkin.

Unbeknownst to Catherine, the region had been devastated by the war. According to folklore, Potemkin – in an effort to placate Catherine – sent ahead an “advance team” to erect a fake village bustling with people before Catherine’s flotilla sailed by. After she had passed, the village would be taken down, rushed further downstream and reassembled to give Catherine the false impression that New Russia was a vibrant and welcome addition to her empire and that all of the treasure and bloodshed to obtain it was not in vain.

It’s been over 230 years, but the tradition of the Potemkin Village is alive and well today.

Don’t believe me?

Try visiting a retail store on a day when the store manager (Grigory) has just been informed that the bigwigs from home office (Catherine) will be stopping by for a visit. In all likelihood, an advance communication went to the store telling them that they don’t need to do anything to prepare in advance and they should just carry on as usual – the bigwigs don’t want to get in the way.

Yeah, right.

A flurry of activity soon ensues. The receiving area and back room are cleaned up and all stock is run out to the floor. Shelves and pegs are filled up, faced up and looking neat. Any aisle clutter is either put away or hidden. This is the kind of stuff that should be happening daily if people had the time – and yet, oddly, the time can be found to do two weeks’ worth of work in 3 days ahead of a VIP visit.

Sidebar: I once worked at a retailer (who shall remain nameless) with hundreds of stores each stocking thousands of products. But there was one store in particular that had its own unique set of stocking policies and ordering rules. This same store was always the top priority location when stock was low in the DC and needed to be rationed. What made this store so special? It happened to be located near the CEO’s home and he was known to shop there frequently. Not making that up.

Okay, back to the VIP visit. The big day arrives and the store is looking fantastic. The VIP entourage arrives and the store manager is waiting at the entrance to give the grand tour. Pleasantries are exchanged. How have sales been? Lots of customers in today! Any issues we need to know about?

Then comes the much anticipated Walking of the Aisles. The VIPs are escorted throughout the store, commenting on the attractiveness of the displays, asking questions and making suggestions….

Then someone in the entourage sees a shelf tag with no stock above it. “Why don’t you have stock? That’s sales we could be losing!”

The sheepish store manager replies: “I dunno. The ordering is centralized at headquarters. We just run product to the shelf when it arrives. We actually haven’t had that item in weeks and I can’t get a straight answer as to why not.”

“We need to support the stores better than this!”, exclaims one of the VIPs. “I’ll get this straightened out!”. Out comes the cell phone to snap a picture of the shelf tag below the void where stock should be. And for good measure, a few more pics of other holes in the same aisle.

A couple of taps and the pics are on their way to whichever VP is in charge of store replenishment with the subject line: “please look into this” (no time for proper capitalization or punctuation).

Ten minutes later, a replenishment analyst receives an email from her manager with the subject line: “FW: FW: FW: FW: please look into this”.

Another sidebar: I happened to be shadowing a replenishment analyst for another retailer for the purpose of learning her current state processes when one of those emails with pictures came in. There were 6 or 7 pictures of empty shelf positions and she researched each one. For all but one of the items, the system showed that there was stock in the store even though there was apparently none on the shelf. The last one was indeed stocked out, but a delivery was due into the store on that very same day. Was this a good use of her time?

Look, I know the tone of this piece is probably a bit more snarky than it needs to be. And although this whole scenario is clearly absurd when laid out this way, I’m not projecting malice of intent on anyone involved:

  • The VIP spotted a potential customer service failure and wanted to use her power to get it rectified. It never occurred to her that the culprit might be within the 4 walls of the store because: a) the store looked so nice and organized when she arrived; and b) the organization doesn’t measure store inventory accuracy as a KPI. If shrink is fairly low, it’s just assumed that stock management is under control.
  • In all likelihood, the store manager truly has no idea how replenishment decisions are made for his store. And while there’s a 4 inch thick binder in the back office with stock management procedures and scanning codes of conduct, nobody has actually properly connected the dots between those procedures and stock record accuracy more generally.
  • The replenishment analyst wants to help by getting answers, but she can’t control the fact that the wrong question is being asked.

The problem here is that there are numerous potential points of failure in the retail supply chain, any of which would result in an empty shelf position for a particular item in a particular store on a particular day. Nothing a senior manager does for 20-30 minutes on the sales floor of a store will do anything to properly identify – let alone resolve – which process failures are contributing to those empty shelves.

Jumping to the conclusion that someone on the store replenishment team must have dropped the ball is not only demoralizing to the team, it’s also a flat-out wrong assumption a majority of the time.

If you happen to be (or are aspiring to be) one of those VIPs and you truly want the straight goods on what’s happening in the stores, you need to change up your game:

  • Every so often, visit a store unannounced – completely unannounced and spend some time in the aisles by yourself and soaking in the true customer experience for awhile before speaking to store management.
  • When it’s time to get a feel for what can be done to keep the shelves full, put down the phone and pick up a handheld reader. Just because the stock isn’t on the shelf right now, that doesn’t mean that it isn’t elsewhere in the store or on its way already.
  • Spend the time you would normally spend on pleasantries and somewhat meaningless measures on a deep dive into some of those shelf holes with the store manager in tow:
    • Shelf is empty, but the system says there’s 6 in the store? Let’s go find it!
    • Truly out of stock with 0 reported on hand and none to be found? Let’s look at  what sales have been like since the last delivery.
    • Can’t figure out why the replenishment system doesn’t seem to be providing what’s needed? Work through the calculations and see if there’s something wrong with the inputs (especially the on hand balance).

Will looking past the facade of the Potemkin Village solve the problems that it’s been hiding? Probably not. But you need to start somewhere.

In the words of George Washington Carver: “There is no shortcut to achievement. Life requires thorough preparation – veneer isn’t worth anything.”

Stoplights and Roundabouts

Stoplight-roundabout

As someone who’s been doing project work for a long time, anytime I read something that makes me ponder, I take note.

Consider stop lights and roundabouts.

Stop lights are the dominant way that we use to manage intersections and flows of traffic for two roads that cross.  Have you ever thought about the assumptions behind this approach?

  1. People can’t make decisions on their own approaching an intersection and need to be told what to do
  2. The intersections must be managed with complex rules and technology with cables, lights, switches and a control center
  3. A plan and logic must be determined for every scenario, thus requiring a solution with multi-colored signals, arrows, etc

Now, think about roundabouts.  In a roundabout, cars enter and exit a shared circle that connects travel in all four directions.  The assumptions for this method are significantly different:

  1. People make their own decisions on entry and exit and trust one another to use good judgment
  2. The intersections are managed with simple rules and agreements: give the right of way to cars already in the circle and go with the flow
  3. Lots of scenarios happen, but co-ordination and common sense will be good enough to handle them

How about the performance of each approach?  Ironically, the roundabout outperforms the more complicated and sophisticated system on the three key performance metrics:

  1. They have 75% less collisions and 90% less fatal collisions;
  2. They reduce delays by 89%; and
  3. They are between $5,000 and $10,000 less costly to operate/maintain each year (and, of course, function as normal during power outages)

There’s some pretty profound insights and learning’s from this comparison.  Obviously, if you’re involved in designing and implementing new thinking and technology, keep it as simple as possible and don’t try to automate every decision.

The other key insight from this example is actually more profound and speaks to the nature of work, innovation and teams.

I’ve been very fortunate to have led two fairly important projects with respect to retail Flowcasting.  This dichotomy between stoplights and roundabouts highlighted why we were successful and paints a picture for how projects, and indeed work, could be organized better.

About 25 years ago I was the leader of a team at a large, national Canadian retailer whose mandate was to design a better way to plan the flow of inventory from supplier to store.  We would eventually design what we now call Flowcasting and would implement retail DRP and supplier scheduling for the entire company, including all suppliers – a first in complete integration from retailer to supplier.

As luck would have it, our team would eventually report up to a Director, who was, like the team, a bit of a maverick.  Let’s call him Geoff.

What Geoff did that was brilliant – and consistent with the roundabout philosophy – was to give me and the team almost complete decision making authority.  I remember him telling me, “This team knows what they’re doing and the design is solid.  My job is to clear trail for you, shelter you from unnecessary bureaucracy and make sure you can deliver”.

And he did.  The team had virtually the entire say in all decisions that affected the design and implementation.  That’s not to say we didn’t communicate with Senior Management and give updates and ask for opinions – we did, it’s just we felt like we were given ultimate say.  It was exhilarating and, as it turns out, a model for project work.

Fast forward 20 years and I’m a consultant on another Flowcasting project – this time for a mid-sized national hardgoods Canadian retailer.

In another stroke of good fortune, the business sponsor for the team inherently had a similar view about work and how projects delivered.  Let’s name him Ken.

Ken’s operating style also gave the team the latitude to make the key decisions regarding design and implementation – of course he kept abreast of things and contributed his input and advice but ultimately we were in charge.  His role, he said, was to educate and help the Senior people make the journey.

As an example, I remember Ken telling me before our first steering committee meeting, words to the effect…”We’re not going in looking for approval.  We know what we’re doing and why.  These sessions are about educating and informing the group, and every now and then asking for their opinions and advice”.

It was how the entire project operated.

In one example that demonstrated the team’s authority, I remember one of the analyst’s on the team helping Ken change his thinking on our implementation approach.  It was a great example of the team working with psychological safety and proof positive that ideas were more important than hierarchy.

I’ve been doing a lot of reading lately on the future of work and how companies can innovate.  And what I’m seeing is that a model for work (day to day, and also project work) is starting to emerge.

It’s based on the principle of turning people into self-organized, self-managing teams and giving them the space, freedom and authority to work and innovate – treating them like small, micro-enterprises.

Principles I’ve been fortunate enough to have experienced in two of the most successful and rewarding projects I’ve been involved with.

You can manage, innovate and drive change using the operating principles of the stoplight or the roundabout.

Choose wisely.

Keep Calm And Blame It On The Lag

 

A good forecaster is no smarter than everyone else, he merely has his ignorance better organized. – Anonymous

stopwatch

I’ve written on the topic of forecast performance measurement from many different angles, particularly in the context of forecasting sales at the point of consumption in retail.

Over the years, I’ve opined that:

  • Forecast accuracy (in the traditional sense) is a useless measure
  • Reasonableness is more important than accuracy, given that forecasts are, by their nature, forgiving planning elements
  • The outsized importance placed on forecast accuracy in supply chain planning is a myth
  • Accuracy and precision must be considered simultaneously
  • Forecasts should be judged against what is a reasonable expectation for accuracy
  • Forecasting at higher levels of aggregation to achieve higher levels of “accuracy” is a waste of time

After going back and re-reading all of that stuff, they are all really just different angles and approaches for delivering the message “popular methods of comparing forecasts and actuals may not be as useful as you think, especially in a retail context”.

But in all of this time there is one key aspect of forecast measurement that I have not addressed: forecast lags. In other words, which forecast (or forecasts) should you be comparing to the actual?

Assuming, for example, that you have a rolling 52 week forecasting process where forecasts and actuals are in weekly buckets, then for any given week, you would have 52 choices of forecasts to compare to a single actual. So which one(s) do you choose?

Let’s get the easy one out of the way first. Considering that the forecast is being used to drive the supply chain, the conventional wisdom is that the most important lag to capture for measurement  is the order lead time, when a firm commitment to purchase must be made based on the forecast. For example, if the lead time is 4 weeks, you’d capture the forecast for 4 weeks from now and measure its accuracy when the actual is posted 4 weeks later.

Nope. To all of that.

While it’s true that measuring the cumulative forecast error over the lead time can be useful for determining safety stock levels, it’s not very useful for measuring the performance of the forecasting process itself, for a couple of reasons:

  1. It is a flagrant violation of demand planning principle. Nothing on the supply side of the equation (inventory levels, lead times, pack rounding, purchasing constraints, etc.) has anything to do with true demand. Customers want the products they want, where they want them and when they want them at a price they’re willing to pay, period. The amount of time it happens to take to get from the point of origin to a customer accessible location is completely immaterial to the customer.
  2. A demand planner’s job is to manage the entire continuum of forecasts over the forecast horizon. If they know about something that will affect demand at any point (or at all points) over the next 52 weeks, the forecasts should be amended accordingly.

Suppose that you’re a demand planner who manages the following item/location. The black line is 3 years’ worth of demand history and a weekly baseline forecast is calculated for the next 52 weeks.


Because you’re a very good demand planner who keeps tabs on the drivers of demand for this product, you know that:

  • The warm weather that drives the demand pattern for this item/location has arrived early and it looks like it’s going to stay that way between now and when the season was originally expected to start.
  • There are 2 one week price promotions coming up that have just been signed off and all of the pertinent details (particularly timing and discount) are known.
  • For the last 3 years, there have been 3 similar products to this one being offered at this location. A decision has just been made to broaden the assortment with 2 additional similar products half way through the selling season.

On that basis, I have 2 questions:

  1. How does the baseline forecast need to change in order to incorporate this new information?
  2. How would your answer to question 1 change if you also knew that the order-to-delivery lead time for this item/location was 1 week? 2 weeks? 12 weeks?

Hint: Because it was established at the outset that “you’re a very good demand planner who keeps tabs on the drivers of demand for this product”, the answer to question 2 is: “Not at all.”

So if measuring forecast error at the lead time isn’t the right way to go, then what lag(s) should be captured for measurement?

As with all things forecasting related, there is no definitive answer to this question. But as a matter of principle, the lags chosen to measure the performance of a demand planning process should based on when facts become “knowable” that could affect future demand and would prompt a demand planner to “grab the stick” and override a baseline forecast modeled based on historical patterns.

In some cases, upstream processes that create or shape demand can provide very specific input to the forecasting process.

For example, it’s common for retailers to have promotional planning processes with specific milestones, for example:

  • Product selection and price discounts are decided 12 weeks out
  • Final design of media to support the ad is decided 8 weeks out
  • Last minute adds, deletes and switches are finalized 3 weeks out

At each of those milestones, decisions can be made that might impact a demand planner’s expectation of demand for the promotion, so in this case, it would be valuable to store forecasts at lags 3, 8 and 12. Similar milestone schedules generally exist for assortment decisions as well.

In other cases, what’s “knowable” to the demand planner can be subject to judgment. For example, if actuals come in higher than forecast for 3 weeks in a row, is that a trend change or a blip? How about 4 weeks in a row?

Lags that are closer in time (say 0 through 4) are often useful in this regard, as they can show error trends forming while they are still fresh.

Unless tied to a demand shaping process with specific milestones as described above, long term lags are virtually useless. Reviewing actuals posted over the weekend and comparing it to a forecast for that week that was created 6 months ago might be an interesting academic exercise, but it’s a complete waste of time otherwise.

The key of measuring is to inform so as to improve the process over the long term.

With the right tools and mindset, today’s “I wish I knew that ahead of time” turns into tomorrow’s knowable information.

Limit your inputs

Meditations

Marcus Aurelius is widely considered to be one of the wisest people of all time. His classic and colossal writings, ‘Meditations’, is a bible for crystal clear thinking, famously outlining the principles of stoicism.

In Meditations, he asks a profound question that all designers and implementers should ponder often…

”Is this necessary?”

Knowing what not to think about. What to ignore and not do. It’s an important question, especially when it comes to designing and implementing new ways of working.

As an example, consider the process of developing a forward looking forecast of consumer demand, by item, by store. Of course, loyal readers and disciples know that this is the forecast that provides a key input to allow a retail supply chain to be planned using Flowcasting.

You might get your knickers in a knot to learn that in one of our most recent and successful implementations of Flowcasting, the store level forecasting process uses only two key inputs:

  1. The actual sales history by item/store in units
  2. An indication if that sales history was during an abnormal period (e.g., a promotion, an unplanned event, a stock-out period, a different selling price, etc).

Now, I know what you’re thinking. What about all those ‘other’ things that influence consumer demand that many people espouse? You know, things like the weather, competitor activities and any other causal variables?

Counter-intuitively, all these additional ‘factors’ are not really required at the retail store level and for very good reasons.

First, did you know that for most retailers 50-60% of the item/store combinations sell 24 or less units per year. That’s less than one unit every two weeks. Furthermore, about 70-80% of the item/store combinations sell less than 52 units per year, or about 1 unit per week.

Consider the very slow sellers – selling 24 or less units a year. If the last 52 weeks sales was 24 units and so was the previous year’s, it would stand to reason that a reasonable forecast for the upcoming 52 weeks would be around 24 units.

Keep in mind, that as actual sales happen the forecasting process would always be re-forecasting, looking ahead and estimating the upcoming 52 weeks consumer demand based on most recent history.

Now, consider a 52 week forecast of 24 units. That breaks down to a weekly forecast of 0.46.

Factoring in additional variables is not likely to make actual sales of 30 units happen (assuming stock outs were not excessive) – which would be the equivalent of increasing previous year’s sales by 25%. A higher forecast does not mean sales will actually happen.

Even a 10% increase in the forecast would only increase the annual forecast by about 2 units, or about .04 units per week.

Is there really a difference between an average weekly forecast of 0.46 and 0.50 units? Isn’t that essentially the same number? In terms of a forecast, they are both reasonable (in reality, the forecast would also have a pattern to the expected sales and would be expressed as integers, but you get the point).

One of the keys of the retail Flowcasting process is using only a limited number of inputs to build the item/store forecast, while allowing people to easily understand and thus manage it – by a very limited number of exceptions that could not be automatically resolved using system rules.

Add some basic supply information (like inventory balances and ordering rules) to the forecast and voila – the entire supply chain can be calculated/planned from store to supplier – every day, for every planned inventory flow and projection for a rolling 52 weeks into the future.

What’s elegant and inherently beautiful about Flowcasting is that daily re-planning the entire supply chain provides the agility to adjust current and planned inventory flows and ensures everyone is working to a single set of numbers – based on the drumbeat of the consumer. It negates the need to find the ‘perfect forecast’ and, as such, allows us to limit our inputs to the bare essentials.

Marcus Aurelius was right.

Limit your inputs and always ask, “Is this necessary?”

It’s good advice in business, in life and especially store level forecasting.

The Great Lever of Power

I shan’t be pulling the levers there, but I shall be a very good back-seat driver. – Margaret Thatcher

lever

A number of years ago, I saw a television interview with President Ronald Reagan after he left office. In that interview, he reminisced on his political career, including when he first stepped into the Oval Office in 1981.

I can’t find any transcripts or direct quotes from that interview, but I do distinctly remember him saying something to the effect of: “Before I assumed the presidency, I imagined a great lever of power on the Resolute Desk. When I took office, I learned that the lever actually existed – but it wasn’t connected to anything.” (If anyone out there has the exact quote, please share!)

I think of that whenever I hear senior leaders in retail say things like “our inventory is too high – we need to get it under control”.

What often follows this declaration is a draconian set of directives to “bring the inventory down”:

  • “Look at all of our outstanding purchase orders and cancel anything that’s not needed”
  • “We can’t sell excess stock out of the DCs, so return as much as possible and push the rest out to the stores where it can sell”

[One quarter later…]:

  • “Oh shit, our in-stock has nosedived and we’re losing sales! Buy! Buy! Buy!”

Rinse and repeat.

It has been described to me as a “swinging pendulum” in terms that would lead one to believe that these inventory imbalances are cyclical in nature, like the rate of inflation in the economy. When it gets too high, the central bank steps in with an interest rate hike to steer it to an acceptable range.

A couple of problems with that:

  1. The behaviour of consumers drives the inflation rate and this behaviour can’t be directly controlled. In contrast, the processes that drive inventory flow are internal to the retailer and, as such, are directly controllable.
  2. The pendulum swings themselves are caused by management’s efforts to control the pendulum swings – that popping sound you heard was my head exploding

I should note that I rarely hear “We need to review our inventory management policies and processes to determine what’s causing our inventory levels to be higher than expected, so that we can improve the process to ensure that we can flow stock better in the future without sacrificing in stock.”

Inventory is not an “input variable” that can be directly manipulated by management and brought to “the right level” in the aggregate. It is an output of policies and processes being executed day in, day out for every item at every location over a period of time. Believing that inventory levels can be directly controlled with blunt instruments is like believing that you can directly impact your gross margin without changing the price or the cost (or both).

It may sound trite, but if management doesn’t like the output of the process, then they must necessarily be taking issue with the process inputs or the process itself (both of which, by the way, are owned by management).

On the input side:

  • Are your stocking policies excessive compared to variability in demand?
  • Are you purchasing in higher quantities or with higher lead times than you used to (e.g. container loads from overseas versus pallets from a domestic source)?
  • Are you buffering poor inbound performance from suppliers with more safety stock?

On the process side:

  • Are demand planners striving to predict what will happen in an unbiased way or are they encouraged to be optimistic?
  • Are people buying first and figuring out how to sell it later?
  • Is your inventory higher because your sales have been increasing?

Management does not “own results”.

Management owns the processes that give rise to the results. If you make the determination that “inventory is too high” and you don’t know why, then you’re not doing your job.

Or to put it another way:

The aim of leadership should be to improve the performance of man and machine, to improve quality, to increase output, and simultaneously to bring pride of workmanship to people. Put in a negative way, the aim of leadership is not merely to find and record failures of men, but to remove the causes of failure: to help people to do a better job with less effort. – W. Edwards Deming