A dynamic, digital twin

Frank Gehry is widely acclaimed as one of the world’s greatest architects. His most famous and celebrated building, the Guggenheim Museum in Bilbao, is the design and subsequent construction that elevated him to superstardom.

The story of how Gehry designs and the technologies he used to develop this, and subsequent masterpieces, is instructive and very relevant for supply chain planning and management.

Gehry usually begins by sketching ideas on paper with scrawls that would mystify most folks. Then he mostly works with models – usually working with wooden building blocks of different sizes that he stacks, and restacks, always looking for something that might be functional and is visually appealing.

Until recently, he’s worked with these types of models his whole life. His studio is filled with them – the culmination of decades of model building. He usually starts at one scale, then tries another and then another to see the project from varying perspectives. He zeroes in on some aspects of the design in his model, zooming in and out until he better understands the design from many different viewpoints and angles. He’s always trying new ideas, reviewing his designs with his team and client, eventually deciding on what works or doesn’t. Eventually he settles on the design and then they get on with it.

After landing the project to design the Guggenheim Bilbao, he and his team spent the better part of two years working through these iterative models, using the decidedly analog world of building blocks and cardboard to visualize the result.

Then, our old friend technology made a house call and changed his design capability forever.

Gehry would be introduced to computer simulation software called CATIA, allowing Gehry to build his designs on a computer. Originally the software was built to help design jets but was modified to allow buildings to be designed – on a computer – in three dimensions. Early in his career his designs were mostly straight lines and box-like shapes, but this technology would allow him to design curves and spirals that would be beautiful and aesthetically pleasing.

CATIA’s capabilities proved incredible. Gehry and his team could alter the design quickly, change curves or shapes, and the system would instantly calculate the implications for the entire design – from structural integrity to electrical/plumbing requirements, to overall cost. They could iterate new ideas and concepts on the computer, simulate the results, then rinse and repeat, and only then, once happy, begin construction.

The Guggenheim building was first fully designed on a computer.

In a moment of foreshadowing, the design and digital design process was labelled a “digital twin”. Once the digital twin was finalized and agreed to, only then did construction begin.

The term “digital twin” has become somewhat fashionable and, to be honest, quite important in supply chain. And what do people mean by the term “digital twin”, when thinking about the supply chain? Here’s one definition…

A digital twin is a digital replica of a physical supply chain. It helps organizations recreate their real supply chain in a virtual world so they can test scenarios, model different nodes, modes, flows, and policies and understand how decisions and disruptions will impact network operations.

For most supply chain folks, the digital twin is relatively static and represents the current state, or outlines a snapshot of the supply chain, as of today – for example, what’s happening in the supply chain, as of right now.

But, like Gehry’s ability to dynamically change design elements and immediately see the impact overall, wouldn’t the best digital twin for supply chains also be dynamic, complete, and forward-looking?

It would.

And isn’t that what Flowcasting is?

It’s a future-dated, up-to-date, complete model of the business. It depicts all current and projected demand, supply, inventory, and financial flows and resource requirements, based on the strategies and tactics that are driving a retailer and their trading partners. If something changes, then the dynamic model re-calculates the projections – so the forward-looking digital twin is always current. Everyone can see the projections in their respective language of the business (e.g., units, cases, dollars, capacities, resources) and work to a single set of numbers.

The architectural “digital twin” was a breakthrough approach for Frank Gehry and architecture in general.

The forward-looking, dynamic “digital twin” – that is, Flowcasting – is a similar breakthrough approach for supply chain planning.


“Life can be improved by adding, or by subtracting. The world pushes us to add because that benefits them. But the secret is to focus on subtracting.”

                    - Derek Sivers

People don’t subtract.

Our minds add before even considering taking away.

Don’t believe me?

Leidy Klotz is a Behavioral Science Professor at the University of Virginia and a student of “less”. He conducted a series of experiments that demonstrate people think “more” instead of “less”.

Consider the following diagram and the ask.

Thousands of participants were asked to make the patterns on the left and right side of the dark middle vertical line match each other, with the least number of changes.

There are two best answers. One is to add four shaded blocks on the left and the other is to subtract four shaded blocks on the right.

Only about 15 percent of participants chose to subtract.

Intrigued, Professor Klotz and his research assistants concocted numerous additional experiments to test whether people would add or subtract. They all produced the same result and conclusion – people are addicted to and inclined to add. It wasn’t close.

Big fucking deal, right?

Not so fast. Unfortunately, adding almost always makes things more complicated, polluted, and worse. You’d be better off subtracting.

A great example in supply chain is demand planning.

Demand planning, according to many, is becoming the poster child of adding. Let’s factor in more variables to produce an even more beautiful and voluptuous forecast. Are you sure you all these additional variables will improve the demand plan?

I doubt it.

First, many companies are forecasting what should be calculated. It’s been proven that the farther away from end consumption you’re trying to forecast, the more variables you’ll try to add. And the resulting forecast usually gets worse the more you add – since you’re often adding noise.

We have a retail client that is forecasting consumer demand at the item/store level only and calculating all inventory flows from store to supplier – what we call Flowcasting. Their demand planning process only considers two variables to calculate the baseline forecast:
• the sales history in units
• an indication if the sales was influenced by something abnormal (e.g., like promotions, clearance, out of stock, etc.)

All “other” variables that the “experts” say should be included have been subtracted.

Yet their planning process consistently delivers industry leading daily in-stocks and inventory flows to the store shelf.

The idea is simple, profound, and extremely difficult for us all. For process and solution designs, and pretty much everything, you need to remove what’s unnecessary.

You need to subtract.

Is the juice worth the squeeze?


A little over 10 years ago I was on a project to help one of Canada’s largest grocery and general merchandise retailers design and implement new planning processes and technology. My role was the co-lead of the Integrated Planning, Forecasting & Replenishment Team and, shockingly, we ended up with a Flowcasting-like design.

The company was engaged in a massive supply chain transformation and the planning component was only one piece of the puzzle. As a result of this, one of the world’s preeminent consulting firms, Accenture, was retained to help oversee and guide the entire program.

One of the partners leading the transformation was a chap named Gary. Gary was a sports lover, a really decent person, great communicator and good listener. He also had a number of “southern sayings” – nuggets of wisdom gleaned from growing up in the southern United States.

One of his saying’s that’s always stuck with me is his question, “is the juice worth the squeeze?”, alluding to the fact that sometimes the result is not worth the effort.

I can remember the exact situation when this comment first surfaced. We were trying to help him understand that even for slow and very slow selling items, creating a long term forecast by item/store was not only worth the squeeze, but also critical. As loyal and devoted Flowcasting disciples know this is needed for planning completeness and to be able to provide a valid simulation of reality and work to a single set of numbers – two fundamental principles of Flowcasting.

The good news was that our colleague did eventually listen to us and understood that the squeeze was not too onerous and today, this client is planning and using Flowcasting – for all items, regardless of sales velocity.

But Gary’s question is an instructive one and one that I’ve been pondering quite a bit recently, particularly with respect to demand planning. Let me explain.

The progress that’s been made by leading technology vendors in forecasting by item/store has been impressive. The leading solutions essentially utilize a unified model/approach (sometimes based on AI/ML, and in other cases not), essentially allowing demand planners to largely take their hands off the wheel in terms of generating a baseline forecast.

The implications of this are significant as it allows the work of demand planning to be more focused and value added – that is, instead of learning and tuning forecasting models, they are working with Merchants and Leaders to develop and implement programs and strategies to drive sales and customer loyalty.

But, I think, perhaps we might be reaching the point where we’re too consumed with trying to squeeze the same orange.

My point is how much better, or more accurate, can you make an item/store forecast when most retailers’ assortments have 60%+ items selling less than 26 units per year, by item/store? It’s a diminishing return for sure.

Delivering exceptional levels of daily in-stock and inventory performance is not solely governed by the forecast. Integrating and seamlessly connecting the supply chain from the item/store forecast to factory is, at this stage, I believe, even more crucial.

Of course, I’m talking about the seamless integration of arrival-based, time-phased, planned shipments from consumption to supply, and updated daily (or even in real time if needed) based on the latest sales and inventory information. This allows all partners in the supply chain to work to a single set of numbers and provides the foundation to make meaningful and impactful improvements in lead times and ordering parameters that impede product flow.

The leading solutions and enabling processes need to produce a decent and reasonable forecast, but that’s not what’s going to make a difference, in my opinion. The big difference, now, will be in planning flexibility and agility – for example, how early and easily supply issues can be surfaced and resolved and/or demand re-mapped to supply.

You and your team can work hard on trying to squeeze an extra 1-3% in terms of forecast accuracy. You could also work to ensure planning flexibility and agility. Or you could work hard on both.

It’s a bit like trying to get great orange juice. To get the best juice, you need to squeeze the right oranges.

Which ones are you squeezing?

Marketplace Perceptions Rarely Reflect Reality (at least in this case)


What’s wrong with this picture?

Back in 2014, Lora Cecere (a well-regarded supply chain consultant, researcher and blogger) wrote a post called Preparing for the Third Act. She said “JDA has used the maintenance stream from customers as an annuity income base with very little innovation into manufacturing applications. While there has been some funding of retail applications, customers are disappointed.”

Our experience is consistent with Lora’s assessment. We’ve been working in the trenches on this for the last 20 years and that opinion is also shared by many of our colleagues in the consulting ranks.

In fact, we have first-hand experience with customers who have been disappointed and with customers who are delighted. That puts us in a unique position.

What’s going on now makes no sense. You can buy the software that disappoints, but you can’t buy the software that does not disappoint. To make things even more absurd, the same company (JDA) has both software packages in its stable.

Retailers agree that planning all of their inventory and supply chain resources based on a forecast of sales at the retail shelf makes perfect sense. Additionally, manufacturers agree that getting time-phased replenishment schedules based on those plans from their retail customers provides significant additional value across the extended supply chain.

But because you can only buy the software that disappoints, most of the implementations will likewise be disappointing. Consequently, the perception of these systems in the marketplace is fairly negative.

It shouldn’t be. These systems can work very well.

First, a brief history of where this all started and how we got where we are today.

Initially, retailers had a choice between time-phased planning software designed for the manufacturing/distribution market or software designed for retail that couldn’t do time-phased planning.

Later, software was developed specifically for the mission: time-phased planning at store level. It could handle gigantic data volumes economically, is easy to use and easy to implement. It’s suitable for a small company (a handful of stores) and has been tested with volumes up to 450 million item/store combinations on inexpensive hardware.

Unsurprisingly, a square peg forced into a round hole (systems initially designed for use in manufacturing plants and distribution centres being applied to store level) yielded the disappointing results.

Also unsurprisingly, a system designed specifically to plan from store level back to manufacturers works just fine. In fact, our client (a mid-market Canadian hard goods retailer) is now planning every item at every store and DC, sharing schedules with suppliers, managing capacities and achieving extraordinary business results.

No doubt, the problem has been solved.

So what’s the path forward?

It’s in everybody’s best interest to work together.

From JDA’s perspective, this is a new wide-open market for them – and it’s enormous. But it won’t be developed if the marketplace perceives that implementing these systems delivers disappointing results.

In the event that JDA were to develop a new system with new technology and features appropriate to a retail business, they still need to build it, sell it to some early adopters, get it working well and rack up a few unequivocal success stories before they can begin to overcome the current level of customer disappointment.

How long will all of that take? An optimistic estimate would be 2 years. A realistic one is more like 3-5 years. Will there still be a market then?

From the retailer’s and manufacturer’s perspective, they can be saving tens to hundreds of millions of dollars per year (depending on size) and providing a superior consumer experience.

From the consultant’s point of view (the people who recommend and implement these systems for a living), having the ability to implement the software that isn’t being sold – but has been proven to work  will increase the number of implementations with outstanding results (rather than disappointing results). The net effect of this is that JDA will have more revenue and more success than if they continue to keep this software off the market. JDA has made this type of arrangement with other partners to their mutual benefit, without head-butting or causing confusion in the marketplace.

It could be that JDA is too close to the problem to see this as a solution.

Maybe Blackstone can look at situations like this more objectively and without bias, unencumbered from all that’s transpired to date.