Supply Chain Summit

Tomorrow — Tuesday, September 10 — the US Department of Commerce and the Council on Foreign Relations will host a Supply Chain Summit. It is organized to explore proactive strategies to strengthen global supply chain resilience. At the link you can still register to participate virtually.

Rana Foroohar, one of my favorite thinkers and writers, will moderate a morning panel. In today’s FT she writes:

US commerce secretary Gina Raimondo… told me last week that the biggest surprise of her tenure was learning “just how unprepared the federal government was to identify and react to supply chain disruptions, and how unsophisticated the approach to this has been for so long”. Part of this is down to the fact that the entities holding the best and most granular information about supply chains are private companies. They tend to be looking for individual risks in specific areas, rather than systemic issues across the economy. Governments, on the other hand, may be able to identify the need for more resilience in areas that are crucial for economic or national security — such as semiconductors or pharmaceuticals — but have little understanding of the particulars of each supply chain, or how they might interact with areas like logistics, transport, energy or power in the midst of a crisis.

This summit is being hosted to enhance understanding. A Department of Commerce Supply Chain Center has been created. Other federal agencies have created similar functions. Various data gathering efforts and analytical approaches are underway. Over the years I have contributed some thoughts about how to be more data-informed regarding big flows under serious stress.

It is important work. We are early in the work. We need to work smart — both near-term and long-term.

Contemporary high volume, high velocity demand and supply networks are Complex Adaptive Systems — not neat Newtonian machines. The more granular our flow data, the more dynamic — and paradoxically — unpredictable our outcomes. Probabilities can be estimated and assessed. More and better data will improve our sense of probabilities. But strict predictability will remain elusive.

Several weeks ago a client asked me to help them look at an emerging risk to some crucial flows. Other outsiders were also asked to offer insights. I tend to focus on midstream flows, especially the confluence of several flows (aka capacity concentrations). One of the others generated a sizing and siting of downstream — fairly granular — demand. I deduce midstream probabilities from diverse indicators. They induce demand dynamics from specific, carefully curated data sources. I hope to be data-informed. These others are definitely data-driven.

In this particular case, for a host of idiosyncratic reasons, the data sources available were seriously wrong. Unfortunately, the data curators and analysts involved did not have a sufficient contextual understanding to recognize several signals of data deficiency. Instead of running the results, recognizing problems, and trying a different way to slice and dice their data, they generated an authoritative report of initial results.

The client did not need me to point out problems. The client understood their own demand well-enough to immediately suss-out fundamental problems in the data-driven analysis. It was embarrassing for everyone. But quickly recognizing the problems was much better than making decisions off wildly skewed angles on reality.

The client was wise in involving several different angles on their potential problem. The client was wise in not expecting a silver bullet. This client’s senior decision-makers are well-informed regarding context. More than many others, these decision-makers are very curious and consistently humble (somehow those two characteristics seem seldom-enough compatible).

I have seen this client be decisive. But they are, if anything, unusually patient. Their corporate culture values shaping and recognizing the “right time” for making and executing a decision. They watch their flows — upstream, midstream, and downstream. They listen to their suppliers, stakeholders, and customers.

This client brings well-informed, well-conceived deniable hypotheses to potential problems. They actively probe context. They adapt their hypotheses to emerging signals. They act courageously, continue to watch/listen for outcomes, and adjust accordingly. They are not afraid of being wrong. They assume that even when they are right, their actions will create outcomes requiring adaptation. The goal is less a matter of being right or avoiding wrong and much more about being effective in context.

Of course data is needed to make constructive decisions. More quantitative and qualitative data is usually better than less. But no matter how much granular data is eventually gathered by all the fabulous sensors distributed across our 2054 Internet-Of-Things, the decisions to be made — by human wet-ware or the most advanced AI — will still require a sense of context, purpose, and emergence.

Data gathering, creation, and analysis is tough. Synthesizing outcomes in meaningful context to advance constructive purposes is crucial.