Blissful Ignorance is Blissfully Deceptive
You are receiving this post/email because you are a patron/subscriber to Of Two Minds / Charles Hugh Smith. This is Musings Report 2024-47.
In many cases, it's true that ignorance is bliss. It's certainly true for the systems that underpin everyday life: energy, food, water, and the other essentials listed in last week's Musings, The 8 Essentials We Need to Control.
If we know nothing about these systems, it's easy to feel complacently confident in their reliability and stability, as all we have to go on is our experience of interruptions being rare and brief.
Knowing too much ruins the blissful confidence of not knowing. For example, some years ago a reader who works in the gasoline / diesel supply chain that keeps your local gas station stocked with fuel explained that gas stations are never very far from running dry. If any glitch delays any link in the supply chain, stations would quickly run out of fuel.
Knowing this changes one's perception. Where other customers see the tanker truck refilling the service station tanks as mundane and unworthy of notice, I can't help see it as the final leg of a global supply chain that is inherently vulnerable to disruption at many points.
We take it for granted that our electricity, fuel, water, Internet, groceries, etc., will always be fully stocked / fully functional 24/7 as the backdrop of everyday life, freeing us to focus on other things. We take all this for granted for two reasons:
1) that's been our experience our entire lives, and
2) we know somebody somewhere "owns" the maintenance of these systems: if a fallen tree takes down a power line, somebody somewhere owns the removal of the tree and the restoration of the line. Since somebody else owns it, we don't have to think about it.
As I've endeavored to explain in various posts, there are limits on the capabilities of those we count on to maintain our essential systems come what may.
It's against this backdrop of the blissful confidence of not knowing that we read that the tap water in Asheville is finally drinkable again after 55 days of no doubt strenuous effort by those working to restore water service.
From the perspective of blissful ignorance, we might assume the storm that laid waste to the water system was a one-time outlier event that will never happen again in our lifetimes. But this might not be an accurate projection. Hundred-year storms, floods and droughts seem to be occurring every year or two now.
All these essential systems have been optimized for "normal" uneventful life to reduce costs. They are not designed for an increase of massively disruptive events. To take one example of many, spare parts are kept to a minimum to reduce costs. This means that there are only so many transformers, etc., available nationally and regionally, and only so much production capacity available should the stock of transformers be depleted by storm damage.
Furthermore, these production facilities and storage of essential components have been heavily centralized to reduce costs / optimize efficiencies, so one storm can cripple national production.
In other words, should abnormal events deplete the inventory of transformers, the system is not capable of producing new ones at the rate needed to fix what was damaged or lost.. Someone "owning" the maintenance isn't enough to fix what's broken, as the entire supply chain is inadequate.
As I explained in The Catastrophic Consequences of Under-Competence, the competence of those tasked with maintaining essential systems is another limit that's invisible until the system fails. The point here is the staff is competent to handle "normal" events but unprepared to deal with unusual events that are beyond their training and experience.
Automated systems amplify this "missing competence" because once those who designed / coded the automation retire, there's nobody left who truly understands how to do the tasks that were automated or how the automation actually works.
Once this experiential knowledge has been lost, it cannot be recovered by those who only have experience of using the automated system, not modifying it from the ground up. This sets systemically consequential limits on the capacity of those who "own" the maintenance of essential systems to fix what breaks in even "normal" accidents / events.
We can understand this as a narrowing of the band of what's "normal" due to optimization and a narrowing of the experiential capital that is the only real foundation of the kind of deep knowledge needed to deal with crises, as Donald Schon explained in his book The Reflective Practitioner: our experiential knowledge accretes slowly and intuitively, in ways we cannot fully describe.
We can also understand the limits of knowledge gained from "normal" operations as what correspondent Tom D. describes as "getting inside their OODA loop," the feedback-driven response cycle of Observe, Orient, Decide, and Act.
CHS NOTE: I understand some readers object to paywalled posts, so please note that my weekday posts are free and I reserve my weekend Musings Report for subscribers. Hopefully this mix makes sense in light of the fact that writing is my only paid work/job. Who knows, something here may change your life in some useful way. I am grateful for your readership and blessed by your financial support.
Keep reading with a 7-day free trial
Subscribe to Charles Hugh Smith's Substack to keep reading this post and get 7 days of free access to the full post archives.