The Catastrophic Consequences of Under-Competence
You are receiving this post/email because you are a patron/subscriber to Of Two Minds / Charles Hugh Smith. This is Musings Report 2024-33.
We all understand human error: someone was tired and misread the situation, or they were impatient. We also understand incompetence: the individual simply didn't have the knowledge and experience needed to make the right decisions and take corrective action.
Author Charles Perrow studied organizational weaknesses that generate flawed responses to what he calls "normal accidents," responses that made the situation far worse. In other words, the system itself increases the risks of normal accidents becoming catastrophic accidents.
"Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks."
Essayist Samo Burja has provided a nuanced account of how the knowledge and skills needed to design industrial organizations and modify their complex systems decay as knowledge/experience are proceduralized, and more recently, automated by algorithms.
The End of Industrial Society
In his analysis, industrialization was fundamentally the reduction of skilled labor to procedures that unskilled labor could perform at a fraction of the cost of workshop-based technologies. This required mass schooling to prepare the labor force to work in factories, what Burja describes as "human capital of the industrial type."
As the economy shifted from industrial to post-industrial, the corporate office demanded the proceduralization of white-collar work, much as factories proceduralized the production of goods. This required mass schooling in the university system, which proceduralized "human capital of the white-collar type."
In the same way that craft skills were viewed as unnecessary in the specialized, proceduralized factory, white-collar work was also de-skilled into siloed specialties and knowledge of procedures.
Systemic (deep) knowledge went by the wayside in favor of what historian David Graeber famously called "BS work:" following procedures of data processing, compliance, oversight, etc. This dynamic led to the over-production of white-collar work, for there is no end to the layers of complexity that can be added, which then require their own layers of precautions, compliance, etc.
Here is how Burja summed it up: "The solution of overproducing white-collar jobs is at first natural and then dysfunctional. Bureaucracies decay in a way that is much less visible than the decay of factories."
In other words, the knowledge embedded in the culture and the workforce decay, impoverishing the intangible networks of experience, skills and shared values that Burja calls "Social Technology."
Correspondent Robert R. coined a phrase which describes this erosion and eventual loss of knowledge/experience: under-competence. This describes something that goes unrecognized: the worker is competent in terms of following procedures to deal with normal situations and normal accidents, but completely incompetent when the situation demands an experiential working knowledge of the entire system and how to respond to situations outside the norm.
Keep reading with a 7-day free trial
Subscribe to Charles Hugh Smith's Substack to keep reading this post and get 7 days of free access to the full post archives.