Why Quantism Is a Toxic Religion We Must All Abjure
“You can’t improve what you don’t measure.” We’ve all heard this mantra, which is often mistakenly attributed to management gurus like Peter Drucker and W. Edwards Deming. And many of us embrace this principle as the basis upon which to press forward with the advancement of data-driven analytics, machine learning, and AI.
There’s just one problem. It’s horsecrap. We improve stuff without measuring it all the time.
There are lots of ways to logically refute the poisonous myth of absolutist quantism. One way is to simply observe our own lives. I, for one, don’t log the total minutes per month I spend arguing with my fiancée. Nor do I track mean time between aggravated eyerolls. Nonetheless, through effort and communication, she and I have achieved a level of intimacy, trust, and delight that I’m sure would be the envy of many.
Another piece of empirical evidence is that much of human progress has had nothing to do with data. The discovery of vaccination, for example, was based on insight and bold action—not the analysis of quantitative data. Sure, data has been used to incrementally improve the efficacy of vaccines and vaccination programs. But the innovation of the vaccine itself was not based on data inputs.
A third refutation is simply to reference the purported disciples of data, Drucker and Deming. In his masterwork The New Economics, Deming actually calls the above pseudoprinciple “a costly myth.” And Drucker asserted that the only functions a manager can personally perform—relationships with people, development of mutual confidence, creation of community, etc.—couldn’t be measured at all.
Yet we persist in clinging to and promulgating the false gospel of quantification. In fact, as we disseminate our technologies across all aspects of human endeavor, we are aggressively poisoning those human endeavors with a dogma that is doomed to failure: that salvation can be found in governance by data—or, as I choose to call it, “datarchy.”
Why datarchy is failing
I’ll probably write a separate piece on how datarchy wreaks destruction, but let’s briefly consider a few basic problems with the fetishization of quant:
- Datarchy privileges the measurable above the important. Sure, data is super-useful for many purposes. But there’s a reason employers first use data to screen job applicants and then call the ones with the best data for an interview before making their final selection. Data alone is simply insufficient. We know this instinctively. Yet because we’ve gotten so good at capturing and analyzing data, we seem determined to pretend that that’s how we can solve our most pressing problems.
- Datarchy answers small questions and begs big ones. Market indices like the S&P and Dow register the expectations of an extremely small number of human beings with relatively narrow interests regarding a handful of public corporations. These indices have almost nothing to do with anything except flows of aggregated capital. Yet they’re treated as a reliable daily indicator of the state of the cosmos. It’s patently absurd. But because we have been indoctrinated to “believe” in data, we act as though these numbers yield knowledge about our economy that we in reality lack. Worse yet, our false belief makes us intellectually complacent—so we keep failing to ask fundamental questions about capital and its purposes.
- Datarchy doesn’t lead inexorably to the good. We have more data and use it in more ways than ever. Despite this, the climate crisis escalates unabated. Economic disparity keeps getting worse. Political institutions are failing. And so on. I know those of us working in data-related fields are tempted to claim that this is not our fault, that if people would just utilize the amazing tools we develop in more effective ways, everything would be fine. But that’s a cop-out. Being good at engineering isn’t an excuse for being bad at intellectual honesty, civic responsibility, and the ethical use of one’s gifts. Unless, of course, we as an industry simply want to plead guilty to moral bankruptcy while we cash out of our third startup.
And this is without addressing issues such as bad data, skewed data sampling, and poor data taxonomy.
Dealing in realities
None of the above is meant as an indictment of data-related technologies themselves. I’ve been working in the field for more than forty years. Wikipedia even credits me with coining the term “DataOps.” And I continue to advocate for the aggressive use of data for the purposes it is suited to serve.
But we can’t claim to be smart while acting dumb. When you hear someone in power say “We will go where the facts lead us,” you might think we’ve somehow succeeded as an industry. Unfortunately, just the opposite is true. Such an absurd statement only demonstrates that we’ve poisoned people with our own special brand of Kool-Aid. Facts don’t and can’t lead. Leaders lead. And they lead by their vision, values, and moral capacity—things which data does not and cannot provide.
So, yes, let’s keep using data engineering as a tool. But let’s wield that tool wisely—and not impute to it any inappropriate power and authority just because such hype may serve our commercial purposes. We can and must use data to make better decisions. However, it is also our responsibility to rein in datarchy by properly understanding and communicating its limits. Failure to do so will have dire consequences for which we—and we alone—will be accountable.
-Lenny Liebmann, 11/17/20
(Note: The term “datarchy” is entirely unrelated to any of the several companies using the trade name “Datarchy” in the U.S., Slovenia, India, and elsewhere.)