BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

How Numbers Can Lie

This article is more than 8 years old.

It’s easy to laugh off an academic squabble.  When overeducated combattants square off in an arena that most people don’t even know exists, few take notice.  Yet some reverberate outside the academic world and I suspect that Paul Romer's assault on mathiness, ably summarized by Justin Fox at Bloomberg View, will be one of them.

The issue at hand is the tendency of economists to cloak ideology in obscure equations to give their views a false appearance of rigor. Well, you might say, that’s what overeducated eggheads do, but seemingly practically minded business people have their own version of “mathiness.”

When managers say they are data driven and ROI focused they are usually more intent on professing a belief than delivering results. They are, essentially, accidental theorists, putting their faith in an abstract idea rather than engaging in any true analysis of cause and effect.  Despite what many will tell you, numbers can lie and only fools follow them blindly.

The Engineering Of Efficiency

Frederick Winslow Taylor must have been a strange sight at the Midvale Steel Works. Unlike most factory foremen, he didn’t bark at his men to work harder and faster, but stood by with a stopwatch, pen and ledger, observing and timing their movements. His aim was to find the one, best way to perform every task.

We now know Taylor as the father of scientific management, which later spawned the best practices and six sigma movements.  In the 1960’s and 70’s Wall Street got into the act, developing an efficient markets hypothesis that led to a capital asset pricing model (CAPM) to guide capital investments and the Black-Scholes model to mitigate risk.

The goal of these was to engineer optimized solutions to common business problems. If, as Taylor preached, there was a “one, best way” of doing things, then managers using math could hone their industrial machines by identifying best practices and deploying them throughout their organizations.

Alas, this was as much “mathiness” as it was math.  Underlying all the numbers and complicated formulas were human assumptions.  These were not, in fact, “scientific,” but mere guesses that made the math simpler and more manageable.  Unfortunately, they were also very, very wrong.

The Mathematics of “Anything Can Happen”

The idea that results can be “engineered” is an attractive one.  It suggests that by knowing inputs we can predict, with a remarkable degree of accuracy, what outputs will be.  If true, then performance is mostly a matter of getting the data right.  With better measurement and analysis, we should be able to get better results.

Yet Benoit Mandelbrot cast doubt on this neat little story.  Although much of the time these models worked and past results did indicate future performance, sometimes they were far off the mark.  He argued that economic analysis was too dependent on “Joseph effects,” which supported continuity and neat models, but ignored "Noah effects" which created discontinuity and blew those same models to bits.

In a sense, he wasn’t telling anybody anything they didn’t know.  Statisticians had long been aware that no model is perfect, but disregarded stray points of data as “outliers” that could be largely ignored.  Yet Mandelbrot pointed out that the efficiency engineering models were failing.  Outliers like market crashes happened far more often than they predicted.

So while everyone else was preaching about the wonders of “scientific management,” Mandelbrot was arguing for the mathematics of anything can happen.  In his mind, it was the outliers—market crashes, innovations like electric cars and iPhones, world wars and the like—that determined the course of history.

The System Crashes

For most of his career, Mandelbrot was seen as an iconoclast to be listened to and then ignored.  Paul Cootner, one of the pioneers of financial engineering wrote that he forced his colleagues “to face up in a substantive way to those uncomfortable empirical observations that there is little doubt most of us have had to sweep under the carpet until now.”

Then he added, “but surely before consigning centuries of work to the ash pile, we should like to have some assurance that all of our work is truly useless.”  The message was clear: the train had left the station.  The dream of a clockwork universe was far too alluring— and there was far too much money to be made—to to stop it.

The financial crisis of 2008, just a few years before his death, largely redeemed Mandelbrot in financial circles.  The risk models of the financial engineers failed us all.  Unforeseen defaults in mortgage markets cascaded through system and reverberated throughout the entire economy.  Risk wasn’t being managed.  In fact, it was being stepped up.

Unfortunately, the lessons remain surprisingly unlearned.  Today’s managers, driven by data and focused on ROI, continually believe that, with just a little bit more effort and precision, that they can keep Mandelbrots “Noah effects” at bay and consistently engineer results.

Robustness And Resilience

As he describes in his new book, Team of Teams, when General Stanley McChrystal first took over Special Forces in Iraq, he presided over a magnificently engineered military machine.  No force in the world could match their efficiency, expertise and effectiveness.  Yet, although they won every battle, they were losing the war.

The problem, he now explains, is that although his force had robust capabilities—they could perform any task they were given—they failed to be resilient in the face of unseen circumstances.  It wasn’t that they weren’t doing their jobs right, but that they weren’t doing the right jobs—and that was the Achilles heel of his elite force.

As McChrystal puts it, “In complex environments, resilience often spells success, while even the most brilliantly engineered fixed solutions are often insufficient or counterproductive.” Or, in business terms, they were performing to plan, but the plan itself was flawed.  It was based on assumptions that turned out not to be true.

And that’s the problem with mathiness.  For any endeavor, there is a simple model that should reasonably lead to good results. However, every model includes assumptions and, unless we understand and account for those assumptions, the model will eventually blow up in our face.  Obscuring that reality with Greek letters and abstract symbols only compounds the problem.

So while mathiness conveys a certain authority, and the idea of “scientifically engineered” solutions sounds attractive, we should remember that science isn’t about certitude, but skepticism.  There is never a magic formula that can solve all our problems.  A leader’s job is to deal with uncertainty, not ignore it.

Follow me on TwitterCheck out my website