You are here
I often find myself in meetings where serious executives start complaining to their minions that those minions’ assertions are actually mere opinions, not “facts” supported by “numbers.” The underlings typically react by scurrying off and digging up numbers that support their opinions. This generally satisfies upper management, and the world marches on.
But sometimes the “cold, hard facts” can be as misleading as any opinion. I was working on a project aiming for a targeted success rate, when Manager Bob suggested that a 95 percent accuracy rate sounded pretty good to him, since in school a 95 percent score was an A or even an A+. Manager Pete’ s response was telling: “If you’ re a kindergarten
teacher and you take 20 five-year-old children to the park, and bring back only 19, that’s a 95 percent success rate, but a horrible outcome.”
Another example: In an article in The New Yorker, Jerome Groopman, writing about intensive care units in hospitals, noted that recent studies show 99 percent of all actions taken (the delivery of a pill, the taking of an x-ray, the insertion of an IV) in an intensive care unit are executed properly. But a different study reported that, on average, 178 different actions are taken per patient per day. This means that, for the majority of patients, two things are done wrong each day, and some of these errors could lead to significant health problems.
The point is, an accuracy claim, whether it’s “four out of five,” or 99.44 percent pure, or 99 percent on time, is fundamentally meaningless until you place this “number” in context. For a food retailer, this all means that while numbers are part of any answer, they’re not the whole answer. When you see a sta- tistic that you don’t trust, ask about the context—where the number comes from, and what it really means.
Watch out for spin
The suspicion you feel deep in your bones when vendors use numbers to sell you things is probably a reasonable instinct that shouldn’t be dismissed. I’m not suggesting that your vendors are intentionally misleading you, but I will suggest that good salespeople are good at using numbers to explain things in ways that work to their own advantage.
Case in point: A salesperson shows a buyer a simple analysis supporting the idea that item X should be stocked in your store because baskets with item X in them are larger than average. This analysis holds water until you realize that the average basket size is only $12, so if the item in question costs about $5 or more, it will almost inevitably drive the average up. So, while this analysis is “legitimate,” it works only for about half of all of the items in your store.
Demand analysis, not numbers. Quality decision-making isn’t driven by more numbers or bet- ter numbers, it’s driven by deeper, more insightful analysis. You don’t want your people to do more adding, subtracting, multiplying, and dividing; you want them to do the thinking. Don’t conclude that an analysis is weak if it lacks numbers; conclude that it’s weak if it features undisciplined thinking.
When you do have to evaluate numbers on your own, do so in a way that confirms the numbers’ reality.
I was taught this little test by Steve Donovan, my first big, big boss when I was at Procter & Gamble. Whenever you were summoned to Steve’s corner office to defend a proposal, he would pick one number from your document, and then trace it to something he could confirm. For example, if you were defending a promotion by claiming it built volume 20 percent in the stores where it ran, he would casually ask you how many stores it ran in (let’s say 5,000), and then confirm that there are 20,000 stores, and that the promotion ran the week of March 1.
He would then say, “So, that means if I open up my fact book and look at the week of March 1, national volume should be up 5 percent (one-quarter of the 20 percent, because it ran in one-quarter of the stores).”
You would nervously agree, and then, together, you would look it up. If volume was up 5 percent, or even 3 percent, everyone would breathe a sigh of relief, and the proposal would be approved. But if volume was down, you would, as Ricky Ricardo would say, have some “’splainin’” to do.
This drill-down test isn’ t the be-all and end-all, but it is a good way of taking a number and tying it to reality, to the context in which it took place. And it serves as a reminder of the slipperiness of numbers when you try to confirm them as “facts.”
There’s nothing wrong with using numbers as the basis for understanding something, but numbers alone don’t create true understanding.