Is any manager-ial platitude more often repeated, and less often observed, than “Whats measured is what matters”?
Before you count another line of code written, another percentage point of server uptime or another dollar of infrastructure capital spending, I ask you to spend some time counting the various ways that you measure the efforts you make and the results that those efforts produce.
I ask you to look unblinkingly at the list of things that you count with razor-sharp precision, and then to visualize—I warn you, this may be painful—the phantom list of things that are tremendously important but that you dont measure at all.
If yours is a typical business, you know to the penny what you spend on office supplies, but you have no idea what you spend on unproductive hours caused by inadequate training in office software skills.
You know the amount of vacation time that an employee has accrued—to the nearest tenth of an hour—but you have no idea of how much time it takes to respond to a customer e-mail, and even less of an idea of the relationship between response time and the customers contribution to your earnings.
You know how much revenue you earn from sales of your companys products, but you have no consistent way of associating revenue growth with customer-facing Web site application performance. You have even less of an idea of the connection between sales improvements on the one hand, and the difference between the work of your best and your worst application developers on the other.
Youre like a driver whose fuel gauge and speedometer function perfectly, but whose windshield is so dirty and scratched that he can barely see if its light or dark ahead of him—let alone which way the road leads. Youd never tolerate this situation in any other part of your life, but in the realm of enterprise performance measurement, its the norm.
One of the earliest strategic metrics that I remember encountering was 3Ms explicit measurement of its success in maintaining a fresh portfolio of innovative products.
Im pretty sure I saw this mentioned first in Tom Peters and Robert Watermans 1982 book “In Search of Excellence,” but 3M was still using this measure in 1997—when it achieved the impressive statistic of generating 30 percent of annual revenue from products that had been introduced within the previous four years.
I suspect that calculating the comparable figure of merit for most companies, even in the supposedly fast-paced IT sector, would yield much less vigorous values—especially if one doesnt count as “new” a product that seems, to all but the expert user, only superficially different from its previous release.
A product such as 3Ms Post-It wasnt a re-release of anything, but rather a basically new approach to a common need; a product such as Microsofts Windows Vista solves Microsofts problem of maintaining a revenue stream more than it solves the problems of any customers I can identify.
Another metric thats surfaced much more recently is Suns proposed SWAP (Space, Watts and Performance) composite measure of server design. The SWAP figure is computed by dividing performance (choose a relevant measure, please) by the product of rack space and power consumption. At least these things can be objectively measured—its just a matter of changing the viewpoint, not of learning to live with a softer focus.
There are two challenges facing anyone who wants to change the measures that guide enterprise decisions. First, the costs of collecting data have to be recognized as cheap compared with the costs of not doing so. Second, the discomfort of responding to any surprising results will require top managements buy-in.
Its a mistake to underestimate these challenges, but its a bigger mistake if you just keep on driving—and hope that youre still on the road.
Technology Editor Peter Coffee can be reached at peter_coffee@ziffdavis.com .