Years ago, I worked for a small tech startup. We were scrappy and energetic, and we hadn't quite decided how finished a new product had to be before we could ship it.
- Did it have to be bug-free? There were always more bugs.
- How about if it had No serious bugs? That sounds nice, but what counts as "serious"?
In all these discussions, our head of engineering usually wanted to ship now and not later. (Of course he also saw the financial statements, and knew that we needed the revenue!) His argument was that if the product was basically good enough, then what we needed was for it to operate in a real-world environment so that we could identify which remaining defects really mattered. Then when we fixed those, the product would be ready. Other nominal defects might exist, but they would be merely cosmetic. He called this process "hardening in the field."
![]() |
| "Let's see: eggs, cheese, filling. I guess it's ready to serve!" Or maybe not. |
![]() |
| "But wait—this is OK?" |
Rapid innovation is a more or less constant feature of the high tech market landscape. Everybody knows that brand-new implementations of new technology are usually full of bugs; stable, reliable implementations take longer. So what do you do? Partly it depends on the inherent risks of the exact product you are designing. Is it a car or a rocket that can hurt people if it fails? Or is it a toy, where failure will just disappoint them? Is it easy to recover? And what does the regulatory environment look like? Obviously you have to take account of all these factors.
Beyond those factors, though, you may just have to decide where you want your organization to fit in the ecosystem of high-tech products: do you want to be first to market with innovative technology, or are you willing to trade speed of innovation for product reliability?
And then, if possible, you would like to design your Quality Management System so that it supports your decision—so that it nudges you into being the kind of company you want to be.
If it is important for you to be first to market, you should measure your development process with KPIs that track (among other things) how fast new releases reach the field. Since your initial releases are likely to be buggy, your customer support process should monitor KPIs that track the speed with which customer issues are resolved. You may wish to implement an Agile development model, or offer customers the opportunity to work with you as partners in exchange for providing their feedback as active members of the development process.
Conversely, if it is more important to you that your products be fully reliable before they reach a customer, then you should not measure speed of delivery as one of your development KPIs. What you measure is what you optimize; if you are willing to sacrifice speed for reliability, don’t measure speed. In this case, you are more likely to set metrics around the extent and comprehensiveness of testing, and the number of known bugs at time of release. You might also choose to use a waterfall development model (instead of an Agile one) so that testing is done on one version at a time, thus reducing the number of variables in the development process and presumably some quantum of risk.
It's interesting to realize that "Quality" doesn't always mean the same thing—or rather, that it can mean two different things (in this case both speed of innovation and reliability of performance) which are incompatible, and which you have to choose between. And that single choice can have ripple effects across your metrics, your processes, and your strategy.



No comments:
Post a Comment