Thursday, August 31, 2023

Accuracy and precision

Calibration is one of the basic methods in any Quality Management System, but for years my understanding of it was not deep. Of course I knew it was important. If you are making a product that requires precision measurements, but your measuring tools aren't calibrated, you have no guarantee that your measurements are right. Concretely, if you need a part to be 0.250"± 0.001" but your tool is off and the part is actually (let's say) 0.234" instead, it's not going to fit. So yes, it matters. 

This is why clause 7.5.1 of ISO 9001:2015 requires that you figure out whether you need calibrated equipment; and then, if you do, that you calibrate it. And in all the years that I did internal audits, I made sure to turn over any measuring equipment that was actually in use to check the calibration stickers.

But then I got a chance to work for a calibration laboratory, and I began to appreciate the huge amount of mathematical theory that underlies the whole job of calibration. I didn't work there long enough to be able to regenerate all the calculations on my own from first principles. But I did learn some of the more important concepts.

One of these is the distinction between accuracy and precision. Those both sound like good things, and of course they are. But they are different things, and the exact nature of the difference matters.

Whenever you measure something, you get a reading of some kind. But you can never be sure that the reading is exactly right. In order to make sure that the reading is as good as possible, you want to ensure two different things:

On the one hand, you don't want your measuring tool to bounce around. You want it to give you the same answer every time you measure the same thing. I've got a scale in my kitchen that isn't very good at this. If I set a weight on it all at once (like a bag of onions) the needle bounces up to a certain reading. If I add the same weight gradually (for example, by pouring in rice until I get the right amount) the needle tends to stick on the way up; then I'll add a little more and it jumps up to a higher number. This scale is good enough to make dinner with, but I would never dream of using it in a production facility to build product.

In calibration terminology, my kitchen scale is not precise: I can put the same weight on it and get two different readings, depending whether I add the weight all at once or gradually. A precise scale would give the same reading for the same weight no matter what.

But precision is only half the battle. I remember a professor of mine once told about visiting the NIST lab that stored the nation's first official atomic clock, which the tour guide called "the most perfect clock in the world." And my prof saw that someone had set the hands to the wrong time. As an atomic clock, it was more precise than any other clock in the country just then. But on that afternoon it wasn't accurate, because the hands were set wrong. It was telling the wrong time, but it told that wrong time with unequaled precision

In other words, precision means that you get the same measurement each time. Accuracy means that the measurement you get is correct. You need them both.

You can have accuracy without precision. That's like my kitchen scale: there's a lot of fluctuation in the readings, but when nothing is on the scale it reliably shows zero. It's not set wrong. There's just room for error when you weigh something on it.

And you can have precision without accuracy. That's like the atomic clock on the day my prof joined the tour.

A quick search on the Internet turns up dozens of pictures that all show the difference in basically the same way. Here's one, for example.


Or, if you prefer, Randall Munroe explained the difference this way
in his webcomic xkcd.


           

No comments:

Post a Comment

Quality and the weather

“ Everybody complains about the weather, but nobody does anything about it. ” The weather touches everybody. But most people, most of the ti...