"And yes, we want both accuracy and precision in scales. But, in your example of the 77 vs 75 grain charge weight, if you know the scale is off by two grains, consistently, you simply adjust till the scale shows 75. The numbers it displays don't really matter, they're just a reference (akin to the "well, just adjust your scope" above. But yes, we want the number displayed to be the true value."
But when weighing a charge for your cartridge development, the number it displays is extremely important. Do you really want to use a scale that is consistently off by 2 grains and you just "adjust" for the difference? How reliable is that scale? How do know if it doesn't wander beyond or toward the accurate number?
Here is a definition from a search on the net: "Accuracy Versus Precision of Measurement. Both accuracy and precision reflect how close a measurement is to an actual value, but accuracy reflects how close a measurement is to a known or accepted value, while precision reflects how reproducible measurements are, even if they are far from the accepted value."
So in the example given by Reloadron, certainly if you expect group 2 to be in the Bullseye, it is not accurate, but precise. Those shots in the bull are both accurate and precise. But neither has anything to do with the use of scales to weigh charges.
In contrast to Hounddawg's position, I want my scale to be accurate AND precise, not just precise. In nhyrum's position, if the scale consistently shows 5 grains when he wants 3 grains, he is satisfied to recognize that it is off by two grains.