Digital Reloading Scales DON'T change when you trickle

From my experiences with only a few digitals, Ive come to the conclusion that they can only register a gradient. This is to say they can only read 10,1 gr when the 10.1 gr level is reached. They cannot read a half tenth like a balance beam, this you can't change unless you get a scale that reads to smaller increments; 0.0n gr My latest is a Dillon that has served me well, although its like a bird dog or retriever that you have to learn to read some idiosyncracies.. I find it works flawlesly if you dump 98% of the charge and add a grain at a time until it "breaks" over into the next tenth. This break will be accurate and consistent nearly all the time. If it misbehaves, its because the A/C or heat is on. Waiting forever, as stated gets it confused, so a fast reboot is needed, lift the pan and reweigh. I always check the tare weight when lifting the pan. If its changed, Illre zero. I use the same pan for it as the balance beam, and just drop the pan on the beam every so often to confirm.
 
Even on a beam, reading to .05 grain is a guess at best. And, frankly, if you are getting that anal about your loads you are burning a lot of calories for almost no measurable gain. But, if that floats your boat - have at it.

My Chargemaster works wonderfully and it is one of the single best reloading tools I've ever bought. Sometimes I wish it would dispense faster, but I easily live with the speed and accuracy.
 
I have never used a beam scale. From the day I got my first press I got a digital scale to go with it. I set the beam up once or twice just to give it a chance, My take- Way to micky mouse, slow, cumbersome to mess with. My loads time and time again with a digital scale will have a deviation of 10 FPS or less , consistantly. So good or bad that level of error is more than good enough for me. The digital is just way to easy to use. I will say nothing against beam as the Micky mouse,slow and cumbersome comes from a guy ( Me) that has spent maybe 20 minutes working with one. Im sure if I had to use it in time it would become easier, but the question still would be-Why waste my time on it. I have powder scoups for every load. One scoop and trickle and I am done.
Now if someone can tell me a beam scale will get me better then the deviation I am getting now I will give it another shot. To me it's like a dig caliper as compared to a manual caliper. They both work great, but digital for me.
 
My cheap Frankfurth scale works just fine.

I charge a bit low, trickle in till its a tenth plus or minus (thank you Bart B) and good to go.

Sometimes its a bit slow to register but I know what it takes for a given increase and just waist a second. No big deal.

I can powder far faster than I can with the beam.

As noted above, I check the pan weight (tare) each time its removed and if it shifts off its calibrated weight, I re-tare (zero) and good to go.

Understand how it works and how to work with it and they are fine. Fight em and you are better off with a beam.
 
wogpotter said:
My PACT doesn't lock, nor does it flicker.

You got a good one. I didn't. Had drift issues and ultimately had to keep it temperature controlled by parking it on a granite surface plate and interference-free by running it through a filtered AC supply.

This has been a past pattern with a lot of digital scales; you might get a good one or not. But my PACT is pretty old at this point and the technology keeps improving. I got a battery operated CED Pocket Scale at one point, which was not cheap, but has proper four point pan support for zero weight location sensitivity. That little battery-operated tool is virtually perfectly repeatable and dead steady. But the company I bought it from, RSI, stopped carrying them because of complaints about some units being far less well-behaved than mine. My Acculab scale also has digits move a little on most days, but since it's got an extra decimal place of resolution, that doesn't matter much. Once you've turned it on, it keeps power supplied to its strain gauge bridge as long as power is uninterrupted. This is in order not to require warming up.
 
I don't really get the complaints in this thread... I have a "cheap" Frankford Arsenal digital scale and I've never had a problem trickling powder into it. Of course the scale (most likely) won't change when you add a flake or two of powder, because for most of our powders a single flake weighs almost nothing.

I've never specifically made a digital scale, but I've worked with similar electronics, so I can make an educated guess about how they work. There are 3 limiting factors that should effect the accuracy of a digital scale:

  1. The accuracy of the sensor. Typically a sensor works by varying an output voltage. For a weight sensor this would mean for example that 0V output is no load and 5V is max load. The sensor will also have an error factor. For example, we would expect a 20% load to output 1V, but in reality it is going to vary between something like .975V and 1.025V.
  2. The resolution of the analog to digital convertor used. 10 bit and 12 bit resolution is common in the microcontrollers I've used, but an application like a scale might use higher resolution like 16 bit. If we assume that we have a 12 bit ADC that means that we can read 4096 distinct values from the sensor. So if the sensor reads between 0 and 250 grains (IIRC this is the range for my scale) then we are measuring in increments of 0.061 grains.
  3. Deciding how to display the data. Our sensor and ADC are still more accurate than our display, which is limited to a precision of 0.1 grain. As weight is added the first 5 readings we get from the ADC will be 0 gr, 0.061 gr, 0.122 gr, 0.183 gr, 0.244 gr. But how do those map to display? Presumably the system does normal rounding, so a value greater than 0.05 gr and less than 0.15 gr will display 0.1 gr, and > 0.15, < 0.25 will display 0.2. But of course the programmers could have done something totally different or off the wall.

There are other questions as well, such as how stable the sensor's output is, how often you should poll it for new readings, the "native" units of the sensor (I imagine most off the shelf parts actually measure grams), and how the sensor is affected by temp/humidity.

Anyway, I haven't seen any evidence that my digital scale isn't functioning properly or in a way I would expect. And yes, digital scales are "imperfect" unless you pay out the behind for a very, very high end model. Good eyesight and a good beam scale will give you "better" measurements, if that's what you're interested in. The thing is, the digital scale is plenty good enough for everything I do.
 
I use a powder trickler with a Sartorius AY-123 milligram digital scale. Even with fine powder like CFE 223 or Win 748 it changes with one or two granules of powder.
 
I use a powder trickler with a Sartorius AY-123 milligram digital scale. Even with fine powder like CFE 223 or Win 748 it changes with one or two granules of powder.

This model is discontinued by manufacturer, still available on Amazon for $305, no warranty or parts available. Replacement model, (Practum 213-1S) is a cool $1,040!

Here's a writeup on 3 scales:
http://bulletin.accurateshooter.com...-comparison-gempro-500-ay123-sartorius-gd503/

Follow the link to the GD503. This will read out kernel by kernel for $900, powered trickler $70.
 
I did the same thing with a cheap scale the other night. 100 rounds of 54r, wanted them dead-on. I just used a lee PPM, and zero-ed the pan from the beam scale. I set the PPM, would do my pour into the pan, weigh it, if it wasn't 45.8gr, then I would just dump the pan and try again. I would say one in four had to be dumped and re-poured because it's weight was of by .1-.2 gr. it didn't seem overly difficult, not sure what your trying to do differently.
 
Back
Top