This is rather interesting.
The gentleman who suggested starting with a headspace "go" gage is,IMO,the most correct.
If you feel the need to verify the calibration of the hardened and ground steel gage,OK.Not wrong.
But that done,its a repeatable standard to go back to what a minimum SAAMI chamber should be cut to. In other words,it represents a standard to size ammo to that will fit in ANY SAAMI spec chamber..
Now the handloader can set his tool to that baseline,and THEN,from his notes,decide I want plus .003 for THIS rifle.
The idea that any aluminum bushing with less than razor sharp corner clamped to a caliper can more accurately represent a standard than a hardened and ground gage is amusing.
You certainly CAN set your Hornady bushing on the headspace gage,hit "zero" and then measure an approx. + or - from there.
I have made a lot of military,aerospace,and other high precision parts.
There are different levels of precision,but generally calipers and micrometers are .."navigation"tools for the machinist.
Often,they are not acceptable to qualify a part. Holes get accepted/rejected with a calibrated plug gage.Rounds are qualified good/bad with a ring gage.
There is a device called a "comparator" that is a stand with a very flat stage.A very sensitive indicator is attached to the stand,often graduated in 50 millionths of an inch. You set the gage with carbide Johannsen blocks,traceable to the Bureau of Standards.
And,dial bore gauges,etc are calibrated to a Johannsen block set to zero for each job,the indicator showing + or - in to 50 millionths.
Point,the yes or no qualifying standard is a a gage,
Now,with all due respect for a feeler gage,how flat/straight is it?
I have,on my loading bench,a flat of black granite with a post on it and a clamp for a dial indicator. If I take my Wilson case gage and stand it on the granite, I can set my dial indicator zero on the min or max step of the gage,and the little limit flags on the indicator dial face to whatever tolerance I want from there.
I can drop the brass in the Wilson gage and pass it under the indicator to get a read for,example -.003 from max length or actually +.003 to reduce head clearance for a particular rifle.
Its just another way.
Mic setting standards are OK,I always used Jo blocks
A pretty good bargain for a decent set of "standard" blocks to use in a shop isa set of "space blocks". they are about 3/4 in in dia,round,with a threaded hole in the middle.They are steel,hardened and ground.
I think maybe $50or $60 will buy a set.
You get a set that can be stacked..like .101,.102,103,etc.THen .110,120,130 And a .050,a.0625,
A .100,.200,.300,a.500,a 1 in,a 2.00 in,etc. You can stack them on an allen set screw.Use them with lathe and quill stops,sine bars,measuring slots,etc.
The gentleman who suggested starting with a headspace "go" gage is,IMO,the most correct.
If you feel the need to verify the calibration of the hardened and ground steel gage,OK.Not wrong.
But that done,its a repeatable standard to go back to what a minimum SAAMI chamber should be cut to. In other words,it represents a standard to size ammo to that will fit in ANY SAAMI spec chamber..
Now the handloader can set his tool to that baseline,and THEN,from his notes,decide I want plus .003 for THIS rifle.
The idea that any aluminum bushing with less than razor sharp corner clamped to a caliper can more accurately represent a standard than a hardened and ground gage is amusing.
You certainly CAN set your Hornady bushing on the headspace gage,hit "zero" and then measure an approx. + or - from there.
I have made a lot of military,aerospace,and other high precision parts.
There are different levels of precision,but generally calipers and micrometers are .."navigation"tools for the machinist.
Often,they are not acceptable to qualify a part. Holes get accepted/rejected with a calibrated plug gage.Rounds are qualified good/bad with a ring gage.
There is a device called a "comparator" that is a stand with a very flat stage.A very sensitive indicator is attached to the stand,often graduated in 50 millionths of an inch. You set the gage with carbide Johannsen blocks,traceable to the Bureau of Standards.
And,dial bore gauges,etc are calibrated to a Johannsen block set to zero for each job,the indicator showing + or - in to 50 millionths.
Point,the yes or no qualifying standard is a a gage,
Now,with all due respect for a feeler gage,how flat/straight is it?
I have,on my loading bench,a flat of black granite with a post on it and a clamp for a dial indicator. If I take my Wilson case gage and stand it on the granite, I can set my dial indicator zero on the min or max step of the gage,and the little limit flags on the indicator dial face to whatever tolerance I want from there.
I can drop the brass in the Wilson gage and pass it under the indicator to get a read for,example -.003 from max length or actually +.003 to reduce head clearance for a particular rifle.
Its just another way.
Mic setting standards are OK,I always used Jo blocks
A pretty good bargain for a decent set of "standard" blocks to use in a shop isa set of "space blocks". they are about 3/4 in in dia,round,with a threaded hole in the middle.They are steel,hardened and ground.
I think maybe $50or $60 will buy a set.
You get a set that can be stacked..like .101,.102,103,etc.THen .110,120,130 And a .050,a.0625,
A .100,.200,.300,a.500,a 1 in,a 2.00 in,etc. You can stack them on an allen set screw.Use them with lathe and quill stops,sine bars,measuring slots,etc.
Last edited: