22 Centerfire bore diameter tolerances

geetarman

New member
In another thread, I have been talking about my 22-250 suddenly going from a good shooter to an awful one.

The gun is a Remington 700 BDL that was re-barreled in 2009 with a Douglas barrel.

It has been glass bedded, action trued, restocked with new glass and new mounts.

The rifle has around a thousand rounds through it.

Took it to another gunsmith ( the gunsmith at Rio Salado died suddenly ) and he has some gage pins. He has a set of minus .0002 and a set of plus .0002 pins.

These pins are working pins and are not in the same league as Deltronic gage pins nor is there any information on the pin actual diameter and I no longer have access to lab equipment that would allow me to accurately measure the actual diameter of the pins

I do not know what the tolerances are on a centerfire 22 barrel and am hoping someone here can tell me or point me to where the information can be found.


We tested the barrel and it will accept a minus .220 pin ( .2198 ) from the muzzle and it will fall through the barrel freely.

Does anyone know what the diameter tolerances are for the bore as well as the rifling diameter?

Thanks
 
1000 round of ammo through a .22-.250 barrel's a bit over its limit for best accuracy, in my opinion.

And the bore diameter at the origin of the rifliing after that many rounds may well be several thousandths larger than the bore diameter at the muzzle.
 
Last edited:
bore dia'

Sirs;
There is NO tolerance! SAAMI may give one but i think SAAMI is full of crap!
Bullets are so precise now you can check your micrometers with them!
Greywolf
 
Harry,

Does it matter that 30 caliber bullets range in diameter from about .3068" to .3095"? They may be within .0001" or less spread across a given make and lot, but I've seen as much as .0005" spread in 30 caliber ones.

30 caliber barrel's groove diameters vary from .3065" to almost .3100" and SAAMI's spec for .308 Win. bullets is .3090" -.0030".

I've not mic'd 22 caliber ones, but I'd bet two bits they vary, too. SAAMI's spec (link's above from Brian P.) says bullet diameter for the .22-.250 is .2245" -.0030".

All cartridges listed in SAAMI specs have about the same bullet diameter tolerance. It's a voluntary standard; nothing says everyone has to follow it. Same for the barrel's chamber, bore and groove dimensions as well as cartridge case dimensions.

Meanwhile, why do you think SAAMI's so full of #&(%? I've never known anybody to feel that way about SAAMI. If you give an answer, I'll start a new thread possibly titled "SAAMI; Full of #&(% or Worthwhile?" so as not to get off this thread's topic.
 
Last edited:
bullets

Sirs;
No, I do not claibrate my micrometers with bullets! Of course not.
In 50 years of reloading I have never had a bullet measure other than printed on the box!
Of course there are always all kinds of stories to be told,but I have never experienced it. Many people read a mike very differently and could get .001 to 002 diffreence in readings!
Nuf said by me!
Greywolf
 
In 50 years of reloading I have never had a bullet measure other than printed on the box!

That is not surprising. It is due to the sensitivity of the measuring instrument. In your case, a micrometer.

In measurement science, there is a hierarchy of standards. The highest order of standards are those that are used to transfer their value, and the associated uncertainties, to lower order transfer standards. Those lower order transfer standards and THEIR uncertainties are used to calibrate working standards. Those standards and their uncertainties are used to calibrate things like micrometers. In every case, a higher order standard is used to transfer/calibrate a lower order standard. In every case, working standards can never be more accurate than the standards used to transfer their value NOR may they be used to calibrate in reverse order a higher echelon standard.

Another way of saying it is you may use a very precise cesium standard to calibrate Big Ben. You cannot then assign a value to the cesium standard by validating it against Big Ben.

The continuity and hierarchy of standards is what allows interchangeability of parts from different manufacturers.

Your micrometer was probably originally calibrated to +/- .0002 inch by a gageblock set whose accuracy was on the order of a few millionths of an inch.

That gage block set in turn had the values assigned to it by a transfer standard and that transfer standard had its value assigned to it by a master.

The masters are often calibrated by NIST and those standards derive from a natural physical constant such as the wave length of a particular helium neon laser.

Failure to abide by the rules will almost always lead to a departure of the desired attributes of a produced part when comparing what you have in your hand to what is defined in the part specification drawing.

It is much like a band tuning up when the guitar player is tuned to a standard concert pitch A=445, the bass player is tuned to concert pitch A=435 and the piano player is tuned to A=440 standard. Everyone is in tune but the music sounds like crap.

The higher you go in calibrating standards, the more complex it is.
 
Geetarman, I like your style.

My mic (zeroed and verified with a Johansson .30000" gauge block) has measured Sierra Bullets' diameters across different boxes of different 30 caliber types and weights marked ".308" for size measuring from .3079" to .3084".

Same mic on Western Cartridge Company ".308" diameter bullets from .3075" to .3088". The .3088" ones, as well as another at .3087" were 200 gr. FMJBT, 197 gr. BTHP and 180 gr. FMJBT match bullets made for use in Win. 70 target rifles whose barrel's groove diameters were typically about .3084". WCC knew what bullet diameters shot most accurate in those barrels.

What's in a number, anyway? Especially when one measures stuff with a rubber ruler. Bullets are so precise now you can check your rubber ruler with them! ;)
 
Last edited:
Bart,

I worked for McDonnell-Douglas and Boeing for over 40 years. I measured things for size, flatness, roundness, torque, force, air and liquid flow. I have been around the track a few times. Invariably, when one takes shortcuts in maintaining the accuracy ratio between masters and standards, things get hosed up and parts that should interchange don't fit.

That is probably the biggest reason why parts from different lots and manufacturers sometimes have assembly problems. The first place you want to look is at their calibration lab and procedures governing handling of standards.

I worked for a part time lab that did gage block calibration for Boeing in Mesa.

I also worked for Boeing in Mesa. At Boeing, we would send a set of gage blocks for recalibration because we did not ( at the time ) have equipment to calibrate gage blocks. We would send a set of blocks to this house and a lot of blocks that were unpopular sizes ( those not needed for stack buildups ) would be rejected for being oversize or undersize. These were blocks tha had not been used.

It just happened the guy who owned the business with a partner was an elder at the church I attend and we were elders together. He needed some short term help as the business built up and I hired on part time.

What I saw was a business partner who, in order to meet schedule, would take a MASTER block from the calibration house reference standard set and send that to the customer to complete their order. What that did was destroy the traceability of the standard that was used to calibrate the set sent in by Boeing. That is what accounted for unused blocks being thrown out. The whole traceability from end use item to the working standard was destroyed.

What I did was ensure that gage block calibration equipment was purchased for Boeing-Mesa and we stopped sending blocks to this calibration house.

Long story short, the cal house was bought out and then went out of business. And it went out of business because they did not pay attention to the history and uncertainties that derive from Grand Masters, Masters, Transfer and working standards and started sending out bogus data.

Most companies keep logs of their gage blocks and I know for a fact the transfer standards at Boeing in Saint Louis have history going back to the 1940s.

Some gage blocks will, depending on how they are constructed, will grow a micro inch over a year or lose a micro inch a year, but if a log is kept long enough, you will see those blocks settle out. If the standard that is being used to derive the value of YOUR standard is continually being changed out, all those uncertainties can be positive or negative to your results and the history on your blocks is compromised. More than one company has gone belly up because they did not pay attention to their calibration system.

It still irritates me to think how we were hurt because we paid attention to a calibration certificate that meant nothing. Sad to say, a lot of calibration houses are that way and the only one that is going to tell you the truth is NIST. You are going to pay a lot for their service but they know what they are doing.

If all you want is a cert to satisfy an auditor, set up your own cal lab in your business. Problem is you might just get an auditor like me and I do know what I am doing and would shut you down in a heartbeat if you were giving me a line of baloney and I know where to look to see if you are on the up and up.

Calibration is serious business and in the aerospace industry a "valid" cert for a catastrophic failure is no comfort to grieving families.

Too many people view calibration as a necessary evil and try to cheat any way they can and hope they don't snag a savvy auditor who can shut them down.

I guess you can say I am still passionate about the discipline. I really got into it and enjoyed it.
 
This whole thing got out of hand. I have shot .308 bullets out .311 barrels with really good results. Sometimes. It usually depends on the combination of variables involved. As for the rest, " A micrometer reading is in the eyes of the beholder", or some such old parable.
 
This whole thing got out of hand. I have shot .308 bullets out .311 barrels with really good results. Sometimes.
And at other times, they'll group just barely into a milk bucket. Right? ;)
 
they'll group just barely into a milk bucket. Right?

Funny:D


The coolest thing I have walked away with in my years of measuring "stuff" is that paying attention to the little things pays big dividends.

Controlling the variables as best you can gives more consistent performance.

Consistency is REQUIRED before MEANINGFUL changes can be made to a process. Shooting is a process and consistency can be adjusted to move from where it is to where you want it.

That is the biggest reason people get into hand loading. You can make your ammo give consistent results and tweak until the results are where you want them.

People used to tell me they did not believe in process control and I would ask them if they were bowlers or archers. A lot of them were. I would ask them if they ever counted "boards" from the edge of the lane or weighed arrows.

Invariable, they would say they did. In simple terms, they were learning how to control variables and improve their consistency.

Bart, I read your posts a lot. I really wish I had the opportunity to pick your brain. You bring a lot to the forum and I like the fact that you refrain from one liners and give some depth to your replies.

More people absorb new information like that than they do with a terse, "Do it this way and it will work."

Thanks, my friend.

PS: Plus you are a Navy dude, and that isn't a bad thing.
 
geetarman says controlling the variables as best you can gives more consistent performance. Amen, or other resounding accolades, to that.

But the icing on that cake of information is often forgotten. That's first knowing what all the variables are. Then picking the variables that cause the greatest deviation from "perfect" that you can control starting with the biggest one working your way down to the smaller ones. And forgeting those you cannot control.

Consistant performance? Geez; what a good thing to have. For example, let's say one wants to have their cartridge overall length to be exactly 3 inches. So they measure a bunch they just loaded. Their caliper reads between 2.989" to 3.006" across all of them. That's a .017" inch spread; .0565% of 3.006 inches. Some folks will accept that being very consistant and get on with their shooting plans. Others will sulk in disgust and post queries on reloading forums seeking advice on how to make the spread smaller; it's not consistant enough for them.

Then they shoot that ammo testing it for accuracy. Five 5-shot groups ranging from .4 inch extreme spread to 1.5 inch extreme spread. They just "measured" the accuracy that ammo produces and the spread's 1.1 inches; a whopping 73% of max. Is that 73% spread consistant performance? Their caliper more precicely measured cartridge overall length than they and their rifle measured the ammo's accuracy. Shouldn't the results of measuring anything have a very, very small spread across several measurements? If each group fired with that ammo's not close to the same dimension, is it anywhere close to being representative of how accurate the ammo is? And if all 25 shots were fired in one string aimed at the same place, that group would very well be somewhat bigger than 1.5 inches.
 
Last edited:
Back
Top