What point on that curve are you using for the reference point? and, why?
Why? 'again'; the question should be 'HOW?'.
No, the question is "why", I already know how. "Why", in context, meaning why pick that particular point on the ogive, to use as your reference point?
The OP was a very "open ended" question, I think, all it asked was if there was a term for "measuring from the ogive.."
"from" implies a "to".
From the ogive to some unspecified point. Many are
assuming the point is the rifling. Some the base of the case, or the boltface. The OP DID NOT SAY.
From
where on the ogive??
What point on the ogive is it being measured
from? The most forward point where the bullet is still full caliber? From the point where it is .010 smaller than full caliber? .020", .030"???
Again, the OP did not say. Without more information, it is neither simple, nor perfectly clear, to me.
I also cannot see what the point of such a measurement might be. OK, so you want to keep the bullet a given distance off the rifling..fine. Each bullet type, style, brand, weight, is slightly different. The measurement can only apply to one specific bullet, and since that is the case, why measure from an unspecified (and possibly difficult to repeat) point on a curve, when you could just use the overall total length and get the same results (bullet .xx" off the rifling)?? (once you know what that distance is, for the specific bullet)
What keeps popping up in my mind is the example of a .30 cal Hornady 220gr RN, and their 150gr Spire point. Measuring "from the ogive" (to any given point) will give you different results (IF you are consist about where on the ogive your data point is...) because of the vastly different ogives of these two bullets.