PDA

View Full Version : Two Micrometers Don't Agree



alamogunr
05-03-2019, 12:26 PM
I hope there are several machinists here that might know what I can do about the problem in the title. I have two 0-1" micrometers, a Brown & Sharpe and a Schure-Tumico(sp?) that differ when checking against a .310 plug guage. Both zero out when closed on the anvil but differ by .0004 when I measure the plug guage. The B&S measures exactly .3100. The Schure-Tumico measures .3096. Both have a vernier for the 4th place. I have cleaned both measuring surfaces but don't know what else to do.

1. Is there any adjustment that would make both measure the same?

2. Is that difference meaningful in reloading/casting?

M-Tecs
05-03-2019, 12:51 PM
Two separate issues.

First what is the tolerance of your plug gage? If it is .3100" gage pin it will be a specific class with a plus or minus tolerance. If it's a ZZ class the tolerance is .0002".

While it is correct that a plus .3100" could be anywhere for .3100" to .3102" the actual size will be very close to 3102". On a minus gage (which is the most commonly purchased) the actual size will be very close to the .3098" . Ring gauges will be just the opposite. This does hold true for quality name brand gages. Not sure about the cheaper imports. I have to say though, I have never measured a pin that wasn't .0002" under nominal for a minus pin or .0002" over for a plus pin.

Second when both zero at close but read different at .3100" one or both have wear or a manufacturing error. Using an actual standard or gage block you can zero it for a specific size but it will be off at other sizes.

https://www.threadcheck.com/technical-documents/fixed-limit-gage-fact-sheet-pg28.pdf

Petrol & Powder
05-03-2019, 01:00 PM
Two Micrometers Don't Agree

One of them is wrong :bigsmyl2: [smilie=s:

Jedman
05-03-2019, 01:09 PM
For any reloading or casting situation I can't even think of anywhere that .0004 would be meaningful.

Jedman

Kenstone
05-03-2019, 01:12 PM
I hope there are several machinists here that might know what I can do about the problem in the title. I have two 0-1" micrometers, a Brown & Sharpe and a Schure-Tumico(sp?) that differ when checking against a .310 plug guage. Both zero out when closed on the anvil but differ by .0004 when I measure the plug guage. The B&S measures exactly .3100. The Schure-Tumico measures .3096. Both have a vernier for the 4th place. I have cleaned both measuring surfaces but don't know what else to do.

1. Is there any adjustment that would make both measure the same?

2. Is that difference meaningful in reloading/casting?

Yes, remove the spindle, there's a tapered/threaded nut over slots in the shaft.
Tightening it removes any play/excessive clearance in the threads.
good luck,
:smile:

Smoke4320
05-03-2019, 01:19 PM
"Two Micrometers Don't Agree
I hope there are several machinists here that might know what I can do about the problem in the title. "

from the headline and first line I was thinking they are both female. Solution no man knows :) :)

country gent
05-03-2019, 01:25 PM
What I suspect is an out of square anvil on the second set. wear or otherwise. If you have pin gages check them at every 1/4 turn of the spindle. ( A mic makes .025 per revolution so every .006) This will show the out of square in the anvil. Another issue I've seen in production tooling that gets used a lot in one spot or range is localized wear. We had mics used in production that 24/5 measured dias of the same size. after a few years wear would affect accuracy at that range and they would be moved to a different line and size range. The localized wear couldn't be adjusted out with the adjustment collet since it would make it to tight on either side of the wear. By moving them from line to line and keeping wear more even they could be lapped and readjusted. Another quick test is to use 2 pieces of shim stock the same thickness and close mics to feel and check size then try and slide each piece seeing if one side is looser repeat 90*. When checking for square use pins .xx0, .xx6, .x12, .x18. we also checked thru the full range of travel

Mics can wear from use grinding and some processes are harder on the faces than others. Can have frames sprung from heavy handed use or being dropped. Wear in the threads from much use in the same range. Another effect is heat temp and frame materials some materials don't expand as much others more some mics have plastic shields to limit body temps from frames.

On 0-1" close the anvils lightly on a cigarette paper and slide out this cleans the anvils do this a couple times then check zero. The paper will show any dirt or crud removed. Last is to disassemble and clean with fresh new solvent flushing and very gently swabbing with soft utensils. blow dry with low pressure air lightly oil with a good insterment oil and reassemble carefully. After this they will need to be re zeroed and set.

Petrol & Powder
05-03-2019, 01:45 PM
"Two Micrometers Don't Agree
I hope there are several machinists here that might know what I can do about the problem in the title. "

from the headline and first line I was thinking they are both female. Solution no man knows :) :)

:bigsmyl2::bigsmyl2:

alamogunr
05-03-2019, 02:59 PM
Two separate issues.

First what is the tolerance of your plug gage? If it is .3100" gage pin it will be a specific class with a plus or minus tolerance. If it's a ZZ class the tolerance is .0002".

While it is correct that a plus .3100" could be anywhere for .3100" to .3102" the actual size will be very close to 3102". On a minus gage (which is the most commonly purchased) the actual size will be very close to the .3100" to allow for wear and still be in size. Ring gauges will be just the opposite. This does hold true for quality name brand gages. Not sure about the cheaper imports.

Second when both zero at close but read different at .3100" one or both have wear or a manufacturing error. Using an actual standard or gage block you can zero it for a specific size but it will than be off at other sizes.

https://www.threadcheck.com/technical-documents/fixed-limit-gage-fact-sheet-pg28.pdf

I bought the set of pin guages on Ebay. It is a Meyer minus set and I have to assume that it is ZZ tolerance since there isn't any indication on the box. I think I got it cheap because it was listed as having several pins missing. When I got it, the missing pins were not ones that I was likely to need.

I will measure several pin sizes in the range(.250-.750) of the micrometer and see how the different measurements compare. If the measurements are within the range of the first comparison, I think I will just accept whatever the reading is. As Jedman said, I don't think that difference would be meaningful.

JSnover
05-03-2019, 03:02 PM
If you have gage blocks, gage pins, or other items you can uses as standards, check them at different sizes. See if the difference is .0004" at any size or if it changes.
If .0004" is constant at any size it probably won't ever be much of a problem.
If it doubles at .620 and triples at .930, at least it' a linear error; it's predictable.
If it's all over the place as you measure different sizes the mic is probably damaged or dirty inside.

RED BEAR
05-03-2019, 06:40 PM
I would have to say wear or manufacture error as stated above could also be from someone that didn't know how to use a mic and tried to use it as a c clamp. I got to the point where i refused to allow anyone to use my tools. Just got tired of replacing them.

Conditor22
05-03-2019, 06:51 PM
Some brands you can take apart and clean the contact area. Not familiar with those 2 brands :(

Hickory
05-03-2019, 06:53 PM
One of the two is a democrat micrometer, that one is lying to you.

Jeff Michel
05-03-2019, 07:16 PM
As pointed out by CG, face wear on the anvil is not unusual, especially with steel faces, carbide less so. Probably just need cleaned and re-calibrated. A standard or gage block will be much more precise than a plus/minus plug gage.

EDG
05-03-2019, 08:50 PM
It seems that possibly both micrometers are having problems.
A .310 minus pin should be about .3098 meaning one of your mikes is over and one is under.

Rather than compare the 2 mikes I would pick one and remove the spindle, clean out both male and female threads and lubricate and re-assemble. Check again then adjust the clearance in the female thread and check again. You might be able to improve the reading by adjustment. If not pick where you want the error to be minimized and adjust your zero to put the mike error on the other end of the measurement range.

I collect mostly high end mikes and at one time supervised 2 different calibration labs.
I have both B&S as well as STI mikes and have never bought a mike off of Ebay with more than about .0001 error end to end. Most read exactly right end to end. I will add that most mikes I buy are based on higher quality brands that appear to be in excellent condition so I expect them to be right. I have bought a number of clunkers and was surprised to find they are still accurate even when beat up and ugly.
I have no idea why you mikes would be off considering the following.
I bought 3 mikes in a lot on Ebay $9 expecting junk.
1. Well used Slocomb 1-2 inch mike missing half the paint. The mike is probably 60 or 70 years old.
It still had the anti-backlash device though it was not installed correctly. This mike was cleaned up and works perfectly.
2. A Reed Small Tool Works 0-1 mike - the worst rust pitted mike I have ever worked on. Looks horrible. Works perfect and reads correctly end to end. This mike might be 70 years old and is one of the predecessor brands that is now Scherr Tumico.
3. Starrett #238 This mike was the reason for buying the lot. This is a super heavy duty boat anchor of a mike that is about 75 or 80 years old. It has seen very heavy use over many decades and has no original finish left. It needed a little cleaning and it reads as close as any mike I have ever used.

If your mikes appear to be in good condition I could not guess why they would be off. There is just not much chance for error in the spindle threads.
My most used mikes were about 20 to 30 years old when I bought them more than 50 years ago. In spite of constant use they check out perfectly end to end every time I check them out. Though I worked with calibration personnel a good bit I never let them touch my personal tools.




I bought the set of pin guages on Ebay. It is a Meyer minus set and I have to assume that it is ZZ tolerance since there isn't any indication on the box. I think I got it cheap because it was listed as having several pins missing. When I got it, the missing pins were not ones that I was likely to need.

I will measure several pin sizes in the range(.250-.750) of the micrometer and see how the different measurements compare. If the measurements are within the range of the first comparison, I think I will just accept whatever the reading is. As Jedman said, I don't think that difference would be meaningful.

RED BEAR
05-03-2019, 09:46 PM
I will say in my opinion there is not a better mic than brown and sharpe. Starret makes a good mic but brown and sharpe always had a better feel.to me.

M-Tecs
05-03-2019, 09:53 PM
A .310 minus pin should be about .3098 meaning one of your mikes is over and one is under.

I collect mostly high end mikes and at one time supervised 2 different calibration labs.


Per my experience I have never measured a pin that wasn't .0002" under nominal for a minus pin or .0002" over for a plus for a ZZ class pin. That is what I always believed based on my experience and that is what I have always posted.

I did post this on another forum awhile back and a gentlemen claiming to run a Metrology lab ripped me a new one claiming that with a minus gage pin the actual size will be very close to nominal to allow for wear and still be in size. With his claim a .3100" that would allow for .0002" pin wear and still be intolerance. Sounded logical enough that he convinced me that I was wrong. After your post and a little research I can not find any support for his claims. From 1982 to 1994 I worked for a company that had 4 full time Metrologists for inhouse calibration. From 1994 till 2016 I sent everything out for calibration so have I am far from an subject matter expert on Metrology.

Based on your comment, my experience and what I can find on the web I edited my above post. When you supervised calibration labs did you ever see anything to support his claim that minus pins would be closest to nominal to allow for pin wear?

EDG
05-03-2019, 11:23 PM
I agree with your experience. The centerless grinders use almost none of the tolerance. The .0002 undersize pins are always exactly .0002 under in my experience. The undersize is intended to permit the pin to be a slip fit in an exactly nominal hole.
The only time I have seen minus pins vary from that is when a tool maker polished one further undersize for some specific reason or when the shop people just wore out certain pins. The pins do wear when heavily used but when replacements are only $3 each it is easy enough to keep full set of good pins. The shops I worked in built a lot of military equipment held together with #4, #6 and #8 fasteners. The pins that were used to gauge the common tap drill diameters got used up pretty fast and had to be replaced regularly. So the guys inspecting the parts had a separate set of pins. Instead of the pins being used to check the mikes. the mikes were always checked to zero and with gage blocks. The mikes were used to verify the pins were what they said they were.


Per my experience I have never measured a pin that wasn't .0002" under nominal for a minus pin or .0002" over for a plus for a ZZ class pin. That is what I always believed based on my experience and that is what I have always posted.

I did post this on another forum awhile back and a gentlemen claiming to run a Metrology lab ripped me a new one claiming that with a minus gage pin the actual size will be very close to nominal to allow for wear and still be in size. With his claim a .3100" that would allow for .0002" pin wear and still be intolerance. Sounded logical enough that he convinced me that I was wrong. After your post and a little research I can not find any support for his claims. From 1982 to 1994 I worked for a company that had 4 full time Metrologists for inhouse calibration. From 1994 till 2016 I sent everything out for calibration so have I am far from an subject matter expert on Metrology.

Based on your comment, my experience and what I can find on the web I edited my above post. When you supervised calibration labs did you ever see anything to support his claim that minus pins would be closest to nominal to allow for pin wear?

barrabruce
05-04-2019, 12:44 AM
Could be how you use your mike.
Some clickers on the end are harder set and some people tend to jack hammer a mike to get what reading they want to.

Only thing I could add to the excellent posts above is to pick one and just use that for the job at hand.
Changing measuring tools while working adds in more tolerance errors.

Different if you need to order something to the half a hair on a knats knacker I guess thou.

Then again you may need a set of mikes like mine.

241049

Can only be found in classy machine shops.

Since we are talking about such things.

EDG
05-04-2019, 02:18 AM
There are a number of very good to excellent brands.
Using many different brands taught me an important lesson.
Most decent brand name mikes will measure your work accurately.
The real difference that really become noticeable is the ease of seeing and reading the mike.
The satin chrome finished Brown & Sharpe mikes with black ink filled lines and numerals are among the easiest to see and read.
If you are in a location where the light source is behind your back it can be very difficult to see the lines on the barrel and thimble on some mikes. Natural steel with a smooth bright reflective surface is a terrible finish for the thimble and barrel. In good light you get a bright reflection back into your eyes.




I will say in my opinion there is not a better mic than brown and sharpe. Starret makes a good mic but brown and sharpe always had a better feel.to me.

1hole
05-04-2019, 08:08 AM
For any reloading or casting situation I can't even think of anywhere that .0004 would be meaningful.

Jedman

Ditto.

I've been reloading since '65 and have never seen anything that needed absolute accuracy to a tenth. Mostly I use my old B&S for comparison measurements, such as fired to unfired case heads. In real world reloading I don't think a matter of 4 tenths exists.

EDG
05-04-2019, 10:20 AM
I suspect that if you shoot cast bullets .0004 undersize you might find a case for having a little more accurate mike. When shooting a tight neck bench rest rifle you need to use an accurate mike since the clearance between the loaded cartridge neck and the chamber neck is so little.
Most of the time there is little reason to tolerate a mike being off .0004. It costs nothing to set the zero properly.


Ditto.

I've been reloading since '65 and have never seen anything that needed absolute accuracy to a tenth. Mostly I use my old B&S for comparison measurements, such as fired to unfired case heads. In real world reloading I don't think a matter of 4 tenths exists.

alamogunr
05-04-2019, 11:49 AM
I've gone back and done another series of measurements with both micrometers. I tried to be much more careful this time.

Plug Gauge Brown&Sharpe Scherr-Tumico

.310 .3099 .3097

.325 .3249 .3247

.360 .3599 .3596

.375 .3748 .3747

.430 .4298 .4300

.454 .4539 .4537

.478 .4779 .4778

I hope that this doesn't get all mixed up when I post it.
Well! I previewed it and it did get compressed. I don't know how to fix it so I hope you can still read it.

uscra112
05-04-2019, 01:13 PM
I was about to say that you should check the mikes at a number of different sizes which do NOT coincide with an integral number of turns of the spindle. You did that. If as you say you've zeroed the barrels with the mikes closed up, I'd simply retire the Scherr-Tumpico if it were under my gage management program. Only other thing I would mention is that our MIL-I-45208 protocol specified gage blocks, not pins, as the reference standard.

country gent
05-04-2019, 01:21 PM
Looks like it might be as simple as old oil thickened up each of the 3 measuring's is very consistent in the recorded changes. Am betting a good cleaning with clean first use solvent. and very light oiling will fix them right up. This appears to be the oil film being thinned with the repeated measurements.
If it was the faces or thread play adjustments each of the 3 measurements would be off but the 3 would be the same. old oil or some coolants will cause this. A good cleaning is the fix.

What I prefer with the brown and sharpe mics over others is the slant line grads on the barrel makes them much easier to read. Mics are a simple tool and really not a lot to go wrong with them. Its all in the threads, 40 tpi ground and fitted with a take u for play the rest is engraved graduations. Made a few mic heads for gages when a "special" range was needed, or a special size was required. On the gages we sent them out for black anodize then used white paint in grads when they came back. Easy to read and use.

alamogunr
05-04-2019, 01:25 PM
I grant you that gauge blocks would be better(easier to use) since they are rectangular but pins are what I have and blocks are not available to me.

I don't use the Sherr-Tumico very often since it is harder to use. The ratchet is small and on the end of the barrel. I feel that I should use the ratchet to close on the object being measured because, as an occasional user, my feel is not well developed.

My post was in response to uscra112. Country gent posted while I was typing.

country gent: The black anodizing sounds great. You can do a lot when it is being paid for by company funds. I had to leave that behind 13 years ago.

country gent
05-04-2019, 02:05 PM
Yep company funds go a long ways. I gave it up about the same time you did Medically retired 10 years ago.

All of our gages specified anodizing on the prints. The white was just was a finish touch. Some prints even specified colors. This was for Gages. Parts were bare or in house blued.

uscra112
05-04-2019, 03:49 PM
Looks like it might be as simple as old oil thickened up each of the 3 measuring's is very consistent in the recorded changes. Am betting a good cleaning with clean first use solvent. and very light oiling will fix them right up. This appears to be the oil film being thinned with the repeated measurements.
If it was the faces or thread play adjustments each of the 3 measurements would be off but the 3 would be the same. old oil or some coolants will cause this.

Can't write off the old oil hypothesis. I once was called in to evaluate a Moore CNC Jig Grinder with my laser interferometer, and found a kind of "dip" in the X-axis, right where they'd been using it for years and years grinding one particular small die part day in and day out. Just a coupla tenths, but that's big lump of error for a Moore. Moore told him to cycle the axis end-to-end a few times every day, and sure enough a year later the "dip" had disappeared. Shop super was kinda disappointed - he'd been angling with his management for money to get the machine rebuilt, based on my findings.

gnostic
05-04-2019, 08:09 PM
I'm definitely not a machinist and I'm ok with being wrong. But, I've always measured a feeler gauge to proof my micrometer and dial caliper. My feel isn't skillful enough to be super accurate....

country gent
05-04-2019, 08:35 PM
Here's another little thing. When doing precession measurements always use the same mic, calipers or measuring tool. You develop your feel to the most used of these tools and others just plain feel different. My brown and sharpe mics are a part of my hand and I now when I have the right "touch" with them. I one of those things that's hard to explain. for truly accurate measurements a height gage indicator and gage blocks or CMM. Even the CMM has a "touch" or feel in it thru the electrical connection used to sense the probe touch the piece.

William Yanda
05-04-2019, 09:03 PM
I have read through this thread but don't find repeatability addressed. Did I miss something?

3006guns
05-04-2019, 11:29 PM
I have read through this thread but don't find repeatability addressed. Did I miss something?

.......or temperature. Most precision tools and their test standards are supposed to be used at room temperature. An ice cold standard and a room temperature mike don't jibe too well....

1hole
05-05-2019, 09:14 PM
Seems the reality of obturating bullets, especially cast bullets, i.e., upset and swelling on firing, even jacketed bullets, has been lost. Truth is, it would take a very hard bullet and/or a very mild load to fail to upset a cast bullet 4 tenths. So, such absolute accuracy is nice but effectively meaningless to reloaders because bullets usually exit at bore diameter.

Oil, paint or anything else on a mic's anvil faces obviously affect their zero but the OP stated both his mics zeroed the same so they both must be clean; I took him at his word and didn't question his intelligence.

uscra112
05-05-2019, 10:46 PM
A long-time truth in the metrology field is that your instruments should be an order of magnitude more precise than the tolerance you're trying to measure.

----------------

Repeatability is addressed in a way at post #30. Given a micrometer as the subject, it is very much a matter of the operator's skill (feel) with his tool.

Yes, temperature is always an issue when trying to verify accuracy. It is also a small factor in repeatability, if the heat of the operator's hand is enough to make the tool grow over the course of several trials. Bench check stations often mount the mic in some sort of stationary clamp to minimize this.

EDG
05-06-2019, 01:38 AM
It seems you are trying to rationalize lack of accuracy with a mike.
A .0004 undersize bullet may permit blow by and cause leading.
If it does how would you know what caused the problem since you seem to be happy with an inaccurate mike.


Seems the reality of obturating bullets, especially cast bullets, i.e., upset and swelling on firing, even jacketed bullets, has been lost. Truth is, it would take a very hard bullet and/or a very mild load to fail to upset a cast bullet 4 tenths. So, such absolute accuracy is nice but effectively meaningless to reloaders because bullets usually exit at bore diameter.

Oil, paint or anything else on a mic's anvil faces obviously affect their zero but the OP stated both his mics zeroed the same so they both must be clean; I took him at his word and didn't question his intelligence.

JSnover
05-06-2019, 08:19 AM
A .0004" undersized boolit shouldn't happen at all if we're casting .002"-003" oversized, unless the mic is way off.

1hole
05-06-2019, 10:17 AM
It seems you are trying to rationalize lack of accuracy with a mike.

IF someone is working in a precision machine shop with many mics scattered around a production line being used on identical parts THEN absolute accuracy between work stations might be important but even then it would matter ONLY if the work itself required better accuracy (and the formentioned repeatability!) than 4 tenths.

Also, as mentioned above is the FACT that even modest temperature variations quickly become important in highly precise machine work but, for most of us, that level of absolute accuracy is meaningless. That level of accuracy simply isn't often needed even in a common machine shop and virtually never on a home shop reloading bench. I don't believe any reloader's work needs such fine measurement accuracy that he should work in a temperature controlled atmosphere because it would be laughable, we have no need for it and would gain nothing from it.


A .0004 undersize bullet may permit blow by and cause leading. If it does how would you know what caused the problem since you seem to be happy with an inaccurate mike.

You don't know what I'm happy with.

We may prove anything we wish with "mays" or "mights" but that doesn't make it true. IF our bullets were cast iron your example would/could be so. But, our bullet's are cast lead and even the hardest of lead alloys are quite malleable. Most bullet weights and powder charges will upset them to fully obturate our bores. Gas cutting will surely lead a bore but the common reasons for leading are a too soft alloy, poor lube and/or an excessive powder charge. Not from being 4 tenths undersize.

Bottom line, handguns aren't bench guns and cast handgun bullets aren't precision BR projectiles. Nothing we can do will make them so and we would never see any difference on our targets even if we could.

There is no "rationalizing the inaccuracy of a micrometer" about any of that. It's just recognizing the facts of precision mechanical measuring and a common sense recognition of the level of accuracy we reloaders actually need, not what may be needed for the space program. :)

JeepHammer
05-06-2019, 10:59 AM
This is one of my 'Pet Peeves',

If you have precision measuring equipment, you NEED a 'Standard' to check the accuracy of that measuring equipment...

The 'Standard' here would be a 'Gauge Block', or a Rod, to verify your micrometer/caliper (whatever) isn't lying to you.

https://en.m.wikipedia.org/wiki/Gauge_block

You can buy these blocks by the piece, you don't need to buy the entire set,
And I suggest you buy one from a major manufacturers, like Starrett, Mitutoyo, Brown & Sharpe, etc.

DO NOT over handle these, keep them protected!
Just handling wear or rust will ruin them.

To do this CORRECTLY,
You will need two precise measurements, like 1/4" and 3/4" or 1".
If the micrometer shows correct at 1/4" AND 1", it's most likely correct over it's entire range.

Keep in mind the heat from your hand can skew readings, the standard can expand with heat, so do as little 'Touching' as you can.

Some standards have a plastic or foam insulation on them so the heat from your hand doesn't skew readings.
I use a plastic clamp to hold standards when I'm doing QC on the micrometers, calipers...
Plastic/nylon jaws on the clamp won't wear on the standards.

I would just about bet the Brown & Sharpe is the closest to correct, they have VERY hard measuring faces and are VERY precisely ground.
The internals of Brown & Sharp equipment are built with the very best materials and are very precise.
Unless some idiot has had ahold of them, they are very hard to ruin or knock out of adjustment.

Like I said, you need TWO standards, one 'Short' (within range of measurement) and one 'Long' (within range of measurement to make sure the micrometer/caliper is correct OVER RANGE OF MOTION/MEASUREMENT.

Rods are fine for what 99.9999% of reloaders so, but gauge blocks are a requirement for gunsmith, machinist, etc.

----------------

I can't deal with the 'Close Enough' bunch...

You don't break out a caliper or micrometer for 'Close Enough'.
The display on calipers/micrometers don't say 'Close Enough', 'Relative', or 'Doesn't Matter',
It gives a SPECIFIC READING, and *IF* you know what you are doing and have standards, the reading is specific & accurate.

If 'Close Enough' is what you are looking for out of a Horror Fright caliper with no standards, then why not just bend a coat hanger?
That's 'Close Enough', coat hanger, rock & stick, holding your fingers 'About That' far apart...

The 'Use a feeler gauge' bunch...
You ASSUME The feel gauge is PRECISELY ground (not likely when you pay $8 a set at Horror Freight), and they do ONE check at 0.010" and ASSUME the rest of the scale is correct also...

That's a LOT of making an A** out of U and ME (ASSUME)

Yes, I'm OCD, but when you want something machined correctly, or you want ammo that's PRECISELY sized, you want a little OCD in the mix.
I don't use a yard stick to measure/cut your headspace, and you wouldn't want me to ASSUME it's correct or just stop at 'Close Enough'...

alamogunr
05-06-2019, 12:10 PM
This is going in a direction I did not foresee. Thanks to country gent and others. I hope the rest have fun.

EDG
05-06-2019, 07:26 PM
You might really be peeved to know that you do not need a standard to properly set up most 0-1 mikes.
Think about it a minute. The zero setting of the mike is set by running the spindle out to touch the anvil.

Of course you can go OCD and check it at a larger dimension but what is your plan if it shows to be off at the second dimension? When the mike is off at one end and not the other it may be due to a sprung frame, excessive lubricant or a miss set thread collar on the female thread.
If you are unable to correct the error the mike can be set up to be accurate in the zone you use the most or the error can be split between each end of the travel. For the most part your 0-1 mike will be dead on end to end when the zero is properly set and there is no need to use a standard. I have more than a dozen 0-1 mikes. Only 1 was bought new and that was in 1969. All these mikes are right on end to end.
The precision of gage blocks for the most part are worthless for everyday checking of an 0-1 mike since it is normally checked at zero. The twenty millionths or so tolerance on the gage block is so much more precise than the resolution of the micrometer it provides no benefit.

Finally if you want to be OCD your gage block is not really known to be accurate unless you have it calibrated on an annual basis. That is the requirement to use the gage block to calibrate any equipment used to accept product sold to the US government or sold under the claim of having an ISO 90001 quality management system or MIL-Q-9858.




This is one of my 'Pet Peeves',

If you have precision measuring equipment, you NEED a 'Standard' to check the accuracy of that measuring equipment...

The 'Standard' here would be a 'Gauge Block', or a Rod, to verify your micrometer/caliper (whatever) isn't lying to you.

https://en.m.wikipedia.org/wiki/Gauge_block

You can buy these blocks by the piece, you don't need to buy the entire set,
And I suggest you buy one from a major manufacturers, like Starrett, Mitutoyo, Brown & Sharpe, etc.

DO NOT over handle these, keep them protected!
Just handling wear or rust will ruin them.

To do this CORRECTLY,
You will need two precise measurements, like 1/4" and 3/4" or 1".
If the micrometer shows correct at 1/4" AND 1", it's most likely correct over it's entire range.

Keep in mind the heat from your hand can skew readings, the standard can expand with heat, so do as little 'Touching' as you can.

Some standards have a plastic or foam insulation on them so the heat from your hand doesn't skew readings.
I use a plastic clamp to hold standards when I'm doing QC on the micrometers, calipers...
Plastic/nylon jaws on the clamp won't wear on the standards.

I would just about bet the Brown & Sharpe is the closest to correct, they have VERY hard measuring faces and are VERY precisely ground.
The internals of Brown & Sharp equipment are built with the very best materials and are very precise.
Unless some idiot has had ahold of them, they are very hard to ruin or knock out of adjustment.

Like I said, you need TWO standards, one 'Short' (within range of measurement) and one 'Long' (within range of measurement to make sure the micrometer/caliper is correct OVER RANGE OF MOTION/MEASUREMENT.

Rods are fine for what 99.9999% of reloaders so, but gauge blocks are a requirement for gunsmith, machinist, etc.

----------------

I can't deal with the 'Close Enough' bunch...

You don't break out a caliper or micrometer for 'Close Enough'.
The display on calipers/micrometers don't say 'Close Enough', 'Relative', or 'Doesn't Matter',
It gives a SPECIFIC READING, and *IF* you know what you are doing and have standards, the reading is specific & accurate.

If 'Close Enough' is what you are looking for out of a Horror Fright caliper with no standards, then why not just bend a coat hanger?
That's 'Close Enough', coat hanger, rock & stick, holding your fingers 'About That' far apart...

The 'Use a feeler gauge' bunch...
You ASSUME The feel gauge is PRECISELY ground (not likely when you pay $8 a set at Horror Freight), and they do ONE check at 0.010" and ASSUME the rest of the scale is correct also...

That's a LOT of making an A** out of U and ME (ASSUME)

Yes, I'm OCD, but when you want something machined correctly, or you want ammo that's PRECISELY sized, you want a little OCD in the mix.
I don't use a yard stick to measure/cut your headspace, and you wouldn't want me to ASSUME it's correct or just stop at 'Close Enough'...

jmorris
05-08-2019, 09:24 AM
A man with one clock always knows what time it is, a man with two is never sure.

Maybe you need three and ditch the one that doesn’t agree. Or you can do like most machine shops and have them calibrated by an outside source.

https://www.transcat.com/calibration-services/instruments/micrometer-calibration-labs

Decent micrometers come with standards so you can test them yourself at either end of their range and you can use gauge blocks somewhere else if you like.

https://i664.photobucket.com/albums/vv5/qvideo/IMG_20150105_141304_010_zps3b8e6ae7.jpg

A precision instrument that’s not precise isn’t worth much around here, about as much as a Chinese C clamp.

EDG
05-08-2019, 12:26 PM
You can have a lot of sayings and rules but standards, Jo-blocks and calibration all cost money. If you have a mike that you got at a good price having it calibrated by an specialty house will cost more than the mike is worth. It is better if you learn enough to set zero on your 0-1 mike yourself. You are not likely to have to set it at all but if you do and take care of the mike you will never need to set it again.
For the most part mikes larger than 0-1 never get used because much of the work can be measured with calipers. Few reloaders have the slightest use for gage blocks nor would they know how to use them nor would they have a surface plate, height gage or a sine bar. Common sense is a lot more economical than spending money like water just because you think you can make the results perfect. You can't and it will cost you a lot of money to find that out.

If you only have one clock you only THINK you know what time it is. You cannot prove anything for sure.


A man with one clock always knows what time it is, a man with two is never sure.

Maybe you need three and ditch the one that doesn’t agree. Or you can do like most machine shops and have them calibrated by an outside source.

https://www.transcat.com/calibration-services/instruments/micrometer-calibration-labs

Decent micrometers come with standards so you can test them yourself at either end of their range and you can use gauge blocks somewhere else if you like.

https://i664.photobucket.com/albums/vv5/qvideo/IMG_20150105_141304_010_zps3b8e6ae7.jpg

A precision instrument that’s not precise isn’t worth much around here, about as much as a Chinese C clamp.

jmorris
05-08-2019, 05:43 PM
I can prove that one of the OP’s micrometers is reading incorrectly because he claims they don’t read the same.

You can say it doesn’t matter and it might not, you can say a caliper (not a precision instrument in many circles) is “good enough”, that too may be correct but one of his micrometers, if not both are not telling the truth. Some folks like to know what is right and wrong, others can build things with no measuring tools at all.

I only have one clock the shop, it receives the amplitude-modulated time signals the .gov sends out therefore can only be expected to determine the beginning of a second with a practical accuracy uncertainty of ± 0.1 second.

Doesn’t matter much because it doesn’t have resolution that precise and I wouldn’t need it, if it did.

Likely like the OP’s different measurements won’t hurt him but if he wants to know what one is correct, there are methods that can eliminate his uncertainty.

EDG
05-08-2019, 08:50 PM
You might want to re-read what I said. I said nothing of what you claim.
Now you want to try again?
Where exactly did I say it was it does not matter? Where did I say it was good enough?

This is what it really says

It is better if you learn enough to set zero on your 0-1 mike yourself.

What I really said is your suggestion to have micrometers calibrated by an outside calibration house is a waste of money if you know how to set the zero of your 0-1 micrometer yourself.

Second it matters little how accurate you THINK your clock is you have no way to prove it.
I have set my time pieces to NIST time also. That does not mean they hold that accuracy for more than a second. It is humorous that you admitted that perfect accuracy of your clock is not necessary. So it does appear that less than perfect can be good enough for you.

You might want to re-read what I said because it is not what you are claiming.




I can prove that one of the OP’s micrometers is reading incorrectly because he claims they don’t read the same.

You can say it doesn’t matter and it might not, you can say a caliper (not a precision instrument in many circles) is “good enough”, that too may be correct but one of his micrometers, if not both are not telling the truth. Some folks like to know what is right and wrong, others can build things with no measuring tools at all.

I only have one clock the shop, it receives the amplitude-modulated time signals the .gov sends out therefore can only be expected to determine the beginning of a second with a practical accuracy uncertainty of ± 0.1 second.

Doesn’t matter much because it doesn’t have resolution that precise and I wouldn’t need it, if it did.

Likely like the OP’s different measurements won’t hurt him but if he wants to know what one is correct, there are methods that can eliminate his uncertainty.

jmorris
05-08-2019, 09:28 PM
So it does appear that less than perfect can be good enough for you.

Sadly it is, no matter what resolution you are able to measure it’s still not perfect.

You can have calipers that will get you close to 3 decimal places. Other indicators can get you much closer to perfect, yet so far away.

https://i664.photobucket.com/albums/vv5/qvideo/Mobile%20Uploads/IMG_20160906_132741_676-1_zps80ox45ci.jpg

Perfect is a never ending number of zeros though. Why we work in tolerances. Some things need to be close, some don’t matter much at all and some need to be “really” close. “Really” to a guy on a backhoe might be a number of feet, a carpenter it could be inches, a machinist, tens of thousandths of an inch. Kind of like how long a minute is depends on what side of the bathroom door your on. :)

I should have said “one” vs “you” in my statements but the reply was across the board of things posted in this thread (vs quoting each and every post or context).

This “you” falls into the “might not matter” category, in my book.


I've been reloading since '65 and have never seen anything that needed absolute accuracy to a tenth.

If you can load and not tell the difference between a .100 and .001, you don’t need calipers, micrometers or anything more accurate.

uscra112
05-08-2019, 09:59 PM
Being a guy who has written not one but TWO MIL-I-45208 manuals for my various employers, I'm enjoying all this. :popcorn:

M-Tecs
05-08-2019, 10:01 PM
I have to say this thread is turning into a P*****g contest.

Zeroing a mic and calibrating a mic are not the same thing. When you zero a mic you are setting it to your feel against a standard of a known size. Calibration is a different animal. It ensures that the tool is within the industrial and Mil Spec standards for the type/class of measuring or gaging tool. The current standard is https://www.quality-control-plan.com/shoptalk/dla-replaces-mil-i-45208-with-iso-9001/ Very few high end new measuring tools will fully meet these standards when new and they don't get better with usage.

Most new tools come back from the initial calibration with limitations and some are flat out rejected. For indictors Brown & Sharpe tend to be the best. I prefer Intrepid however out of apx 30 purchased for the shop none ever made it through initial cal. without needing a limited Cal sticker. For Brown and Sharpe about 1/2 would fully pass calibration. For mics and dial calipers B&S and Etalon tended to have the fewest limitations.

For gage blocks three sets of new SPI were rejected on initial cal. The replacement Starrett sets all fully passed the initial cal.

My shop was required to follow DLA Aviation standards.

My most accurate and expensive measuring tool is my Mahr 20 Millionth 0 - 1 pressure mic. I also have several free plastic dial calipers and yes they are good enough for measuring group size at the range or is that a 6mm bolt or 1/4" . I have about 10K worth to measuring tools that are in-between the two extremes that are selected depending on requirements.

EDG
05-09-2019, 12:11 AM
The usual preference for precision measurement is to have an instrument with 10X the resolution of the measurement needed. When that is not available the default is an instrument with 4X the resolution needed.
Not only are the parts we make NOT perfect due to manufacturing tolerances but the tools we use to measure them are not perfect either. Sooner or later you have to accept less than perfect because nothing is ever perfect. However there is not one thing to be sad about. The entire world runs fine on everything being less than perfect
If you confuse .100 and .001 no tool is going to be much use to you unless you are framing a house or something of similar precision.


Sadly it is, no matter what resolution you are able to measure it’s still not perfect.

You can have calipers that will get you close to 3 decimal places. Other indicators can get you much closer to perfect, yet so far away.

https://i664.photobucket.com/albums/vv5/qvideo/Mobile%20Uploads/IMG_20160906_132741_676-1_zps80ox45ci.jpg

Perfect is a never ending number of zeros though. Why we work in tolerances. Some things need to be close, some don’t matter much at all and some need to be “really” close. “Really” to a guy on a backhoe might be a number of feet, a carpenter it could be inches, a machinist, tens of thousandths of an inch. Kind of like how long a minute is depends on what side of the bathroom door your on. :)

I should have said “one” vs “you” in my statements but the reply was across the board of things posted in this thread (vs quoting each and every post or context).

This “you” falls into the “might not matter” category, in my book.



If you can load and not tell the difference between a .100 and .001, you don’t need calipers, micrometers or anything more accurate.

EDG
05-09-2019, 12:55 AM
Zeroing an 0-1 mike is setting it to a known standard and that standard is ZERO length.
That is perfectly adequate for a handloader with a decent mike that shows no apparent error end to end.
The usual handloader with one ordinary 0-1 mike does not need to know 9858, 45208, ISO 9001 or any of that stuff. He does not need gage blocks to set his 0-1 mike either. We do not need to drag in dial indicators when the subject is an ordinary 0-1 micrometer.

You may feel your 10K worth of measuring instruments was a big deal for your shop. I have bought Renishaw CMM probes that cost 3X that. I have bought 50 inch optical comparators and CMMS even more expensive. No matter how much you spend there is always some shop somewhere that has bigger or better equipment. But none of that equipment is needed to set a simple mike to zero so it can be used by a handloader.



I have to say this thread is turning into a P*****g contest.

Zeroing a mic and calibrating a mic are not the same thing. When you zero a mic you are setting it to your feel against a standard of a known size. Calibration is a different animal. It ensures that the tool is within the industrial and Mil Spec standards for the type/class of measuring or gaging tool. The current standard is https://www.quality-control-plan.com/shoptalk/dla-replaces-mil-i-45208-with-iso-9001/ Very few high end new measuring tools will fully meet these standards when new and they don't get better with usage.

Most new tools come back from the initial calibration with limitations and some are flat out rejected. For indictors Brown & Sharpe tend to be the best. I prefer Intrepid however out of apx 30 purchased for the shop none ever made it through initial cal. without needing a limited Cal sticker. For Brown and Sharpe about 1/2 would fully pass calibration. For mics and dial calipers B&S and Etalon tended to have the fewest limitations.

For gage blocks three sets of new SPI were rejected on initial cal. The replacement Starrett sets all fully passed the initial cal.

My shop was required to follow DLA Aviation standards.

My most accurate and expensive measuring tool is my Mahr 20 Millionth 0 - 1 pressure mic. I also have several free plastic dial calipers and yes they are good enough for measuring group size at the range or is that a 6mm bolt or 1/4" . I have about 10K worth to measuring tools that are in-between the two extremes that are selected depending on requirements.

EDG
05-09-2019, 01:32 AM
I know what 45208 means to a handloader at home with an ordinary 0-1 mike - exactly nothing.


Being a guy who has written not one but TWO MIL-I-45208 manuals for my various employers, I'm enjoying all this. :popcorn:

M-Tecs
05-09-2019, 01:39 AM
You may feel your 10K worth of measuring instruments was a big deal for your shop..

No those are personal tools that I paid for out of my own pocket for my civilian position before I worked for the government. They are now utilized in my home shop. What I purchased for the government shop was more like 1.5 million in inspections tools.

Knowledge is free and I am more than a little surprised that you seem to have a problem with sharing how industry deals with this issue.

For a very large percentage of reloaders a plastic dial caliper is adequate. Same for the very low cost mics. Some of those are actually very good.

We could have a much larger discussion on what is realistic accuracy of various tools in novices hands and what accounts for inaccuracies in both novice and experienced hands but you argue that setting at zero for a 0-1 is all that is need yet you also state there is no reason to tolerate error .0004" in a mic. Those two positions don't go together very well.

This covers basic calibration of a mic

https://www.qualitymag.com/articles/83880-quality-101-calibrating-micrometers

These results mirror my experiences on how the average mics track. Not sure I agree with the cause. While not always the case the more you pay the more precision you tend to get. In the late 80's and early 90's the Polish mic imports were junk. They are much better today so I think the price verse precision is less than it used to be. None of the B&S or Elton mics ever failed or had limited cal. for the initial cal. SPI always came back with limitation and a couple failed initial cal.

With gage pins or blocks the class or grade spells out the level of precision. That is not the case with mics. Just because a mic reads in tenths doesn't mean its accurate.

https://rick.sparber.org/am.pdf

Gage block also have different grades depending on intended usage.

https://www.prattandwhitney.com/Content/Gage_Block_Calibration_Tolerances.asp

For those interested in the history of gage blocks

https://www.nist.gov/sites/default/files/documents/calibrations/mono180.pdf

1hole
05-15-2019, 11:06 PM
If you confuse .100 and .001 no tool is going to be much use to you unless you are framing a house or something of similar precision.

Who's confused?

Come on man, you're defense of your expertise is getting funny! Surely you know that neither .100" nor .001" are "tenths" of anything to a machinist. Every machinist I've ever known understands that a tenth means .0001", exactly a tenth of a thou. (And everyone knows .001" is a thou. At least I think I know that but you may have different definitions! :))

uscra112
05-15-2019, 11:24 PM
I have to say this thread is turning into a P*****g contest.


For gage blocks three sets of new SPI were rejected on initial cal. The replacement Starrett sets all fully passed the initial cal.

Forgive me please, but what instrumentation did your shop have that was precise enough to "calibrate" gage blocks? What was your reference standard? Serious question, not snark.

alamogunr
05-15-2019, 11:55 PM
I have to say this thread is turning into a P*****g contest.


See my post #39

M-Tecs
05-16-2019, 12:12 AM
Forgive me please, but what instrumentation did your shop have that was precise enough to "calibrate" gage blocks? What was your reference standard? Serious question, not snark.

Definitely a fair question. We didn't have any calibration capabilities. Everything had to be sent of to Airforce PMEL labs for initial calibration and all subsequent scheduled calibrations. Items too large to be sent out PMEL technicians would come inhouse with the necessary equipment.

https://www.matsolutions.com/Blog/tabid/275/entryid/1069/pmel-setting-standards-for-metrology-and-calibration.aspx

What Is a Precision Measurement Equipment Laboratory (PMEL)?
A Precision Measurement Equipment Laboratory (PMEL) is a United States Air Force (USAF) facility that calibrates and repairs test and measurement equipment. There are PMEL labs throughout the Air Force, overseen by AFMETCAL (Air Force METrology and CALibration) Program Office, located in Heath, Ohio.
PMEL technicians are responsible for the calibration and preventative and corrective maintenance of all the avionic, navigational, radar, test and intelligence equipment used by the USAF. They maintain, calibrate and certify the equipment in coordination with AF Primary Standards Laboratory, and to the National Institute of Standards and Technology (NIST) and other AFMETCAL-approved standards.

john.k
05-16-2019, 06:02 AM
We fired an apprentice once,he knew he was getting the boot,and systematically altered the setting on all the length rods for the big inside micrometers......he was smart enough not to alter shorty ones used all the time,and just did the long ones.....the first stuff up cost thousands of dollars.......unfortunately ,even a meter rule could detect the difference in the finished work.Everything had to be recalibrated.

uscra112
05-16-2019, 06:04 AM
@M-Tecs: And that's the only answer, unless you worked at NIST or a gage block mfgr. :smile::smile: (Or PTB in Germany.)

For the benefit of them as might be interested: https://www.usna.edu/Users/oceano/raylee/SP211/gauge%20blocks.pdf

145 pages about nothing but gage blocks. Obviously well off the original topic, but interesting. Foundational tech for almost everything we do.

uscra112
05-16-2019, 06:16 AM
We fired an apprentice once,he knew he was getting the boot,and systematically altered the setting on all the length rods for the big inside micrometers......he was smart enough not to alter shorty ones used all the time,and just did the long ones.....the first stuff up cost thousands of dollars.......unfortunately ,even a meter rule could detect the difference in the finished work.Everything had to be recalibrated.

There is a story, perhaps apocryphal, perhaps not, that alteration of standards that way was a sabotage trick used by forced laborers in Nazi arms factories.

M-Tecs
05-16-2019, 01:39 PM
@M-Tecs: And that's the only answer, unless you worked at NIST or a gage block mfgr. :smile::smile: (Or PTB in Germany.)

For the benefit of them as might be interested: https://www.usna.edu/Users/oceano/raylee/SP211/gauge%20blocks.pdf

145 pages about nothing but gage blocks. Obviously well off the original topic, but interesting. Foundational tech for almost everything we do.


Before anything could be put into service it had to pass the initial calibration. I never kept track of what percent of the new unused items came back from initial cal. with limitations or outright failed but it was a surprisingly high percentage.


For those interested in the history of gage blocks

https://www.nist.gov/sites/default/files/documents/calibrations/mono180.pdf


Different link but same source document as I posted above.

uscra112
05-16-2019, 04:10 PM
The cynic would of course say that the vendors, having won with the lowest bid, sent the Grade B stuff.

Metrology was a large part of my engineering career. My old Dad, who was wiser than I ever gave him credit for, once said: "It's a fascinating profession, but you'll never have cocktail party conversation." (Except with other engineers, I hasten to add.)

M-Tecs
05-16-2019, 10:00 PM
The cynic would of course say that the vendors, having won with the lowest bid, sent the Grade B stuff.

Most of the items I purchased were local purchase. The exception would be the Starrett Dial Calipers. I purchased my first Starrett Dial Cal. in 1978 when I went to tech. school. I never liked it. Compared to a B&S it had no feel. At the time Etalon made them for B&S. Still use it today.

Later when I was purchasing for the shop I could get Starrett's thru the GSA stock system. Different pots of money so I didn't have to justify them. I still don't like the feel but they are robust and out of 30 your so purchased I never had one fail the initial cal. and only a couple had limited initial cal. so the GSA stuff did better that most.

I did local purchase three cased sets of carbide faced B&S mics 0-12" with zero limits on initial cal. Worst for initial cal. was Interapid indicators all local purchased thru three different sources. They tended to fail the full range test. They are still my favorite and I true don't care about full range since they are never used in that manor.

JeepHammer
05-20-2019, 10:59 AM
I'm going with M-Tecs & Mr. Morris...

I'm also a BIG fan of Brown & Sharpe!
I have a lot of Starrett, which is accurate, but I buy Brown & Sharpe is my personal favorite.

At anvil (Zero) and 1" on a 1" micrometer you know it's accurate at anvil (which measures nothing) and at 1".
If you are measuring 1" pieces and nothing else, you know it's accurate.

When you check range of motion, you find the pesky details,
Pulled threads where some idiot used it for a 'C' clamp,
You find grit, crud, burrs & wear, particularly where it's been used to measure the same size parts thousands of times, wear at just ONE part of the scale.

There is a reason machinists keep their tools clean, usually in padded drawers, and LOCKED UP!
After I spend the money I do on extreme accuracy I don't want some floor sweeping knuckle head to use my micrometer for a hammer or 'C' clamp!
Or in my case, the idiot was using it as a ground clamp & 'C' clamp while stick welding 1" from the anvil!
(Ever REALLY want to kill someone? To the point you look around for 'Witnesses'?)

---------

If you are bending brass back into shape 0.0001" isn't a big deal.
0.0001" off in chamber headspace isn't a big deal.
If your total tolerance is 0.0001", an entire production run gets rejected.

To NOT have some kind of standard for a caliper doesn't make sense to me unless your measurements are 'Relative', like using a tapered throat case gauge adapter on a caliper.
Your 'Standard' is a case that fits your chamber, and you are just looking to see if your cases are +/- what fits.
A champfered/radiused throat on the adapter makes a direct reading impossible, so what's the point?

The campfer/radius makes the case insert/remove easier and doesn't mark the case, so all the adapters have champfered/radiused mouths... No direct read possible.

There is a reason I rotate the gauge standards, so I can find problems before they find me.
Using different standards in the range of scale finds those pesky problems that like to screw up my work pieces and make me start over with something properly QC'ed.

Not everyone is going to need extreme accuracy, so if Horror Fright is working for you, or a single standard is working for you, that's the tool for the job you are doing.

-------

The point was, the OP went beyond the usual 'Close Enough' measuring and ask about precise measurement issues,
To be precise, one or both micrometers are lying to him and he wanted to know which one, and presumably how to correct the situation...

I'm probably like every other machinist, when it fails QC it goes home for 'General Use', that includes reloading.
Still gets standard checks, still more accurate than Horror Freight, but won't screw up tight tolerance work at the shop anymore.
I also pass them down to newbies, guys working on 'Close Enough' projects like seeing if a shaft is going to fit a bearing or finding the correct size of stock to start machining... Grunt work...
He's learning to use one, so you KNOW it's going to get dropped, left laying around, beat up.

uscra112
05-20-2019, 11:26 AM
As long as we're expressing opinions......My chest has had mikes of all sorts of brands over 50 years. Ones I like stay, others drift away. The preponderance after all that time is "vintage" Brown & Sharpe. Never could warm up to Starrett. "Fit & finish" of their mikes just never measured up in my view. I have Starrett steel rules and squares and a 12" dial caliper, but their mikes joined the throng that emigrated. There's one mechanical-digital Fowler mike near the bench lathe, which I admit to using pretty often due to my weakening eyesight. But when it comes to "tenths" the B&S rules, even though I need a magnifier to read it.

JeepHammer
05-20-2019, 05:05 PM
Yup!
Lights & magnifiers, comes with tri-focal glasses that make you look like a chicken bobbing it's head around everytime you focus on something...
I'm nearsighted, so I often loose my glass on top my head, probably spent 100 hours 'looking' for my glasses over the years.

When I got the last inspection table, I got the BIG remote readout! ;)

GONRA
05-20-2019, 06:25 PM
GONRA's "Go To Mike" is a Sears 1" Craftsman from 1950's High School Days.
Have fancier stuff, but it verks Just Fine for usual Slam Bang machining we all do.