PDA

View Full Version : 45 70 slugged 456



cptjack
03-03-2017, 05:53 PM
Slugged my bore on a marlin 45 70 456 thou are 459 to big. Thanks cj

NSB
03-03-2017, 06:38 PM
I suspect your barrel isn't truly .456". It sounds like you might have a tight spot in your barrel somewhere and you're measuring the constriction in that spot. Having a spot like that will show up as the smallest spot in the barrel. Take a tight fitting patch on a ramrod and slide it in the barrel up and down....do you feel a tight spot? If so, you can relieve that spot somewhat with bore paste. Marlin's are generally a bit looser than the Win/Miroku barrels are and all my Miroku barrels are slugging .4575".

cptjack
03-03-2017, 08:56 PM
used a pure lead 490 ball tapped into muzzle tapped out from breech. calipers read 456 and low spot 451

Skipper
03-03-2017, 09:48 PM
Try upsetting the slug while it's in the bore. I use a brass rod inserted from the breech and one from the muzzle. A good whack, smashing the slug between the rods will give a good reading. How many grooves does your barrel have?

NSB
03-03-2017, 10:35 PM
used a pure lead 490 ball tapped into muzzle tapped out from breech. calipers read 456 and low spot 451
If you're gage only reads to three decimal places you won't be getting an accurate reading. Does it read to four places? If only reading to three, you could be .4569 which is closer to .4570 than to .4560.

dbosman
03-03-2017, 11:05 PM
Good point!


If you're gage only reads to three decimal places you won't be getting an accurate reading. Does it read to four places? If only reading to three, you could be .4569 which is closer to .4570 than to .4560.

buckshotshoey
03-04-2017, 06:01 AM
Even if the caliper only reads to 1/1000, you can make out 1/4's, 1/2's, and 3/4's of a thousandth on the dial. With a mechanical caliper that is. Prob not with a digital. You really don't need to be accurate to 1/10000 for this purpose.

NSB
03-04-2017, 10:01 AM
Even if the caliper only reads to 1/1000, you can make out 1/4's, 1/2's, and 3/4's of a thousandth on the dial. With a mechanical caliper that is. Prob not with a digital. You really don't need to be accurate to 1/10000 for this purpose.
The fourth decimal point isn't used to determine a need, it determines the accuracy of the third decimal place. It's called the "discriminating decimal point". It determines how accurate you are to the nearest thou.

buckshotshoey
03-04-2017, 03:58 PM
The fourth decimal point isn't used to determine a need, it determines the accuracy of the third decimal place. It's called the "discriminating decimal point". It determines how accurate you are to the nearest thou.
Yes. That is correct. But with a dial caliper, I can easily see if it's closer to .358 or .359, or exactly between the two. No need to be more exact then that in this situation.

NSB
03-04-2017, 05:11 PM
Yes. That is correct. But with a dial caliper, I can easily see if it's closer to .358 or .359, or exactly between the two. No need to be more exact then that in this situation.
I'm trying to educate the OP, not you. He hasn't responded to the measuring comment. We don't know how he's measuring it. I'm sure your method is fine for you, but he needs some help or advice.

Larry Gibson
03-04-2017, 05:48 PM
Also suggest you use a micrometer instead of a caliper. I've checked that accuracy of numerous calibers in the .308 - .454 range and have found several were off by .001 to as much as .004".

Larry Gibson

buckshotshoey
03-05-2017, 10:23 AM
I'm trying to educate the OP, not you. He hasn't responded to the measuring comment. We don't know how he's measuring it. I'm sure your method is fine for you, but he needs some help or advice.

And so am I. All I'm saying is you don't have to spend hard earned money on a caliper or mic that reads to .0001 for this purpose. With an accurate dial or mic, the naked eye can see a difference of .00025 on the dial face. It is about the width of the indicator needle. The OP can take both of our recommendations and decide for himself how he wants to do it.

NSB
03-05-2017, 02:31 PM
And so am I. All I'm saying is you don't have to spend hard earned money on a caliper or mic that reads to .0001 for this purpose. With an accurate dial or mic, the naked eye can see a difference of .00025 on the dial face. It is about the width of the indicator needle. The OP can take both of our recommendations and decide for himself how he wants to do it.
I retired as a quality engineer/manager and one of my jobs prior to retiring was to supervise two large metrology labs and maintain calibration and certifications for all the gages in use in the manufacturing area. For three years I checked and recorded many hundreds of gages used in the facility. During that time the only gages I ever rejected were ones that were damaged and no longer accurate. Even the cheaper gages passed calibration checks using registered standards to check them with. I don't recall ever seeing a gage used that didn't measure to at least four decimal places when the requirement was to measure accurately to three decimal places. We had mostly machinests, tool and die makers, and inspectors using these gages. Many of the gages were relatively inexpensive and didn't require anyone to waste "hard earned money" to purchase. If the OP were to get a gage to use to measure accurately to three decimal places, and wanted to do it with certainty, they would get a gage that measured to four decimal places. That's just how it is. If "Bubba close" is good enough, then use a wooden ruler. If you want to do it right, get an accurate, affordable gage that measures correctly. For accurate three decimal places, you need to read to four decimal places. As mentioned above by another responder, use a micrometer not a caliper to do this with. Not looking to start a fight, I'm just pointing out that there is a right way and a wrong way to measure things accurately. For those that don't think they need to be accurate, use anything you want.

buckshotshoey
03-05-2017, 02:39 PM
I retired as a quality engineer/manager and one of my jobs prior to retiring was to supervise two large metrology labs and maintain calibration and certifications for all the gages in use in the manufacturing area. For three years I checked and recorded many hundreds of gages used in the facility. During that time the only gages I ever rejected were ones that were damaged and no longer accurate. Even the cheaper gages passed calibration checks using registered standards to check them with. I don't recall ever seeing a gage used that didn't measure to at least four decimal places when the requirement was to measure accurately to three decimal places. We had mostly machinests, tool and die makers, and inspectors using these gages. Many of the gages were relatively inexpensive and didn't require anyone to waste "hard earned money" to purchase. If the OP were to get a gage to use to measure accurately to three decimal places, and wanted to do it with certainty, they would get a gage that measured to four decimal places. That's just how it is. If "Bubba close" is good enough, then use a wooden ruler. If you want to do it right, get an accurate, affordable gage that measures correctly. For accurate three decimal places, you need to read to four decimal places. As mentioned above by another responder, use a micrometer not a caliper to do this with. Not looking to start a fight, I'm just pointing out that there is a right way and a wrong way to measure things accurately. For those that don't think they need to be accurate, use anything you want.

We are not talking about making trigger parts here. But,
You are 100 percent right....
I am 100 percent wrong....
There...its over.
Oh, and take your bubba comment and stuff it.