Since this thread concerns tin...
Quote:
Originally Posted by
cbrick
It's not that over 3% tin isn't good, it's just wastefull and will accomplish little to nothing, at least nothing that your trying to accomplish.
6-8% antimony is getting up in brittleness
Rick
Some good responses on this thread. I certainly agree with the information presented. I realize Marlin Hunter started this thread in concern to using tin as an aid in casting, but I have a question concerning the use of tin as it relates to bullet integrity.:hijack:
Allow me to give a little background concerning my situation.
I have spent the past several years doing dedicated research and testing on the modern 45-70 loads, specifically with pressures associated with the Marlin lever actions. A friend of mine is into wildcat cartridge development, and owns one of those "portable ballistic labs" that utilize strain gages and software. I use the T/C Encore as a test mule to develop the maximum pressure loads before transferring the loads over to the levers. (Not only is testing easier, but if I really mess things up with insane pressure, a stretched frame is easier to replace than a levergun!)
The loads are developed specifically for hunting purposes. I realize that there are commercial outfits who specialize in this, and some may consider me to be "reinventing the wheel". But I prefer to develop loads specific to my situations, rather than settle for a few choices from the manufacturers. I have developed loads from mild to wild, 250 grain bullets to 550 grains in various designs, both jacketed bullets and cast. However, my most-tested loads involve the bullets around 350 grains at velocities above 2000 fps. Not that these are my "everyday" loads (ouch!), but I purposely attempt to push the bullet to its limits. The number of various alloy combinations I have tried are mind-boggling.
The major stumbling block has been finding the correct alloy for desired terminal ballistics. Something I have definitely learned is that the alloy that may seem ideal for the load (accuracy/control of leading) is not always ideal upon impact, dependent upon the desired characteristics.
Most of my testing is done with wet paper, simply because that is the most accessible and economical for me. I feel that it is a good standard, comparatively speaking. I have done one test with ballistic gelatin, but that was to get an idea of "wound channel" to see if the paper testing was leading me in the right direction.
One of the early tests involved comparing Lyman's No.2 (5-5-90) against the hardball alloy (2-6-92). The two different alloys react the same as far as the load goes, yet recovering the fired bullets revealed better weight retention with the No.2 alloy.
Drawing conclusions from that, later testing was done with wheel weight metal from a 800 pound lot. This lot had a tested composition of 3.2% antimony and .18% tin. Following the accepted procedure of incrementally adding tin, the bullet recovery testing showed the same trend- better weight retention and penetration whenever I added tin.
I am not a metallurgist, so I can only base any conclusions upon the old "try it and see" methods. I make every attempt to be very methodical in my testing. Every lot of lead in my shed is of known composition. The loads are developed so as to ensure similar impact velocities; I even use the chronoghraph as I'm firing into the test medium just to make sure. Even with low-grade antimonial alloys (as low as 2.3%), the trend seems to be better weight retention as the percentage of tin is increased.
My questions are:
1. Can anyone explain what is actually happening concerning the effect of tin as it pertains to alloy integrity? I am not interested in the effects on casting characteristics, nor am I concerned with hardening/softening characteristics. I want to know why the bullets are holding together better after impact. The additional tin seems to help counteract the effect of the antimony; is this what is actually happening, or am I way off base?
2. Are my findings in line with what should be happening "in theory"? If not, maybe I need to reconsider my approach.
3. Keeping in mind that these alloys contain some amount of antimony, is the above quoted 6-8% the starting point of actual brittleness, or is it the accepted standard of what is considered to be brittle for our normal range of utility? My reason for asking is this: If my WW metal mentioned above only has 3.2% Sb, why does it consistently shed more weight than 10:1 lead/tin alloy? The two alloys are nearly identical in initial hardness, but the 10:1 cannot be quenched or heat-treated to a high enough level of hardness to prevent leading. However, I did indeed run a test between the two in their as-cast state at 2200 fps just to see how they held up on impact (and dealt with the leading afterwards!) The WW retained 62% of its original weight; the 10:1 retained 83%.
This has led me to believe that the introduction of any antimony begins to introduce some degree of "brittleness"; it's just that we do not consider it to be a problem until it reaches above our accepted level of compromise, i.e., when we easily start noticing the difference.
4. Does anyone else have any experience similar to my findings above?
I am open to all thoughts pertaining to this topic.