I am trying to figure out how much taper per inch is needed on a base to equal 20 MOA at 100 yards. If one ignores the distance between the center of bore to center of scope I come up with the following.
3600 inches = 100 yards
taper of .006 per inch would decline the sight line by 21.6 inches at 100 yards. (3600*.006)
Then divide 21.6 by 1.047 (moa)= 20.6 moa
Does this sound close to those that might have a better handle on the subject?
3600 inches = 100 yards
taper of .006 per inch would decline the sight line by 21.6 inches at 100 yards. (3600*.006)
Then divide 21.6 by 1.047 (moa)= 20.6 moa
Does this sound close to those that might have a better handle on the subject?