I don't know all the dates of the manufacturer, we had a bigger class of Parks Service Guys with them, several failed, don't know the dates, they are .gov rifles one as bad as 89%
We did not do dates
We did not do dates
We don't have time to go that deep into the scope's manufacturing history. Thankfully, they are beginning to disappear from the firing line. We have much trouble with them before we started tracking the results.@lowlight when you said the older leup mk4’s had a lot of tracking error, how old are we talking? I have a 2007 mk4, are the ones you evaluated older than that? Have never gotten around to testing the tracking on mine.
They hawking the schott cuz they know your gonna need to see all your misses with that scope.Schott glass compensates for any mechanical tracking errors.
The 99% was a BTR. Nor certain what the other 2 were.Great info.
Are the Athlon Ares listed ETR's or BTR's? I'm assuming they were the 34mm tube ETR's. If so that's a good showing for a 1K scope so far.
We have started to do that but don't have enough info to release anything subjective or substantive as of yet.Frank, are you going to collect and record the data for student scopes with other mechanical issues? Fails to hold zero, reticle canted relative to turret, etc?
I really like what you and Frank are doing. Honestly a database to track test results would be awesome. For that matter if you documented a repeatable test methodology that others could submit results, well this would be huge. Obviously one would have to prevent people / vendors from manipulating data / some sort of vetting process for those conducting the testing and submitting results.We have started to do that but don't have enough info to release anything subjective or substantive as of yet.
It is only a single scope, but yeah kind if disappointing. I like my 2 minox zp5s very muchKinda surprised to see the Minox ZP 5 with 98.3 (.
Not sure if anyone has pointed it out yet but both Athlons are misspelled. Supposed to be Ares and Cronus.
Good info!As many have read, during our Sniper's Hide Training Courses, we perform a tall target test on many of the student's scopes. Originally we did only in our PR 2 Class, which usually only runs about 2 a year. This year we expanded that to our PR 1 Classes because of the results and the benefits.
Methods,
We use our Sniper's Hide Tracking fixture and it's set up exactly at 100 yards, or in our case, we use 300ft to fine-tune it. We also use a 4ft level and ruler to look at the reticles and measure the variation if we see it. The fixture weighs 30LBS so as not to influence the results and any negative is double-checked.
View attachment 7456804
We consider 2% within spec of most scopes, and the minor variations we see can and should be used to insert into ballistic software to correct the solvers. Scopes are our weakest link in the system, if something will break usually it's a scope first. If you dope your rifle out by shooting it, the variations are minor, but if you are software to predict the computer must know there is a deviation. Not all software let's you do this, Turret Correction, but they should.
Software like Coldbore 2.0, FFS, and Now Genesis Ballistics have the Scope Tracking Utility built into the program. Here is an example:
View attachment 7456809
The added value we found of doing this in every class is, we can find scopes that have the reticles canted as well as demonstrate parallax in their optic by adding and removing it while it is on the fixture. Then, once the scope has been tested we have an instructor guiding the student in rifle set up so the scope is placed in the optimal point of the rail for their body type. About 1/3 of the scopes per class were found to be canted in the rings to varying degrees. We also see reticles not properly aligned to external Levels. It's here too we can fix the reticle and level relationship if the level is off. Sometimes it's an error in initial mounting, sometimes it's happening with installation and tightening. Rifle set up is an important part of our class, fitting the rifle to the shooter solves a lot of problems down the road.
So as of now, we have 150 scopes on our list that have been tested and recorded. The results are here for all to see:
View attachment 7456820
Moving forward expect to find more data, more scopes tested more classes incorporating this process.
Don't own one, sorry. Trust me when I say it's the scope you think it is.Not sure if anyone has pointed it out yet but both Athlons are misspelled. Supposed to be Ares and Cronus.
Thanks. We are only interested in tracking the scopes that come through our course. One tracker, one list.I really like what you and Frank are doing. Honestly a database to track test results would be awesome. For that matter if you documented a repeatable test methodology that others could submit results, well this would be huge. Obviously one would have to prevent people / vendors from manipulating data / some sort of vetting process for those conducting the testing and submitting results.
If you did open up submissions to the community I think you could prevent multiple scope submissions for the same scope model from the same person to prevent people padding the data. Perhaps binding the data submission to serial number In addition to model?
However I think a lot could be gained by tracking this data over a larger sample size.
Ultimately you are testing and quantifying the success of a manufacturers QA. That data should stabilize into a predictable value for each line of scopes from a manufacturer.
Cool stuff.
Just for small correction on the Athlon scopes.
they are spelled:
Ares (likely a BTR)
Cronos (likely a BTR)
Perhaps I am a bit slow on the uptake but, if the numbers represent % of tracking, just what do numbers >100 mean?
Perhaps I am a bit slow on the uptake but, if the numbers represent % of tracking, just what do numbers >100 mean?
Yeah, the worst Bushy tested at 99.8%, the rest were 100%Bushnell kicked it all around!
I only have one, but that may change....
Many thanks!
Jordan