• Watch Out for Scammers!

    We've now added a color code for all accounts. Orange accounts are new members, Blue are full members, and Green are Supporters. If you get a message about a sale from an orange account, make sure you pay attention before sending any money!

Rifle Scopes Objective Testing on Rifle Scope optics

NevadaZielmeister

Sergeant
Full Member
Minuteman
Mar 1, 2013
694
13
Northern Nevada
Gentlemen,

Do any of you know of any OBJECTIVE tests that are performed on rifle scopes?

Are there tests for light transmission, clarity, contrast, accuracy of magnification, accuracy of tracking, etc.?

I do not mean the clarity test patterns that the Air Force created in 1951, but some form of measurable and quantifiable standard testing that can really explain with numbers the differences between a low, medium and high quality scope. It appears that whenever we are confronted with questions regarding brand "A" vs. brand "B" we get so many different subjective opinions. Is there any research out there that we can point to and discuss objective findings?

I understand that there will be preferences to style for the scope constructions (turrets, parallax adjustment, etc.), but what about the actual optical quality of the scope itself? When someone states that "optically" brand "A" and brand "B" are the same, what are they basing that off of?

So far, this is one of the few tests I have found: Multiple Rifle Scope tests from Europe..REAL TESTS - The Optics Talk Forums

However, this is just a secondary source (forum post) and I could not access the original link of 2008 since it had expired/change. Any help would be appreciated. This way, maybe, just maybe, people can back up their statements/comments? And where is Ilya???
 
To my knowledge, there is no standardized system for testing multiple brands and types of riflescopes. Ilya Koshkin would be the man to ask, if anyone has done more extensive scope testing than him, I'd be surprised. I don't know the extent of or if he has equipment to directly measure light transmission. In a lab setting, light transmission would seem like the only quantifiable thing to measure, as well as maybe color spectrum representation. I would be difficult to assign a meaningful test or quantity to resolution, other than the air force's chart as compared by human eyes. In the end, even if there were quantifiable differences in light transmission, color rendition, contrast, and resolution, it may mean little to human observers as it is so subjective as to what kind of image someone might prefer. Thats why guys like Ilya and Bigjimfish are so nice to have around as they are for now the golden standard as objective scope testers, with of course they're own subjective preferences. There are also qualities of a scope image that go beyond specs and numbers and they are really hard to pin down. Sometimes someone will hate a S&B and love a Premier or vice versa, with no real logical excuse..and they are both state-of-the-art optics. People are hard to predict.
The free market does a good job of classifying scope quality based on their pricing, as it is tough to offer a scope for $500 that is on par with a scope that costs $2000 because sooner or later the market will equalize and the $500 scope will be worth $1200, as well as the $2000 scope dropping down in value to that level, at least in theory. Some guys will take exception to that statement, such as Leupold's Mk4 cost-to-value vs Vortex's PST, etc.
Well, thats my stab at answering, for whats its worth.
 
We / FinnAccuracy have been using such in-house test for years already. Plan is to finalize it very soon for public release.
Test consists test table and excel file with input fields for measured numbers. It is simplified - limitations with laser printers and varying lightning conditions force to do it so.
 
Camera lenses are commonly tested for resolution, light transmission, color fidelity, chromatic aberration, spherical aberration, etc.

No reason that scopes cannot be tested the same way. And it would probably VERY interesting.
 
Yes- but IF test results must be comparable with results done in different place, time, lightning and environmental conditions- and by different person-then its not too straight forward anymore.
Also creating f.e. such aberration test with numeric performance result is hard if not impossible.
 
Last edited:
Camera lenses are commonly tested for resolution, light transmission, color fidelity, chromatic aberration, spherical aberration, etc.

No reason that scopes cannot be tested the same way. And it would probably VERY interesting.

There is absolutely a reason we can't do the same... camera lenses are tested by comparing images to a computer program. Or subjectively by looks. I have spoken to the creator of the program used to test lenses and he said it's an uphill battle that would not work. I have spoke to several including these people http://www.imatest.com

You'd have to use a camera, chart the camera, and test that first, then put the scope in the same spot and test it, subtracting the camera from the equation. And still nobody agrees on what lenses is the best, just that we like the "looks" of this camera / lens better than that camera lens combo.

It's subjective, its a fancy eye test with out expensive equipment. Sure you can get pretty close, but our perfect eyes only see so much, less than perfect eyes will see something completely different. And any time you have someone who doesn't have a clue, wrong ocular setting, wrong focus, etc, his opinion throws the whole thing out of whack.

What I look for in a scope is not what others looks for, the "look" is close to 4th or 5th down my list of what is important, then again I am not expecting bargain optics to perform like a S&B or I am not saying a scope like an IOR that looks good can compete with a NF that may not "pop" but is 100x more reliable.

To each their own.

When FinnAccuracy releases their methods we'll use that as a standard, but understand it's all very subjective. And frankly I would rather it track then try to explain to someone what I see. I can show you the accuracy in the tracking, showing you the accuracy in the look is near impossible.

People who want to test the glass or are concerned about the "look" of the scope have their priorities in the wrong place.
 
I back Lowlight on this. If a buddy or I were in the market for another scope first and foremost I look at tracking and reliability history, from there it is mostly personal preference (glass, reticle, turret type, etc.). There are certainly different tiers of glass and generally your wallet dictates what you can afford. There are not that many people that can truly tell the difference between Hensoldt, Kahles and Schmidt glass, you get what you pay for.
 
Gentlemen,

Do any of you know of any OBJECTIVE tests that are performed on rifle scopes?

Are there tests for light transmission, clarity, contrast, accuracy of magnification, accuracy of tracking, etc.?

I do not mean the clarity test patterns that the Air Force created in 1951, but some form of measurable and quantifiable standard testing that can really explain with numbers the differences between a low, medium and high quality scope. It appears that whenever we are confronted with questions regarding brand "A" vs. brand "B" we get so many different subjective opinions. Is there any research out there that we can point to and discuss objective findings?

I understand that there will be preferences to style for the scope constructions (turrets, parallax adjustment, etc.), but what about the actual optical quality of the scope itself? When someone states that "optically" brand "A" and brand "B" are the same, what are they basing that off of?

So far, this is one of the few tests I have found: Multiple Rifle Scope tests from Europe..REAL TESTS - The Optics Talk Forums

However, this is just a secondary source (forum post) and I could not access the original link of 2008 since it had expired/change. Any help would be appreciated. This way, maybe, just maybe, people can back up their statements/comments? And where is Ilya???[/


At present there will be subjectivity with any optical comparison. As Ilya has discussed in his writings, each person has different preferences and may see differently when comparing optics. Additionally, several very high end scopes may appear to have substandard glass to some people due to the specificity of the optical design (i.e., optical designs which target the competition bench rest shooters). Some people favor optics maximized for resolution rather than contrast.

Scope designers must compromise in some area of optical performance in order to increase other areas of performance. For example, let's assume that a potential buyer has narrowed it down to two different scopes with equally clear glass, (both are 5-25x56 with a 34mm tube). If the makers of one of the scopes prioritized resolution with less concern for contrast, then contrast must be compromised somewhat in order to maximize resolution. To some people the image may appear flat and less colorful than scopes with higher contrast. If contrast is maximized over resolution, the same viewer who was unimpressed by the higher resolving scope may prefer the higher contrast without thinking much about resolving power. Unknowingly, the buyer believes the glass to be better in the higher contrast scope based upon a quick look through....and for that person it may be. Everyone has different needs and preferences.

As Frank has already mentioned, many times one may give a scope a quick glance and offer an opinion that the scope has poor glass without properly adjusting the ocular and parallax.
 
As Frank has already mentioned, many times one may give a scope a quick glance and offer an opinion that the scope has poor glass without properly adjusting the ocular and parallax.

Jtrax, you are right on the money with your discussion and I agree. You are also so right regarding proper adjustment of the scope. Without proper adjustment regarding ocular and parallax, the optic may appear substandard. To confirm, I shot last night with my eyeglasses when I normally shoot with contact lenses. My point of impact shift was 1 inch left at 100 yards and my parallax adjustment was off by 50 meters. So yes, proper adjustment is VERY important.

Others, thank you so much for the responses. I am particularly interested in the tests to be performed by the gentlemen at Finn Accuracy. Their development of the MSR reticle was the big selling point in my purchase of the Schmidt and Bender. I know they will not let us down. It would also appear that there are opinions about opinions, which is interesting by itself.
 
On top of all this, how many are testing two different scopes thinking they are on the same power... cause the power ring says 10x, meanwhile one is being viewed at 9.6x and the other 11.2x which then "looks" better to the tester.

Magnification rings are not all perfectly calibrated to be the same. So that is a another variable.
 
On top of all this, how many are testing two different scopes thinking they are on the same power...

Frank hits on the "King's new clothes" truth about scopes here. I have noticed this over and over. Scopes are both 20x on top and *CLEARLY* not the same magnification. I have taken to running the mag ring by eye to evaluate as the marks on the scopes mean little or nothing.

John
 
Maybe I'm way off track with this idea, but I have written software before that can give solid numbers to image quality. The algorithm was used to prove the quality of a new fingerprinting technology. What I did was use C# with XNA libraries to log every pixel in the image, then count the number of unique colors. The second portion of the algorithm worked by measuring the rate of change across user-defined areas and get the rate of gradient change. Knowing how well a standardized image is transmitted via a scope could tell you a lot about clarity, definition, and transmission.
 
Maybe I'm way off track with this idea, but I have written software before that can give solid numbers to image quality. The algorithm was used to prove the quality of a new fingerprinting technology. What I did was use C# with XNA libraries to log every pixel in the image, then count the number of unique colors. The second portion of the algorithm worked by measuring the rate of change across user-defined areas and get the rate of gradient change. Knowing how well a standardized image is transmitted via a scope could tell you a lot about clarity, definition, and transmission.

An interesting idea, I'm sure software could scan an an ultrahigh resolution image and distinguish details and spit out numbers and so on. But, I think the problem would lay with getting a pure and consistent image from the scope to the testing medium and then further accurate pixelation. It is insanely hard to get a decent through-the-scope shot with a camera and everyone makes sure to mention when they post one that the picture doesn't do any justice to the real image in person. Some complex and expensive equipment may be able to do a much better job of it, but the logistics would be very tough, and very hard to repeat consistently and would have to be in a lab setting for sure. Any deviation or signal distortion from the scope to the testing equipment would be more drastic that the minuscule differences in resolution and light transmission between a S&B and a Premier for instance. Probably not impossible, just expensive and difficult, and a cool idea though.
 
Maybe I'm way off track with this idea, but I have written software before that can give solid numbers to image quality. The algorithm was used to prove the quality of a new fingerprinting technology. What I did was use C# with XNA libraries to log every pixel in the image, then count the number of unique colors. The second portion of the algorithm worked by measuring the rate of change across user-defined areas and get the rate of gradient change. Knowing how well a standardized image is transmitted via a scope could tell you a lot about clarity, definition, and transmission.

Certainly one of the biggest problems is getting that image. As a person who has taken countless through the scope pictures, and even devised two separate adapters for taking such photos, I can tell you that I have yet to find a way to do so that I would consider to be reliable enough and repeatable enough to compare photos and judge optical performance. Simply put, there is often more difference from picture to picture than from scope to scope. While I am sure that some device could be made that would flip the scenario so that shot to shot differences were far less than scope to scope differences I expect that this would be quite an undertaking not only from the machine shop point of view but as regards the software and even the test facility. I would be surprised if some scope makers did not have such tech. But then, you wouldn't really believe what any scope maker had to say about their product vs anybody else would you.
 
What I look for in a scope is not what others looks for, the "look" is close to 4th or 5th down my list of what is important, then again I am not expecting bargain optics to perform like a S&B or I am not saying a scope like an IOR that looks good can compete with a NF that may not "pop" but is 100x more reliable.
This is precisely the way I see it. When I see guys in Afghanistan crawling around in the shit with a Nighforce, making world-record sniper shots, and getting holes shot through the tube and the optic still working and keeping them in the fight - this is a real-world lab analysis IMO and has sold me on that brand. Even though NF glass isn't as good as some other top-tier scopes and they weigh more than the competition, the fact that the turrets track accurately and they can take a beating are more important to me than the small advantage in glass that other scopes provide.
 
This is precisely the way I see it. When I see guys in Afghanistan crawling around in the shit with a Nighforce, making world-record sniper shots, and getting holes shot through the tube and the optic still working and keeping them in the fight - this is a real-world lab analysis IMO and has sold me on that brand. Even though NF glass isn't as good as some other top-tier scopes and they weigh more than the competition, the fact that the turrets track accurately and they can take a beating are more important to me than the small advantage in glass that other scopes provide.

I hate to bust your bubble but just because you see guys running around in the box with them doesn't mean guys or the sniping community like them. I can tell you first hand that the Nightforce scopes were on the list for being shit canned for a FFP scope during the SOF sniper conference /force modernization meeting some time back. In fact there was a specific lot of NF scopes that had to be recalled and fixed due to mechanical issues. The problem was resolved but it was a hiccup in the fielding. What you are seeing on some guns now is legacy contract issued weapons systems.The NF scope has been deemed to not meet the requirements for what the force needs as a FFP scope is one of the requirements. So basing your purchasing of a scope on what you see guys carrying in the box is flawed as it isn't indicative of what they want and or would choose if given a choice.
 
Ilya Koshkin does some pretty thorough reviews here
Reviews: Riflescopes » OpticsThoughts
He does not use test equipment, but he approaches the comparison with an open mind and has others participate.
He seems to have a fair amount of experience with optics and a good understanding of optical design

His conclusions seem rational and valid to me.

Other than that, all I've got is my own eyes and experience.

Joe
 
I hate to bust your bubble but just because you see guys running around in the box with them doesn't mean guys or the sniping community like them. I can tell you first hand that the Nightforce scopes were on the list for being shit canned for a FFP scope during the SOF sniper conference /force modernization meeting some time back. In fact there was a specific lot of NF scopes that had to be recalled and fixed due to mechanical issues. The problem was resolved but it was a hiccup in the fielding. What you are seeing on some guns now is legacy contract issued weapons systems.The NF scope has been deemed to not meet the requirements for what the force needs as a FFP scope is one of the requirements. So basing your purchasing of a scope on what you see guys carrying in the box is flawed as it isn't indicative of what they want and or would choose if given a choice.


Exactly, look at the Horus, shows what some well placed lobbying can do. :)
 
I can tell you first hand that the Nightforce scopes were on the list for being shit canned for a FFP scope during the SOF sniper conference /force modernization meeting some time back. In fact there was a specific lot of NF scopes that had to be recalled and fixed due to mechanical issues. The problem was resolved but it was a hiccup in the fielding. What you are seeing on some guns now is legacy contract issued weapons systems.The NF scope has been deemed to not meet the requirements for what the force needs as a FFP scope is one of the requirements.

Wait a minute? The Army NF scopes I have handled were ALL FFP scopes. This was almost 2 years ago. I am probably not as connected as you are, but has this changed? Is NF now providing F2s to the Army? The guys I spoke with that used them seemed to like them very much (although they probably never got to use the S&Bs that the Marines were using). Not saying you are wrong, but the NF optics that I personally saw were all FFP.
 
Last edited:
Wait a minute? The Army NF scopes I have handled were ALL FFP scopes. This was almost 2 years ago. I am probably not as connected as you are, but has this changed? Is NF now providing F2s to the Army? The guys I spoke with that used them seemed to like them very much (although they probably never got to use the S&Bs that the Marines were using). Not saying you are wrong, but the NF optics that I personally saw were all FFP.

I should have been more thorough in my explanation, FFP is one of the requirements among several for a scope and NF has supplied some but the legacy systems that are under contract and at some units are SFP. The new requirement asks for a optic in the 3x/5x-20x/25x range.

One of the bullet points from the conferance slides:

–Fact = NF scopes fielded for MK13s having 15-20% quality control failures (see AAR) – has second focal plane and hollow mil dot reticle - low confidence


I am positive there are some guys that like/dislike certain features on the SWS we have be it the scope, the reticle, etc so I can't say everyone dislikes a specific piece of gear as some guys like the Horus reticle, some hate it, flip a coin and thats the chances of it being one or the other.The guys who had NF scopes shit the bed on them are obviously not going to be keen on it as those who have not had issues with it.
 
Last edited:
Doing the classes that had both NF and Leupold, the Leupolds were failing 3 to 1 compared to the Nightforce. So if you have 20% failures on the NF side, you certainly saw over 60% on the other side.

I think that is bit cherry picking.

Crane used Nightforces, and they continue to use the 2.5-10x, the 25x call is why the BEAST is coming out, albeit far too late to compete with the S&B. There was a pretty solid mix, 3-15x, 5-22x, etc, the change to FFPs will certainly matter as NF is late to the party in that respect. But for a long, long time, they worked out great, especially when compared to the Leupolds.
 
Doing the classes that had both NF and Leupold, the Leupolds were failing 3 to 1 compared to the Nightforce. So if you have 20% failures on the NF side, you certainly saw over 60% on the other side.

I think that is bit cherry picking.


No cherry picking as far as I can tell as this bullet was also included. I didn't include it as it didn't pertain to the topic at hand. However, it does support your statement of a higher fail rate.

"Leupold – 20% fail rate per class. Replaced 49 of 99 in last 18 months – regardless of replacement policy - low confidence"


Crane used Nightforces, and they continue to use the 2.5-10x, the 25x call is why the BEAST is coming out, albeit far too late to compete with the S&B. There was a pretty solid mix, 3-15x, 5-22x, etc, the change to FFPs will certainly matter as NF is late to the party in that respect. But for a long, long time, they worked out great, especially when compared to the Leupolds.

I agree and you get no argument from me in the comparison of the two.
 
Okay folks, I asked and the people at Finn Accuracy DELIVERED!!

http://www.snipershide.com/shooting...5-finnaccuracy-optics-test-available-now.html

This is something we can do as a group and help tabulate our results into one main excel spreadsheet. That way, when someone says Brand "A" has better optical quality than Brand "B", they better back it up with the testing procedures dictated in the thread above.

Please check out that thread and start getting to work proving your opinions. Now there are no excuses.
 
I hate to bust your bubble but just because you see guys running around in the box with them doesn't mean guys or the sniping community like them. I can tell you first hand that the Nightforce scopes were on the list for being shit canned for a FFP scope during the SOF sniper conference /force modernization meeting some time back. In fact there was a specific lot of NF scopes that had to be recalled and fixed due to mechanical issues. The problem was resolved but it was a hiccup in the fielding. What you are seeing on some guns now is legacy contract issued weapons systems.The NF scope has been deemed to not meet the requirements for what the force needs as a FFP scope is one of the requirements. So basing your purchasing of a scope on what you see guys carrying in the box is flawed as it isn't indicative of what they want and or would choose if given a choice.

I love the String, "What the pros use", I believe they do have a choice. I doubt if these guys just read an ad in the latest "tactical rag" and decided on which scope to use.
 
I love the String, "What the pros use", I believe they do have a choice. I doubt if these guys just read an ad in the latest "tactical rag" and decided on which scope to use.


I'm not sure you are tracking on what I said, I think you misunderstood what and who I am talking about, or perhaps I don't understand the context of your response. I am talking about military guys using military issued weapons systems, once its a program of record there is no choosing, you get to use what is on the awarded contract.
 
Okay folks, I asked and the people at Finn Accuracy DELIVERED!!

http://www.snipershide.com/shooting...5-finnaccuracy-optics-test-available-now.html

This is something we can do as a group and help tabulate our results into one main excel spreadsheet. That way, when someone says Brand "A" has better optical quality than Brand "B", they better back it up with the testing procedures dictated in the thread above.

Please check out that thread and start getting to work proving your opinions. Now there are no excuses.

It is a good effort by Finn Accuracy, but use it with caution because their test is subjective and the results are not comparable between different users, different environments, different lighting conditions, etc. There are also some methodology differences and when you get into the discussion on high end scopes which are very good to start with, their tests are also quite incomplete.

There is not magic to an objective scope test. My company builds Electro Optical test equipment. That is all we do and we have built enough testers for all manner of cameras, payloads, weapon sights, lasers, etc to know that for any DVO, the "magic" is not in the measurement, but rather in translating what that measurement means for the user.

If all you are trying to do is figure out how each scope objectively compares to any other scope all you have to do is spend some money on semi-custom test equipment (a fair amount of money, but nothing insane by aerospace standards) and get the results. I can measure everything you want on a scope using the test equipment we build and the testers we supply are agnostic of atmospheric conditions, user eyesight and other subjective factors. There is a variety of tests you can run, but anything you use starts with the polychromatic and monochromatic MTF curves on-axis and off-axis, spectral transmission, distortion, flare, color accuracy, color discrimination, FOV, XP as a function of X-Y position, XP as a function of Z position, and a few others. I can simiarly measure tracking accuracy with greater precision than you ordinarily can on the range.

If push comes to shove, I can even add a vibration stage to test how all this survives recoil. There are even people out there who make scope testers and market them. They do not test everything I mentioned here, but I have seen their test sets in action and they measure the important stuff.

In order to figure out how all those results translate into real world performance, you have to take those objective test results in hand for a bunch of scopes of varying quality and go look through them (and use them) with your own eyes.

And then you have to take into account that how those objective tests translate into actual usage is a little different from person to person, although most people are fairly close to each other with a few outliers here and there.

When I first started looking at scopes I took them to work and did some instrumental testing. After doing it for a couple of years I worked out pretty well how objective results I get with the equipment translate into subjective results I get with actual usage of the scopes. Over time, I phased out instrumental testing except for some special cases where I see subtle effects that I can't easily explain.

ILya
 
It is a good effort by Finn Accuracy, but use it with caution because their test is subjective and the results are not comparable between different users, different environments, different lighting conditions, etc. There are also some methodology differences and when you get into the discussion on high end scopes which are very good to start with, their tests are also quite incomplete.

And then you have to take into account that how those objective tests translate into actual usage is a little different from person to person, although most people are fairly close to each other with a few outliers here and there.

When I first started looking at scopes I took them to work and did some instrumental testing. After doing it for a couple of years I worked out pretty well how objective results I get with the equipment translate into subjective results I get with actual usage of the scopes. Over time, I phased out instrumental testing except for some special cases where I see subtle effects that I can't easily explain.

ILya

Ilya,

Thank you for all of your testing and reviews in the past. You really are a benefit to our community, especially in assisting people with making informed decisions for rifle scope purchases.

I see your point regarding objective vs. subjective testing. Meanwhile, any progress on evaluating the "Heavy Weights" as discussed in another thread? Or are you waiting for the Nightforce B.E.A.S.T. to be included with these evaluations?

I always look forward to your write-ups Ilya and thank you so much for your thoughts, insight and assistance!!
 
Last edited:
Ilya,

Thank you for all of our testing and reviews in the past. You really are a benefit to our community, especially in assisting people with making informed decisions for rifle scope purchases.

I see your point regarding objective vs. subjective testing. Meanwhile, any progress on evaluating the "Heavy Weights" as discussed in another thread? Or are you waiting for the Nightforce B.E.A.S.T. to be included with these evaluations?

I always look forward to your write-ups Ilya and thank you so much for your thoughts, insight and assistance!!

Thank you for your kind words.

The first part is largely done and I have looked at Steiner, S&B (two of them), Premier, March and Kahles. There is one remaining issue to iron out and I will publish it.

I will try to follow-up with the scopes that were not easily available to me this time around: Leupold Mark 8 and Nightforce BEAST and maybe something else.

One last word on subjective testing: it is valuable if processed in the right context. Also, subjective testing becomes fairly objective it is done by the same user, with several scope side-by-side so that they are all affected by subjective factors (user's eyes, environment, etc) equally.

ILya
 
Honestly, while I completely dislike the idea of testing "glass" , especially the subjective way, I understand why people want it. Everyone has a different reason for owning a scope, and how they are gonna use it, so it makes sense in a self destructive way. In my opinion it does more harm than good, and really does a disservice to the buying public.

But, having a controlled, standardized way of doing it, will at least give people some idea of what is going when someone says, they like Brand X better then Brand Y.

Really, the concentration should be on the operation of the mechanics, with a secondary emphasis on the "look" and then you can really just reduce it to contrast and clarity at a distant target. Can you see target image A at X distance. I go back to everyone crowing about the Premier overlooking the fact it was only designed to work with one set of rings at 15 inch pounds, anything else was binding the parallax, where I would walk down the line at a match and see 50% of the Premiers were bound in the rings. It was well known issue being disregarded because everyone's only consideration was the glass.

Honestly I think a lot is overlooked when people are busy talking about glass, and let's face it, out of the box, a lot of stuff looks great. I would want to know how it holds up over time. Does rain, or sunlight degrade the coatings faster on Scope A over Scope B... in other words, 4 years from now does it still look good or has it lost it's shine.

If we concentrated on shooting them, I think we would be serving the buying public better. If you want to test them, do a twilight test, at what time can you no longer hit that 600 yard target ? This would be relevant to shooters and not collectors. As well by focusing on the tracking as priority one we'd be moving people towards truing their software which starts by 1st calibrating your scope. Much of the software has inputs for adjusting your scope tracking yet everyone assumes the scopes are adjusting true, when we know this to be false. I would rather know my scope's average movement over 20 mils is .99 mils than to assume it was right.

I applaud the FinnAccuracy guys, they are working towards a standard we all can see and understand. They are giving you the tools to at least follow a logical order. Better than someone saying they looked out the window of the house and saw a license plate better with scope A over B. I do it too, but I never take it too heart what I see, heck I at least walk outside and mount the thing in a rest. The best case would be for viewing parties to form like Ilya has done in the past where 10 people all take a consensus of what they saw. But it needs to be done without any input while it is happening. Too easy to influence people.

It's easy to look through and give an opinion, but I would rather see something concrete behind it. Combine the look of the scope with some solid testing of the mechanics after all the glass does nothing, the turrets are the weak link.