• Watch Out for Scammers!

    We've now added a color code for all accounts. Orange accounts are new members, Blue are full members, and Green are Supporters. If you get a message about a sale from an orange account, make sure you pay attention before sending any money!

Rifle Scopes Should We, Review Criteria

For me personally the scorecard needs two parts,

The Specs and details,
What exactly are we looking at

The list of criteria covered in the review,

Criteria will have subjective elements because people don't have machines to put the scopes into a test, so they will be subjective

But if you consider these Review Cards are only for reference, which any review is only for reference. How many people only look at once source for data, I know I look at as many appear relevant as I can find. So with that in mind, we have to get the fact home field reviews are subjective

Let's just assume every review is, and move past that part of it,

if we can boil the criteria to 10 or so topics and then outline the steps or at least recommendations for each, we can move forward with a model. Will the model be the end-all, absolutely not, it can be a living document that grows and changes over time.

What would like to see to help move this along is a set of bullet point lists we can refine, as noted like 10 or so areas of consideration

That might be a better way to look at it, areas of consideration
Does the website have a database/forms module? Then guys who are making reviews fill the form (with definitions/criteria under a (?) button and the data is saved. Gatekeeper can review submissions and determine format of output or even make it a searchable database.

Open access or members only? As a member you get exclusive access to the most detailed data and reviews, all in one place. ?
 
Just brainstorming here...
  • Price (simple range, like low, medium, and high)
  • Weight
  • Glass clarity/brightness (this is somewhat subjective, but it's also important)
  • Number of mils/moa of adjustment in one revolution
  • Availability of mounts and accessories??
  • Where scope is made
  • Customer service/support (a simple good or bad rating maybe?)
  • Reticle subtensions? (reticles themselves are too subjective, but most people agree that .2 mil hashmarks are "better" than .5 mil hashmarks)
  • Trying to come up with more.....


Does anyone think tube diameter being larger is a "better" thing? If so, that could be added to the list. Larger is better...
How about internals? Scope X has aluminum internals while Scope Z has steel... ??
 
Last edited:
  • Like
Reactions: Tunnuh and chrome
I may get into the weed w/this next question, but feel compelled to ask:

If we narrow it to 10 or so variables, how "inclusive" do we want to be with researching what makes the list or doesn't make the list? I could create an anonymous/open survey where we ask hide members to rack and stack whats most important to them and use that as a foundation to evaluate?
 
  • Like
Reactions: Covertnoob5
Screenshot 2020-04-02 19.01.56.png



Would look like this, with as many options as we wanted. The survey taker would simply drag and drop by preference, and then qualtrics would spit out top 10 based on when we closed it.
 
Don’t need a Hide survey. Frank’s already got the subjective review criteria laid out:


Fit & Finish
Turret Feel
Click Spacing
Magnification Ring Movement
Eye Box
Mounting
Glass Quality
Light Transmission Dusk
Light Transmission Dawn
Reticle Design / Features
Overall Score

A 1-5 score is plenty fine since these are all subjective; but 1-10 makes people think they’re really honing in on perfection—-oh well, either works. It won’t matter once a scope has more than a handful of reviews.
 
People are going to be doing it away, some will see it as a way to get attention.

Look at it this way, a lot of companies over the last few years have moved to a competition model. They have invested heavily in the PRS or NRL and now market via PRS Influencer or through Sponsorships. That is model, they want to reach a specific class of shooters, in this case, potential match shooters or people who follow match shooting. It's an egg in one basket approach and now with our activities being restricted, they are gonna have to change tactics.

These same influencers who were getting free or discounted products to talk about in conjunction with their competition season are gonna be sidelined in some respects. No more, "hey look at my score" to focus on, so the direction is gonna change. How do you hold your spot while we wait for some return to normalcy? By becoming subject matter experts so to speak. The die is cast, it's just a matter of time.

Fast forward, we are gonna see more of these pop-ups. We have a ton of scope companies out there, more than ever before. How does Company A separate themselves from Company B. Either one drops out faster than the other, or they get creative? The no money creative approach is to give a product to a future influencer or potential influencer and let them run with it.

This is gonna muddy the waters, even more, so creating a simplified standard, or at least a SH Specific programs for these reviews will help wade through the Hype Machine.

if we can hold people to a standard, if we create a system easy to understand that covers the basics, we are ahead of the game and can potentially change this part of the sport/industry for the better. Instead of a wild west approach, create a format.

Subjective stuff will never go away. and let's face it, nobody bullet points a review they all add in their own opinion. Heck, I do it all the time, in my opinion, I see X, where in Ilya's opinion he sees Y. But if we both test tracking and find the scope is in 2% of each other, the reader will know it's probably a solid bet. If we both score a optic to within 3 points of each other, you have an idea it's pretty close. What it does is moves the outliers to where they belong and gives someone who might want to do this a format to follow

Please dont' get into the weeds before a model and method is discussed, What if the virus explodes because and airplane of infected blows up over the US and the airborne virus rains down on us, sure I can do it too.

Stick to plan, start at Number 1 AND THEN we can worrry about What If or how it will work moving forward. Stop jumping around with nonsense talk.
 
Don’t need a Hide survey. Frank’s already got the subjective review criteria laid out:


Fit & Finish
Turret Feel
Click Spacing
Magnification Ring Movement
Eye Box
Mounting
Glass Quality
Light Transmission Dusk
Light Transmission Dawn
Reticle Design / Features
Overall Score

A 1-5 score is plenty fine since these are all subjective; but 1-10 makes people think they’re really honing in on perfection—-oh well, either works. It won’t matter once a scope has more than a handful of reviews.
 
That goes a long way to making it a less subjective thumbnail number out of thin air
 
Looking at the lists so far, 'Reticle' would also be an example where it could potentially break into sub-extensions of:

1.) FFP or SFP
2.) # of Reticle Options Avail from Manufacturer
3.) Illuminated or Non-Illuminated
4.) MRAD or MOA
5.) etc.

Would these sub-categories be too much, or would they stay nested under "reticle" as one of the primary 10 for evaluation?
 
We can do 1 - 10 as a scoring,

I was saying for Glass we should lower it, but maybe split it up instead,

30 Minute Twilight Test: 1-10
Resolution Test: 1-10
Contrast Test: 1-10

Instead of saying, Glass...

That's nice and simple and much better that just looking at a resolution chart in good light.

There is only really one thing I would add to this. Look at something with small color variations. In the real world, it is kinda like looking at a wall of green foliage or, if in a dry area, at some yellow/brown terrain. Note how much subtle detail you see there. That corresponds surprisingly well to being able to find something that is hard to see in the real world.

It is sort of an extension of a general contrast test, except rather than looking at contrasting colors, you are looking at similar color shades. Scopes that do this well are also good at seeing stuff in the shadows when it is bright light everywhere else.

ILya
 
  • Like
Reactions: TriggerJerk!
Looking at the lists so far, 'Reticle' would also be an example where it could potentially break into sub-extensions of:

1.) FFP or SFP
2.) # of Reticle Options Avail from Manufacturer
3.) Illuminated or Non-Illuminated
4.) MRAD or MOA
5.) etc.

Would these sub-categories be too much, or would they stay nested under "reticle" as one of the primary 10 for evaluation?

Those could be in the “specs” part of the card
 
  • Like
Reactions: chrome
One of the things I saw someone mention above is ranking based on maximum magnification. I probably would not do that. I think magnification ratio is a better metric since that directly aids flexibility. To be fair, we kinda start getting into diminishing returns once the ratio gets past 6x or so, but for general purpose use larger erector ratio is a good things to have if other stuff is not compromised too much. The things that are usually compromized are FOV and Depth of FIeld, so if we keep track of the three we'll isolate the scopes that make the best compromise.

ILya
 
For a compact bullet point list .
One I'd like to see on it is.

  • Eye box 1-10
For reviewing each bullet point could be defined in more detail or sort of a how to rate. If I'm understanding correctly the end goal is to have a simplified short bullet point list to reference. Could really help someone narrow down to 1 or 2 scopes and then they can drill down on the minusha as much as they want.
 
  • Like
Reactions: Jack Master
Gotcha, last few posts are helping me track better on the overall intent.
 
Just brainstorming here...
  • Price (simple range, like low, medium, and high)
  • Weight
  • Glass clarity/brightness (this is somewhat subjective, but it's also important)
  • Number of mils/moa of adjustment in one revolution
  • Availability of mounts and accessories??
  • Where scope is made
  • Customer service/support (a simple good or bad rating maybe?)
  • Reticle subtensions? (reticles themselves are too subjective, but most people agree that .2 mil hashmarks are "better" than .5 mil hashmarks)
  • Trying to come up with more.....


Does anyone think tube diameter being larger is a "better" thing? If so, that could be added to the list. Larger is better...
How about internals? Scope X has aluminum internals while Scope Z has steel... ??
I am not sure facts should be submitted per user. For example, if I want to review S&B PMII 5-25x56 I should find it in a drop down and that should have the facts (weight, length, does/doesn't have zero stop, elevation/windage range etc). If it doesn't exist then you can click an option to create a entry for it and it can be reviewed before being added to the list of scopes to review.

Also, there needs to be a discussion about how scopes will be grouped. If you include the reticle in the name then you can be missing out on the big picture. Lets say you look up S&B PMII GR2ID and there are 10 reviews. But unbeknownst to you there is 80 reviews for S&B PMII H2CMR and all that info is relevant except for the reticle section. S&B is also a good example of what happens when you have different options for turrets MTC/DT etc, the glass and parallax and magnification, eyebox etc is all the same except for the section on turrets. AKA instead of having the review of the parallax be spread out across 6 different scopes (S&B PMII 5-25x56 H2CMR MTC, S&B PMII 5-25x56 GR2ID MTC DT, S&B PMII 5-25x56 P4LF MTC DT, S&B PMII 5-25x56 H2CMR, S&B PMII 5-25x56 Tremor3, S&B PMII 5-25x56 LRR-MIL) they could all be grouped under one S&B PMII 5-25x56 so that you don't have to manually look at all of the different models that use the same glass, parallax, magnification adj etc just to get a larger picture. (I am not 100% sure on what is exactly the same between different models of S&B but you get the idea that depending on how this is implemented you can be missing data that is specific to the scope you want but you just happened to chose to look at a less popular reticle/turret config etc)

It would be nice to have overall score to be similar to typical reviews, have the overall score but let me mouse over it and it will show banded graph of how many people gave it 80-100, 60-80, 40-60, 20-40, 0-20 so I can see if 90% of the reviews are in 60-80 and 10% are in 0-20, it helps give perspective on if people are trying to tank reviews or a rate of lemons of that particular scope. Same thing for the sub categories as far as a banded bar graph breakdown of # of people that gave (example) Turret feel a 8 and how many gave it 0. Or not, more options isn't always better

Format:

Overall Score (Add up Major Categories)
Major Category (Avg of sub categories) - Sub Categories

(I am not going to pretend I know much about glass, this is just for example sake and you can add or remove sub categories or remove the whole major category altogether if you think it isn't useful)

Glass - CA, Edge Distortion, Brightness, Clarity, Resolution, (FOV, DOF {are these facts?}), Low Light usefulness?, Ability to cut through mirage? Or ability to see mirage?

Turrets - Crispness(?), Consistency (Overshoot bc of detentes being too deep), Ability to Determine which revolution user is at, Ease of lock/unlock if applicable, ease of setting zero stop, Grippiness(?, like stippling or cuts to make it slip less, IDK... again, just examples). You can have a rating for the markings (Hard to read, easy to read 1 mil marks but doesn't have .5 mil larger then the .1's or something IDK). Tracking error in %

Reticle - Ability to hold over, Ability to hold wind, Ability to range target (like if it has a section of the reticle with .1 mil hashes or something), Ease of use at high magnification, Ease of use at low magnification, how distracting it is(?), center aim point (like if you highly rate small center dots or prefer open center, this is very subjective and might fall under the ease of use at high/low mag), thickness (also might fall under ease of use at high/low mag).

Parallax - how forgiving it is, or how easy it is to reach proper parallax

Reticle Illumination - Might be more appropriate under facts section? Auto shutoff goodness (6 min is too short, 2 hours is too long etc)?, Extra points for off ticks between levels? # of levels? again, 6 min auto shutoff, no off inbetween are facts but also how much you value these is subjective sooo IDK.

Diopter - Also a combo of facts and subjectiveness, locking diopter is a fact but you can rate how well it locks up (Ex, constantly loses up). IDK much about diopter but I assume there is a +- range and some scopes might have a wider range then others? If true the range would be a fact but your desire to have a extra wide range may not mean anything to you.

Accessories - Sun Shade, ARD, Caps, cleaning kit, tools?

Magnification - Ring Stiffness (Or ease of adjustment?), Maybe how accurate it is to the markings?, Speaking of S&B: Tunneling (Or does this belong with glass), POI change when adjusting? Is no longer parallax free when adjusting by x amount?

Ruggedness - Not sure of a good way to test this aside from like the crazy scope checker thing the BR guys do where they put a frozen scope and a test scope on the dual rail thing and shoot it under recoil and see if the reticle moves compared to the frozen one. Or maybe how well it works under cold conditions? (Mag ring broke at 0 C or something like that). Maybe a checkbox for ever needing to be sent in for warranty work?

Warranty? I am not sure if any companies have a reputation for giving better service for a specific model or what, which means this may not be worth mentioning.

--------------------------------------------------

The biggest problem with a lot of what I mentioned above is that it is difficult to quantify a lot of them. What number between 1-10 do you give to "My TT parallax binds when rings are above 15 inch lbs", or "Mag ring broke when cold", or "Easy to cross-thread zero screws", or "Reticle was crooked" etc.
 
Good point: You’ll get a broader and bigger sample of reviews for a given make/model if you leave reticle out.
 
So my advice as most of this is subjective would be to move to a likert 1-7 scores

so for example for turrets would be...
1- turret frequently skipped between clicks and was hard to place at location
2- turret sometimes skipped
3- turret rarely slipped and or was mushy
4- turret tracked within 5% up to 5 milm clicks were nit audible or precise
5 turret tracked within 2% up to 10 mil, clicks were not audible or clicks were not precise
6 turret tracked with 2% up to 10 mil, clicks were audible but a bit mushy
7- turret tracked within 1% up to 10mil, clicks were audible and precise

another way would be to define criteria and rate each one

so for turrets there could be an accuracy Scale, an audible scale, and a feel scale

there will need to be guidance on what is a score. I could think vortex gen 2 or mk 5 is 10/10 because I am used to pst gen 1. but I may have near heard or felt something like a zco or s&b turret (assuming they are better)
 
But first, how about the features and criteria that is important to everyone vs just as few

You have tracking which is my #1 but how about
  • Reticle choice - By this I mean do they offer a wide variety of reticles?
  • Eyebox - No explanation necessary
  • Advanced Zero stop - By this I mean, how difficult is it to set? Does it offer some adjustment range below the stop? One of the things I love about my NF is you can set the adjustment below the ZS wherever you want. This helps with a multi-caliber rifle like my AXMC
  • Turret legibility - Many always talk about turret feel but how legible are the markings and how close together are the clicks?
 
  • Like
Reactions: cooper_257
^^^^
Next one up, : LOL

Stuff like this, on top of a Review Card for review, we are actually building a standard, a way to look at a product and define it for all

So there is a guide to follow, what does something mean, definitions that everyone can review or at least understand
 
We have some many people at home and looking for ways to entertain themselves, I have been thinking about this for a while, so I will throw this out there.

We need a standard on which people look at a new optic and rate it. We have tons of bay window reviewers and guys who want to "look" at scopes and give their opinion of it compared to others.

So why not create a repeatable system, sort of like the Caliber Matrix Brian Whalen did with me at Blue Steel Ranch, I think a 1 - 10 scale in most cases, and I have found where I wrote this down, I have a list and somehow misplaced it, but I can start it off until I find it.

So without saying a whole lot to start, I was thinking a 1-10 scale, although I want to a 1-5 on glass because too many put too weight here and I think less will show it's not the most important feature in 2020 and beyond. We have so much better glass and almost everyone sources it in the same places so why bother chasing it because we know people will.

So off the top of my head until I find my list,

Turrets
Tracking
Parallax
Finish
Design
Movement (turrets, magnification, parallax)
Glass
Overall

A matrix that can be shared and used to help wade through the host of subjective reviews out there. If we can create a list of features and give them a value we can then more accurately compare two optics to each other. Because scopes are so subjective, most people tend to go by the look through the scope vs the core features, or overall quality. On top of that, too many want to compare a $1500 scope to a $3000 one so lets put a real number on it.

What features do we grade, 1-10 and then with Glass let's do 1-5, now with that, attack the questions and flush out the problem
FOV is super underrated and people sometimes buy a 3-18 to have better FOV than a 5-25. Yet, if they look at the numbers they might be completely surprised.

I think FOV needs to be much more attention and needs a place in your matrix.
 
  • Like
Reactions: Glassaholic
So my advice as most of this is subjective would be to move to a likert 1-7 scores

so for example for turrets would be...
1- turret frequently skipped between clicks and was hard to place at location
2- turret sometimes skipped
3- turret rarely slipped and or was mushy
4- turret tracked within 5% up to 5 milm clicks were nit audible or precise
5 turret tracked within 2% up to 10 mil, clicks were not audible or clicks were not precise
6 turret tracked with 2% up to 10 mil, clicks were audible but a bit mushy
7- turret tracked within 1% up to 10mil, clicks were audible and precise

another way would be to define criteria and rate each one

so for turrets there could be an accuracy Scale, an audible scale, and a feel scale

there will need to be guidance on what is a score. I could think vortex gen 2 or mk 5 is 10/10 because I am used to pst gen 1. but I may have near heard or felt something like a zco or s&b turret (assuming they are better)
I like this.
 
Guys, there have to be two components to this, it's informational afterall

There has to be quick Spec or Brand, So we need to know upfront what exactly we are looking at. Then we have to have the review side of things. How we lay out the spec's maybe it's Length, weight, Objective, tube size, etc, there you can address numbers that is part of the design.

How do we score a FOV between a 5-25x and 3-18x via a number score, or do you say just put, FOV @ 100 yards on 10x, or do you give the spec as posted in the manual, where do we go with these things. FOV is a personal choice, not necessarily a reviewable one. I can't really compare it to another optic beyond saying true or false,

and also I am not giving 100% complete answers in every reply, so they are included in the other replies, my first post doesn't mean that is my list, it's just examples of what might be contained in my list.

But we need to establish a General Spec List and then the Criteria for grading list,

Not everything is graded and you can't always compare apples to oranges. We want to to be a bit more broad and not something meant to completely replace an actual review. Think about it this way, We come on here and ask as many people as possible for their opinions on a particular product. We get 100 replies as which is the best, well how do we weight these answers, or better how can we apply a standard to the response?

Read that again, and then think about a quick review card with a simple 1-10 grading system that gives you a broad overview of the product you are looking at. From there we can insert our own opinions on things, the Philip Factor, I love my scope because everyone on here hates it, I love my scope because it makes my wife jealous. But for my score i rated it, 85 out of 100 points given the Hide Factors

Instead of, the Glass is clear as a crystal and sharps as a tack, you would instead say,

30 Minute Twilight - 8
Contrast Score - 9
Resolution - 6
Day Chart - 7

Glass Score is 30

Stuff like that, we can make it quick and accessible and still tells everyone a story.
 
Guys, there have to be two components to this, it's informational afterall

There has to be quick Spec or Brand, So we need to know upfront what exactly we are looking at. Then we have to have the review side of things. How we lay out the spec's maybe it's Length, weight, Objective, tube size, etc, there you can address numbers that is part of the design.

How do we score a FOV between a 5-25x and 3-18x via a number score, or do you say just put, FOV @ 100 yards on 10x, or do you give the spec as posted in the manual, where do we go with these things. FOV is a personal choice, not necessarily a reviewable one. I can't really compare it to another optic beyond saying true or false,

and also I am not giving 100% complete answers in every reply, so they are included in the other replies, my first post doesn't mean that is my list, it's just examples of what might be contained in my list.

But we need to establish a General Spec List and then the Criteria for grading list,

Not everything is graded and you can't always compare apples to oranges. We want to to be a bit more broad and not something meant to completely replace an actual review. Think about it this way, We come on here and ask as many people as possible for their opinions on a particular product. We get 100 replies as which is the best, well how do we weight these answers, or better how can we apply a standard to the response?

Read that again, and then think about a quick review card with a simple 1-10 grading system that gives you a broad overview of the product you are looking at. From there we can insert our own opinions on things, the Philip Factor, I love my scope because everyone on here hates it, I love my scope because it makes my wife jealous. But for my score i rated it, 85 out of 100 points given the Hide Factors

Instead of, the Glass is clear as a crystal and sharps as a tack, you would instead say,

30 Minute Twilight - 8
Contrast Score - 9
Resolution - 6
Day Chart - 7

Glass Score is 30

Stuff like that, we can make it quick and accessible and still tells everyone a story.
I am saying FOV needs to be listed at the min.

For hunting it might be critical, but for many shooting matches or whatever it is, one scope feels super generous at 12x, and another one feels horribly tight.

The reason I bring this up is most new people have no clue until they’ve been through a bunch of scopes and that doesn’t leave many of us
 
I think this is a great idea and away to get closer to a standard way to view scopes and their reviews. Does the Hide/scorecard solution that will house the data have the ability to look something similar to the attached screenshot?

Taking a shot at this, obviously feel free to chime in or make corrections/additions as I'm sure I missed something.

As far as criteria if I captured all/most of the past responses it would be something like:

Overall Score:
###.##

Mechanical Assessment
- Elevation Tracking
- Windage Tracking
- Zero Stop (repeatable?)

Optical Assessment
- Parallax
- Resolution
- Light Transmission Dusk
- Light Transmission Dawn
- Eye box
- CA control
- Depth of Field
- Contrast
- Microcontrast

Ergonomic Assessment
- Turret Feel
- Mag ring ease of operation
- Diopter Adjustment
- Scope Finish
- Click Quality

Other Features
- Illumination brightness
- Illumination color options (#)
- Accessories included (sunshade, mini wrench, flip caps, etc.)
- Reticle options (#)
- Locking Turrets?
- MSRP in $
- # of years scope has been in production
- Scope mount options
- Customer Service
- Warranty
- Country of Origin


Above category names borrowed from the precision rifle blog that did a great multi-scope review in 2014. Perhaps we can use those/similar definitions for the criteria we designate to keep some synergies with that work done. I totally agree that criteria definitions are key so that every reviewer knows how to grade and what they're aspects they are to be grading. I think the above criteria are the scored options with the other part being the spec sheet.

Spec sheet, (not in a particular order, pulled from the Eurooptic template):

Weight
Length
Mag range
Objective diameter
Tube size
Turret adjustment
Total Elevation Travel
Total Windage Travel
Parallax down to: (#m or #y)
Focal Plane
Adjustments in MRAD or MOA
Reticle in MRAD or MOA
Eye Relief
Illuminated?
Turret Rotation Direction
Field of View
Elevation MILS or MOA per 1 turret rev


The attached image was borrowed from google, but the layout could be a nice starting template for a compare of multiple optics at one time. It would be nice to eventually get the scorecards in a excel ish type format so the user could filter on scopes that have a 8.5 or greater overall score or other shared criteria. (once we are on our way with the data collection effort)
 

Attachments

  • Overall Compare View.jpg
    Overall Compare View.jpg
    84 KB · Views: 44
On FOV: compare them when at the same magnification. Listing the FOV on lowest power is not always useful because some scopes tunnel.

SInce magnification ring markings on many scopes are useless, for scopes with fairly constant eye relief a good metric is calculated FOV on 10x or 20x. Basically take FOV on highest power multiply by the highest power and divide by 10 for calculated 10x FOV. It is not perfect, but it is a consistent metric.

If you want to evaluate whether the scope tunnels, list the highest magnification that achieves max FOV.

ILya
 
4) Apparent FOV plays a huge role in how easy a scope is to get behind.

I like the idea of the apparent FOV.
Some scopes have a wide FOV but still have a tunnel vision effect, whereas other scopes the whole eyepiece seems to disappear.
 
It should be Amazon style so you can see who added a review their stats, to eliminate people voting for what they bought.
 
Reading through this my brain keeps jumping to this video. Reviews might not work the same as he is discussing but still relevant to the world we are in.
 
  • Like
Reactions: TriggerJerk!
I think this is a great idea and away to get closer to a standard way to view scopes and their reviews. Does the Hide/scorecard solution that will house the data have the ability to look something similar to the attached screenshot?

Taking a shot at this, obviously feel free to chime in or make corrections/additions as I'm sure I missed something.

As far as criteria if I captured all/most of the past responses it would be something like:

Overall Score:
###.##

Mechanical Assessment
- Elevation Tracking
- Windage Tracking
- Zero Stop (repeatable?)

Optical Assessment
- Parallax
- Resolution
- Light Transmission Dusk
- Light Transmission Dawn
- Eye box
- CA control
- Depth of Field
- Contrast
- Microcontrast

Ergonomic Assessment
- Turret Feel
- Mag ring ease of operation
- Diopter Adjustment
- Scope Finish
- Click Quality

Other Features
- Illumination brightness
- Illumination color options (#)
- Accessories included (sunshade, mini wrench, flip caps, etc.)
- Reticle options (#)
- Locking Turrets?
- MSRP in $
- # of years scope has been in production
- Scope mount options
- Customer Service
- Warranty
- Country of Origin


Above category names borrowed from the precision rifle blog that did a great multi-scope review in 2014. Perhaps we can use those/similar definitions for the criteria we designate to keep some synergies with that work done. I totally agree that criteria definitions are key so that every reviewer knows how to grade and what they're aspects they are to be grading. I think the above criteria are the scored options with the other part being the spec sheet.

Spec sheet, (not in a particular order, pulled from the Eurooptic template):

Weight
Length
Mag range
Objective diameter
Tube size
Turret adjustment
Total Elevation Travel
Total Windage Travel
Parallax down to: (#m or #y)
Focal Plane
Adjustments in MRAD or MOA
Reticle in MRAD or MOA
Eye Relief
Illuminated?
Turret Rotation Direction
Field of View
Elevation MILS or MOA per 1 turret rev


The attached image was borrowed from google, but the layout could be a nice starting template for a compare of multiple optics at one time. It would be nice to eventually get the scorecards in a excel ish type format so the user could filter on scopes that have a 8.5 or greater overall score or other shared criteria. (once we are on our way with the data collection effort)


This is the direction I was envisioning!

Yes, I want to study this more and look at the image, I have a podcast to record this AM, but I will be back,

Nice work,
 
  • Like
Reactions: TangoSierra916
I definitely think we need some form of Forum wide ranking that has some uniformity to it.
I think each person that’s giving their opinions on scopes also needs to give their personal exp with what scopes they have used extensively, because let’s be honest, the guy that has only ever used a Barska and then goes and buys a Argos is going to rate it as “the best glass and function” he’s ever seen! Cuz it is, at least to his experience.
I know for me personally, before coming on this site, my exp with scopes stopped with a Leupold MK4 and a Springfield Gen3!
I’ve owned and used a LOT more scopes since then and I’ve learned quite a lot.
so I agree with @lowlight of a standardized ranking system of some kind.
Too many of us have fell victim to the over excited review of a scope that’s not nearly as great as the review states. But that could be that reviewers lack of exp with other higher end glass tubes. Just my 2c
 
  • Like
Reactions: seansmd
Finnaccuracy has a method they use for optics testing. Could be an option to use to attach a score to glass quality and maybe remove some subjective aspects?
 
how about value? while value is subjective to the individual, I think we can all agree that there are excellent scopes that cost a ton that offer very little in terms of % "better" compared to another model that costs significantly less. Some have the budget to spend whatever it takes to own the best, others are more interested in best value to perform task X or Y. I think there are scopes at both end of the spectrum that offer very poor value, and some that are the exact opposite.
 
I suggest using weight factors to balance the data and make it meaningful.

For example, say you have 5 different categories that deal with optical quality, and rate each 1-10, but only have one category for tracking, rated 1-10 (lets say 10 is 100%, 9 is 0.5% error, etc. , and 1 is doesn't track correctly at all). When you calculate the overall score for that scope, a scope with great glass and utterly useless turrets would still get alot of points, and could still get a decent overall score, even though its useless in reality. Turrets have to somewhat work, or the scope is a waste of money.

The solution is to weight the categories, and the weight is a multiplier. Tracking could get a weight of 10, but maybe each optical category only gets a weight of 2. So for a theoretical perfect scope, it gets 100 optical points and 100 turret points, even though 5 categories of optics were evaluated and only 1 category of turret data was evaluated.

The beauty of this is you can allow reviewers to use 1-5 or 1-10, and rate as many categories as is appropriate for each aspect of a scope, without having one category dominate others in importance. The weight factors would be decided beforehand as part of the system, and never change. The reviewer would not be able to change weight factors.

Deciding on what weights to use is the hard part. A good way is to take data from reviews for scopes with a lot of history and fiddle with the weights until the system ranks them according with conventional wisdom. The system then can be applied to new unknown scopes to see how they compare.
 
Last edited:
Since I am only a couple of years from an outsider's perspective, having this scope info in a consistent format would have been HUGE for me in properly selecting a scope. When I first visited the Hide in 2017, I got more than a little overwhelmed when trying to select a scope as everyone seemed to have 102 answers to 100 questions on scopes. Anecdotal opinions will always be there, but to a total newbie like myself, it was almost paralyzing as I had no friends who were experienced rifle shooters. As a result of listening to too many opinions, I ended up with a Nikon X1000 4-16x50 which was a POS. I now have a NF ATACR 4-16 and couldn't be happier.

Suffice it to say, having a resource like is being suggested will be a godsend to those who, like myself, found themselves drowning in a sea of opinions from people we do not know. Thanks for suggesting this. I am looking forward to seeing the end result. Much appreciated.
 
I think we need to keep it informative but at the same time simplified, Like I really like what was posted above, but I would make minor changes, like the Country of Origin, who cares, half the people lie about it, or try to hide it, and not all the parts are sourced from one country or two, some might have 3 sources, so the origin is lame,

But I would do this

Overall Score:
###.##

Mechanical Assessment
Tracking = % - the percentage can have a 1-10 score
Turret Movement, (Click Spacing)
Lock Features (Ease of lock vs unlock)
Reference Alignment (does engraving line up)


Optical Assessment
- Parallax
- Resolution
- Light Transmission LowLight
- Eye box
- CA control
- Depth of Field
- Contrast


Ergonomic Assessment
- Turret overall
- Mag ring ease of operation
- Diopter Adjustment
- Scope Finish

Other Features. Or spec sheet
- Illumination brightness
- Illumination color options (#)
- Accessories included (sunshade, mini wrench, flip caps, etc.)
- MSRP in $
- Scope mount options
- Customer Service
- Warranty

Some things are just repeating the spec as the manufacturers list it, other things can be valued and weighed,

I want it to be practical, not over the top or overly difficult and it should be informed enough for the basic shooter, not so much to satisfy the finnegan out there. If you want to know where your glass is from, look it up, we don't need to go down too many rabbit holes that invite debate or disagreement
 
We are definitely going to have to somehow quantify the "level" of the scope.
maybe Entry or Budget - mid tier - top tier
Let's face it, something like a Midas Tac has excellent glass for it's price range, but does not compete with the likes of 1200- and up crowd, let alone the 3k bunch.
 
no we don't have to quantify it,

The scopes are standalone, you are not comparing a Midas to a Zco, you are looking at the score only

Again, as noted on the first page, others do this, you can have a list of apples to oranges, the difference is, you go in yourself and compare the oranges and forget the apples. When a shooter knows his budget is $1500 if he is looking at reviews of the Zco, that is his fault.

We want this to be universal, should not depend on the price, the brand, anything else, dollars or sense wise. We test every scopes the same way, budget is irrelevant to the information posted.
 
I suggest using weight factors to balance the data and make it meaningful.

For example, say you have 5 different categories that deal with optical quality, and rate each 1-10, but only have one category for tracking, rated 1-10 (lets say 10 is 100%, 9 is 0.5% error, etc. , and 1 is doesn't track correctly at all). When you calculate the overall score for that scope, a scope with great glass and utterly useless turrets would still get alot of points, and could still get a decent overall score, even though its useless in reality. Turrets have to somewhat work, or the scope is a waste of money.

The solution is to weight the categories, and the weight is a multiplier. Tracking could get a weight of 10, but maybe each optical category only gets a weight of 2. So for a theoretical perfect scope, it gets 100 optical points and 100 turret points, even though 5 categories of optics were evaluated and only 1 category of turret data was evaluated.

The beauty of this is you can allow reviewers to use 1-5 or 1-10, and rate as many categories as is appropriate for each aspect of a scope, without having one category dominate others in importance. The weight factors would be decided beforehand as part of the system, and never change. The reviewer would not be able to change weight factors.

Deciding on what weights to use is the hard part. A good way is to take data from reviews for scopes with a lot of history and fiddle with the weights until the system ranks them according with conventional wisdom. The system then can be applied to new unknown scopes to see how they compare.
This can be done by averaging the sub categories so each major category only can ever be a max of 5 or 10 or w.e and the overall score is adding up all the major categories. So you can add or remove sub categories and the weight of each one does not change. You can also change the weight of the major categories by changing the max for each. Like give glass a max of 5 so sub categories are 1-5 but turrets sub categories are 1-10 etc.
 
Okay here is how I see this card looking,

A listing of the specs can be after the overall score, so I see each card as reading

Scope: Nightforce ATACR 5-25x MOAR Reticle, Price Paid $2800

Mechanical Assessment
Turret Tracking
Turret Click Spacing
Turret Alignment
Illumination

Optical Assessment
Resolution
Contrast
Chromatic Aberrations
Depth of Field
10x FOV
Eyebox
Twilight Transmission
Parallax Adjustment
Reticle Usability

Ergonomic Assessment
Magnification Ring Movement
Parallax knob Movement
Diopter Adjustment
Mounting
Overall Fit & Finish

Final Score:

Full Specs:
Screen Shot 2020-04-03 at 12.45.09 PM.png


Something like this that scores each block 1-10 and then gives an overall score,

One we have several scopes compiled this way can create buckets to place the scopes in via their price point.
 
  • Like
Reactions: TangoSierra916
While I see the point @fdkay made, I would agree that not quantifying the "level" of scope is the way to go for what I interpret the goal of this to be. I think if the goal is to break out buy a certain criteria like price or perceived quality it could lead to alot of assumptions that may not paint a non-bias compare perspective. There will ofcourse be some but the criteria definition should limit that, as well as multiple review will show a truer score. By keeping it all scopes in one big pot and applying the same defined criteria you hopefully get to really see how a scope A compares to scope B, based on the criteria, when their MSRP is $2,000 different.
 
This can be done by averaging the sub categories so each major category only can ever be a max of 5 or 10 or w.e and the overall score is adding up all the major categories. So you can add or remove sub categories and the weight of each one does not change. You can also change the weight of the major categories by changing the max for each. Like give glass a max of 5 so sub categories are 1-5 but turrets sub categories are 1-10 etc.

You are correct, there are multiple ways to achieve the same goal. The nice part about the weight factors is that you can change the balance between categories by changing only one number per category. This allows you to easily tweak the various weight numbers with a sample dataset until the desired balance is achieved. Not all categories are as important as others. But exactly how to weight the categories is the most important part of the system, and the hardest decisions to make. Whether a scope is a 9 or a 10 optically matters little (despite the endless horse beating online of Schmidt vs Nightforce vs Kahles vs Quigley Ford). Let the reviewers decide that. Also, weight factors can be adjusted after some data is gathered, if it becomes apparent its necessary.

In developing a system, what does matter is: how important is glass, vs turrets, vs reticle, etc. When a scope like the ZCO excels in all areas, of course its gonna get a great score. We don't need a spreadsheet for that. But the system will shine when comparing mid-level scopes with alot of tradeoffs, where sometimes the tradeoffs are harder to balance mentally.
 
Here is an example of how weighting will be the tough question. How much weight do you put on features, and how much on price? Because I bet the first time the numbers are crunched, scopes like the Nightforce NXS, with great turrets, good reticles, great durability, and "good enough" glass, are gonna outscore some of the top tier scopes, once price is factored in. And that may not be wrong, but could wrinkle some underpants.

This is why it might be best to leave price out of it.
 
Last edited:
When I was in sales, it was always important to talk about benefits rather than features. 34mm scope tube is a feature, 25mils of tracking is a benefit. A good feature produces a benefit, a bad feature produces profits. I wouldn't want a scope with great features to be scored higher than a scope with great benefits.

If collecting multiple scores to average them, providing the standard error of the reviews will account for the number of reviews and help mitigate the effects of outlier reviews or at least provide insight to the accuracy of the cumulative reviews to the end user of the data.
 
Is this like an evolving crowdsource project or will you have an "expert panel" of reviewers?

Being able to also compare scopes hard data would be a good feature.

Wouldn't the criterias be subjective to the reviewers own opinions or at least his experience with certain scopes? Vs measurable ones?
On that note, Since not everyone can have every brand of scope. The data almost needs to be compiled as comparisons of different scopes.

For example:

Reviewer [A] compares a Nightforce and a Razor
Reviewer (B) compares a Razor and Nikon

One should then be able to deduce how a Nightforce compares to a Nikon. Over time this would build a trend as compared to other scopes.

Vs someone who only has Nightforce giving them all 10's because they think they are the best.
 
Last edited:
  • Like
Reactions: AKMarty
Here is an example of how weighting will be the tough question. How much weight do you put on features, and how much on price? Because I bet the first time the numbers are crunched, scopes like the Nightforce NXS, with great turrets, good reticles, great durability, and "good enough" glass, are gonna outscore some of the top tier scopes, once price is factored in. And that may not be wrong, but could wrinkle some underpants.

This is why it might be best to leave price out of it.
Or just give it the $$ rating like yelp.
 
I think it a way to create a standard for anyone looking to "review" their scopes.

this is more to help standardize the process when trying to determine product A vs product B, even if an individual comes on and does a mini-review, which happens all the time.

it's time to create a standard we can look at, and it's a way to help people identify the important points.

It's more to address the endless questions we get, the fact is if you own the scope and want to review it using this type standard, great, add it to the list. If you have 10 customer side reviews and the scope you are looking to buy averages 7.2 and your second choice averages 8.1, maybe you pick the second choice over the first. This is meant to add balance to the question

is it perfect, no, is it meant to replace a review Ilya might conduct on a scope, no, it's meant a reference guide that may be user-generated but is held to a standard everyone can understand.

how we absorb information is changing, we skimp, we bullet point, we read headlines more so than we tend to study anymore. So changing the way we look at this information in what I am calling a Review Card format will help those with short attention spans compare multiple scopes quickly and easily.

We do it every day already, we just do it in an opinion based format. Hey, I want to buy a Vortex Razor what do you all think, Well if we add the same opinions we always have, but included a link, or if someone says, hey the SH review average is 8.0, it's just another point of validation

Think about how many people come on here to validate their purchases. Now we can put a value to it

Seems odd this is hard to grasp for some, everyone is avoiding what I believe to be the obvious in this, ? Strange
 
  • Like
Reactions: seansmd