• Watch Out for Scammers!

    We've now added a color code for all accounts. Orange accounts are new members, Blue are full members, and Green are Supporters. If you get a message about a sale from an orange account, make sure you pay attention before sending any money!

Rifle Scopes Should We, Review Criteria

Lowlight

HMFIC of this Shit
Staff member
Moderator
Supporter
Minuteman
  • Apr 12, 2001
    35,609
    40,130
    Base of the Rockies
    www.snipershide.com
    We have some many people at home and looking for ways to entertain themselves, I have been thinking about this for a while, so I will throw this out there.

    We need a standard on which people look at a new optic and rate it. We have tons of bay window reviewers and guys who want to "look" at scopes and give their opinion of it compared to others.

    So why not create a repeatable system, sort of like the Caliber Matrix Brian Whalen did with me at Blue Steel Ranch, I think a 1 - 10 scale in most cases, and I have found where I wrote this down, I have a list and somehow misplaced it, but I can start it off until I find it.

    So without saying a whole lot to start, I was thinking a 1-10 scale, although I want to a 1-5 on glass because too many put too weight here and I think less will show it's not the most important feature in 2020 and beyond. We have so much better glass and almost everyone sources it in the same places so why bother chasing it because we know people will.

    So off the top of my head until I find my list,

    Turrets
    Tracking
    Parallax
    Finish
    Design
    Movement (turrets, magnification, parallax)
    Glass
    Overall

    A matrix that can be shared and used to help wade through the host of subjective reviews out there. If we can create a list of features and give them a value we can then more accurately compare two optics to each other. Because scopes are so subjective, most people tend to go by the look through the scope vs the core features, or overall quality. On top of that, too many want to compare a $1500 scope to a $3000 one so lets put a real number on it.

    What features do we grade, 1-10 and then with Glass let's do 1-5, now with that, attack the questions and flush out the problem
     
    Are we talking about having a database of sorts, where a member can click on, lets say, a Minox ZP5, and see the ratings other members have given? Because that would be pretty damn awesome! I was a plank owner at this site, so that's the type of review I was thinking of

     
    Seems like a good idea. I’d be in to doing reviews in that format.
     
    That is getting ahead of ourselves, but it can be if we create a number system, it's easy to compile the Review Card, and then host it

    Can I do that, yes, but we need criteria first, you are putting the cart in front of the horse
     
    I would post the video I am referring too but it sidetrack the discussion, everyone will start chasing squirrels and change the subject,

    But what i am envisioning is a Review Card, It's basically a table of the Criteria and the Score for the scope so you can easily reference it
     
    • Like
    Reactions: Average guy
    This is not right, but you might get the idea,

    Screen Shot 2020-04-02 at 1.21.33 PM.png


    We can make Score Cards, or Comparison Cards, super easy and repeatable
     
    Great idea to see trends and similarities in view points.

    A question though. Would there be a way to account for the outliers?

    Meaning people that just dont give a fair evaluation.

    Or are you thinking more along the lines of say a group of testers to keep things consistent?

    If a group is the concept it makes a little more sense.

    Manufacturers, dealers, sponsored shooters should possibly be DQ'd... maybe. Unless they have a track record that shows no bias.
     
    • Like
    Reactions: alfmoonspace
    This, or That is why the criteria has to be flushed out, as much of this subjective to begin with,

    Nobody has machines to test, and really the only thing a person can test is tracking, Tracking should be Pass / Fail with a percentage of movement

    The rest is certainly going to be swayed, which is why I say, for Glass as an example we use 1-5 vs 1-10 so the, Clear as Crystal, Sharp as a tack crowd can give them their number and not necessarily push the score over the top.

    This would blunt the Philips of the group who only score an IOR on glass, nothing else, maybe we can include a "group score" as things move forward,

    By group score I am thinking an offset number, a balancing factor for the outliers.

    This will never be perfect but it's not meant to be, it's a quick reference for some, and a way to follow a standard for others.
     
    @lowlight
    If this takes off and you want to do it with a tracking fixture to hold the scopes I may be able to fabricate a few if needed.
     
    I think we should just kidnap @koshkand force him to evaluate every scope fairly.

    Otherwise retards like me might sway the results. I won't input on the results beyond this though. Lol.

    It's a great idea though
     
    Also probably need a time stamp or something (year of the review?), so that it can be weighted correctly against scopes available at the time of the review/grading, and not a scope from 5 years ago, being graded against some current year scope with newer tech.
     
    • Like
    Reactions: Bender and 1moaoff
    Along these lines there is a newer car review channel on YouTube called Grand Test Auto. Might give some ideas of what this ends up like?
     
    • Like
    Reactions: Southern Custom
    Needs a column to identify if they are a poor or not.

    Makes a huge difference in regards to perception.

    eta - Ok, let me clear this up a little. Yes, I was serious but it was also supposed to help lol

    What I was suggesting was some sort of basis as to what they normally use/what they are using prior to reviewing the new optic/their usual budget. because a review from a guy who usually uses Barska optics, on his shiny new Leupold Mk4 would be a hell of a lot different than someone who has 10 NF/SB optics and reviews it. You need some sort of reviewer baseline.
     
    Last edited:
    For the glass, you could use a resolving test for how fine of a line spacing can it resolve at x power and at x distance.
    Some of the places that review cameras and lenses have some detailed test targets that let you compare resolving power and clarity as well as changes in colour hue.

    If you picked a standardized one to test all the scopes on, then you could be a bit closer on comparing that part.
     
    • Like
    Reactions: 2aBaC̶a̶
    Guys, this is not hard,

    Why not focus on the features and criteria, once we have that I can do the rest and you can see what mean,

    But try to focus, Ilya is not gonna be able to do 1/4 of what is out there, everyday people talk about which scope to buy, let's help this process by focusing on a solution to wade through it,

    Figure with everyone with a video camera on their phone and being home, working from home, social distancing, this is going to open up things to even more people doing it. why not beat the rush and put something in place,

    Features and Criteria to Score :
     
    @lowlight , you might consider throwing a PM out to @koshkin , @wjm308 , and @BigJimFish and have them collectively provide some guidance or feedback here. I think you're on an excellent track to get a standard review format laid out for the rest of us to follow, but those three especially are guys who know quite a bit or have a lot of hands-on experience with a slew of optics. If you had them hash out some details (assuming they're willing to do so), I think you'd wind up with a pretty solid foundation to build off of. I'd love to help as well (admitted optics whore), but I'll freely admit I'm not in the same class as those three gentlemen. I don't mean to suggest they're the only ones to DO the reviews, but rather with their experience, they can help you narrow down what criteria is most important, as well as a good way to rate it. Once you have those details, sticky it in the Optics section as a "This is the Official SH Optic Review Format" and then let us mouth-breathers give it a try. :ROFLMAO: Between those three and yourself, I think a fantastic baseline for reviews could be established here!
     
    For what it's worth, I think a numerical rating system, would be a smart way to go. Sometimes, a lot of the same questions get asked for every optic that's out there, then if people don't search for a particular scope thread, because they don't know any better, the experienced people, have to re-explain what they've explained hundreds of times before. Redundancy abounds. Some people, me included, when something comes to mind or we're interested in a certain optic, or anything for that matter, we just create a new thread and we start the process all over again, then get the "stink eye" from the experienced. Sounds to me, a great way to get the basic plus or minuses out there, then if someone has a more nit-picky question, they can ask it. Mac(y)
     
    I get that,

    But first, how about the features and criteria that is important to everyone vs just as few

    We have a huge resource here and still we are wasting it by changing the subject,

    Why make it just what I say is important or Ilya, I can roll out an "official review" on my own, but the more hands in it, the better it might get accepted,

    But if you can't do it, i will just do it myself, I figured everyone being home would enjoy putting this together, but every post is deflecting to someone else to do it

    noted,
     
    I believe that in addition to a raw score, there should be a place for a sentence or two describing why. However there is a need to keep it simple and concise as well. Sadly whats good for one may not be good for another, due to difference in eyes/intended use.

    I do like the idea because we all simply don't have the opportunity to get behind all the options available. to us.

    I have cheap scopes, and a couple Gen II Razors and a couple in the middle. I have not been behind a S&B or ZCO.
    I'm just trying to wrap my head around a baseline scope that has all the necessary features, maybe not the best in all categories.

    Too many boxes to check could muddy the field as well.

    Let's do this!

    Reticle Options/ Evaluation?

    Value/Price or perhaps broken down by MSRP into broad categories
     
    Last edited:
    I created a rough matrix in Excel and included a PDF here: https://www.snipershide.com/shooting/threads/ultimate-long-range-hunting-scope.6967966/post-8064849. Don't know if this helps but it provides the specifications of some scopes in a specific range.


    This is good reference but I only see specs there, putting the specs in one place is handy, but it's the same problem

    Guys wants to know Scope A or Scope B, the spec's say what the physical or feature differences might be, including price, but that does not solve this issue.

    We can do that in the title

    NF ATACR 5-25x = Average Retail: $3100

    Putting a few spec details into the card would be good, but I see it like a Player Card, details for sure, but a review score
     
    What a great idea. The system could accumulate the total number of individual ratings for each scope so the reader can discount a score that maybe only has a few ratings and put more weight in higher number of ratings.
     
    This is good reference but I only see specs there, putting the specs in one place is handy, but it's the same problem

    Guys wants to know Scope A or Scope B, the spec's say what the physical or feature differences might be, including price, but that does not solve this issue.

    We can do that in the title

    NF ATACR 5-25x = Average Retail: $3100

    Putting a few spec details into the card would be good, but I see it like a Player Card, details for sure, but a review score
    Understood. I am relatively new with scopes so I would not be able to provide scores accurately. I could expand the list on the spec side if you would like.

    Even though I am in a "stay at home" state, I have been deemed "essential" so I am working but can squeeze the research in at night.
     
    Those eight criteria would seem like enough to start. I can't really think of what else would be relevant. Durability, perhaps, but impossible to really test.
     
    • Like
    Reactions: Average guy
    Should there be something used as a baseline for each category? For example use one brand and model of scope as a baseline. Scope X has Glass: 4
    Turrets: 8, Tracking:10 etc. Then you can make sure evaluations are more consistent for others. Just a thought.
     
    Perhaps some performance criteria, such as light gathering. But some standard would be needed, as demonstrated by a recent thread on when the sun went down.
     
    Wow,

    Still nothing, keep wasting time talking about shit down the road when a bunch of reviews are in, versus day one

    We can't do anything until the card is made, without criteria for the card, all this extra is meaningless

    Look at the softball review posted above,

    Screen Shot 2020-04-02 at 2.11.17 PM.png

    Screen Shot 2020-04-02 at 2.11.32 PM.png


    See the Ratings, we need this before any of the other stuff, pages, handling, etc, come after
     
    • Like
    Reactions: drglock and D_TROS
    Possible additional things to rank;
    1. Weight? or weight/max power to get an even playing field.
    2. Mils or MOA per Revolution?
    3. Size of Eye box - Set standards for measuring at 15x or 12x rather than max mag, or at 75% of the scopes max magnification setting. This is measuring the max and min eye relief needed for said set magnification.
    4. Company customer service? (maybe warranty?)
     
    Not trying to be stupid i got lost with alot of posts going in different directions. But simple you want to make a cheat card like athlon ares etr- glass 4 turret -5 ect that way people can reference that place to find scopes rather than asking. Now lowlight do you want input from each individual on ratings? Or rather everyone come together to rate scope and input in the card?
     
    if we agree to become reviewer, when can we expect scopes to ship :D
     
    yes,

    A quick reference card or page with the most important information at your fingertips,

    Nightforce ATACR 5-25x
    Reticle
    Magnification
    Size
    Weight
    Parallax Close Focus


    We can front load the important details and then under card,

    Reviewer Score

    Fit & Finish
    Turret Feel
    Click Spacing
    Magnification Ring Movement
    Eye Box
    Mounting
    Glass Quality
    Light Transmission Dusk
    Light Transmission Dawn
    Reticle Design / Features
    Overall Score

    Stuff like that, understanding we'd have to explain it
     
    I think that would be a great idea great place to make it simple for people to compare scopes from real life people. Are you going to design a special area for that input
     
    I get that,

    But first, how about the features and criteria that is important to everyone vs just as few

    We have a huge resource here and still we are wasting it by changing the subject,

    Why make it just what I say is important or Ilya, I can roll out an "official review" on my own, but the more hands in it, the better it might get accepted,

    But if you can't do it, i will just do it myself, I figured everyone being home would enjoy putting this together, but every post is deflecting to someone else to do it

    noted,

    Gotcha. I think I misunderstood your intent, so my apologies there, but I'm tracking now.

    For me, I'd rate things (in order) as follows:

    -Tracking (Accuracy, repeatability)
    -Parallax (how high/low it goes, how forgiving, some may say accuracy of markings but I very rarely rely on those myself)
    -Turret feel/design (how many clicks per turn, rev indicator, translating or stationary, etc._
    -Zero Stop (how it functions, ease of setting up, can it go X number of clicks below)
    -Controls overall (feel, function of mag ring, illumination, etc.)
    -Design (form factor/size/weight)
    -Reticle options (subtensions, hold-overs, thickness, illumination quality, etc.)
    -Eye Focus (ease of use, locking/non-locking)
    -Glass (CA control/clarity/color)
    -Optic overall (everything taken together, total perceived value/return on investment, etc.)

    Is that more of what you're looking for?
     
    Gotcha. I think I misunderstood your intent, so my apologies there, but I'm tracking now.

    For me, I'd rate things (in order) as follows:

    -Tracking (Accuracy, repeatability)
    -Parallax (how high/low it goes, how forgiving, some may say accuracy of markings but I very rarely rely on those myself)
    -Turret feel/design (how many clicks per turn, rev indicator, translating or stationary, etc._
    -Zero Stop (how it functions, ease of setting up, can it go X number of clicks below)
    -Controls overall (feel, function of mag ring, illumination, etc.)
    -Design (form factor/size/weight)
    -Reticle options (subtensions, hold-overs, thickness, illumination quality, etc.)
    -Eye Focus (ease of use, locking/non-locking)
    -Glass (CA control/clarity/color)
    -Optic overall (everything taken together, total perceived value/return on investment, etc.)

    Is that more of what you're looking for?

    ^awesome above. I'd add in some random things off top of head for consideration:

    1.) Customer Support/Maintenance Turnaround by manufacturer
    2.) Resale Value
    3.) Color Options from Manufacturer

    #1 may not generate a lot of data, but when shit breaks, people want to how it is handled.
     
    ^awesome above. I'd add in some random things off top of head for consideration:

    1.) Customer Support/Maintenance Turnaround by manufacturer
    2.) Resale Value
    3.) Color Options from Manufacturer

    #1 may not generate a lot of data, but when shit breaks, people want to how it is handled.

    Good points. Resale will be hard to quantify, as the market fluctuates wildly. New optic out? Old stuff takes a dive. Somebody holds a fire sale for stupid low prices? Market takes a dive. And so on and so forth, but I guess a general statement might be useful somewhat.

    The CS/warranty comment is definitely helpful, however. I realize the ol' Tommy Boy "We could take a shit in a box and put a warranty on it" applies, but EVERY manufacturer will have to deal with failed gear at some point. How they handle it has a pretty big effect on perceived value.
     
    • Like
    Reactions: chrome
    Guys, this is not hard,

    Why not focus on the features and criteria, once we have that I can do the rest and you can see what mean,

    But try to focus, Ilya is not gonna be able to do 1/4 of what is out there, everyday people talk about which scope to buy, let's help this process by focusing on a solution to wade through it,

    Figure with everyone with a video camera on their phone and being home, working from home, social distancing, this is going to open up things to even more people doing it. why not beat the rush and put something in place,

    Features and Criteria to Score :
    I like your list. Maybe add something with dimensional pros and cons like
    relief
    Front - back movement available

    Could be sub-categories for the different features and criteria.

    Guys can get in the weeds via subcategories or keep it general.

    Requirement—-You had to have it in hand or use it in order to review it.
     
    • Like
    Reactions: chrome
    Requirement—-You had to have it in hand or use it in order to review it.

    ^This. Perfect world we'd collect how long the reviewer owned it or how many rounds they'd have through it. But I digress.
     
    I like the idea, and think that with the baseline features mentioned above in the OP, this could be really successful. A potential pitfall to avoid might be a pure "listing" of the features without a uniform "value" in terms of rating methodology for each feature mentioned above (eg: what would have turrets score a "10" as opposed to a "1" on the scale - what are the evaluation criteria?)

    I do think that it's hard with optics to come up with "scorecard" that can be universally applied, without getting too deep into the parameter "weeds"

    @chrome's #1 is actually an important metric in my mind, because it "encourages" manufacturers not to have poor customer service (I might be biased right now though, as I am about to send an optic in for repair)

    Maybe adding an "eye relief at minimum magnification", "eye relief at maximum magnification", as well as eyebox at min/max on a 1-10 scale could be useful to many?
     
    I threw something together real quick. Each category has a note in parenthesis whether it is quantifiable, relative or subjective. I would add some different ISO views and then the specs. Also on the card would be the overall score.

    USO.jpg
     
    Generally, I like the idea, but as Frank correctly pointed doubt, there is no way I can do it myself. This will take a significant amount of effort and I am kinda swamped already.

    I'll be happy chime in with my take on different scopes and their qualities, but I can't be driving this. This whole coronavirus has an overall strongly detrimental effect on the economy, but for my dayjob it only adds work (by the way, I am looking to hire a mechanical engineer in Dallas area if you know anyone interested).

    As far as the criteria to look at go, I think most of the suggestions above are quite reasonable. A few comments I would make are as follows:

    1) With optical performance, do not concentrate on resolution too much. Nowadays, a lot of scopes have very good resolution in good light with good air conditions, but once the light gets iffy or the air gets turbulent, things change. Contrast and microcontrast are very important and become prominent when the conditions get worse

    2) Turret feel is very subjective and rather think of it as click quality, I use a simpler criteria: how easy is it for me to run the turret without skipping or miscounting clicks without looking at it.

    3) Depth of Field and Field of View are easy to eyeball and are important.

    4) Apparent FOV plays a huge role in how easy a scope is to get behind.

    5) Light transmission and, often confused for it, image brightness, with modern optics, are absolutely useless criteria unless you are trying to market a scope. For marketing they are great.

    ILya
     
    @DustBun I think that it's difficult to create a universal/standardized "scorecard" when mixing subjectively or contextually scored categories with quantitatively scored ones. I would argue that there must be a baseline/benchmark set for each category.

    Once the categories are finalized, I think that it's imperative to create a scoring criteria and explain the methodology behind the criteria for each variable. This is very common practice in academia, as well as in science/engineering-related fields to keep testing consistent and confirm/reconfirm prior testing results.
     
    ScopeAScopeBScopeCetc...etc...
    Tracking (quantitative)
    Parallax
    (quantitative)
    Turret Feel
    (qualitative)
    Zero Stop
    (quantitative)
    Controls
    (qualitative)
    Weight
    (quantitative)
    Reticle Options
    (quantitative)
    Diopter Ease
    (qualitative)
    Glass Quality
    (both qualitative/quantitative)
    Price
    (quantitative)
    Average Resale Value
    (quantitative)
    Customer Support (or Warranty?)
    (qualitative/quantitative)
    Availability
    (quantitative)
    Aesthetics
    (qualitative)
    Robustness
    (qualitative)
    Depth of Field
    (quantitative)
    Magnification Range
    (quantitative)
    Tool-Less Turrets
    (Y/N)
    Field of FOV @ X yds, Y yds, Z yds
    (quantitiative)
     
    • Like
    Reactions: DIBBS
    Now we are getting it

    Sorry I had a podcast to record,

    But this is more like I am talking about,

    Yes, for resolution, I personally think we can take a USAF Chart and modify for a basic, and I mean basic, simplified resolution chart for people to use. In the podcast we were talking optics today and it was mentioned one of the best things to use is a $1 dollar bill,

    I think we can make a 10 point resolution chart similar to the USAF one and put "Bars" on it, how many bars you can resolve might be a go way to throw that in the Review Card,

    10 bars that can put at 100 yards and resolved down, with PhotoShop I can some damn small lines,
     
    @DustBun I think that it's difficult to create a universal/standardized "scorecard" when mixing subjectively or contextually scored categories with quantitatively scored ones. I would argue that there must be a baseline/benchmark set for each category.

    Once the categories are finalized, I think that it's imperative to create a scoring criteria and explain the methodology behind the criteria for each variable. This is very common practice in academia, as well as in science/engineering-related fields to keep testing consistent and confirm/reconfirm prior testing results.
    As an Engineer, I completely agree with you. I was actually going to work on some criteria and standards this weekend. I was just trying to get something going here.
     
    Do we envision the scorecard being limited to a certain number of variables? If yes, thoughts on how we would narrow them down?
     
    For me personally the scorecard needs two parts,

    The Specs and details,
    What exactly are we looking at

    The list of criteria covered in the review,

    Criteria will have subjective elements because people don't have machines to put the scopes into a test, so they will be subjective

    But if you consider these Review Cards are only for reference, which any review is only for reference. How many people only look at once source for data, I know I look at as many appear relevant as I can find. So with that in mind, we have to get the fact home field reviews are subjective

    Let's just assume every review is, and move past that part of it,

    if we can boil the criteria to 10 or so topics and then outline the steps or at least recommendations for each, we can move forward with a model. Will the model be the end-all, absolutely not, it can be a living document that grows and changes over time.

    What would like to see to help move this along is a set of bullet point lists we can refine, as noted like 10 or so areas of consideration

    That might be a better way to look at it, areas of consideration