• The Shot You’ll Never Forget Giveaway - Enter To Win A Barrel From Rifle Barrel Blanks!

    Tell us about the best or most memorable shot you’ve ever taken. Contest ends June 13th and remember: subscribe for a better chance of winning!

    Join contest Subscribe

Precision Rifle Gear New Athlon Rangecraft Chronograph-Garmin Xero Killer?

I work in data analysis and this obsession over extreme spread is strange to me in this context.

ES is just an equation, it’s not a measure of accuracy. At best, it’s a proxy for consistency.

Case in point, you can have a single digit ES, and still miss your target entirely for a multitude of reasons.

What’s the proper statistical design for multiple, concurrent variables which may or may not have an association of causality?

Miltivariate Analysis?
 
That depends, are we talking real statistics or the kind where someone says my buddy chrono’d it once and it was sub MOA?

We already have an abundance of Bubbasayso here 🤣

Honestly, I don’t even know if our results even follow a true Normal Distribution, or if we have the proper resolutiom to document some of our interventions 🤣

How does one mix multiple, graduated and single “I did this” interventions?

How does one properly define group size / scatter?

I really dunno…
 
Last edited:
  • Like
Reactions: AleksanderSuave
We already have an abundance of Bubbasayso here 🤣

Honestly, I don’t even know if our results even follow a true Normal Distribution, or if we have the proper resolutiom to document some of our interventions 🤣

How does one mix multiple, graduated and single “I did this” interventions?

How does one properly define group size / scatter?

I really dunno…
At least we’re honest about the challenges.

Multivariate model would be a good direction to move towards since we’re looking to determine which factors may or may not cause one another, or just the end outcome.

Either way I still believe SD is of lower importance in the bigger picture.

I shot a 5 shot group with factory federal gold medal sierra match kings, with a 5.9 SD.

That came out of a 20” fluted barrel that already showed signs of too much pressure after a 3 shot string..

Accurate enough and will be used for hunting, but clearly not a good rifle to compete with, regardless of the elusive single digit SD.
 
I'm not a big "put another frakin app on your phone" guy.
macbook here and the app works to connect to the laptop where I'll actually use the data, no need to hop the data from phone to laptop

I managed to get in some rounds with an air gun on mine today, real range will still have to wait.

This is the first chono I've ever owned that reliably tracks low speed .177 projectiles without fussy setup and numerous test shots.

Out of the box and set up a quick start session in < 5 minutes. Ran two sessions of 5 shots each and it didn't miss a beat.

The controls are intuitive enough that no manual reference was needed to sort it out.

I'm still interested in following up on the fps differences I'm seeing in some of the videos but ease of setup, laptop connectivity and (so far) reliability makes it a win for me.
 
macbook here and the app works to connect to the laptop where I'll actually use the data, no need to hop the data from phone to laptop

I managed to get in some rounds with an air gun on mine today, real range will still have to wait.

This is the first chono I've ever owned that reliably tracks low speed .177 projectiles without fussy setup and numerous test shots.

Out of the box and set up a quick start session in < 5 minutes. Ran two sessions of 5 shots each and it didn't miss a beat.

The controls are intuitive enough that no manual reference was needed to sort it out.

I'm still interested in following up on the fps differences I'm seeing in some of the videos but ease of setup, laptop connectivity and (so far) reliability makes it a win for me.
Did you get the athlon ballistics app from the mac app store, then just use laptop bluetooth to sync? Thats pretty slick. Didnt even consider doing that..
 
Did you get ...
App from the app store, bluetooth yes

The setup instructions call it the "Lite" app but it doesn't say "Lite" in the app store.

Also, even when you open it on the laptop it stays tiny phone sized :ROFLMAO:

BUT! ... it works. Sessions can be renamed, shots and projectiles can be edited. I find the app itself a little clunky to navigate but I've seen worse.
 
Last edited:
App from the app store, bluetooth yes

The setup instructions call it the "Lite" app but it doesn't say "Lite" in the app store.

Also, even when you open it on the laptop it stays tiny phone sized :ROFLMAO:

BUT! ... it works. Sessions can be renamed, shots and projectiles can be edited. I find the app itself a little clunky to navigate but I've seen worse.
I saw that as well. Also saw a search option that was mac-specific apps, instead of "ios". Learned something new, that mac can run iphone/ipad apps, ones not originally made for running on a mac. That explains the tiny app screen size.
 
  • Like
Reactions: doubloon
from what I saw, athlon is reading higher (20± fps) than other units.
Even if that’s the case, it’s a whole lot better than my Caldwell. It reads 70-100 fps high. I compared it directly with a used Magnetospeed I bought, and damn it reads high. I might throw the Magnetospeed on with the Athlon to directly compare, but the Athlon gave very similar numbers. Whole lot better than the Caldwell.

I’m not a huge fan of having to back clear out of a string to start a new string, but that’s really the only major complaint I’ve had so far.

On a total side note, I ran some 115 gr PMC vs 147 AE from my PX4 while I was still at the range. PMC is loaded crazy light (~1050 fps) while the 3.3” barrel is getting box velocity from AE 147’s.
 
  • Like
Reactions: Genin and doubloon
App from the app store, bluetooth yes

The setup instructions call it the "Lite" app but it doesn't say "Lite" in the app store.

Also, even when you open it on the laptop it stays tiny phone sized :ROFLMAO:

BUT! ... it works. Sessions can be renamed, shots and projectiles can be edited. I find the app itself a little clunky to navigate but I've seen worse.
I have yet to get my phone to connect to the device.
 
I have yet to get my phone to connect to the device.
The app is squirrely designed. Not sure if you figured that out yet.

Apologies in advance if this is all obvious to you. Not judging just detailing what works for me.

ETA: check in the settings on the radar if "connection" is "on" ... mine was on by default

The steps to connect from the app are

click the hamburger at the top right

select "connect device" - you should see two things "athlon thermals" with "connect" under it and "velocity pro radar" with "enter" under it

click "enter" under "velocity pro radar"

on mine there's a "reconnect" button that's darked out but yours should have a "connect" button IIRC

I have no idea why there are two different fkn connect buttons, I clicked the "connect" button under thermals like 3 times before I decided to see what was on the other side of "enter" under radar

32qf.jpg
asdfasdfwa.jpg
fwq3q.jpg
 
Last edited:
  • Like
Reactions: lash
The app is squirrely designed. Not sure if you figured that out yet.

Apologies in advance if this is all obvious to you. Not judging just detailing what works for me.

ETA: check in the settings on the radar if "connection" is "on" ... mine was on by default

The steps to connect from the app are

click the hamburger at the top right

select "connect device" - you should see two things "athlon thermals" with "connect" under it and "velocity pro radar" with "enter" under it

click "enter" under "velocity pro radar"

on mine there's a "reconnect" button that's darked out but yours should have a "connect" button IIRC

I have no idea why there are two different fkn connect buttons, I clicked the "connect" button under thermals like 3 times before I decided to see what was on the other side of "enter" under radar

View attachment 8694065 View attachment 8694060 View attachment 8694066
Ok thanks. I think that’s what I was doing, however I was more concerned with actually shooting that day. I’ll mess with it more when I’m not actually trying to do something semi important. I do, however, believe I see how it’s slightly squirrelly.
 
  • Like
Reactions: lash and doubloon
Nitpicking that Athlon Rangecraft app and maybe the Rangecraft firmware.

The Rangecraft chronograph itself I'm happy with in terms of performance and ease of use.

Nits

The xlsx data exported includes a time for each shot in minute resolution but does not include a date which might be useful. I'm assuming the Rangecraft firmware would be responsible for recording the date. It wouldn't need to be a date for each shot, even a single date somewhere in the file would be useful. The date of the session is clearly visible in the chronograph session history it just never makes it from the chronograph to the exported data.

The app is kludgy to navigate for Rangecraft purposes. The ballistic portion of the app is front stage with the chronograph portion basically spackled into it.

The default session name for every rifle session in the app is just "RIFLE". It would be nice if the default name included a unique identifier but for this to really matter the export function needs to be improved. This is the same for "PISTOL" and all the rest.

The export function. Every session exported defaults to the exact same name "output.xlsx" not even the default session name in the app. Not "Rifle", not "Pistol" and absolutely no unique identifier just "output.xlsx". Every session exported has to be individually renamed as it's being exported EVEN THOUGH the app gives you the ability to rename each session inside the app.

The session name isn't even included in the exported spreadsheet. It's as if the ability to rename the session in the app serves no purpose whatsoever. Including the session name from the app in the spreadsheet might be too much to ask since the xlsx is probably generated on the chronograph.
 
  • Like
Reactions: lash
After many failed attempts over multiple days and system resets, one of my Athlons will connect, but simply will NOT sync sessions to my phone. That's been kind of frustrating for me - especially when I'm trying to compile data from several chronographs at the same time and have to do manual data entry for the simple fact that one unit will not sync. I've also experienced co-channel interference with all 3 of the Athlons I have used - I'm not sure if they're accepting NEAR-channel interference instead of co-channel, but it's strange that I've had such bad luck in getting 3 out of 3 to interfere, whereas I've not had that experience with Garmins or LabRadar LX's as pervasively as I do with the Athlons (you can see the 1400fps reading on the bottom left Garmin - that ONLY happens when I have co-channel interference). It CAN happen to any of them - the V1 and the VelociRadar pictured here interfere both directions with the 2 Athlons pictured (interfering 1 each), but acoustic triggers (which the V1 and VR are using) hide the interference better than radar triggered units (which the Athlons use). Eliminating the co-channel units, the Athlons operate perfectly fine and offer more consistent readings with the rest of the units.

1748388762106.png
 
After many failed attempts over multiple days and system resets, one of my Athlons will connect, but simply will NOT sync sessions to my phone. That's been kind of frustrating for me - especially when I'm trying to compile data from several chronographs at the same time and have to do manual data entry for the simple fact that one unit will not sync. I've also experienced co-channel interference with all 3 of the Athlons I have used - I'm not sure if they're accepting NEAR-channel interference instead of co-channel, but it's strange that I've had such bad luck in getting 3 out of 3 to interfere, whereas I've not had that experience with Garmins or LabRadar LX's as pervasively as I do with the Athlons (you can see the 1400fps reading on the bottom left Garmin - that ONLY happens when I have co-channel interference). It CAN happen to any of them - the V1 and the VelociRadar pictured here interfere both directions with the 2 Athlons pictured (interfering 1 each), but acoustic triggers (which the V1 and VR are using) hide the interference better than radar triggered units (which the Athlons use). Eliminating the co-channel units, the Athlons operate perfectly fine and offer more consistent readings with the rest of the units.

View attachment 8696224

I think you have officially gone too far.
 
I think you have officially gone too far.

Eh, folks wanna know stuff, and I like experimenting when I have time and ability. I kinda accidentally ended up with a bunch of chronos, 8 units of 7 different make/model, folks asked questions, so I designed some experiments for comparison - then the preliminary testing revealed additional testing to be done, especially with multiple units of the same brand, and someone had a VelociRadar I could borrow, so I actually have 11 chronos of 8 different models right now to do a lot of comparison.

But yeah, it's a rather large pain in the ass that I can't run all 8 radar units concurrently due to the near/co-channel interferences. So I have to pick and choose which can be concurrently operated and then design the tests around minimizing the round count (running 100 round strings across 11 units, even in pairs, is a LOT of rounds). But I'm trying to answer questions which different folks have had about them.
 
sync sessions to my phone

Apple or Android? Mostly just curious about this point.

I'm running the app on my macbook and it "syncs" between the chronograph and the app on the laptop if that's what you mean. All the sessions are visible in the app and I can rename them and mess with them to the limits of what the app allows. The new names haven't yet been synced back to the chrono that I have noticed.

Downloading the sessions to my laptop for use by something else like a spreadsheet app is not accomplished with "sync". Each file has to be saved to the device individually renaming each one in the process.

co-channel interference

Yes. The video of the interview with Dustin Harding of Athlon linked by someone in a previous post addressed this but Dustin dodged the question. Basically, by not directly answering the question, Dustin confirmed the Rangecraft is not as good as other devices at filtering out competing signals from other radars. Maybe it will be addressed in a future update, maybe not.

Athlons operate perfectly fine and offer more consistent readings with the rest of the units

Interesting you got this result. It seems to differ from some other results.

I'm having a hard time spotting the Magneto Speed. Is it wearing grass camo?
 
  • Like
Reactions: lash
After many failed attempts over multiple days and system resets, one of my Athlons will connect, but simply will NOT sync sessions to my phone. That's been kind of frustrating for me - especially when I'm trying to compile data from several chronographs at the same time and have to do manual data entry for the simple fact that one unit will not sync. I've also experienced co-channel interference with all 3 of the Athlons I have used - I'm not sure if they're accepting NEAR-channel interference instead of co-channel, but it's strange that I've had such bad luck in getting 3 out of 3 to interfere, whereas I've not had that experience with Garmins or LabRadar LX's as pervasively as I do with the Athlons (you can see the 1400fps reading on the bottom left Garmin - that ONLY happens when I have co-channel interference). It CAN happen to any of them - the V1 and the VelociRadar pictured here interfere both directions with the 2 Athlons pictured (interfering 1 each), but acoustic triggers (which the V1 and VR are using) hide the interference better than radar triggered units (which the Athlons use). Eliminating the co-channel units, the Athlons operate perfectly fine and offer more consistent readings with the rest of the units.

View attachment 8696224
What are the units second from the bottom?
 
Apple or Android?

I use iOS.

I'm running the app on my macbook and it "syncs" between the chronograph and the app on the laptop if that's what you mean. All the sessions are visible in the app and I can rename them and mess with them to the limits of what the app allows. The new names haven't yet been synced back to the chrono that I have noticed.

When opening the "manage tab" of each Athlon Velocity Pro in the app - on iOS - the units generally automatically sync sessions into the phone. When this doesn't happen automatically, typically there is a "sync sessions" button at the bottom of the app interface page. One of my Athlons will readily do that transfer, or will accept the instruction to sync. My second unit cannot be made to sync - it spins the wheel of death for hours without syncing. I'm going to do another hard reboot and reinstall the updated firmware and hope it clears, now that I have extracted the data manually, but I have not had luck so far.

Downloading the sessions to my laptop for use by something else like a spreadsheet app is not accomplished with "sync". Each file has to be saved to the device individually renaming each one in the process.

I can't speak to the mac based App, but once sessions are synced from the unit to the phone, they can be exported as .csv, at which point the iOS interface allows the destination of the export to be an email (or any of a dozen other options, such as opening with specific spreadsheet apps or saving as pdf, or opening with other apps), so I simply export to email and then open the .csv's in Excel. Other than the fact they all export as individual files without descriptive filenames, this works quickly and easily.

Yes. The video of the interview with Dustin Harding of Athlon linked by someone in a previous post addressed this but Dustin dodged the question. Basically, by not directly answering the question, Dustin confirmed the Rangecraft is not as good as other devices at filtering out competing signals from other radars. Maybe it will be addressed in a future update, maybe not.

I'm not terribly certain at this point if the Athlon is actually worse than "other devices" or not. The "other devices" I have tested include 3 brand/models which are acoustic or recoil trigger only, which means they would only exhibit bad behavior AFTER a shot has been fired. The Athlons, alternatively, are radar triggered, so comparatively, their triggers are always open. So when I have known interfering units operating (such as my LabRadar V1 and my first Athlon, or the Caldwell VelociRadar I have and my second Athlon), the Athlon will trigger repeatedly, non-stop, as if it were constantly receiving shots, even if there is no gun on the firing line. However, the VelociRadar will SEEM as if it is not experiencing interference, but ONLY because it has not been actively triggered. However, once it "hears" a shot with the acoustic trigger, it will experience the same co-channel interference and remains at risk of reading a false echo (the echo from the OTHER co-channel unit rather than their own) and displaying a false velocity. Both LabRadar V1 and LabRadar LX work this same way - acoustic or recoil trigger only, so they too would also SEEM to be operating normally until the shot is fired, but then once triggered, they'd accept the false echo and report a false velocity.

I've seen false echo receipt a few times with Garmins - sometimes we'll see shots reporting over 4,000fps, which indicates they received a false echo from another unit, but I have NEVER seen the Garmins be triggered by Radar, AND it has been very rare to see Garmins interfering. We end up with a bunch of Garmins on the same firing line at PRS match zero boards, and it is exceedingly rare to see interference indicators, whereas 3 out of 3 of the Athlons I have used have fallen into constant feedback loops where they are being triggered and displaying "analyzing" and even reporting velocities for shots which did not happen.

So I'm suspicious but have not yet confirmed that Athlons are accepting NEAR-channel interference rather than simply accepting co-channel interference (which cannot be avoided by ANY of the radar units on the market).

BUT I WILL OFFER A WARNING TO VIEWERS - I believe it will be common for people to see video of folks using LabRadars or VelociRadars to show a lopsided and false representation that the Athlons are unilaterally accepting co-channel interference because we can make them spaz out by turning on the co-channel units. The Athlon will go into a continuous loop, whereas the LabRadar and VelociRadar will APPEAR to be fine... but ONLY because they have not yet been triggered to read, whereas the Athlon IS getting triggered by radar. The LabRadars and VelociRadar will still experience the false velocity output once they are triggered - but it's easy to make a video which seems very, very one sided. I have this kind of video myself, but I'm not yet sharing them until I can properly demonstrate the bi-directional interference instead of making the Athlon look like it is failing.

[Varminterror: "Eliminating the co-channel units, the Athlons operate perfectly fine and offer more consistent readings with the rest of the units.] Interesting you got this result. It seems to differ from some other results.

Not sure what to tell you, other than to describe that by the end of that preliminary interference test pictured above (a test conducted to evaluate the validity experimental design for future tests), I fired a 100 round string with 6 units operating, removing the LabRadar V1 and the Caldwell VelociRadar, which demonstrated to interfere with the 2 Athlons. I'm compiling that information to share in threads like this around the internet and in a forthcoming video (series of videos maybe), but here's an example of the alignment of results when the interferences are eliminated - all 6 units, 2 Athlons, 2 Garmins, and 2 LabRadar LX's agree within 3 fps 1165fps to 1168fps (left side LX was not updated for firmware and was displaying only whole integer velocities, rather than the updated LX on the right side which displays decimal value velocities.

This has been my consistent experience when interference is not happening - observationally, the units are reading within a few fps of each other overall, and within each brand, typically the variance is less than 1fps. As the rain subsides in the next days and weeks, I will share more data to confirm or correct this observation, knowing now which units can and cannot be operated concurrently.

IMG_2174.jpeg


I'm having a hard time spotting the Magneto Speed. Is it wearing grass camo?

The test pictured was a preliminary operability test to determine interferences for the Radar units to let me determine the volume of sampling I will have to do in order to capture the necessary dataset for this comparison. I want to be able to display data from concurrent operation, but I knew the opportunity for co-channel interference existed, so the test above was the opportunity to test which units can be concurrently operated and which units would interfere. I have to trade out the interfering units one at a time to enable the future testing. Not pictured there, as they were not included in the preliminary radar interference test, are my Magnetospeed V3, MacDonald 2 Box, and CE ProChrono Digital. Since these 3 units aren't radars, I did not include them in the radar interference test.
 
this video starting around 10:30 ... Dustin calls it an "analyzing loop" around 11:30 ... caused by interference



I might be picking nits here, but I don't really like "analyzing loop," to describe what we're seeing. Judd used the same language - "analyzing loop" - on the phone when I called after the first instance of interference on the first day with my Athlon, but I'm not terribly certain I'd agree with that particular language - I don't necessarily perceive that it's just "analyzing" and then picking a high velocity, but rather the units are in a re-triggering loop because of radar feedback from the interfering units.

In that Paramount video above, we can't really tell which unit is interfering with that visible Athlon, since we can only see 2 of the 5, but I would bet it was the LabRadar. The LabRadar would appear to be fine, since it wasn't yet triggered between shots, but it's quite likely that it was displaying false velocities when triggered. The Athlon is radar triggered, so it's being repeatedly triggered by the interference. The Garmin could exhibit the same issue - potentially - since it is also radar triggered, so the fact the Garmin screen isn't constantly recycling in the same way as the Athlon, I'm betting the co-channel interference was with the LabRadar and being hidden by the fact the LabRadar uses an acoustic trigger rather than Radar.

The high velocities being displayed are almost assuredly false-echo registrations. The signal from another unit comes back and the timing offset is just wrong, so then the unit displays an incorrect velocity. Alternatively, when the Athlons are showing the chronic "analyzing" on the screen, I believe that is multiple trigger events, rather than just persistent processing. You'll see the screen flash back and forth, indicating the analysis is ending, skipping some false echoes, registering others. As I mentioned, any radar triggered unit could exhibit that issue, it's just odd that we're seeing it so much more commonly from the Athlons than the Garmins, which makes me think the Athlon may be more susceptible to NEAR-channel interference rather than strict co-channel (this COULD also describe a potential difference for inequitable interference from Athlon to another unit brand, but that's pure speculation on my part at this time).
 
Last edited:
  • Like
Reactions: doubloon
I might be picking nits here, but I don't really like "analyzing loop"

You're not. The analyzing loop is simply inferior engineering, either hardware or firmware. If it's firmware they can fix it if they actually have the skills. If it's hardware then that will eventually be the death of the product unless they offer a recall to fix it.

It bugs me a little that Dustin visibly squirmed in his chair and gathered his thoughts when the issue was presented. Maybe it's just me but it seems like it was a well known problem and they actually decided not to fix it " in a rush to get it out the door.

That squirm in that chair, that answer to the analyzing issue, the misspelled "Muzzel" on the splash screen and the issues I nitpicked earlier all give me second thought about their entire approach.

"How you do anything is how you do everything." -- Unknown

If I hadn't jumped in early and got it for $350 I'd be second guessing the purchase even at $399. But I'm not sure I'd be jumping on a Xero for $549 either. If I was going to spend that kind of money I'd be taking a hard look at the Labradar LX because of that one extra downrange data point but I'd use once in a blue moon but why not get it for only $50 more ... as long as they all work that is.

we can't really tell which unit

I haven't looked at the FCC statements from all the various units but Dustin mentioned in the video there are a limited number of "channels" these devices can operate on.

It's basically the same problem as your home WiFI router. If you are having lousy connections/bandwidth with your home WiFi you used to have to put an analyzer on your phone to see if your neighbors unit was on the same frequencies and move yours. Nowadays all these devices take care of this themselves. It's a known problem with a known solution, there is no technical reason they shouldn't have this solved.

false-echo registrations

Seems reasonable. Kinda like the old jammers you could buy for po-po radar.

I hope it's something they can address in firmware but I'm not worried a ton about any of these problems because I don't see myself being in a situation where I'm sharing a bench/ranch with a bunch of other FPS nerds. I could be wrong and I'll be mad when it happens but willing to take that chance at the moment.
 
If I hadn't jumped in early and got it for $350 I'd be second guessing the purchase even at $399. But I'm not sure I'd be jumping on a Xero for $549 either. If I was going to spend that kind of money I'd be taking a hard look at the Labradar LX because of that one extra downrange data point but I'd use once in a blue moon but why not get it for only $50 more ... as long as they all work that is.
Street price for the Xero is around $475 these days.
 
  • Like
Reactions: doubloon and lash
You're not. The analyzing loop is simply inferior engineering, either hardware or firmware. If it's firmware they can fix it if they actually have the skills. If it's hardware then that will eventually be the death of the product unless they offer a recall to fix it.

Like I mention, observationally, I do not think these units are really stuck constantly analyzing, but rather are retriggering and simply analyzing many, many triggers. So it’s like having a thousand small math problems to do instead of being stuck on one long math problem.

It bugs me a little that Dustin visibly squirmed in his chair and gathered his thoughts when the issue was presented. Maybe it's just me but it seems like it was a well known problem and they actually decided not to fix it " in a rush to get it out the door.

Co- channel interference is an issue for any civilian radar unit. That’s unavoidable, other than frequency hopping, which none of them are doing (to my knowledge after discussing with each manufacturer).

I haven't looked at the FCC statements from all the various units but Dustin mentioned in the video there are a limited number of "channels" these devices can operate on.

Only around 100 different channels are available. So having 8 units on the line in such close proximity, I have a relatively high likelihood of interference.

It's basically the same problem as your home WiFI router. If you are having lousy connections/bandwidth with your home WiFi you used to have to put an analyzer on your phone to see if your neighbors unit was on the same frequencies and move yours. Nowadays all these devices take care of this themselves. It's a known problem with a known solution, there is no technical reason they shouldn't have this solved.

Most of these are multi-channel devices with frequency hopping opportunity, which simply doesn’t apply to the ballistic chronographs - because they aren’t.

Seems reasonable. Kinda like the old jammers you could buy for po-po radar.

Essentially, yeah. True jammers are transmitter/receivers which issue an off-sequence signal, measuring and returning a true interference signal. They CAN be simplified by issuing broad channel signals which do interrupt accurate reading of a radar gun.

I hope it's something they can address in firmware but I'm not worried a ton about any of these problems because I don't see myself being in a situation where I'm sharing a bench/ranch with a bunch of other FPS nerds. I could be wrong and I'll be mad when it happens but willing to take that chance at the moment.

I don’t expect they can fix co-channel interference via firmware. The units would have to have the ability to frequency hop to “run away” from co-channel interference, which would still only offer them the same list of ~100 channels, and have the same odds of interfering with another unit - and maybe inconveniently, two units might hop back together when trying to hop apart. It adds complexity and cost, for something which only happens for us experimenters who bring a bunch of radars together, or potentially for competitors where we’d accumulate a bunch of shooters with radars at a zero board - but the physical spacing and beam divergences really limit that opportunity as well.
 
  • Like
Reactions: lash
I don’t expect they can fix co-channel interference via firmware.

From the Xero manual I get the impression the Xero auto configuration step of frequency selection is done via firmware. I don't see any advantage to implementing this feature in hardware.

The Rangecraft may not select a frequency at setup and stick with it or it may not be as good at detecting new interference and moving away from it.

Automatic Radar Configuration​

The Xero® C1 chronograph automates some setup steps to help you get started faster.

Frequency Selection
When you start a session, the chronograph automatically checks for interference from other radars. If the chronograph detects interference from other radar systems, it switches to a clean channel automatically.

Only around 100 different channels are available

Maybe a few more, I haven't checked the list for duplicates.

 
Last edited:
I haven’t seen it any lower than the current sale price of $549

Just searched again at gun.deals, haven't checked in a couple three weeks. At least one place listing it for $474.95



Deals

6 found

1 day ago
Garmin Xero C1 Pro Chronograph w/ Tri-Pod Mount & USB A/C - $474.95 (Add to cart for best price)
Garmin Xero C1 Pro Chronograph w/ Tri-Pod Mount & USB A/C - $474.95 (Add to cart for best price)
$474.95
100
Not Just Gu
 
  • Like
Reactions: TahoeDust
  • Love
Reactions: doubloon
I'm working now to compile some data from one of my preliminary comparisons depicted above. I had to eliminate two units from the test to prevent co-channel interference, but while I was conducting the interference test, I shot a 100rnd string of 22LR ammo (rot gut stuff, just for preliminary evaluation of the method) across 3 pairs of LabRadar LX, Garmin Xero C1, and Athlon Rangecraft Velocity Pro's.

Overall, for 100rnds, after my first few outings with the Athlons, this dataset was much more compelling than I expected. We can see here, the ES and SD's are relatively similar, the ES's appear to vary more than they really are, since the numbers are so ridiculously big for this terrible ammo, but 231.7 vs. 242.3 is still only a ~4% spread, and 32.7SD vs. 34.3SD is only ~4.8% spread, and considering this ammo, I'm not terribly disappointed at this point. More rimfire and similar centerfire testing to come which may confirm or correct that observation.

1748640767096.png


I expected in the outset of designing this experiment that I would see relatively persistent offsets between units - meaning one unit would read higher than another, or lower, regularly. This persisted both within each brand (less so for the Athlons) as well as between the 3 brands. Reading the data itself isn't important, but I applied a heat map to the data for all 6 units over 100 rounds to reflect which units read the highest speed readings for each shot (highlighted green below) vs. the lowest speed (highlighted red below). The two left columns are Garmin units, two center columns are the readings from LabRadar LX's, and the right most two columns are the readings from my two Athlons. For an overwhelming majority of shots, the LabRadar LX's displayed the higher speed readings with the Athlons displaying the lower speed readings, with the Garmins floating in the middle:

--> One or both of the LabRadar LX's represented 90 of the fastest shots out of 100, and only ONE out of 100 shots did a LabRadar LX represent the slowest reading for any shot (including ties).

--> One or both of the Athlon untis represented the SLOWEST velocity reading for 89 out of 100 shots, and only represented 10 of the fastest readings (including ties).

--> One or both of the Garmins represented mid-range readings between the other brands 85 out of 100 shots, only representing the fastest shots for 5 out of 100 (including ties), and only represented the slowest readings 10 times out of 100 shots.

So 90% of the time, the LX was faster than the other two brands, and 89% of the time the Athlon was slower than the other two brands. The LX's averaged 0.8fps faster than the Garmins, which averaged 1.2fps faster than the Athlons.

***Note: Reading the specific data here isn't so important as the heat mapping - majority red in a column shows that unit reads slower more often, majority green shows that unit reads faster more often.***

1748639182296.png


Comparing each brand to itself, there was also an offset between the units, although less decisive. One of my Garmins read faster than the other for 62 of the 100 shots and only slower for 29 of the 100, with the two units matching the displayed speed for 9 shots - meaning one unit read faster than the other more than twice as often. The two Athlons agreed 21 times, then one unit displayed faster on 47 shots and only slower for 32 shots, reading faster than the other roughly 50% more often. The LabRadar LX's display to 0.01fps rather than 0.1fps, which makes it less common to see perfect agreement (and on that date, they were not operating on the same firmware version), so there were no shots for which they displayed the exact same speed, but one unit displayed higher speed 63 times and only displayed slower 37 times, so one unit read faster about 70% more often than the other.

1748639581835.png


The readings between each of the same brand were very close together. The worst spread displayed for ALL chronographs was 11.1fps, but 80% of the shots were less than 4.2fps separated. The 2 LabRadars were, on average, 2.1fps faster than the 2 Athlons and 0.8fps faster than the 2 Garmins. But within brands:

--> The 2 Garmins read within 0.54fps average from one another, never more than 1.9fps apart.

--> The 2 LX's averaged within 0.83fps of one another, never more than 3.9fps apart.

--> The Athlons averaged 1.1fps apart, never more than 6.5fps.

So this suggests, at least in this test, the Garmins are closer together than the LabRadars by ~50% tighter, while the Garmins are about 96% closer together on average than the Athlons. But overall, whether the average is within +/-0.5fps or +/-1.1fps, eh, not a huge difference in practical application performance.

1748639880488.png


Unfortunately, the ammo I chose was just terrible, so the noise from one shot to the next really drowns out the difference between each chronograph reading for each respective shot. Really difficult to display the dataset with only ~3.5fps average spread between each chronograph, but 242fps spread between the fastest and slowest shot registered. This is all 6 trends depicted together, you can see they track very well up and down for the macro result, but there's slight feathering and a few crossing of the trends at the peaks and valleys where the few fps between the readings are revealed:
1748640080887.png


In an attempt to better visualize the differences between units, and the prevailing offset trends, I ranked the shots by average velocity across the units, then charted the trendlines for a smaller velocity window, choosing 1200-1220 relatively randomly - it's a small enough velocity window to let the ~3.5fps spread between shots reveal itself in the ~20fps window, but still have ~25-30 shots for comparison. So here is a ranked velocity depiction (by average velocity) of 30 rounds.

--> The pink and purple trends floating typically near the bottom edge are the Athlon units
--> The light and dark green trends floating typically in the middle are the Garmin units
--> The peach & orange trends riding predominantly on the top are the LX units

We can see the relative noise, but also see the prevailing offset trends, and the near-parallel tracking together of all of units as the ranked velocity increased up and up. Tighter together in some spots, looser and noisier in some spots, but just randomly so, since this represents a non-chronological series of shots.

1748640400471.png


Overall, the results were very consistent for all 3 brands, all 6 units, and the expected behavior was demonstrated. Without yet going into volatility evaluation, I'm glad to have this info as I move this weekend and next week into further exploration.
 
The LX's averaged 0.8fps faster than the Garmins, which averaged 1.2fps faster than the Athlons.

Averages are fine and all but I tended to use my old Beta Chrony for two purposes. Ladder tests and load consistency tests.

In a ladder test a single or two shots recorded incorrectly by 5 or 10 or more fps can significantly skew the test.

In all this testing is it possible to figure out which unit produces the most accurate results shot to shot?
 
Averages are fine and all

All 600 data points are depicted there, for you analytical pleasure.

In all this testing is it possible to figure out which unit produces the most accurate results shot to shot?

In general, for that data set, considering the variability of the ammunition, the resulting disparity among units, and the inherent precision limitation of these units, no, it's not really possible to tell which unit is correct.

However, it's relatively well proven that velocity ladders are snake oil, so being closer to "truth" than the average +/-1.75fps between units realized in this preliminary test, we'd be pretty certain of our results for any given shot being of substantial value for our ballistic calculator. Easy enough to prove where is an average and whether ammo is sufficiently consistent or not. But based on the data trends, in this evaluation, either all of them are sufficient or none of them.
 
Last edited:
All 600 data points are depicted there, for you analytical pleasure.

Right, the data points are there but which data points are more accurate than the others?

How to identify and weed out the bad data points?

One unit may record lower ES or SD than the other but which one is correct?

I recently took My Rangecraft out for a spin recording some 45-70 subs I loaded. It recorded 2 shot at 1023.x fps, 2 shots at 1025.x fps and one shot at 924.x fps. On the paper at 100 yards there were 4 holes within a 1" vertical spread and a 1.5" horizontal spread and 1 hole a few inches lower than all the others. More or less a decent indication of consistency but still not proof the speeds were 100% "accurate".

Vertical spread is a half decent litmus test for validating recorded speeds and figuring out which unit is lying. Barrel whip not-withstanding.
 
Last edited:
Right, the data points are there but which data points are more accurate than the others?

How to identify and weed out the bad data points?

One unit may record lower ES or SD than the other but which one is correct?

The spread between all 6 units for 80% of the shots was less than 4.2fps. A 4.2fps error in a ballistic engine for my 6 creed load would be a MAXIMUM potential of 0.84" and with a simple RSS assessment on a rifle shooting .5moa at 100yrds shows it would only account for less than .1" at 1,000yrds. Equally, realize, the average displayed by all 6 units were only 2.7fps spread from fastest to slowest, so HALF of that potential difference at 1,000yrds. Within this dataset, ALL of the singular results were incorrect, because the variability of the ammo itself, but for each data point, the potential error from one unit to the other was very tight. So as I mentioned above, either ALL of them are good enough, or none of them are. A 99% confidence interval for this particular dataset, however, shows any given one of these units would only be potentially within ~9fps of the true average - even after 100rnds - so this dataset isn't seeking truth, it's seeking comparison. When we're only an average of 3.5fps spread among the 6 units with a set which only held +/-9fps for a 99% confidence interval, eh, nah, they're either all telling the truth, or none are. Even if I only took ONE shot, the odds of being "wrong" have nothing to do with the unit being used to measure the velocity, and everything to do with the consistency of the ammunition and the relative insensitivity of our trajectory to such small variations in velocity.

Vertical spread is a half decent litmus test for validating recorded speeds and figuring out which unit is lying. Barrel whip not-withstanding.

I'll readily admit, I'm not sufficiently skilled as a shooter to be able to tell the difference of 0.06" at 1,000 yards in my groups to determine truth between two chronographs showing 4fps spread among them. Certainly, my subsequent testing using a better performing centerfire load instead of this cheap and shitty 22LR ammo will improve ability to determine validity of result, but again, I'd have to be able to hold less than 1/10th of an inch difference at 1,000yrds to be able to tell the difference.

Anecdotally, I've used my Garmin velocity over the last 2 seasons to hit 1.5moa targets with 1st round impacts as far as 2200yrds. This is with a load which typically displays 5-9fps SD's for 10 shot strings, and 7-10 SD's for 60 shot SD's. A 99% Confidence interval for this load would be +/-3.2fps. That potential 6.4fps variability in the 1% uncertainty describes a difference of 19.16mils vs. 19.25mils at 2200yrds... One click in my scope, on a target about 5 clicks tall - and I'll be honest, I just don't shoot well enough to hold my shots within 1 click on the turret at 2200yrds.

So I struggle a little to conceive of any application where I REALLY have the ability to differentiate with live fire between any results as close together as the above dataset describes. I generally have control at 2000-2200 where my waterline is within ~0.3mils using velocity obtained via my Garmin, so to think I'd dial off 0.02mil less if I were using the LabRadar or dial on 0.03mil more if I were using the Athlon, eh, I'm pretty sure I can't shoot the difference between the 3 brands.

I do, however, have a MacDonald TwoBox acoustic chronograph unit which claims to be true within +/-1-2fps, which is tighter than the potential +/-0.1% of these radar units. Set up is a massive pain in the ass, requiring level and straight installation across 15ft span, so if this works as described, this would be my most accurate opportunity for "true" velocity within subsequent testing out of the 8 different chronograph units I have on hand. Data for this forthcoming with sunny weather. But reminding, as we see here, we have high confidence that the "truth" is somewhere between the min and max of these 6 units, and despite being the minority result, any of the 6 units each demonstrated the highest, lowest, and median results throughout this dataset.
 
I was just trying to figure out a comparison to what end if it's not to figure out if one is more accurate.

Just within the limited scope of that particular preliminary comparison - a test only conducted to determine validity of experimental design, I've already presented here multiple aspects of comparison, as well as demonstration of relative behavior.

Many folks have asked, during the secondary phase of crowd-sourced aspects of comparison, simply for determination or validation of relative performance between these units. "Which one should I buy? And why?" seem pertinent questions for folks with money in hand and all 3 of these options in front of them on the computer screen. Folks are questioning whether the Athlon at the lower price will perform as well as the Garmin and LabRadar units - which have longer market provenance of performance than does the Athlon. Above, I presented a data set showing, at least in this ONE instance, 100 rounds, the consistency between multiple Garmin units is defensibly better than the consistency between the other two brands. The comparison above also presented strong evidence that each brand measures notably faster or slower relative to the others - which, considering the reality of this particular data set, could be considered that the Garmin falling in the middle as the moderate choice, and the above sensitivity analysis, would suggest choosing the Garmin might offer the most moderate risk profile. But... Again, when we really digest the sensitivity analysis, either all of them are right or none are.

Certainly through this particular test, I've also witnessed that frequency/channel hopping or manual assignability is a relatively important consideration for buyers who would be on busy firing lines, AND that the acoustic or recoil trigger options will only HIDE interference, but not actually prevent it. A lot of folks may be mislead by youtube videos of athlons caught in "analysis loops" and think the LX or even the VelociRadar are a better option - but once they fire, both have the same risk of interference, the Athlon is simply more honest about it.