What is the best way to figure out what your particular load's velocity change as a function of temperature. The end data would tell me 1.4 FPS per degree (F) for example.
Would I just chrono at a high temp, and then a low temp and divide the FPS change with the temp change and would that be an accurate enough calculation? If so, what would be the best baseline temp difference, 50 degrees or more, or is something like 20 degrees adequate?
Probably a simple question, but I just want to be sure.
I am trying to gauge this for 175SMK and 168SMK for use with the FDAC to ensure I have the right card installed, and then to put a printed label on each card that would say 175SMK 75 to 100 (F) and on the same card 168SMK with it's temp range for the velocity to be close to the charts.
Thanks for any useful response in advance.
Would I just chrono at a high temp, and then a low temp and divide the FPS change with the temp change and would that be an accurate enough calculation? If so, what would be the best baseline temp difference, 50 degrees or more, or is something like 20 degrees adequate?
Probably a simple question, but I just want to be sure.
I am trying to gauge this for 175SMK and 168SMK for use with the FDAC to ensure I have the right card installed, and then to put a printed label on each card that would say 175SMK 75 to 100 (F) and on the same card 168SMK with it's temp range for the velocity to be close to the charts.
Thanks for any useful response in advance.