drg, on 06 August 2012 - 01:44 PM, said:
I'm not complaining about your methodology, just pointing out it doesn't prove what you say it does. The calculated drop at 10 yards with a 30 fps fluctuation is very small, less than the width of a paintball, i.e. your test actually cannot show the difference. The differences magnify further downrange and do end up being significant. You will notice that cockerpunk also noted the same issue in the thread.
EDIT: Rntlee's test also occurred at a distance of fairly minimal calculated drop (about 1.5" @ 30 fps change) and the majority of his values were within about 12 fps so ... the drop is not expected to be detectable there either.
By George, he's starting to get it.... Keep in mind that when you increase the range, the background "noise" is increased as well. You can note the rate of expansion by viewing my results at different distances. I didn't put a whole lot of effort into analyzing the different distances (I'll go over it again when I get home), but I suspect that the noise increases exponentially (which makes sense), whereas the possible changes due to velocity fluctuations expand at a much lower rate. Any effects on the Y axis due to speed fluctuations are totally lost in the noise that is produced by vortex shedding and thus all significance is lost. Further testing isn't going to find any changes in velocity being correlated to velocity changes, simply because, the background noise will be so high. So, again, no significant correlation. This doesn't mean that Newtonian physics has flown out the window, but if you are using simple models to predict the ball's endpoint, you are neglecting some of the most major forces that effect the ball's accuracy. If your model didn't even account for some, fairly, major velocity fluctuations, it wouldn't narrow the degree of error in a detectable manner.
Furthermore, to reiterate the point that I made initially, if you can't detect the differences in shot velocities at ranges that people ACTUALLY shoot at, then what the hell is the point of arguing that a marker that shoots +/-5fps over the crono is worse then shooting +/-1fps over the crono from an accuracy standpoint?
I would much rather talk about important things that effect accuracy, and not get caught up in the undetectable minutia.
This post has been edited by Troy: 06 August 2012 - 03:07 PM