Why how you are zeroing your sights might be making you miss (warning: contains math)

I've created this image using graphing software and MS paint to help illustrate the path your bullet takes out of your gun and how your sight on different zeroes interacts with the path of the bullet.

equations shown at the bottom of the post

The interesting thing to note is that zero'ing your gun to 50m is almost always worse than 100m, even when the target is closer than 50m. You can confirm this weird behavior in the practice range by firing at the furthest target (this one seems to be around 25-30m away). When you fire at the furthest target with a zero of 50m, you will notice your shot lands slightly below the point you were aiming at. However, if you zero to 100m, your shot will land almost exactly on target. (note that firing at closer targets will yield the same results but it is harder to notice the difference; regardless of your zero your shots will land very low on the first target)

Both shots were aimed at the red dots along the top edge of the target

This happens because when you zero at 50m, the line from your eyes, through your sight, to the target only touches the path of the bullet once (at 50m). However when you zero to 100m, the line crosses the path of your bullet twice (once at around 33 meters and once at 100m).

Because of this phenomenon, you actually become MORE accurate at extremely close range (<30m) the higher you set your zero. As you can see by the 150m graph, the sight lines up with the path of the bullet the first time at around 22 meters.

However, this comes at a cost. When increasing your zero your sight becomes more accurate at extremely close and long ranges but suffers in the mid range (the range most engagements are fought in). From 40-80 meters the 50m zero is obviously most accurate. Looking at the 150m zero, it is clear that as you go to higher and higher zeroes, shots fired at mid-range targets start to land extremely high.

After doing these calculations, I'm going to zero all my sights to 100m from now on. I don't like to flip the zero very often and so if I have to pick one, 100m seems best. It will improve the accuracy of my sights under 40m and over 90m. In the 40-90m range my shots will land slightly high but missing high is better than missing low since that might lead to some accidental head/thorax shots instead of accidental stomach shots. Going to zeroes higher than 100m seems less optimal to me since your shots will start landing VERY high at mid range and as I mentioned before, we don't want to sacrifice much at this range since it is a very common engagement distance. At the worst your shots will land a centimeter or two high when fired at mid-ranges so unless you were aiming at the fuzzy part of their pompom, it should very rarely cause a correctly aimed shot to miss high.

Just to preempt any comments saying I used numbers that fit my argument I want to say a few things regarding the numbers I used. This phenomenon is exaggerated by sights that are significantly higher than the muzzle of the gun and by low bullet velocities. I assumed the sight is 5 cm (2 inches) above the muzzle and a muzzle velocity of 800 m/s. I know 5cm might be higher than many sights, but I did the same thing with a 2.5cm high sight and the phenomenon still exists. However, with a lower sight it is less obvious when graphed. 800m/s is a fairly standard muzzle velocity (only considerable slower than the 5.56 rounds in tarkov that clock in at 900-1000 m/s) and graphing slower ammunition simply exaggerates the phenomenon.

For those of you that want to check the math, I used the following information:

Assumptions:

initial bullet velocity of 800 m/s

muzzle height of 1.4478 meters

sight height of 1.4986 meters

negligible air resistance

path of the bullet:

y=1.4478-9.8*(x/800)^2