Ok, I've been down this road and I'll tell you the technically correct answer and the real answer. Your BDC is calibrated for a specific grain round with a specific BC for a specific HOB and MV, if you use any other ammo/rifle and in different atmospheric conditions it will deviate and be incorrect. That's the technically correct answer.
In real life using ball ammo from a rack grade rifle, the above differences won't matter at any realistic ranges. Now, if you're shooting some precision setup at 600m for groups or something, then yeah it matters, but you probably shouldn't be using an ACOG for that either. People freak out about using 55 vs 62 when they're shooting at silhouettes at <300m mostly, and for that you'll be fine. I use my 31F with a 12.5" SBR shooting 77gr to 300m, and I've never noticed any difference. This is likely a lot larger difference than you'll use, and you'll likely notice even less than I do (none).
As to zeroing, why would you not just zero it as designed? I guess you could zero the dot at 50m, but it dicks up all the rest of the reticle. I'd just zero the dot at 100m as prescribed, and use it as intended. If you're doing a rough zero at 25m, you'll want to zero POI to the 300m hold. This is the Army method of getting a rough zero, and will get you on paper at 100m, where you'll of course want to confirm.
Lastly, if you're really worried about any trajectory differences, zero 1" or so high at 100m. This splits the difference of a less efficient round (the M855 by published BC/MV (and thus what Trijicon calibrates for) is usually more efficient than most rounds I've used), thus if I shoot from 0-300m, I usually end up about +1.5"@100m, dead on at 200m, and -1.5"@300m. This is subjective, but I find that more useful than dead on at 100m, -2"@200m, and -5"@300m. I'm making these exact numbers up, but hopefully it illustrates the purpose.