User Panel
Please stay on point and be polite in responding to this thread, thanks
|
|
Are you going to use a control in regards to optics? If so, what?
weapons and ammo specifics may cause shifts that are not optic related, so again a control should be considered. Very interested in the results |
|
Quoted:
Are you going to use a control in regards to optics? If so, what? weapons and ammo specifics may cause shifts that are not optic related, so again a control should be considered. Very interested in the results View Quote That is an interesting question. I'm not sure how we could establish a "control" optic. Especially since all red dots are subject to parallax. But since the goal is to measure the deviation and compare it among the sample group- it may not be needed for that purpose. Thoughts? This test is on the optic through visual observations alone. No ammo will be fired. |
|
Quoted:
That is an interesting question. I'm not sure how we could establish a "control" optic. Especially since all red dots are subject to parallax. But since the goal is to measure the deviation and compare it among the sample group- it may not be needed for that purpose. Thoughts? This test is on the optic through visual observations alone. No ammo will be fired. View Quote View All Quotes View All Quotes Quoted:
Quoted:
Are you going to use a control in regards to optics? If so, what? weapons and ammo specifics may cause shifts that are not optic related, so again a control should be considered. Very interested in the results That is an interesting question. I'm not sure how we could establish a "control" optic. Especially since all red dots are subject to parallax. But since the goal is to measure the deviation and compare it among the sample group- it may not be needed for that purpose. Thoughts? This test is on the optic through visual observations alone. No ammo will be fired. I like Science as well as the next guy, but if no ammo is being fired, All of this seems like an arbitrary exercise in Theory. |
|
Quoted:
I like Science as well as the next guy, but if no ammo is being fired, All of this seems like an arbitrary exercise in Theory. View Quote Without getting into the error that started this effort, the goal is to visually measure any deviation or irregular movement. The weapon/sight will remain fixed in position- pointed at the target. If excessive or irregular movement is observed- it will indicate by seeing the dot appear to move away the intended target. This visual error would cause a shooter with imprescise head placement to re-reference the aiming dot back to the point of aim and point he weapon off line. Of course if we fired live rounds then it would also demonstrate this but it would also add a very significant amount of controls and protocols to rule out all of the possible factors influencing the results. It would also add a significant amount of time to the test. My current line of thought is for this test to confirm, refute and or measure aiming for movement and deviation due to parallax or irregular movement due to aberration. I think we can agree that you will point your rifle in a direction that references your dot to the desired POA, so if the dot moves and we can measure the deviation- then this would generally indicate (but not confirm) a POI shift. It would be appropriate to do a follow on test after this, if the visual test indicates something, and confirm the visual observations with live rounds. Again- that would be a MUCH more involved process. Based on what I've seen, I expect that the results of what we may find will be very interesting. |
|
Quoted:
That is an interesting question. I'm not sure how we could establish a "control" optic. Especially since all red dots are subject to parallax. But since the goal is to measure the deviation and compare it among the sample group- it may not be needed for that purpose. Thoughts? This test is on the optic through visual observations alone. No ammo will be fired. View Quote Are you testing the optic or the validity of this type of optic in a certain situation... If the former, the control could be a similar optic of known quality, i.e. Aimpoint T1. If the later the control optic needs to be of a different type, i.e. irons or a quality magnified optic. My $0.02 |
|
I agree. A scope with an adjustable parallax to get at least a base of eye orientation not determining poa shifts when adjusted for parallax. Then off focus parallax to see if parallax is truly counter to head movement or if there is any movement off the axis of head movement.
|
|
Quoted:
Are you testing the optic or the validity of this type of optic in a certain situation... If the former, the control could be a similar optic of known quality, i.e. Aimpoint T1. If the later the control optic needs to be of a different type, i.e. irons or a quality magnified optic. My $0.02 View Quote To your first sentence: I'm testing to measure deviation or irregular movement due to non-centered eye/optic alignment- such as that can occur with inconsistent head alignment. I know I'm not being specific with what I am looking to prove. This is because I have personally seen what I am looking for and as a result would be biases in the testing. This is why I will not be one of the testers myself. While the issue I'm dancing around here is mentioned in another thread and the FB post that started this- It is bad practice to name the defect that you are attempting to prove or disprove because it can cause the conditions for the test to be compromised. So, I am setting uniform testing conditions and letting the more independent observations and data drive the outcome. I hope I made that make sense? To the second point: So, the T-1 will not be the only optic tested. So far, although I haven't finished compiling the list of commitments and what they are bringing, we are looking at multiple EXPS 3.0's, Multiple T-1's, T-2's, Aiming Pro's, M68's, Comp m4's, Primary optic T-1 clones, and a number of other red dot sights. Of course I would love to have a very large sample group of maybe 20-50 of each optic type and 20-30 testers- but I think that the diversity and number of the sample group being tested will give us reasonable comparative results- if not at least a starting point for others to remotely replicate the test and post their results. Or someone else to take the time and effort to set up a much larger and controlled test to validate what we will do this weekend. What are your thoughts here? To the third point: I'm not sure that irons or LPV's would be a good comparison or control because they are such a different form factor than reflex or red dot optics and may not be a fair basis for comparison. I'm completely open on this one and it would be easy to add, just not sure it compliments the issue that I believe will be presented in the results. Thoughts? |
|
Quoted:
I agree. A scope with an adjustable parallax to get at least a base of eye orientation not determining poa shifts when adjusted for parallax. Then off focus parallax to see if parallax is truly counter to head movement or if there is any movement off the axis of head movement. View Quote Are you talking about using the magnified optic as a control to confirm head alignment? I'm not sure how I could integrate that into the test in a way that would rule out user error. The only control I have come up with, that wouldn't involve spending $$$ to make an apparatus that keeps the user's head fixed and moved the optic on a perfect vertical and horizontal axis, is to hang a plumb line behind the optic for the tester to use as a travel reference and have an additional tester observe the active tester that can notify if the tester moved inconsistently and call him to repeat. I am definitely interested in hearing ideas on how to effectively control this possible point of user induced inconsistency. |
|
In for the results. Hypothetically If the test does end up confirming your observations, and assuming this issue was fixed on the T2, one would have to think Aimpoint was aware of the issue and possibly withheld the info.
I'm more of a M68CCO Aimpoint guy so no dog in this fight for me. But definitely interested in seeing where this goes. |
|
Quoted:
In for the results. Hypothetically If the test does end up confirming your observations, and assuming this issue was fixed on the T2, one would have to think Aimpoint was aware of the issue and possibly withheld the info. I'm more of a M68CCO Aimpoint guy so no dog in this fight for me. But definitely interested in seeing where this goes. View Quote I've discussed my observations in the other thread in this forums and in the various social media posts I've made on this. I don't want to "contaminate" this thread with them though. I too am looking forward to see where this goes. I don't expect the results to be the end of this, but I hope it is a starting point for more data driven analysis under controlled and repeatable conditions. |
|
Quoted:
That is an interesting question. I'm not sure how we could establish a "control" optic. Especially since all red dots are subject to parallax. But since the goal is to measure the deviation and compare it among the sample group- it may not be needed for that purpose. Thoughts? This test is on the optic through visual observations alone. No ammo will be fired. View Quote Irons wont work, 3 items to visual align versus the 2 in red dots. LPVO's typically have more parallax issues that 1x red dots so they won't help here either. I had assumed that you would be firing to see what the down range effect of the parallax would be on all the sample optics. Obviously, this makes the experiment design much more in depth and more complicated so I understand you sticking with your design as is. |
|
"..I had assumed that you would be firing to see what the down range effect of the parallax would be on all the sample optics.."
This. Anyway, thanks for bringing this possible issue forward. |
|
Quoted:
"..I had assumed that you would be firing to see what the down range effect of the parallax would be on all the sample optics.." This. Anyway, thanks for bringing this possible issue forward. View Quote I can see that the live fire thing may cause some doubts as there may be some people that content that for some reason a change in perceived position of the aiming reference would not correlate to a POI shift. I think I will add a second stage to this test. At the conclusion of the test. I will consult with the volunteer test committee. If we have any optics models that indicate irregular or excessive movement, we will take those optics and repeat the test. However in this second iteration, an LBS-300c will be properly collimating and adjusted to the POA at the test distance that indicated the greatest irregularities. The tester will keep the red dot sight referenced at the center of the target during the head movement and a third tester will record any laser position change at the target. This should clearly show that (if) the barrel is indeed deflected due to a perceived change of position of the red dot and the resulting re-aiming- without adding external ballistics and marksmanship ability into the variables. |
|
FYI... Good video of parallax on various red dots: https://youtu.be/waeYp90OpHk
|
|
Quoted:
FYI... Good video of parallax on various red dots: https://youtu.be/waeYp90OpHk View Quote Good video. I want to be a bit more data driven that that though. I would ask if we could keep this thread "clean" of any opinions of certain optics, previous test results, or discussion about specific issues. It is just good practice to prevent an expected outcome from entering into the process. I will also not be declaring a "winner". I will just post the data and summarize them in a comparative manner, the reader will have to evaluate the results. |
|
Good protocol. One does not use a control optic to measure parallax. This is an accuracy test. The grid is the control, the one factor that does not vary.
|
|
Test complete. My time estimate for the test was off. It took much more time than anticipated. We had very good results, and more importantly- the recorded observations of each individual optic were consistent across all testers.
There was one optic that surprised us all...... The video of the aiming dot deviation looks pretty good as well. I'll have some time after a training class I'm running Monday to start compiling data and drafting the report. After the board of testers reviews it and concurs with the results- I'll post them. |
|
I look forward to seeing the results. I love my T1 but no optic is perfect. At the very least, knowing your optic's weak points will help you learn the best way to use it.
|
|
Interesting.
Quote from Aimless above: "Please stay on point and be polite in responding to this thread, thanks." As this subject matter is of interest to me, I will also be reading the posts here. |
|
Quoted:
Interesting. Quote from Aimless above: "Please stay on point and be polite in responding to this thread, thanks." As this subject matter is of interest to me, I will also be reading the posts here. View Quote I am glad to hear of many's genuine interest- I appreciate it. I have a training engagement tomorrow, but I'll get to work on compiling the report as soon as possible. Then as soon as the tester concur with my draft- I'll post it. |
|
Quoted:
I am glad to hear of many's genuine interest- I appreciate it. I have a training engagement tomorrow, but I'll get to work on compiling the report as soon as possible. Then as soon as the tester concur with my draft- I'll post it. View Quote View All Quotes View All Quotes Quoted:
Quoted:
Interesting. Quote from Aimless above: "Please stay on point and be polite in responding to this thread, thanks." As this subject matter is of interest to me, I will also be reading the posts here. I am glad to hear of many's genuine interest- I appreciate it. I have a training engagement tomorrow, but I'll get to work on compiling the report as soon as possible. Then as soon as the tester concur with my draft- I'll post it. You're posting technical data in a technical forum.... Who'd a thunk it.. Thanks for your effort. |
|
Just want to make sure we are all on the same page about what you are testing....
"Parallax Free" would refer to the dot always matching with the proper point of impact based on the original zero so that no matter where the shooter positions their head, as long as they can see the dot on the target, the round goes to the right place: Attached File Attached File Attached File Some interesting discussion here: http://www.breachbangclear.com/parallax-free-isnt/ People keep referencing EoTech's debacle as if it's similar to what this testing is attempting to measure...it's not. http://www.ar15.com/forums/t_1_5/1814710_-ARCHIVED-THREAD----F--k-Eotech-Eotech-offering-Refunds-for-HWS-Products.html&page=1 If I am recalling the allegations made in federal court against EoTech that were supported by internal documents from EoTech employees found by the .mil government staff after they starting noting issues were the following:
From the federal lawsuit's documents: "61. EOTech quickly confirmed the Norwegians’ findings. On February 2, 2007, the CTO emailed a memo explaining the defect to other EOTech employees and suggested that it be forwarded to the Norwegians. The CTO’s memo admitted that “[w]e had never looked at the sight performance at very low temperature. We had assumed the sight performed about the same at 20 degrees C ± 40 degrees C. We were quite surprised by how poorly the sight performed at -20 degrees C.” The CTO’s memo also admitted that the sight demonstrated “a completely unacceptable performance.” 62. In replying to the CTO, one sales employee asked, “do we really want to admit that we never tested the HWS at cold temperature when we list on the published specs that it operates to -40 C? . . . Also, temperatures in Afghanistan are very, very cold in the winter. We should say that our HWS performs very well (as a 1x) in both temperature extremes.” Another sales employee responded that “[s]tating that we never did low temperature testing would not make me feel comfortable. It begs the question [] what else have we not tested." ............ "71. By early 2007, Defendants thus realized that every product they had shipped to Crane and other customers failed to perform in accordance with the product’s specifications concerning cold temperature operation. If the sight was taken even to moderately cold temperatures, ranging from 32 degrees to 5 degrees Fahrenheit, the user’s aim would be affected by error ranging from 12 MOA to more than 20 MOA, i.e., more than 20 inches per 100 yards." ............ "106. In June of 2009, EOTech sent a sample of newly-manufactured sights to an independent testing laboratory to determine whether the seal on the sight was an effective barrier against ambient air and moisture. The laboratory checked for “gross leaks” (typically indicating that the seal can be penetrated by a liquid) by immersing the sights in fluid and checking for bubbles emanating from the seal area. The laboratory reported to Defendants that “[t]he results of the gross leak testing revealed all four devices failed hermetic testing. . . . Bubbles were evident from the devices at various locations. . . .” The laboratory identified three to four leak paths on each sight. 107. In August of 2009, the same laboratory conducted an internal vapor analysis on six sights, specifically those in use by the U.S. Special Operations Command operators, to determine whether the sights leaked the nitrogen gas that was injected into the sights to keep out moisture. The laboratory concluded that all of the sights in the sample leaked nitrogen. When EOTech then asked the laboratory to conduct additional testing to assess the rate and the path of the leaks, the laboratory reported that it was unable to complete the testing because the “[l]eak rate was greater than the instruments’ detection limit.” 108. An EOTech test engineer subsequently prepared a single-page summary of the results, reporting that two “Significant Findings” of the testing were that “[n]itrogen & moisture content reach[] near ambient levels within a 1 month period” and that the use of a “[d]essicant [sic] provides a significant improvement to moisture content but only for a short period of time (< 1 month).” According to a former EOTech optics engineer, all of the senior managers were aware of the results, including Mangano. 109. In other words, by 2009, EOTech knew that ambient air, with its relative humidity, filled the optical cavity almost immediately. Although the damaging test results pertained to the very units being shipped to Crane, EOTech did not disclose them. ............. Further, EoTech thought so much of their own "fix" that EoTech began asking Law Enforcement agencies to essentially sign a waiver/letter of acknowledgement that they are aware of the temperature related POI variation issue as they haven't resolved it: http://www.thefirearmblog.com/blog/2016/03/18/eotech-asks-le-customers-acknowledge-poi-shift-issue-shipping-sights/ |
|
Quoted:
Just want to make sure we are all on the same page about what you are testing.... "Parallax Free" would refer to the dot always matching with the proper point of impact based on the original zero so that no matter where the shooter positions their head, as long as they can see the dot on the target, the round goes to the right place: https://www.AR15.Com/media/mediaFiles/114798/Ex1-164267.JPG https://www.AR15.Com/media/mediaFiles/114798/Ex2-164266.JPG https://www.AR15.Com/media/mediaFiles/114798/Ex3-164269.JPG Some interesting discussion here: http://www.breachbangclear.com/parallax-free-isnt/ People keep referencing EoTech's debacle as if it's similar to what this testing is attempting to measure...it's not. http://www.ar15.com/forums/t_1_5/1814710_-ARCHIVED-THREAD----F--k-Eotech-Eotech-offering-Refunds-for-HWS-Products.html&page=1 If I am recalling the allegations made in federal court against EoTech that were supported by internal documents from EoTech employees found by the .mil government staff after they starting noting issues were the following:
From the federal lawsuit's documents: "61. EOTech quickly confirmed the Norwegians’ findings. On February 2, 2007, the CTO emailed a memo explaining the defect to other EOTech employees and suggested that it be forwarded to the Norwegians. The CTO’s memo admitted that “[w]e had never looked at the sight performance at very low temperature. We had assumed the sight performed about the same at 20 degrees C ± 40 degrees C. We were quite surprised by how poorly the sight performed at -20 degrees C.” The CTO’s memo also admitted that the sight demonstrated “a completely unacceptable performance.” 62. In replying to the CTO, one sales employee asked, “do we really want to admit that we never tested the HWS at cold temperature when we list on the published specs that it operates to -40 C? . . . Also, temperatures in Afghanistan are very, very cold in the winter. We should say that our HWS performs very well (as a 1x) in both temperature extremes.” Another sales employee responded that “[s]tating that we never did low temperature testing would not make me feel comfortable. It begs the question [] what else have we not tested." ............ "71. By early 2007, Defendants thus realized that every product they had shipped to Crane and other customers failed to perform in accordance with the product’s specifications concerning cold temperature operation. If the sight was taken even to moderately cold temperatures, ranging from 32 degrees to 5 degrees Fahrenheit, the user’s aim would be affected by error ranging from 12 MOA to more than 20 MOA, i.e., more than 20 inches per 100 yards." ............ "106. In June of 2009, EOTech sent a sample of newly-manufactured sights to an independent testing laboratory to determine whether the seal on the sight was an effective barrier against ambient air and moisture. The laboratory checked for “gross leaks” (typically indicating that the seal can be penetrated by a liquid) by immersing the sights in fluid and checking for bubbles emanating from the seal area. The laboratory reported to Defendants that “[t]he results of the gross leak testing revealed all four devices failed hermetic testing. . . . Bubbles were evident from the devices at various locations. . . .” The laboratory identified three to four leak paths on each sight. 107. In August of 2009, the same laboratory conducted an internal vapor analysis on six sights, specifically those in use by the U.S. Special Operations Command operators, to determine whether the sights leaked the nitrogen gas that was injected into the sights to keep out moisture. The laboratory concluded that all of the sights in the sample leaked nitrogen. When EOTech then asked the laboratory to conduct additional testing to assess the rate and the path of the leaks, the laboratory reported that it was unable to complete the testing because the “[l]eak rate was greater than the instruments’ detection limit.” 108. An EOTech test engineer subsequently prepared a single-page summary of the results, reporting that two “Significant Findings” of the testing were that “[n]itrogen & moisture content reach[] near ambient levels within a 1 month period” and that the use of a “[d]essicant [sic] provides a significant improvement to moisture content but only for a short period of time (< 1 month).” According to a former EOTech optics engineer, all of the senior managers were aware of the results, including Mangano. 109. In other words, by 2009, EOTech knew that ambient air, with its relative humidity, filled the optical cavity almost immediately. Although the damaging test results pertained to the very units being shipped to Crane, EOTech did not disclose them. ............. Further, EoTech thought so much of their own "fix" that EoTech began asking Law Enforcement agencies to essentially sign a waiver/letter of acknowledgement that they are aware of the temperature related POI variation issue as they haven't resolved it: http://www.thefirearmblog.com/blog/2016/03/18/eotech-asks-le-customers-acknowledge-poi-shift-issue-shipping-sights/ View Quote I would really like to post some of the video I currently have, but I want to release it in a complete package. You and I are talking about the same thing. The parallax effect I am speaking of is the dot changing position relative to the target when changing viewing angle. Also- this is absolutely not an EoTech vs Aimpont issue. I also did not evaluate for thermal drift. I would, however really be interested in seeing test data on that effect, however- but that is getting off topic a bit. |
|
Quoted:
Without getting into the error that started this effort, the goal is to visually measure any deviation or irregular movement. The weapon/sight will remain fixed in position- pointed at the target. If excessive or irregular movement is observed- it will indicate by seeing the dot appear to move away the intended target. This visual error would cause a shooter with imprescise head placement to re-reference the aiming dot back to the point of aim and point he weapon off line. Of course if we fired live rounds then it would also demonstrate this but it would also add a very significant amount of controls and protocols to rule out all of the possible factors influencing the results. It would also add a significant amount of time to the test. My current line of thought is for this test to confirm, refute and or measure aiming for movement and deviation due to parallax or irregular movement due to aberration. I think we can agree that you will point your rifle in a direction that references your dot to the desired POA, so if the dot moves and we can measure the deviation- then this would generally indicate (but not confirm) a POI shift. It would be appropriate to do a follow on test after this, if the visual test indicates something, and confirm the visual observations with live rounds. Again- that would be a MUCH more involved process. Based on what I've seen, I expect that the results of what we may find will be very interesting. View Quote View All Quotes View All Quotes Quoted:
Quoted:
I like Science as well as the next guy, but if no ammo is being fired, All of this seems like an arbitrary exercise in Theory. Without getting into the error that started this effort, the goal is to visually measure any deviation or irregular movement. The weapon/sight will remain fixed in position- pointed at the target. If excessive or irregular movement is observed- it will indicate by seeing the dot appear to move away the intended target. This visual error would cause a shooter with imprescise head placement to re-reference the aiming dot back to the point of aim and point he weapon off line. Of course if we fired live rounds then it would also demonstrate this but it would also add a very significant amount of controls and protocols to rule out all of the possible factors influencing the results. It would also add a significant amount of time to the test. My current line of thought is for this test to confirm, refute and or measure aiming for movement and deviation due to parallax or irregular movement due to aberration. I think we can agree that you will point your rifle in a direction that references your dot to the desired POA, so if the dot moves and we can measure the deviation- then this would generally indicate (but not confirm) a POI shift. It would be appropriate to do a follow on test after this, if the visual test indicates something, and confirm the visual observations with live rounds. Again- that would be a MUCH more involved process. Based on what I've seen, I expect that the results of what we may find will be very interesting. Agreed. The moment you go live fire, ammo, type of barrel barrels, windage, dirtiness/build up all would come into question to try and invalidate a basic premise that, without shooting a round, if I move my head behind an RDS, and the dot moves to varying degrees at varying distances, it should translate to a POI shift as well. |
|
Also, the results should be tagged as it could answer so many questions on "which RDS is right for me?"
|
|
Quoted:
Also, the results should be tagged as it could answer so many questions on "which RDS is right for me?" View Quote I'll defer to the judgment of others as to how it is tagged regards to that, but I am going to stay 100% away from making any comments or recommendations towards any models in this. I'm just going to summarize the data and explain what we saw and how it was evaluated. |
|
Thanks for taking the time to do this test. I'm very interested in the results.
Max |
|
Quoted:
I would really like to post some of the video I currently have, but I want to release it in a complete package. You and I are talking about the same thing. The parallax effect I am speaking of is the dot changing position relative to the target when changing viewing angle. Also- this is absolutely not an EoTech vs Aimpont issue. I also did not evaluate for thermal drift. I would, however really be interested in seeing test data on that effect, however- but that is getting off topic a bit. View Quote Feel free to amend your first post with the images I used so that people can quickly understand what you're testing. In the other thread that started before this one several posters took your "banning of T1's due to parallax issues" to be the same problem/reason EoTech got sued. Parallax as you are attempting to measure is a factor in all sight systems, obviously the introduction of lenses and varying lens designs can have a significant impact on how severe the Parallax is. Aimpoint never pushed the T1 as a military optic, only as a more robust version of the H1 that was night vision compatible. When the FBI HRT dumped their EoTechs, they went with the T2: http://www.thefirearmblog.com/blog/2015/11/30/fbi-drops-eotech-switches-to-aimpoint/ The fact that Aimpoint introduced the T2 not to long after the introduction of the T1 says that they continued to innovate their design and found a way to make a better mouse trap in the same footprint, so if you told me it has less Parallax deviation, I wouldn't be shocked. As to your testing methods, the only way you could get a consistent sight picture while varying where the dot is at in relation to the target would be to take the human element out of the equation and use a combination of a mounted camera on a single gun that can be moved to various repeatable angles and locked into place that feeds live footage to a monitor where you can take measurements to ensure that the dot is in the same relative location in each optic you test at each target you test. A body human can inject to many variables. |
|
Quoted:
Feel free to amend your first post with the images I used so that people can quickly understand what you're testing. In the other thread that started before this one several posters took your "banning of T1's due to parallax issues" to be the same problem/reason EoTech got sued. Parallax as you are attempting to measure is a factor in all sight systems, obviously the introduction of lenses and varying lens designs can have a significant impact on how severe the Parallax is. Aimpoint never pushed the T1 as a military optic, only as a more robust version of the H1 that was night vision compatible. When the FBI HRT dumped their EoTechs, they went with the T2: http://www.thefirearmblog.com/blog/2015/11/30/fbi-drops-eotech-switches-to-aimpoint/ The fact that Aimpoint introduced the T2 not to long after the introduction of the T1 says that they continued to innovate their design and found a way to make a better mouse trap in the same footprint, so if you told me it has less Parallax deviation, I wouldn't be shocked. As to your testing methods, the only way you could get a consistent sight picture while varying where the dot is at in relation to the target would be to take the human element out of the equation and use a combination of a mounted camera on a single gun that can be moved to various repeatable angles and locked into place that feeds live footage to a monitor where you can take measurements to ensure that the dot is in the same relative location in each optic you test at each target you test. A body human can inject to many variables. View Quote I'm going to think about your suggestion- my only worry is that, while we are talking about it now, that posting photos of what we are looking for could be inferred as bias when we should be evaluating the data, video, etc and coming to a conclusion about what is observed. What are your thought? |
|
I can see where you are coming from but I think even if you don't use all of the photos or if you choose to create your own diagram showing what you are testing from the shooter's perspective through the optic, it would help people understand that you are checking various optics to see which ones, if any, have issues with parallax deviation issues based on placement of the reticle relative to the optic's field of view at varying target distances.
|
|
Quoted:
I can see where you are coming from but I think even if you don't use all of the photos or if you choose to create your own diagram showing what you are testing from the shooter's perspective through the optic, it would help people understand that you are checking various optics to see which ones, if any, have issues with parallax deviation issues based on placement of the reticle relative to the optic's field of view at varying target distances. View Quote Do you mean for the test itself? If that's the case, yes I do have diagrams, but more importantly- video footage through each optic that clearly shows what the testers observed. |
|
I meant in the opening post. Something that shows where you are checking to see if deviation occurs.
As to your comment about what "testers" saw, if you are going to eliminate human error factors, which you really should try to do when looking at this kind of testing, you should consider doing the following: Testing done at a single indoor range location with a known elevation, no wind on days and times with similar temperature and barometric pressure Testing done using one of these rigs that includes a remote trigger activator: Attached File Aiming during the testing done using a live view camera with the feed going to a large monitor where you can ensure that you are placing the reticle in the correct location for what you are trying to test across all platforms(Perfectly centered/offset to the 8 o clock position 80% of the way to edge of the objective tube/offset to the 12 o clock position 50% of the way to edge of the objective tube/ect) Camera mounted to a quality ball head that you can repeat angle/positions with: Attached File Camera Ball attached to clamp and adjustable monopod of proper length. The clamp would attach to the buffer tube in an indexed location that would place the camera lens in the same area that a shooter's eye would be positioned: Attached File Now all you have to do is swap optics out, zero each optic at the start per the manufactures recommended distances and techniques. Once zeroed start working on putting holes in targets adjusting the point of aim and camera view for each shot before you change optics and repeat. When you leave the human in the equation you can get everything from ocular distortion to movement do to breathing as well as the normal anticipation/pushing of trigger/gun and so on. If you don't take those variables out of the equation your test isn't repeatable. |
|
Quoted:
I meant in the opening post. Something that shows where you are checking to see if deviation occurs. As to your comment about what "testers" saw, if you are going to eliminate human error factors, which you really should try to do when looking at this kind of testing, you should consider doing the following: Testing done at a single indoor range location with a known elevation, no wind on days and times with similar temperature and barometric pressure Testing done using one of these rigs that includes a remote trigger activator: https://www.AR15.Com/media/mediaFiles/114798/remote-trigger-164416.JPG Aiming during the testing done using a live view camera with the feed going to a large monitor where you can ensure that you are placing the reticle in the correct location for what you are trying to test across all platforms(Perfectly centered/offset to the 8 o clock position 80% of the way to edge of the objective tube/offset to the 12 o clock position 50% of the way to edge of the objective tube/ect) Camera mounted to a quality ball head that you can repeat angle/positions with: https://www.AR15.Com/media/mediaFiles/114798/camera-mount-164420.JPG Camera Ball attached to clamp and adjustable monopod of proper length. The clamp would attach to the buffer tube in an indexed location that would place the camera lens in the same area that a shooter's eye would be positioned: https://www.AR15.Com/media/mediaFiles/114798/Clamp-164421.JPG Now all you have to do is swap optics out, zero each optic at the start per the manufactures recommended distances and techniques. Once zeroed start working on putting holes in targets adjusting the point of aim and camera view for each shot before you change optics and repeat. When you leave the human in the equation you can get everything from ocular distortion to movement do to breathing as well as the normal anticipation/pushing of trigger/gun and so on. If you don't take those variables out of the equation your test isn't repeatable. View Quote I understand what you are saying. I actually looked into several different setups like you linked. We did use rests to secure rifles and ensure they remained oriented. I did contemplate the human factor and explored a few camera apparatuses to rule out inconsistent head position. But then I stepped back and restated the purpose of the test: to measure deviation due to head position as it can be presented in user inconsistencies. So, I asked myself- why eliminate user inconsistency when it comes to viewing angle? At least in my assessment, is the user was not moving perfectly on the x and y axis, he would at least be consistent in his inconsistency on that sight. We set mild controls to limit this a bit by having another user observe the tester and having the tester complete the axis movement three consecutive times to ensure if there was deviation, it was consistent all three times. I felt that camera apparatus would be useful for measuring the prescicely angle that deviation occured and to what extent. Since I saw this as a starting point. So, my descision was to use a more user based process to see if any deviation was actually present and if there was, could we confirm it with multiple testers. Would the tester's data sheets (which the other testers did not see) match each other's perceived deviation? Would the data sheets match the deviation recorded in the video? Would the deviation change at variable distance? While the testing process definitely could have been better equipped to rule out some of these variables, I think the consistency or the inconsistency of the individual tester forms will be the best judge. I didn't receive any funding for this test so the wife would have killed me if I went shopping....I did take pictures of all of our setups so you will be able to see it all in the report. So far, the data doesn't point to any inconsistencies, even with the basic setup we used. |
|
Actually, if you swing by the Green Eye Tactical FB page or IG, you'll see a picture of one of the test optics on a rest during testing.
|
|
Quoted:
I meant in the opening post. Something that shows where you are checking to see if deviation occurs. As to your comment about what "testers" saw, if you are going to eliminate human error factors, which you really should try to do when looking at this kind of testing, you should consider doing the following: Testing done at a single indoor range location with a known elevation, no wind on days and times with similar temperature and barometric pressure Testing done using one of these rigs that includes a remote trigger activator: https://www.AR15.Com/media/mediaFiles/114798/remote-trigger-164416.JPG Aiming during the testing done using a live view camera with the feed going to a large monitor where you can ensure that you are placing the reticle in the correct location for what you are trying to test across all platforms(Perfectly centered/offset to the 8 o clock position 80% of the way to edge of the objective tube/offset to the 12 o clock position 50% of the way to edge of the objective tube/ect) Camera mounted to a quality ball head that you can repeat angle/positions with: https://www.AR15.Com/media/mediaFiles/114798/camera-mount-164420.JPG Camera Ball attached to clamp and adjustable monopod of proper length. The clamp would attach to the buffer tube in an indexed location that would place the camera lens in the same area that a shooter's eye would be positioned: https://www.AR15.Com/media/mediaFiles/114798/Clamp-164421.JPG Now all you have to do is swap optics out, zero each optic at the start per the manufactures recommended distances and techniques. Once zeroed start working on putting holes in targets adjusting the point of aim and camera view for each shot before you change optics and repeat. When you leave the human in the equation you can get everything from ocular distortion to movement do to breathing as well as the normal anticipation/pushing of trigger/gun and so on. If you don't take those variables out of the equation your test isn't repeatable. View Quote Eliminating the human element from the equation is impossible, since the whole point of the demonstration is to show how the human element interacts with such optics. I have no dog in this hunt, and have been contemplating buying high-end RDS for some time. Don't own an Aimpoint right now. I await results of the test, and will not speculate till then. |
|
Quoted:
From what I understand, the OP is measuring perceived user error when using the subject sights. IOW, the OP is seeking to demonstrate that there is a certain amount of optic-induced error on the part of the user, and which error is common to most users given the specified optic device. Eliminating the human element from the equation is impossible, since the whole point of the demonstration is to show how the human element interacts with such optics. I have no dog in this hunt, and have been contemplating buying high-end RDS for some time. Don't own an Aimpoint right now. I await results of the test, and will not speculate till then. View Quote If memory serves me right...and please speak up if it doesn't... This thread started because of another thread where "dopushups" was quoted as saying there was a deficiency in the design of the T1 Aimpoint which contributes to parallax shift of such significance that he will not allow people to use it in the classes he instructs. "dopushups" mentioned that the T1 has an additional lens that's been removed from the T2 design. As to the testing methods... If you are going to make a claim and issue a standing ban on the use of a product because of your assessment of it's design, then yes, you do take the human element out of the testing to see if the physical design is the issue or if it's a user induced issue. As it stands, before OP ever started the testing, he already had anecdotal observations that there was a problem, the testing should be designed to isolate it down to a hardware test. If you determine there's no hardware issue then you move onto shooter technique and you can examine ergonomics as a component of the shooter technique. ...At least that's how I see it from a scientific approach. Dopushups- I appreciate you trying to tackle this assessment and I recognize there are financial constraints. I'd only ask that you consider that you are putting quite a bit of text/time/info out that has the appearance of scientific approach and as a result of that appearance, your "findings" could have a significant impact on the optic market. The last thing you want to do is call it "good enough because it's all I could afford to do" without being more upfront about that, especially when you are telling people they can't use "xyz" product because your "testing and findings" showed that their product had "issues/problems/faults". I say all of that after watching the "Fireclean" thing play out. Fireclean ended up suing a few of the posters here who did "testing" here at AR15.com that seemed to indicate that the product was corn oil. Now consider that you are doing the same with some big name brands out there like EoTech, Aimpoint, Zeiss, Trijicon, Burris, Vortex, Bushnell, ect as well as some of the smaller players like Holosun, Primary Arms, Trueglo, ect. If your testing paints a picture that a company's product was put through a through scientific test and you found it faulty enough that you don't allow people to use it or you don't recommend it...and your testing method can be challenged...you may be getting more attention than you were planning on. Frankly, I wouldn't be shocked if you were able to find a few manufactures who would be willing to partner with you to run this test, providing you with the equipment and location to conduct the test. Reaching out to them with what you've already laid out and what you may already have, along with providing a brief plan of what you want to do to improve/finalize your results before releasing to the public, you'd probably have them lined up willing to help. Good luck and thanks for putting this project together! |
|
Quoted:
If memory serves me right...and please speak up if it doesn't... This thread started because of another thread where "dopushups" was quoted as saying there was a deficiency in the design of the T1 Aimpoint which contributes to parallax shift of such significance that he will not allow people to use it in the classes he instructs. "dopushups" mentioned that the T1 has an additional lens that's been removed from the T2 design. As to the testing methods... If you are going to make a claim and issue a standing ban on the use of a product because of your assessment of it's design, then yes, you do take the human element out of the testing to see if the physical design is the issue or if it's a user induced issue. As it stands, before OP ever started the testing, he already had anecdotal observations that there was a problem, the testing should be designed to isolate it down to a hardware test. If you determine there's no hardware issue then you move onto shooter technique and you can examine ergonomics as a component of the shooter technique. ...At least that's how I see it from a scientific approach. Dopushups- I appreciate you trying to tackle this assessment and I recognize there are financial constraints. I'd only ask that you consider that you are putting quite a bit of text/time/info out that has the appearance of scientific approach and as a result of that appearance, your "findings" could have a significant impact on the optic market. The last thing you want to do is call it "good enough because it's all I could afford to do" without being more upfront about that, especially when you are telling people they can't use "xyz" product because your "testing and findings" showed that their product had "issues/problems/faults". I say all of that after watching the "Fireclean" thing play out. Fireclean ended up suing a few of the posters here who did "testing" here at AR15.com that seemed to indicate that the product was corn oil. Now consider that you are doing the same with some big name brands out there like EoTech, Aimpoint, Zeiss, Trijicon, Burris, Vortex, Bushnell, ect as well as some of the smaller players like Holosun, Primary Arms, Trueglo, ect. If your testing paints a picture that a company's product was put through a through scientific test and you found it faulty enough that you don't allow people to use it or you don't recommend it...and your testing method can be challenged...you may be getting more attention than you were planning on. Frankly, I wouldn't be shocked if you were able to find a few manufactures who would be willing to partner with you to run this test, providing you with the equipment and location to conduct the test. Reaching out to them with what you've already laid out and what you may already have, along with providing a brief plan of what you want to do to improve/finalize your results before releasing to the public, you'd probably have them lined up willing to help. Good luck and thanks for putting this project together! View Quote I appreciate that feedback and share many of your concerns. I'm going to mull this over the today and tomorrow. When I get back to the office Tuesday, I'll compile what I have and get it into a logical format. I'm still going to get the testers approval of the draft before moving forward. At that point, I had intended to pass the draft off to some select people for pre-review. I'm going to think about addling some people here that to that distro list. Once those concerned see what I have, I'll listen to their feedback as to whether it should go as is or if there needs to be additional testing. Due to this time of year being busy for me with training courses, it may extend the release of the results significantly. I would be willing to include others testing that are interested in improving the result though. |
|
I'm also quite interested in this testing! Looking forward to the results.
|
|
Just to update those following- my timeline for getting the report up has slid right a day. Had a short notice training event today. I should be able to start compiling the test reports tomorrow.
|
|
Quoted:
Camera mounted to a quality ball head that you can repeat angle/positions with: Attached File View Quote You wouldn't want to use a gear head if you want to repeatable camera movements. Movements of the shooters eye in the eyebox are mostly lateral. You'd want something like a 2 axis or 3 axis macro head that lets you finely move the camera left, right, up, down. Bolt that to a block of wood or aluminum with a steel pic rail for mounting optics at the same distance from camera. I would also instead of looking at the live view just capture high resolution images for review later. You can count the grid or pixels more accurately in software than off a monitor with a ruler. |
|
Sign up for the ARFCOM weekly newsletter and be entered to win a free ARFCOM membership. One new winner* is announced every week!
You will receive an email every Friday morning featuring the latest chatter from the hottest topics, breaking news surrounding legislation, as well as exclusive deals only available to ARFCOM email subscribers.
AR15.COM is the world's largest firearm community and is a gathering place for firearm enthusiasts of all types.
From hunters and military members, to competition shooters and general firearm enthusiasts, we welcome anyone who values and respects the way of the firearm.
Subscribe to our monthly Newsletter to receive firearm news, product discounts from your favorite Industry Partners, and more.
Copyright © 1996-2024 AR15.COM LLC. All Rights Reserved.
Any use of this content without express written consent is prohibited.
AR15.Com reserves the right to overwrite or replace any affiliate, commercial, or monetizable links, posted by users, with our own.