Imprecision of measuring tool versus thickness specification and allowed spread of data.

#1
We are manufacturing a device with a plastic coating on a lens with thickness specification of 42 microns +/-6 and we have a maximum allowed spread of 6 microns.

Data values taken at 4 points might be 42, 39, 41, 36. The spread (or precision per the "standard" below) of these values is 6.

The only measuring device we have currently has a repeatability and accuracy of +/- 1.5 microns. For determining how much error this device takes up from the thickness specification I think its just 12/3 which is 25% of that +/-6 tolerance. For the precision range would that be calculated the same resulting in 50%?

I am posting an ISO standard since I am trying to use standard definitions which seem to vary quite a bit - "trueness" instead of "accuracy" for example.



https://en.wikipedia.org/wiki/Accuracy_and_precision
1545337966878.png
 

Attachments

#2
Do you have a set of calibration standards? http://www.elcometerusa.com/Coating-Inspection/Calibration-Foils-Standards/

Once you know the repeat ability of your measuring device, then you can adjust your specs at inspection. Do you care about the overall thickness of the glass, or just the variation in the thickness?

If you do have a target thickness, lets say 40 microns with +/- 3, then you know the measuring device could add or subtract 1.5 microns. Then you'd have to adjust the specs to 40+/- 1.5 microns. That boxes you in to the range of (38.5, 41.5) thats pretty tight.

If you don't care about the thickness and just the variation, then again the devices error is subtracted and the +/- 3.0 now is +/- 1.5 essentially giving you a range of 3 microns from max to min.
 

Attachments

#3
Do you have a set of calibration standards? http://www.elcometerusa.com/Coating-Inspection/Calibration-Foils-Standards/

Once you know the repeat ability of your measuring device, then you can adjust your specs at inspection. Do you care about the overall thickness of the glass, or just the variation in the thickness?

If you do have a target thickness, lets say 40 microns with +/- 3, then you know the measuring device could add or subtract 1.5 microns. Then you'd have to adjust the specs to 40+/- 1.5 microns. That boxes you in to the range of (38.5, 41.5) thats pretty tight.

If you don't care about the thickness and just the variation, then again the devices error is subtracted and the +/- 3.0 now is +/- 1.5 essentially giving you a range of 3 microns from max to min.

Much thanks on the standards.

We can't adjust our specs since they are required for correct operation which makes finding a more precise thickness gauge valuable to reduce false rejects and accepts which without knowing any other stats would be 25% I believe for the 42 +/-6um tolerance.

My uncertainty was whether or not I would apply the same calculation on the spread but perhaps so since before the spread is calculated the measurement error is applied to values then the "worst case" spread is calculated. In this case I think you are agreeing that the measurement device at +/-1.5 microns accuracy and repeatability takes up 50% of that spread specification.

Is it true per the ISO spec. graph that this "spread" (upper limit -lower limit of measurements) is Precision and repeatability and reproducibility? From what I have read those are different but in this ISO spec. the implication is they are the same.




The gauge we are using is very accurate for the price. There are thickness sensors at 10 times the precision and accuracy but 30 times the price.