Two different temperature measurements - Comparing results - Relative error

Hello everyone,

i have two question on how to compare measurments of theoretically the same measurand (gasphase temperature) obtained with two different approaches:

Method 1) mean Temperature over 30 experiments: 733 K
- Adiabatic compression using initial pressure, initial temperature, heat capacity etc to calculate the gas phase temperature
- combined uncertainty using a error propagation results in: +- 4 K
- Std. deviation / repetability: +-0.36 K

==> 733 +- 4 K

Method 2) mean Temperature over 30 experiments: 740 K
- Laser thermometry (N2-RS)
- no combined uncertainty due to complexity
- Std deviation: +- 56 K from the same 30 individual measurments
--> Resulting std error of the mean (shrot SE): +-10 K

==> 740 +- 10 K

Question 1)
Is the overall approach shown for both methods generally correct?

Question 2)
How can i compare in relative terms (aka %-error) both methods?
Simply calculating the error of method 1 from two (733-740)/733 = -0.95% is probably wrong as i ignore the uncertainty of both results at all?

Best and thanks


Well-Known Member
I assume that the idea is to possibly use the (cheap) laser gadget in place of the (expensive) traditional method 1. The big question is how does the %uncertainty in a determination using method 1 compare with the %uncertainty using method 2?
Both methods give comparable levels so it won't matter much whether you use 733 or 740 or anything around there to get the relative SD. The laser gadget can probably be calibrated anyway.
If I have read your post right, Method 1 will give %SD +- 4/737 = +- 0.54% for any particular determination, compared to the laser gadget which has a %SD +- 56/737= +- 7.6%.
So, accurate and expensive or cheap and rough?