i have two question on how to compare measurments of theoretically the same measurand (gasphase temperature) obtained with two different approaches:

Method 1) mean Temperature over 30 experiments: 733 K

- Adiabatic compression using initial pressure, initial temperature, heat capacity etc to calculate the gas phase temperature

- combined uncertainty using a error propagation results in: +- 4 K

- Std. deviation / repetability: +-0.36 K

==> 733 +- 4 K

Method 2) mean Temperature over 30 experiments: 740 K

- Laser thermometry (N2-RS)

- no combined uncertainty due to complexity

- Std deviation: +- 56 K from the same 30 individual measurments

--> Resulting std error of the mean (shrot SE): +-10 K

==> 740 +- 10 K

Question 1)

Is the overall approach shown for both methods generally correct?

Question 2)

How can i compare in relative terms (aka %-error) both methods?

Simply calculating the error of method 1 from two (733-740)/733 = -0.95% is probably wrong as i ignore the uncertainty of both results at all?

Best and thanks