My problem has two steps:

Step 1- I place both temperature sensors in the same environment. Even though they should be reading the same temperature, I can expect a little bit of offset due to internal error between the sensors.

Step 2- I place both temperature sensors in different environments. I now expect there to be differences in the data from each sensor as they are in different environments, but I also know that some of that difference is due to the offset found in Step 1.

My questions becomes:

How can I quantify the error found in step 1 and create a correction factor ? (Perhaps using a regression analysis?)

How can I apply the above correction factor to Step 2?

As an example, here is some dummy data

Step 1-

Sensor 1 Sensor 2

10 10

10 9

11 11

12 11.8

13 12.9

14 13.9

It looks like Sensor 2 consistently reads a little low (or conversely that Sensor 1 reads a little high). Since I don't know the actual temperature this is a relative problem. The question for step 1 is- how to calculate a correction factor that would basically make these two sets of numbers be close to equal (as they should be since they were reading the same temperatures).

Step 2-

Sensor 1 Sensor 2

10 13

11 14

12 14

12 14

13 14

Now that the sensors are in different environments, how would I apply the correction factor from Step 1 to these numbers?

My first thought would be to run a liner regression. In this case for the first set of numbers the regression analysis yields the equation y=1.0925x -1.3125. But I'm not sure how to apply this equation to the second set of numbers.

And as a second thought, a linear regression assumes all of the error is in the Y column. But in this case the error can assumed to be in both, so perhaps I should be using a least squares analysis (though I know very little about this kind of stats).