Best way to determine deltas trends - linear gradient?

Hi all - very much a novice here.

I have two different systems that analyze the concentration of x in a sample multiple times in a given period (e.g. one month). I'm looking for a method to determine how the delta between the two measurements is trending. The delta should be fairly consistent over the range over the range of measured values and I need to identify if the trend of the deltas deltas is changing above some level of significance over the period.

As the deltas should be consistent, is looking at the gradient of the linear trendline the right approach? E.g. if I expect a consistent delta, in a plot of delta vs. reading # the trendline would have a gradient of zero in the perfect case. A non-zero gradient would imply some trend in the deltas, and I just need to work out what my threshold of concern is. My concern is that trends in one direction early in the period could offset trends in the other direction later in the time period..

Is there a better approach - a different type of trendline perhaps?

Thanks - Chris