goodness of linear fit when the slope is flat

Lukas

New Member
#1
I am working with some concentration vs time data --> exponential decay.

If I take the log of my concetration data, I can easily fit a linear regression to it and I get good R-square values.
However I suspect R-square is not telling me what I want to know: I want to know how good the fit of the curve to the actual data is.

R-square gets smaller, the smaller the slope of the fitted line is, even when it looks like it is actually quite a good fit.

How else can I describe goodness of fit for a linear curve fit when the slope of the line is very small?

thanks for your help
 

TheEcologist

Global Moderator
#2
I am working with some concentration vs time data --> exponential decay.


If you don’t trust R-squared you could try the following;

Calculate “predicted” values (with your linear model) using a large and small slope. Then do a chi-squared (goodness of fit test) with these predictions using your original data as the expected (once for the large and once for the small slope predictions). The model that gives the lowest p-values has the worst fit. This way you can see if your r-squared trend is maintained.

However before you do the above:

You mention exponential decline/decay, have you tried a log transformation before doing the regression?