I'm running a bunch of regressions on a series of stock returns (decimal values).

One of the models is:

r_t = a0 + b1*rm_t + e

where r_t is the stock return and rm_t is the market return (both decimal values).

My interpretation is that a0 is the mean return of r_t, and that b1 is the sensitivity to the market return.

So if my constant is: 0.00108623 i have a daily return of 0,11 pct. b1 is -0.0110737; how would I intrerpret this and is my interpretation of the constant correct?