Time series degree of slope: Calculating what I see

#1
Good morning Crew Genius!

I want to calculate the degree of slope at each point in a time series. Different time series have different scales. The final number should be normalized in the range of +/-90 degrees. Basically, when I plot my time series in Excel, I can see the degree of slope up or down, 0=flat, 70=very steep up, -20=gradual slope down. I want to calculate the "number" for what I am seeing.

I thought using the arctangent(P-P1), P=current point, P1=previous point would work. Not at all. For example on one time series: atan(1.166031374-1.168266667) yields -0.00224. On another times series, atan(11373.92-11342.05) = 1.539431. Certainly not normalized across different value scales nor producing values between +/-90.

Visually, it’s so easy to see the degree of slope in my chart! Yet, over the last year I’ve tried more than a hundred work arounds, mostly complex. They approximate what I want but seem very convoluted and inelegant. I’d appreciate any insights into solving this problem.

Thank you very much!
alexander