Significance Issue in Regression

#1
Hi everyone!
I have a statistical question that I hope someone can answer.
I am doing a research about relationship between satisfaction and loyalty. Each variable had 5 dimensions that were measured (5 dimensions determining loyalty and 5 determining satisfaction).
I did multiple regression via SPSS, but 4 out of 5 dimensions are shown not to be significant. I really think all of the dimensions are important for my research and I don't want to drop them out of the model.
Is it possible to leave them all as they are, and then do simple regression for overall loyalty and satisfaction?

Thanx in advance,

Iva
 
#2
Are you regressing satisfaction on the 5 dimentions of loyalty? i.e. regressiong satisfaction on 5 variables (L1, L2...L5)? If so, you could run Auxilary regression, e.g. regress L1 on L2, and see if they are closely related, if the r^2 is high they are. If that is the case than you have multicollinearity, and that's why you can't get your variables statistically significant.

I guess combining multicollinear variables into an index is ok, but I don't know.
 
#3
No, I wanted to compare 5 dimensions of satisfaction with overall loyalty. when I do multiple regression this way, only one variable is significant. when I take overall satisfaction (don't exclude any variable) vs. overall loyalty, I get very significant result. So should I play dumb and ignore insignificance, and just do linear regress?
 
#4
OK. So You are regressing overall loyalty on 5 dimentions of satisfaction?

OL = a + BS1 +BS2 +....BS5 + e

B -- beta, OL --overall satisfcation and S's are each dimention.

In this model only one of the S's is statistically significant?

Overall satisfaction is a combination, perhaps just a sum of all other 5 dimentions, right? Nothing is excluded.

OL = a + BOS + e

And in this model OS is statistically significant?
 
#6
What is meant by 5 "dimensions" of satisfaction? Are they simply 5 different metrics/variables that each attempt to measure satisfaction in another way (to predict loyalty)?

P.S.: Maybe if its a common expression used in statistics, Im just not used to the expression, b/c my first language is German..

@Akavall: The test you suggested..is that one called a RESET-test (trying to think of my last stats class..)?
 
#9
Yes! Everything is like you wrote. So is it ok to leave it as it is?
This depends on what you are looking for. If you want to show that OS has an effect on OL (if we assume there is not endogeniety problem), than you can run a simple regression (OL = a + BOS + e). But if you want to see how each dimension of satisfaction you can try something else. You can run auxiliary regression between the dimension variables, e.g. S1 = a +BS2 + e, then you could look at the r^2 from this regression, and if high around 0.9, than you have multicollinearity, that is high correlation between the independent makes t-statistic low, so your results are not statistically significant.

If you find that two variables are multicollinear, you can combine then into one, say, S1 + S2 = S6, and then regress OL on S3, S4, S5 and S6. Maybe they will show up statistically significant?

Of course, it is possible that only one variable is statistically significant and it is what is showing up in the simple regression.

Analytics, if you are talking about the multicollinearity test, I don't know what it's called.