However, i was wondering what is the theoretical validation with wanting high degrees of freedom (or conversely wanting a low number of parameters)?

Thanks

- Thread starter amsbam1
- Start date

However, i was wondering what is the theoretical validation with wanting high degrees of freedom (or conversely wanting a low number of parameters)?

Thanks

So would I be correct in saying that:

We are interested in maximising Degrees of Freedom because this minimises the 'Generalization error' and 'Type 1 error' for the model. This is due to more parsimonious models having a smaller standard error, and so being more likely to predict for out of sample data.

I'm a bit confused by your use of the word 'adequate'. We need to find the best/most predictive models, and we always are forced to choose between better fitting models with more parameters and weaking fitting ones with less parameters. In such, we need to justify why we are rejecting these better fitting models, as you can virtually always improve the fit by adding another parameter.

We need to find the best/most predictive models,

Another approach to parsimoney, and I think this is the one you are using, is that if two models have approximately the same predictive power than you chose the one with fewer variables. Commonly you do a chi square difference test between a more complex model (which has more variables) and a less complex one. Only if the result of the test is statistically signficant do you chose the more complex model.