Search results

  1. spunky

    Negative values in a Pattern Matrix after Exploratory Factor Analysis

    No. Corrected item-total correlations are.... well, that. You get the column of each item and correlate it with each sum score or average score or whatever way you're creating a 'score' in your scale. And they are "corrected" because the total score is calculated without the item you're...
  2. spunky

    Negative values in a Pattern Matrix after Exploratory Factor Analysis

    Care to elaborate on what this "Monte Carlo" is? Is this Horn's Parallel Analysis? This is weird because it honestly reads like a classic case of "oops, forgot to reverse code". If you run (corrected) item-total correlations treating each factor as an independent subscale, do you observe more...
  3. spunky

    Factor analysis with nested data

    Yes, you are violating a more general version of the assumption of heteroskedasticity. Because you have the same people responding to multiple surveys you have measurements (surveys) nested within respondents. For this kind of data you would need to perform multilevel factor analysis...
  4. spunky

    American Horror Story

    Is it good, though? Cuz I was a devout follower until Jessica Lange left. AHS-hotel with Lady GaGa just didn't do it for me :/
  5. spunky

    What is the best dip for tortilla chips?

    There is only one right answer to this question and that is GUAC. GUAC IS WACK.
  6. spunky

    relative impact of predictors

    I think I answered a question like this a while ago:
  7. spunky

    Hello from Washington, DC

    Hi! Welcome to our humble digital abode! We hope you'll have a great time here!
  8. spunky

    Frequentist v Bayesian

  9. spunky

    Frequentist v Bayesian

    I think in the main issue to keep in mind in the whole "Frequentist VS Bayesian" 2.0 debate (because it re-surfaces every time Bayes becomes popular) comes from the fact of how they conceptualize probability. For Frequentists, it's the long-run expected events / total sample size VS the Bayesian...
  10. spunky

    What does this F test nomeclature mean?

    It looks like the F-distribution from ANOVA (or regression, in this case) where the degrees of freedom of the model are the p predictors and the degrees of freedom of the error are sample size (n) - # of predictors (p)
  11. spunky

    Frequentist v Bayesian

    What exactly do you mean by this?
  12. spunky

    SPSS - MIXED Specifying covariate at level 2 or higher?

    SPSS (much like R, STATA and SAS) neither use nor recognize this "Level 1" vs "Level 2" language. That's a direct consequence of the Raudenbush and Bryk software and the whole notation/framework they developed for mixed-effects models. If you want to use SPSS to fit a mixed-effects model, you...
  13. spunky

    variance of simple linear regression coefficients
  14. spunky

    How can be used time series in sport?

    Because OLS regression assumes independence of observations over time and you don't have that type of data.
  15. spunky

    Weighted Cohen Kappa (R)

    Dason once offered this as an all-purpose, code-based solution to inference and stuff runif(1,0,1) Perhaps this may be of use to you now? :D:p:D:p:D:p:D:p:D:p
  16. spunky

    Weighted Cohen Kappa (R)

    Found this in the documentation: "(Note, that the vector interface can not be used together with weights)" So I guess you're stuck using the matrix of weights as opposed to the default calculations it does? I looked at the documentation and the original article by Fleiss, Cohen & Everitt...
  17. spunky

    What is the difference between sample algorithms and algoritms for estimating parameters?

    Well, you can still use MCMC to estimate parameters by doing something on the posterior distribution after you've run all your chains. On the (I'll admit, limited) work I did on this for my MA thesis, I used a type of MCMC (Gibbs sampler) to get a posterior distribution and then I calculated...
  18. spunky

    Logistic Reg Complete Seperation

    Booo! I mean... YAAY! :D
  19. spunky

    Logistic Reg Complete Seperation

    I thought separation is a property of the variables, not the subjects.
  20. spunky

    Logistic Reg Complete Seperation

    I think that is the case. Remember that the intercept in regression models is a function of the mean of the covariates and the regression coefficients. If one of them is Inf then any arithmetic function of that would result in an Inf too.