Recent content by stats20

  1. S

    should my model include separate predictors?

    Thanks for your help @hlsmith. To make it simple let's say I collected 300 data points of heart rate and 300 data points of blood pressure acquired at the same time for each person. Time point is in seconds, so 300 seconds of continuous recorded data. If my dat is: ID heart_rate...
  2. S

    should my model include separate predictors?

    Yes, a person has 300 data points in day1 and 300 data points in day2. It is time series data that I'm trying to model with a mixed model. ind day1 day2 time 1 90 141 1 1 113 123 2 1 122 134 3 1 86 112 4...
  3. S

    should my model include separate predictors?

    My actual data set has about 700 individuals (ind) and each individual has about 300 time points (time) for each of the two predictors (day1 and day2).
  4. S

    should my model include separate predictors?

    Thanks @hlsmith. So with mod1, I have the possibility to explain day1 by day2 (or the other way round), which I cannot do with mod2. But then what is the advantage of mod2? What does it answer that mod1 cannot? You mentioned above that "gives you predictions for time of day", but I thought for...
  5. S

    should my model include separate predictors?

    Thank you for your answer @hlsmith. Yes, I'd do this with lme4: mod1 <- lmer(day1 ~ day2 + 1|ind, data=dat) and mod2 <- lmer(value ~ day + 1|ind, data=dat2). So that means fitting either mod1 or mod2 is correct depending on the question. But what question would mod1 and mod2 answer? You mention...
  6. S

    question regarding grouping several variables

    Hi obh, A follow-up question, what if I use a mixed model (eg, lme4::lmer) for mod2 to account for the fact the data come from the same ind? Would it still be incorrect?
  7. S

    should my model include separate predictors?

    Let's say I measured blood pressure on day1 and day2 three times a day (morning, afternoon and evening). dat <- data.frame(ind=c(1,1,1,2,2,2,3,3,3,4,4,4), day1=c(90,113,122,86,84,95,114,126,123,115,92,103), day2=c(141,123,134,112,112,115,92,100,121,133,124,89)...
  8. S

    regression coefficient as average effect

    In OLS, given the regression equation y = B0 + B1X, why do I often read that B1 represents the average effect of a predictor? I don't get that. For example, data <- data.frame(sex=c("male","female","male","female","male","female","male","female"), DV=c(22,32,34,16,66,34,77,23)) The average...
  9. S

    error term in linear regression

    Let's say I have a simple model like this: yi ~ beta1*xi + errori > dat <- data.frame(y=c(10,20,30,40),x=c(1,2,5,8)) > m <- lm(y~x,data=dat) summary(m) gives me this information Residuals: 1 2 3 4 -3 3 1 -1 Coefficients: Estimate Std. Error t value Pr(>|t|)...
  10. S

    orthogonalization question

    Thanks @noetsi, that makes sense. However, now that I think about it a bit more, when we include predictors in the model, say IV1 and IV2, isn't, for example, IV2 partialled out in relation to IV1, such that IV1 reflects unique contribution? How is it possible that variables that are included in...
  11. S

    help in setting up model with confounding variables

    @hlsmith, sorry that was supposed to be R-like syntax. DV is dependent variable, IV independent variable and COV the confounding variable. say my data frame (df) looks like: DV IV1 IV2 COV 43 x a 21 11 x b 32 53 y a 32 44 y b 12...
  12. S

    help in setting up model with confounding variables

    Hi! if I have a multiple regression model with two IVs such as DV ~ IV1 + IV2 + IV1:IV2, what is the appropriate way to include a confounding variables (COV) in the model? DV ~ IV1 + IV2 + IV1:IV2 + COV or do I need to specify all interactions with the COV as well? DV ~ IV1 + IV2 + COV +...
  13. S

    orthogonalization question

    Hi, is it a good idea to orthogonalize regressors in a regression model if they happen to be correlated? What are the cons and pros of orthogonalization? Thanks!
  14. S

    best fitting lines in PCA

    Hi, I'm learning about PCA and I don't get the point that PC2 is the next best fitting line after PC1. Wouldn't the second best fitting line be a line which is close to PC1? This second line will not be orthogonal to PC1 (which I'm aware is required) but it's likely to have a sum of squares of...
  15. S

    orthogonal lines

    Hi! I'm trying to learn PCA and am struggling with the concept of orthogonal lines, and why they need to be orthogonal. Is it correct to say that if two lines are orthogonal to each other then they are uncorrelated, independent or both? Thanks!