Odds ratio of 1.00 while the IV is statistically significant

I apologize in advance for the stupid question. I am new to statistics.

I got an odds ratio of 1.00 while the independent variable is statistically significant. How do I explain this?


Less is more. Stay pure. Stay poor.
Yes, please provide your software output this will clear up many questions. Those two values should align, so something is missing here.


Fortran must die
With enough power, millions of cases, you can get statistical significance for nearly anything. That says nothing about the effect size per se other than it is likely the true effect size in the population.

That said, a odds ratio of 1 is essentially a linear regression slope of zero so the results don't make a lot of sense. Your results are telling you there is no relationship between your X and your ability to predict Y.


TS Contributor
the lower bound on the CI can still exclude 1 with the point estimate and maybe even CI rounded to 1. This is the same as a p-value of .0000 because the program rounds rather than the value actually being 0.