Hi Everyone,
I estimate a linear model with OLS. I have two independent variables, x1 and x2. When I run regress y x1, I find that x1 is significant. When I run regress y x1 x2, x1 loses statistical significance as I expect since I argue that x2 matters for the changes in y not x1. Althoug the correlation between x1 and x2 is -0.25, (so there is no multicollinearity problem), I was suggested to orthogonalize two variables to make sure that actually x2 significantly explains y, not x1. When I orthognalize x2 and x1 (orthog x2 x1, gen (newx2 newx1)), t values remain almost same. When I orthognalize x1 and x2 (orthog x1 x2, gen (newx1_alt newx2_alt)), the coefficient on x1 becomes significant again. Can anyone help me how interpret these results? Should I use orthog here and if so, which order I should rely on? I read the help file but couldn't understand which one fits into my case better.
Thanks in advance.
Best,
Ulas
Related Posts with Orthogonalizing Two Variables in a Linear Regression
Difference in Fixed Effect and Random EffectsHello! I am trying to understand the different Panel Data models and I am getting confused by the di…
Moran scatter plot for different periodsHi, I have panel data from 1990-2017. I can use spatwmat command to create a weight matrix. And spa…
svy: logit goodness-of-fit tests (subpopulation)Hi, how can I use stat gof if I need to keep subpop to keep the correct estimation of variances and …
generate retirement ageHey all, I am using panel data and have a full population dataset. I can observe an individuel over…
Is meglm support for poisson mean difference?In the Multilevel model, meglm is responsible for many linkage functions and families, but not for t…
Subscribe to:
Post Comments (Atom)
0 Response to Orthogonalizing Two Variables in a Linear Regression
Post a Comment