Hi Everyone,
I estimate a linear model with OLS. I have two independent variables, x1 and x2. When I run regress y x1, I find that x1 is significant. When I run regress y x1 x2, x1 loses statistical significance as I expect since I argue that x2 matters for the changes in y not x1. Althoug the correlation between x1 and x2 is -0.25, (so there is no multicollinearity problem), I was suggested to orthogonalize two variables to make sure that actually x2 significantly explains y, not x1. When I orthognalize x2 and x1 (orthog x2 x1, gen (newx2 newx1)), t values remain almost same. When I orthognalize x1 and x2 (orthog x1 x2, gen (newx1_alt newx2_alt)), the coefficient on x1 becomes significant again. Can anyone help me how interpret these results? Should I use orthog here and if so, which order I should rely on? I read the help file but couldn't understand which one fits into my case better.
Thanks in advance.
Best,
Ulas
Related Posts with Orthogonalizing Two Variables in a Linear Regression
Marginal effect after heckpoissonDear All, I am new to stata and I am using Heckpoisson, a poison regression model with endogenous s…
two stage IV approchbootstrap "repeated time values within panel" 2 SRI I'm trying to run 2sri approach(control functio…
Numbering occurrences of a flag within patients.In the example below I have created OVERLAP_FLAG, which is equal to 1 where two consecutive regimens…
working out expressions(sum and multiply) in stata with some missing observationsHi Nick Cox I want to add and multiply certain variables in stata in a new variable VaR. My expres…
Looping Over the numeric value of a variable and use it for the subsequent loop.Hi, thanks for your help in advance. I have the following dataset. And I come up with the following …
Subscribe to:
Post Comments (Atom)
0 Response to Orthogonalizing Two Variables in a Linear Regression
Post a Comment