Good afternoon,
Suppose that my task is to obtain the residual vector in a multiple regression of Y on X and Z. Let's call this residual vector R. X and Z are two groups of variables.
With experimentation where X and Z contain each only one variable, it seems to me that the following algorithm achieves the task:
1. Regress Y on X, predict the residual vector R1, say.
2. Regress R1 on Z, predict the residual vector R2.
3. Regress R2 on X, predict the residual vector R3.
4. Regress R3 on Z, predict the residual vector R4
... repeat the procedure until the Residual Sum of Squares does not change between this and next iteration.
I did this manually with the auto data where Y is price, X is mpg and Z is headroom.
With each next iteration the residual was getting closer and closer to the residual from the joint regression of price on mpg and headroom.
My questions are:
a) Is it just a coincidence that I obtained convergence, or such an algorithm exists?
b) What is the name of this algorithm?
Related Posts with Sequential partialling out of variables in linear regression: Is there such an algorithm and how is it called?
Problem by making an ordered probit variableHello, I am using panel data, where my dataset includes 4 different target firm characteristics and…
Westerlund cointegration testI am attempting to estimate a panel co-integration. I converted 'country' to numeric using encode co…
Error in Stata 16Hello, I try to run "ssc new" in Stata 16 IC on Windows and get the following error: "file ssc_resu…
Replace variable if commandI have a set of European regions for many different years and I would like to create a variable whic…
Help with randomly scrambling the districtsDear all, I hope you are doing well. I wanted to run a placebo test where I rerun my main estimatio…
Subscribe to:
Post Comments (Atom)
0 Response to Sequential partialling out of variables in linear regression: Is there such an algorithm and how is it called?
Post a Comment