Hi!

I try to estimate 3 consecutive models using the "diff" command in Stata:

1. Diff model:
Code:
 diff `var', period(treatperiod) treated(treatment) robust
2. Diff model with covariates:
Code:
diff `var', period(treatperiod) treated(treatment) cov(`indepvars') robust
3. Diff model with covariates & kernel PSM:
Code:
diff `var', period(treatperiod) treated(treatment) cov(`indepvars') robust kernel id(idnum) support
I am not so sure if this is a problem regarding my coding or the basic understanding of this estimation, so I hope it is ok to ask this question:

My outcomes in the first model (which does not include other covariates) produce significant results. When I estimate it with the second model (including the covariates), its coefficient significantly reduces. Yet, as the treatment and control observations have certain differences, I use matching in the third model using kernel PSM, including the covariates. Yet, as far as I see, there are only slight (very minor) changes in the coefficients from model 1 to model 3, despite the significant and relatively huge changes from model 1 to model 3. It is also curious that the R-Square results significantly reduce as well.

To give a concrete example:
Model 1: B is 0.65, R2 is 0.04.
Model 2: B is 0.24, R2 is 0.54
Model 3: B is 0.66, R2 is 0.06.

Is it something that I miss in the Stata code, or am I missing something about these estimations?

Following is the key variables, if it is relevant:

Code:
* Example generated by -dataex-. To install: ssc install dataex
clear
input float idnum byte panelwave float treatperiod byte treatment float did
10009 1 0 . .
10009 2 1 0 0
10009 3 1 0 0
10010 1 0 0 0
10010 2 1 0 0
10010 3 1 0 0
10011 1 0 0 0
10011 2 1 1 1
10011 3 1 1 1
10012 1 0 1 0
10012 2 1 0 0
10012 3 1 0 0
10013 1 0 0 0
10013 2 1 0 0
10013 3 1 0 0
10014 1 0 0 0
10014 2 1 0 0
10014 3 1 0 0
10015 1 0 0 0
10015 2 1 0 0
10015 3 1 0 0
10018 1 0 0 0
10018 2 1 0 0
10018 3 1 0 0
10019 1 0 0 0
10019 2 1 0 0
10019 3 1 0 0
10038 1 0 0 0
10038 2 1 1 1
10038 3 1 1 1
10051 1 0 1 0
10051 2 1 0 0
10051 3 1 0 0
10052 1 0 0 0
10052 2 1 0 0
10052 3 1 0 0
10054 1 0 0 0
10054 2 1 0 0
10054 3 1 0 0
10055 1 0 0 0
10055 2 1 0 0
10055 3 1 0 0
10056 1 0 0 0
10056 2 1 0 0
10056 3 1 0 0
10057 1 0 0 0
10057 2 1 0 0
10057 3 1 0 0
10058 1 0 0 0
10058 2 1 0 0
10058 3 1 0 0
10061 1 0 0 0
10061 2 1 0 0
10061 3 1 0 0
10062 1 0 0 0
10062 2 1 0 0
10062 3 1 0 0
10065 1 0 0 0
10065 2 1 0 0
10065 3 1 0 0
10081 1 0 0 0
10081 2 1 0 0
10081 3 1 0 0
10083 1 0 0 0
10083 2 1 0 0
10083 3 1 0 0
10105 1 0 0 0
10105 2 1 0 0
10105 3 1 0 0
10106 1 0 0 0
10106 2 1 0 0
10106 3 1 0 0
10107 1 0 0 0
10107 2 1 0 0
10107 3 1 0 0
10108 1 0 0 0
10108 2 1 0 0
10108 3 1 0 0
10110 1 0 0 0
10110 2 1 0 0
10110 3 1 0 0
10111 1 0 0 0
10111 2 1 0 0
10111 3 1 0 0
10112 1 0 0 0
10112 2 1 0 0
10112 3 1 0 0
10114 1 0 0 0
10114 2 1 0 0
10114 3 1 0 0
10115 1 0 0 0
10115 2 1 0 0
10115 3 1 0 0
10118 1 0 0 0
10118 2 1 1 1
10118 3 1 1 1
10141 1 0 1 0
10141 2 1 0 0
10141 3 1 0 0
10142 1 0 0 0
end

All my best.