I would like to ask you about two things for which I need your help:
- I am working with a multilevel model with three levels (region, firms, time) for which I am interested in determining if it is relevant to include the region as a level. I do what is usually done in the literature and do an empty model and check the icc, however, I have some problems with the command. First, when implementing the melogit command, it give the following error:
Code:
melogit y ||region_id: ||id: , or intpoints(30) Fitting fixed-effects model: Iteration 0: log likelihood = -25315.955 Iteration 1: log likelihood = -25274.806 Iteration 2: log likelihood = -25274.769 Iteration 3: log likelihood = -25274.769 Refining starting values: Grid node 0: log likelihood = -20408.483 Fitting full model: initial values not feasible r(1400);
Next, what I do is implementing the evaltype(gf0) into the command and then it converge:
Code:melogit y ||region_id: ||id: , or evaltype(gf0) intpoints(30) Integration method: mvaghermite Integration pts. = 30 Wald chi2(0) = . Log likelihood = -19512.201 Prob > chi2 = . ------------------------------------------------------------------------------ y | Odds Std. Err. z P>|z| [95% Conf. Interval] -------------+---------------------------------------------------------------- _cons | .0830499 .0114779 -18.00 0.000 .0633431 .1088879 -------------+---------------------------------------------------------------- region_id | var(_cons)| .2556345 .1207838 .1012601 .6453577 -------------+---------------------------------------------------------------- region_id>id | var(_cons)| 4.913874 .194306 4.547424 5.309852 ------------------------------------------------------------------------------ Note: Estimates are transformed only in the first equation. LR test vs. logistic model: chi2(2) = 11525.14 Prob > chi2 = 0.0000 Note: LR test is conservative and provided only for reference. . estat icc Intraclass correlation ------------------------------------------------------------------------------ Level | ICC Std. Err. [95% Conf. Interval] -----------------------------+------------------------------------------------ region_id | .0302191 .0138461 .0121931 .0729267 id|region_id | .611098 .0106392 .5900539 .6317361 ------------------------------------------------------------------------------
However, if I do the same estimation in GLLAMM (which I read once that is more rigorous, even though more time consuming) the estimation is different depending on using adaptive quadrative option, especially the icc as you can see next.
Code:gllamm y , i(id region_id) family(binomial) link(logit) nrf(1 1) eq(inter inter) nip(30) adapt eform gllamm model log likelihood = -19512.201 ------------------------------------------------------------------------------ y | exp(b) Std. Err. z P>|z| [95% Conf. Interval] -------------+---------------------------------------------------------------- _cons | .0830527 .0114777 -18.01 0.000 .063346 .10889 ------------------------------------------------------------------------------ Note: Estimates are transformed only in the first equation. Variances and covariances of random effects ------------------------------------------------------------------------------ ***level 2 (id) var(1): 4.9138833 (.19430896) ***level 3 (region_id) var(1): .25556668 (.12077743) ------------------------------------------------------------------------------ . end of do-file . di .25556668/(.25556668+4.9138833+3.29) .03021079 . gllamm y , i(id region_id) family(binomial) link(logit) nrf(1 1) eq(inter inter) nip(30) eform gllamm model log likelihood = -19564.722 ------------------------------------------------------------------------------ y | exp(b) Std. Err. z P>|z| [95% Conf. Interval] -------------+---------------------------------------------------------------- _cons | 4.878615 2.387978 3.24 0.001 1.869182 12.73332 ------------------------------------------------------------------------------ Note: Estimates are transformed only in the first equation. Variances and covariances of random effects ------------------------------------------------------------------------------ ***level 2 (id) var(1): 4.8848945 (.17888784) ***level 3 (region_id) var(1): 7.1880202 (1.8145096) ------------------------------------------------------------------------------ . di 7.1880202/(7.1880202+4.8848945+3.29) .46788128
My first question is what am I doing with the evaltype(gf0) option?
Why such a difference between the two estimation with GLLAMM?
Which estimation should I use, GLLAMM or melogit (with the evaltype(gf0) option) for determining how relevant is the regional context?
- Lets say that I am forced to use GLLAMM for such a model (in the hypothetical case that the command melogit give any other error). If I want the margins with the melogit command I would do the following:
Code:melogit y x1 x2##c.z1 ||region_id: ||id: , or intpoints(30) estat icc margins, dydx(x2) at(z1 = (0(5)15)) marginsplot margins, at(z1 = (0(5)15)) marginsplot
How to obtain the margins and plot them from a GLLAMM estimation?
Code:gllamm y x1 x2 x2Xz1, i(id region_id) family(binomial) link(logit) nrf(1 1) eq(inter inter) nip(30) adapt eform
Here you have the descriptive of the variable for if it help you (dataset is so huge to use dataex).
Code:
Variable | Obs Mean Std. Dev. Min Max -------------+--------------------------------------------------------- y | 47,411 .2249478 .4175523 0 1 x1 | 32,122 .3176639 .4655752 0 1 x2 | 47,306 .218133 .4129826 0 1 z1 | 98,165 5.869033 5.543298 0 16.89129 x2Xz1 | 47,306 1.285256 3.41989 0 16.89129
0 Response to GLLAMM or melogit?
Post a Comment