Dear all
We are having a journal club reading Diggle et all, Analysis of Longitudinal Data.
In there a parametric covariance structure is discussed having a sum of three covariance parts per subject:
  1. Random effect (my)
  2. Serial (auto)correlation (rho, thau)
  3. Measurement error (sigma)
The covariance per subject being: my2 * J + thau2 * H(rho) + sigma2 * I, where J is a matrix of ones, H(rho) is a matrix of autocorrelations and I is the diagonal matrix of ones.

One code example using mixed regression is (dataset attached):
Code:
cls
use milk.dta, clear
generate week_b1 = cond(week <= 3, week, 3)
mixed protein bn.diet c.week_b1, nocons ||cow:, residuals(ar 1, t(week)) reml nolog
The random effect parameters are:
Code:
------------------------------------------------------------------------------
  Random-effects Parameters  |   Estimate   Std. Err.     [95% Conf. Interval]
-----------------------------+------------------------------------------------
cow: Identity                |
                  var(_cons) |      0.020      0.006         0.011       0.034
-----------------------------+------------------------------------------------
Residual: AR(1)              |
                         rho |      0.548      0.030         0.486       0.605
                      var(e) |      0.076      0.005         0.067       0.086
------------------------------------------------------------------------------
It seems like random effect and serial correlation are merged into one in the above model.

Does anyone know if it is possible to have the three separate components in Stata?
If so how can it be done?

Looking forward to hear from you