Hi,
I have a methodological question about how optimize works with the Newton-Raphson algorithm.
At each iteration optimize updates the parameters according to the following formula:
new parameters = old parameters - lambda*(Hessian^-1)*gradient
At the beginning the scalar lambda is set equal to some constant lambda0, but if the objective function fails to improve, lambda is reduced.
However, if this does not work, after a while the initial lambda is actually increased and the search starts over.
I was wondering how the increase of lambda works. Is there a maximum number of iterations after which if decreasing lambda does not improve the objective function, the initial lambda is increased? Or is this modelled in a iterative way by moving from increasing/decresing lambda until the objective function is maximized? Any help would be greatly appreciated.
Thank you very much.
Simone
Related Posts with optimize_init_technique(S, "nr")
Cluster SE and Fixed Effects using mlogitDear Statalists, I do not have stata17 so I cannot use xtmlogit. The femlogit does not allow cluste…
Different combinations in each householdI have individual data set. With my dataset I want to create a living arrangement among elderly for …
Marginsplot for logistic regression?Hi guys Variables explained: splitvoter: 0= not split vote between two elections / 1= split vote b…
How to label variables taking last part of variable namesI have numerous variables. For example, I have variables var1, var3, var6, var30. And I want to labe…
Fitting NL model with strictly positive dependent variable!I have estimated a model of the following form using NL: y=exp(xB)*(P^{gamma}) * (I^{theta}) + epsi…
Subscribe to:
Post Comments (Atom)
0 Response to optimize_init_technique(S, "nr")
Post a Comment