Hi,
I have a methodological question about how optimize works with the Newton-Raphson algorithm.
At each iteration optimize updates the parameters according to the following formula:
new parameters = old parameters - lambda*(Hessian^-1)*gradient
At the beginning the scalar lambda is set equal to some constant lambda0, but if the objective function fails to improve, lambda is reduced.
However, if this does not work, after a while the initial lambda is actually increased and the search starts over.
I was wondering how the increase of lambda works. Is there a maximum number of iterations after which if decreasing lambda does not improve the objective function, the initial lambda is increased? Or is this modelled in a iterative way by moving from increasing/decresing lambda until the objective function is maximized? Any help would be greatly appreciated.
Thank you very much.
Simone
Related Posts with optimize_init_technique(S, "nr")
Group-wise value assignmentHello Statalist, I hope everyone is fine. In my event study, I have event variable (EV) contain 110…
Configurate table with asdocHi all, I am using asdoc from SSC in Stata 13 and I am trying to configurate a little detail in a c…
Marginal contrasts with an interaction termHello, I'm attempting to do some post estimation on a model with an interaction term. I want to tes…
Having Problem with the "menbreg" commandHello everyone, My name is David. I am new here. I need your help to solve the command issues that …
Generating New Variables to Multiple Conditionshi all, I have troubles to make a stata code on generating a binary groups based on different condi…
Subscribe to:
Post Comments (Atom)
0 Response to optimize_init_technique(S, "nr")
Post a Comment