Hi,
I have a methodological question about how optimize works with the Newton-Raphson algorithm.
At each iteration optimize updates the parameters according to the following formula:
new parameters = old parameters - lambda*(Hessian^-1)*gradient
At the beginning the scalar lambda is set equal to some constant lambda0, but if the objective function fails to improve, lambda is reduced.
However, if this does not work, after a while the initial lambda is actually increased and the search starts over.
I was wondering how the increase of lambda works. Is there a maximum number of iterations after which if decreasing lambda does not improve the objective function, the initial lambda is increased? Or is this modelled in a iterative way by moving from increasing/decresing lambda until the objective function is maximized? Any help would be greatly appreciated.
Thank you very much.
Simone
Related Posts with optimize_init_technique(S, "nr")
combomarginsplot and file#opts question about text over just one graphGreetings, I am not yet able to provide a dataex sample that runs what I'm doing (sorry - my ineffic…
mean of frequency of each valueI have a variable which is ID with 26000 observations, there are more than 2000 values, some are rep…
Merging panel data filesHi all, I'm using Stata 13 I have two panel datasets which I want to combine before performing panel…
error term calculationI run a panel model. I want to calculate the error terms associated with each observation while regr…
How to import JSON into StataHi, I am using Stata 17 and would need some help in importing JSON into Stata. This is my first tim…
Subscribe to:
Post Comments (Atom)
0 Response to optimize_init_technique(S, "nr")
Post a Comment