Hi,
I have a methodological question about how optimize works with the Newton-Raphson algorithm.
At each iteration optimize updates the parameters according to the following formula:
new parameters = old parameters - lambda*(Hessian^-1)*gradient
At the beginning the scalar lambda is set equal to some constant lambda0, but if the objective function fails to improve, lambda is reduced.
However, if this does not work, after a while the initial lambda is actually increased and the search starts over.
I was wondering how the increase of lambda works. Is there a maximum number of iterations after which if decreasing lambda does not improve the objective function, the initial lambda is increased? Or is this modelled in a iterative way by moving from increasing/decresing lambda until the objective function is maximized? Any help would be greatly appreciated.
Thank you very much.
Simone
Related Posts with optimize_init_technique(S, "nr")
Dealing with Highly Collinear Independent VariablesDear Stata Members I have a panel data where my independent variables are highly COLLINEAR(Index1 t…
Cannot install estout packageHello everyone, I have been experiencing some issues while trying to install the estout package on m…
Case control matching with age, gender and BMIHi I am trying to match data by gender, age range +/- 5 years and BMI +/- 3. With the code below it …
Creating a new variable (that changes value in every 3 days....)I am working with survey dataset. It has 30 people (variable name = dlp, numeric) who start work in …
Trim and fill using metatrimI looked through other forum posts (post1, post2, post3, post4) and this question has been asked man…
Subscribe to:
Post Comments (Atom)
0 Response to optimize_init_technique(S, "nr")
Post a Comment