Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Optimization algorithm (dog-leg trust-region) in Matlab and Python

I'm trying to solve a set of nonlinear equations using the dog-leg trust-region algorithm in Matlab and Python.

In Matlab there is fsolve where this algorithm is the default, whereas for Python we specify 'dogleg' in scipy.optimize.minimize. I won't need to specify a Jacobian or Hessian for the Matlab whereas Python needs either one to solve the problem.

I don't have the Jacobian/Hessian so is there a way around this issue for Python? Or is there another function that performs the equivalent of Matlab's dog-leg method in fsolve?

like image 252
Medulla Oblongata Avatar asked Mar 20 '26 11:03

Medulla Oblongata


1 Answers

In newer versions of scipy there is the approx_fprime function. It computes a numerical approximation of the jacobian of function f at position xk using the foward step finite difference. It returns an ndarray with the partial derivate of f at positions xk.

If you can't upgrade your version of scipy, you can always copy the implementation from scipy's source.


Edit:

scipy.optimize.minimize calls approx_fprime internally if the input jac=False. So in your case, it should be enough to do the following:

scipy.optimize.minimize(fun, x0, args, method='dogleg', jac=False)

Edit

scipy does not seem to handle the jac=False condition properly so it is necessary to build a callable jac using approx_fprime as follows

jac = lambda x,*args: scipy.optimize.approx_fprime(x,fun,epsilon,*args)
scipy.optimize.minimize(fun, x0, args, method='dogleg', jac=jac)
like image 68
lucianopaz Avatar answered Mar 22 '26 23:03

lucianopaz



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!