Alexandra Shchukina Alexandra Shchukina - 1 year ago 157
Python Question

Scipy.optimize: how to restrict argument values

I'm trying to use

functions to find a global minimum of a complicated function with several arguments.
seems to do the job best of all, namely, the 'Nelder-Mead' method. However, it tends to go to the areas out of arguments' domain (to assign negative values to arguments that can only be positive) and thus returns an error in such cases. Is there a way to restrict the arguments' bounds within the
itself? Or maybe within other

I've found the following advice:

When the parameters fall out of the admissible range, return a wildly huge number (far from the data to be fitted). This will (hopefully) penalize this choice of parameters so much that
will settle on some other admissible set of parameters as optimal.

given in this previous answer, but the procedure will take a lot of computational time in my case.

Answer Source

The Nelder-Mead solver doesn't support constrained optimization, but there are several others that do.

TNC and L-BFGS-B both support only bound constraints (e.g. x[0] >= 0), which should be fine for your case. COBYLA and SLSQP are more flexible, supporting any combination of bounds, equality and inequality-based constraints.

You can find more detailed info about the solvers by looking at the docs for the standalone functions, e.g. scipy.optimize.fmin_slsqp for method='SLSQP'.

You can see my previous answer here for an example of constrained optimization using SLSQP.

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download