Free Python optimization framework

Tuesday, November 20, 2007

an example of OO NLP solvers benchmark

Yesterday I have got a letter from Nils Wagner that having OO NLP solvers benchmark would be useful.
So I have committed my file that can be use for your own purposes, modifying the one for your problems and solvers. Let me note that some problems with matplotlib still exist.

Below you can observe a graphic output from the example (also available from openopt/examples/, after svn update). Let me note that usually ALGENCAN works much better that cobyla or lincher. Also, turning stop criteria is very important (for ALGENCAN you should try to set gradtol to 1e-1...1e-6).

Let me also note that current OO-ALGENCAN connection suffers the following drawback: constraints always are calculating, while ALGENCAN uses only active ones. Hence, if user supply constraints c as a sequence (c1, c2, ..., cm) calculations can be speed up. However, it's not implemented in OO-ALGENCAN connection yet.

Please don't forget that cobyla can't handle user-supplied 1st derivatives, so time, cputime, counter will differ significantly (vs lincher and ALGENCAN) if you'll provide the ones.

Also, I intend to provide constraints for ralg soon; maybe, then I'll publish the example with ralg as well.

Monday, November 19, 2007

another one NLSP solver: nssolve for nonsmooth and noisy funcs

New NLSP solver is available: nssolve, designed primarily for non-smooth and noisy funcs (scipy_fsolve usually is much more better with smooth funcs than nssolve).
This one is based on minimizing max residual via a NSP solver (default: ralg), can handle user-supplied gradient/subgradient. If the one is not available - splitting equations to separate functions is recommended (to speedup nssolve calculations):
f = [func1, func2, ...] or f = (func1, func2, ...)
(not
f = lambda x: ...
or
def f(x): ...)


OO native graphical output is available
(w/o using p.connectIterFcn(..) like it is implemented for scipy_fsolve. Just set p.plot = True or p.plot = 1):
Pictures like the one below are automatically generated by the py-file
This example shows results of nssolve vs scipy.optimize fsolve with a little numerical noise (1e-8) . Here both nssolve and fsolve obtain same gradients from OO Kernel with p.diffInt = 1e-7 (default). ns- can be interpreted as
NonSmooth
or NoiSy
or Naum Shor (Ukrainian academician, my teacher, r-algorithm (implemented in openopt ralg) inventor)

Currently the nssolve is much far from recent nonsmooth solve algorithms (even those ones from our department), but it's better than having nothing at all, and I intend to enhance both r-alg and nssolve from time to time.

Sunday, November 11, 2007

LSP (non-lin least squares) graphical output

example of LSP graphical output. Requires matplotlib (pylab) installed.
p = LSP(..., plot = True)
p.connectIterFcn('df')# no matter did you provide gradient or not
r = p.solve('scipy_leastsq')

NLSP graphic output

Now you can use NLSP (non-linear solve) graphic output:
p = NLSP(f, x0, plot=True, ...)
p.connectIterFcn('df') # scipy.optimize.fsolve has no native iter func that could be connected to OO
#optional: provide gradient df
#p.df = ...
r = p.solve('scipy_fsolve')
(for other NLSP solvers that cannot handle df you should use p.connectIterFcn('f'), however, this is still buggy, maybe I will fix the 'f' case in future).

Let me also remember that you can use p.xlabel = 'time' (default), 'cputime', 'nIter' (case-unsensetive). Old-style p.graphics.xlabel works as well.

The dotted line in the picture is log10(p.contol)

bugfix related to NLSP and LSP

fixed a bug related to NLSP (non-lin solve) and LSP (least squares), when user-supplied gradient was absent.

Saturday, November 10, 2007

one more NLP UC solver connected: scipy_powell

one more NLP UC solver: scipy.optimize.fmin_powell as scipy_powell
(cannot handle user-supplied gradient).

Thursday, November 8, 2007

ALGENCAN HPTYPE value

As I had mentioned, if an NLP/NSP solver can handle 1st derivatives, OpenOpt will always pass the ones (obtained numerically according to user-defined p.diffInt, default values for NLP and NSP may differ); elseware sometimes Python cycles significantly slow down calculations (more times to update counters, more checks to be done etc).

So some solvers can't know does 1st derivatives obtained via numerical approximation (inexactly) or analytically.
I tried to supply correct value of HPTYPE right now, but some difficulties were encountered. Then ALGENCAN developers had informed me, that in future ALGENCAN versions HPTYPE will be adjusted more automatically, depending on which funcs & derivatives does user provide. Still I hope they will implement possibility somehow (via a param) to inform the solver that the derivatives supplied are obtained numerically (it can affect on choosing algorithms involved).

Wednesday, November 7, 2007

graphic output: subplot for constraints

First of all let me remember that graphic output currently is available for some NLP and NSP solvers via p.plot=True (or p.plot=1).


Also, you can modify p.graphics.xlabel (or, with latest svn changes, p.xlabel as well): default is 'time', also possible 'cputime', 'nIter' (case-insensitive).
Some solvers (for example ALGENCAN) still don't have convenient ability for iter-to-iter data output via a func that could be connected to OO. So you should use (regardless does user provide gradient or not - if latter, OO finite-difference approximation will be used instead) p.connectIterFcn('df') (you can use other func as well, for example 'f' is good for scipy_fminbound, but bad choise can significantly slow your calculations). BTW matplotlib output of 2 subplots is significantly slower than single plot - I hope it will be either fixed in future matplotlib releases or your hardware is(/will be) adequate for your problems.

So, here's an example of ALGENCAN graphic output:


for unconstrained problems (as well as those solvers that have each iter point feasible, for example some box-bound solvers) graphic output remains same:Some minor problems still remain (plotting several solvers in single window or no Python control return till figure close), but I suspect they are due to pylab native bugs (I consider using MATLAB plotting tool was more convenient). I will try to fix(/avoid) that ones when I'll have enough time, I'm short of the one for now.
See also: NLSP (non-linear solve) graphical output.
See also: LSP (non-linear least squares) graphical output.

Monday, November 5, 2007

scipy fminbound connected

scipy.optimize.fminbound have been connected as scipy_fminbound (single variable problems only, requires finite lb-ub, ignores x0)

Thursday, November 1, 2007

another one NLP UC solver

scipy.optimize.fmin_cg (nonlinear conjugate gradient algorithm of Polak and Ribiere See Wright, and Nocedal 'Numerical Optimization', 1999, pg. 120-122) has been connected as scipy_cg.

+ some minor changes to text output of scipy_bfgs and scipy_ncg have been made.