Free Python optimization framework

Monday, July 30, 2007

ticket 285: ready to close

rev. 3209 contains changes to optimize.brent (I implemented the brent
class in the way proposed by Alan Isaac + 4 tests related to brent were

So I guess the ticket 285 should be closed.
Also, I have changed some docstrings related to brent and some other
funcs ():

If bracket is two numbers *(a,c)* then they are
assumed to be a starting interval for a downhill bracket search
(see bracket);* it doesn't always mean that obtained solution will
satisfy a<=x<=c*.

(As for me some weeks ago I was surprised to obtain solution from
outside of the (a,c) interval, MATLAB has only one func fminbound that
yields solution strictly from the given interval, but some
scipy.optimize line-search routines use strictly the interval and some
other, like brent, use only as starting interval for bracket search, and
this is not properly descriebed in documentation).

Thursday, July 26, 2007

more tickets to be closed

So some more tickets are ready:
(optimize.fmin_powell doesn't accept a matrix input for the initial guess)
rev 3199
(MATRIXC2F transposing the wrong way in optimize.leastsq)
I will commit my patch if nothing better will be proposed till some hours

One more change in tnc - now return x value (optim point) is numpy.array, not Python list.

Also, now it consumes x0 as numpy.array, not Python list
(so should be closed)

About a day was spent for a bug related to fmin_ncg, but finally it turned out to be related to sparse matrices.

Now I'm thinking about (bracket parameters). I intend to discuss my suggestions in scipy dev mailing list tomorrow.

Tuesday, July 24, 2007

tnc 1.3 connected to scipy

so tnc 1.3 (ticket296) is available in scipy svn rev. 3185
Maybe, it's worth to connect the one to openopt (i.e. it should be available in openopt syntax provided scipy is installed).
Maybe, the same should be done to fmin_cobyla and l_bfgs_b.
These fruits are rather low-hanging.

Monday, July 23, 2007

scipy optimize tickets

So, scipy dev team asked me to close some tickets this week.
For now the tickets ready to close are:

now I'm working on the ticket
(connecting tnc 1.3). It require some more time.

Seems like the same changes should be done each time the new tnc version appear, for example handling of lower and upper bounds differs in tnc native interface and it scipy version. I think it would be much more better, has tnc single interface with single svn home directory.

Thursday, July 5, 2007

f, c, h patterns: progress is quite limited and slow

I have understood that it will take much more time that I previously planned (I hope one week will be enough).
Also, it will require using sparse matrices in some cases.
The patterns are implemented in OpenOpt for MATLAB, but in other, very limited, way, that is appropriate for non-smooth solvers (like UkrOpt ralg) only.

Would the patterns implementation be easy, all free NLP solvers like fmin_cobyla would use those ones.
Of course, Python allows to spend much less time for (patterns-related) code creation than for example Fortran, but this one is still rather significant.

Tuesday, July 3, 2007

Michigan State Formula Racing: using solver ralg from OpenOpt for MATLAB

The video of nonsmooth ralg solver work from OpenOpt for MATLAB/Octave, linked with Ansys, on a car design, is available at (~3.3MB)
DMX and SMX are the Maximum Displacement and the Maximum Von Mises stress respectfully
This file is also available here

see also: Michigan State Formula Racing website

Unified number of funcs evaluation

r.nFEvals, r.nGradEvals etc have been renamed to
r.nEvals.f, r.nEvals.df, r.nEvals.c, r.nEvals.dc, r.nEvals.h, r.nEvals.dh
the same will be done to 2nd derivatives (they are unimplemented for now)

Automatic gradient check: ready!

here's an example of usage:
let's consider the NLP problem that I published 2 days ago (see here), and let's add the following lines:
p.check.df = 1
p.check.dc = 1
p.check.dh = 1
#also you may add/modify these ones:
prob.diffInt = 1e-07# default value is 1e-7
prob.check.maxViolation = 1e-05 #default value is 1e-5

(diffInt is ised in numerical gradient/subgradient obtaining;
lines where difference between user-supplied and OO-obtained (numerically) gradients are less than prob.check.maxViolation will not be shown)

Run this example and check your output with mine (that is attached in comment to the message)

Monday, July 2, 2007

Automatic gradient check

I have committed some code related to automatic gradient check in svn (+some more minor changes), but it doesn't work properly yet.
It will provide text output in a way, similar to OpenOpt for MATLAB/Octave, that is more convenient one than MATLAB7 fmincon do.
I hope tomorrow it will be ready.