Free Python optimization framework

Friday, October 10, 2008

Example of 10X speedup for NLP via oofun

I have committed the example of unconstrained NLP where using oofun yields 10X speedup vs classic style (via reducing of the time required for gradient approximation for a costly function).

Of course,
  • I could construct example where the profit is 100X or even more, but it's longer to wait for output.
  • There can be similar examples constructed for constrained NLP or some other classes from non-linear group (NSP, NLSP, LSP).
  • I used scipy_ncg solver but speedup ~ 4X...10X is observed while using any other non-linear solver as well (provided the solver deals with at least 1st derivatives, so scipy_cobyla, goldenSection, scipy_fminbound, scipy_powell or GLP solvers are inappropriate).
Also, some changes to oofun-, oovar-related and some other OO Kernel files have been committed.

No comments: