Release Notes¶
This section discusses changes between versions, especially significant changes to the use and behavior of the library. This is not meant to be a comprehensive list of changes. For such a complete record, consult the lmfit github repository.
Version 0.9.5 Release Notes¶
Support for Python 2.6 and scipy 0.13 has been dropped.
Version 0.9.4 Release Notes¶
Some support for the new least_squares routine from scipy 0.17 has been added.
Parameters can now be used directly in floating point or array expressions, so that the Parameter value does not need sigma = params[‘sigma’].value. The older, explicit usage still works, but the docs, samples, and tests have been updated to use the simpler usage.
Support for Python 2.6 and scipy 0.13 is now explicitly deprecated and wil be dropped in version 0.9.5.
Version 0.9.3 Release Notes¶
Models involving complex numbers have been improved.
The emcee module can now be used for uncertainty estimation.
Many bug fixes, and an important fix for performance slowdown on getting parameter values.
ASV benchmarking code added.
Version 0.9.0 Release Notes¶
This upgrade makes an important, non-backward-compatible change to the way many fitting scripts and programs will work. Scripts that work with version 0.8.3 will not work with version 0.9.0 and vice versa. The change was not made lightly or without ample discussion, and is really an improvement. Modifying scripts that did work with 0.8.3 to work with 0.9.0 is easy, but needs to be done.
Summary¶
The upgrade from 0.8.3 to 0.9.0 introduced the MinimizerResult
class (see MinimizerResult – the optimization result) which is now used to hold the return
value from minimize()
and Minimizer.minimize()
. This returned
object contains many goodness of fit statistics, and holds the optimized
parameters from the fit. Importantly, the parameters passed into
minimize()
and Minimizer.minimize()
are no longer modified by
the fit. Instead, a copy of the passed-in parameters is made which is
changed and returns as the params
attribute of the returned
MinimizerResult
.
Impact¶
This upgrade means that a script that does:
my_pars = Parameters()
my_pars.add('amp', value=300.0, min=0)
my_pars.add('center', value= 5.0, min=0, max=10)
my_pars.add('decay', value= 1.0, vary=False)
result = minimize(objfunc, my_pars)
will still work, but that my_pars
will NOT be changed by the fit.
Instead, my_pars
is copied to an internal set of parameters that is
changed in the fit, and this copy is then put in result.params
. To
look at fit results, use result.params
, not my_pars
.
This has the effect that my_pars
will still hold the starting parameter
values, while all of the results from the fit are held in the result
object returned by minimize()
.
If you want to do an initial fit, then refine that fit to, for example, do a pre-fit, then refine that result different fitting method, such as:
result1 = minimize(objfunc, my_pars, method='nelder')
result1.params['decay'].vary = True
result2 = minimize(objfunc, result1.params, method='leastsq')
and have access to all of the starting parameters my_pars
, the result of the
first fit result1
, and the result of the final fit result2
.
Discussion¶
The main goal for making this change were to
- give a better return value to
minimize()
andMinimizer.minimize()
that can hold all of the information about a fit. By having the return value be an instance of theMinimizerResult
class, it can hold an arbitrary amount of information that is easily accessed by attribute name, and even be given methods. Using objects is good! - To limit or even eliminate the amount of “state information” a
Minimizer
holds. By state information, we mean how much of the previous fit is remembered after a fit is done. Keeping (and especially using) such information about a previous fit means that aMinimizer
might give different results even for the same problem if run a second time. While it’s desirable to be able to adjust a set ofParameters
re-run a fit to get an improved result, doing this by changing an internal attribute (Minimizer.params
) has the undesirable side-effect of not being able to “go back”, and makes it somewhat cumbersome to keep track of changes made while adjusting parameters and re-running fits.