dlr::optimization::OptimizerLM< Functor > Class Template Reference

OptimizerLM implements the Levenberg-Marquardt nonlinear least-squares minimization algorithm, as described in [1]. More...

#include <optimizerLM.h>

Inheritance diagram for dlr::optimization::OptimizerLM< Functor >:
[legend]
Collaboration diagram for dlr::optimization::OptimizerLM< Functor >:
[legend]

List of all members.

Public Types

typedef Functor::argument_type argument_type
 This is the Type of the objective function argument.
typedef Functor::result_type result_type
 This is the Type of the objective function return value.

Public Member Functions

 OptimizerLM ()
 The default constructor sets parameters to reasonable values for functions which take values and arguments in the "normal" range of 0 to 100 or so.
 OptimizerLM (const Functor &functor)
 This constructor specifies the specific Functor instance to use.
 OptimizerLM (const OptimizerLM &source)
 Copy constructor.
virtual ~OptimizerLM ()
 The destructor destroys the class instance and deallocates any associated storage.
virtual void setMinimumGradientMagnitude (double minimumGradientMagnitude)
 This method returns the number of function calls required to complete the previous minimization.
virtual void setParameters (double initialLambda=1.0, size_t maxIterations=40, double maxLambda=1.0E7, double minLambda=1.0E-13, double minError=0.0, double minimumGradientMagnitude=1.0E-5, double minDrop=1.0E-4, size_t strikes=3, int maxBackSteps=-1, int verbosity=0)
 This method sets minimization parameters.
virtual void setStartPoint (const typename Functor::argument_type &startPoint)
 This method sets the initial conditions for the minimization.
virtual void setVerbosity (int verbosity)
 This method sets the amount of text printed to the standard output during the optimization.
virtual OptimizerLMoperator= (const OptimizerLM &source)
 The assignment operator deep copies its argument.
Functor objectiveFunction ()
 This method returns a copy of the Functor instance used for optimization.
result_type optimalValue ()
 This method finds the optimum of the current Functor, if necessary, and returns the Functor value at that point.
argument_type optimum ()
 This method finds the optimum of the current Functor, if necessary, and returns the Functor argument which produces that optimum.
void setObjectiveFunction (const Functor &functor)
 This method specifies the Functor instance to use for the optimization.

Protected Member Functions

double gradientConvergenceMetric (const argument_type &theta, const result_type &value, const argument_type &gradient)
 This protected member function is used to asses whether the algorithm has reached convergence.
virtual std::pair< typename
Functor::argument_type,
typename Functor::result_type > 
run ()
 Perform the optimization.
virtual void verboseWrite (const char *message, int verbosity)
template<class Type >
void verboseWrite (const char *intro, const Type &subject, int verbosity)
virtual void setOptimum (const typename Functor::argument_type &optimum, const typename Functor::result_type &optimalValue, bool needsFurtherOptimization)
 This protected member function provides a way for subclasses to communicate intermediate optimization results outside of the normal "return value of this->run()" method.

Protected Attributes

double m_initialLambda
int m_maxBackSteps
size_t m_maxIterations
double m_maxLambda
double m_minDrop
double m_minError
double m_minGrad
double m_minLambda
argument_type m_startPoint
size_t m_strikes
int m_verbosity
Functor m_functor
 m_functor->operator()() should compute the objective function.
bool m_needsOptimization
 Set to false if m_optimum contains a valid optimum, true otherwise.
argument_type m_optimum
 Caches the result of the most recent optimization.
result_type m_optimalValue
 Caches the result of the most recent optimization.


Detailed Description

template<class Functor>
class dlr::optimization::OptimizerLM< Functor >

OptimizerLM implements the Levenberg-Marquardt nonlinear least-squares minimization algorithm, as described in [1].

This algorithm seeks the parameter value which minimizes the objective function. The template parameter (Functor) defines the type to use as the objective function of the minimization, and must support the GradientFunction interface.

[1] W. H. Press et al., Numerical Recipes in C The Art of Scientific Computing, Cambridge University Press, 1988.

Definition at line 41 of file optimizerLM.h.


Member Typedef Documentation

template<class Functor>
typedef Functor::argument_type dlr::optimization::OptimizerLM< Functor >::argument_type

This is the Type of the objective function argument.

Reimplemented from dlr::optimization::Optimizer< Functor >.

Definition at line 46 of file optimizerLM.h.

template<class Functor>
typedef Functor::result_type dlr::optimization::OptimizerLM< Functor >::result_type

This is the Type of the objective function return value.

Reimplemented from dlr::optimization::Optimizer< Functor >.

Definition at line 47 of file optimizerLM.h.


Constructor & Destructor Documentation

template<class Functor >
dlr::optimization::OptimizerLM< Functor >::OptimizerLM (  )  [inline]

The default constructor sets parameters to reasonable values for functions which take values and arguments in the "normal" range of 0 to 100 or so.

Definition at line 322 of file optimizerLM.h.

References dlr::optimization::OptimizerLM< Functor >::setParameters().

template<class Functor >
dlr::optimization::OptimizerLM< Functor >::OptimizerLM ( const Functor &  functor  )  [inline, explicit]

This constructor specifies the specific Functor instance to use.

Using this constructor exclusively avoids the danger of calling optimalValue() or optimum() before a Functor instance has been specified.

Parameters:
functor A copy of this argument will be stored internally for use in optimization.

Definition at line 342 of file optimizerLM.h.

References dlr::optimization::OptimizerLM< Functor >::setParameters().

template<class Functor >
dlr::optimization::OptimizerLM< Functor >::OptimizerLM ( const OptimizerLM< Functor > &  source  )  [inline]

Copy constructor.

This constructor deep copies its argument.

Parameters:
source The OptimizerLM instance to be copied.

Definition at line 362 of file optimizerLM.h.

References dlr::optimization::copyArgumentType(), and dlr::optimization::OptimizerLM< Functor >::m_startPoint.

template<class Functor >
dlr::optimization::OptimizerLM< Functor >::~OptimizerLM (  )  [inline, virtual]

The destructor destroys the class instance and deallocates any associated storage.

Definition at line 382 of file optimizerLM.h.


Member Function Documentation

template<class Functor>
double dlr::optimization::OptimizerLM< Functor >::gradientConvergenceMetric ( const argument_type theta,
const result_type value,
const argument_type gradient 
) [protected]

This protected member function is used to asses whether the algorithm has reached convergence.

Parameters:
theta This argument specifies the parameter values (arguments to the objective function) being assessed.
value This argument specifies the function value at the point described by theta.
gradient This argument specifies the function gradient at the point described by theta.
Returns:
The return value gets progressively smaller as we approach a local minimum.

Functor dlr::optimization::Optimizer< Functor >::objectiveFunction (  )  [inline, inherited]

This method returns a copy of the Functor instance used for optimization.

Returns:
A Functor instance.

Definition at line 91 of file optimizer.h.

References dlr::optimization::Optimizer< Functor >::m_functor.

template<class Functor >
OptimizerLM< Functor > & dlr::optimization::OptimizerLM< Functor >::operator= ( const OptimizerLM< Functor > &  source  )  [inline, virtual]

result_type dlr::optimization::Optimizer< Functor >::optimalValue (  )  [inherited]

This method finds the optimum of the current Functor, if necessary, and returns the Functor value at that point.

Note that you must have specified an objective function (Functor) before calling this method.

Returns:
The Functor value at it's optimum.

argument_type dlr::optimization::Optimizer< Functor >::optimum (  )  [inherited]

This method finds the optimum of the current Functor, if necessary, and returns the Functor argument which produces that optimum.

Note that you must have specified an objective function (Functor) before calling this method.

Returns:
The Functor arguments which produce the optimal value.

template<class Functor >
std::pair< typename Functor::argument_type, typename Functor::result_type > dlr::optimization::OptimizerLM< Functor >::run (  )  [inline, protected, virtual]

Perform the optimization.

This virtual function overrides the definition in Optimizer.

Returns:
A std::pair of the vector parameter which brings the specified Functor to an optimum, and the corresponding optimal Functor value.

Implements dlr::optimization::Optimizer< Functor >.

Definition at line 484 of file optimizerLM.h.

References dlr::optimization::copyArgumentType(), dlr::optimization::dotArgumentType(), and dlr::optimization::Optimizer< Functor >::m_functor.

template<class Functor >
void dlr::optimization::OptimizerLM< Functor >::setMinimumGradientMagnitude ( double  minimumGradientMagnitude  )  [inline, virtual]

This method returns the number of function calls required to complete the previous minimization.

If the minimization parameter "restarts" is 0, there will be only one element in the returned vector. If restarts is greater than 0, the first element of the return value will reflect the number of function calls in the initial minimization, and subsequent numbers will reflect the number of function calls in the following restarted minimizations. If a valid minimization has not been performed since the last update to startPoint, parameters, etc., then the return value will be an empty vector.

Returns:
a vector of function call counts. This method returns the number of gradient calls required to complete the previous minimization. If the minimization parameter "restarts" is 0, there will be only one element in the returned vector. If restarts is greater than 0, the first element of the return value will reflect the number of gradient calls in the initial minimization, and subsequent numbers will reflect the number of gradient calls in the following restarted minimizations. If a valid minimization has not been performed since the last update to startPoint, parameters, etc., then the return value will be an empty vector.

a vector of gradient call counts. This method returns the number of iterations required to complete the previous minimization. If the minimization parameter "restarts" is 0, there will be only one element in the returned vector. If restarts is greater than 0, the first element of the return value will reflect the number of iterations in the initial minimization, and subsequent numbers will reflect the number of iterations in the following restarted minimizations. If a valid minimization has not been performed since the last update to startPoint, parameters, etc., then the return value will be an empty vector.

a vector of iteration counts. This method sets one of the termination criteria of the optimization. Iteration will stop if the magnitude of the gradient of the objective function at the current location is less than the specified value.

Parameters:
minimumGradientMagnitude The value at which the magnitude of the objective function gradient should be considered small enough to terminate iteration.

Definition at line 393 of file optimizerLM.h.

void dlr::optimization::Optimizer< Functor >::setObjectiveFunction ( const Functor &  functor  )  [inherited]

This method specifies the Functor instance to use for the optimization.

If this function is overridden by the base class, it should normally either call Optimizer::setObjectiveFunction(), or explicitly set the member variable m_needsOptimization to true.

Parameters:
functor A copy of this argument will be stored internally for use in optimization.

virtual void dlr::optimization::Optimizer< Functor >::setOptimum ( const typename Functor ::argument_type &  optimum,
const typename Functor ::result_type &  optimalValue,
bool  needsFurtherOptimization 
) [inline, protected, virtual, inherited]

This protected member function provides a way for subclasses to communicate intermediate optimization results outside of the normal "return value of this->run()" method.

Parameters:
optimum This argument will be saved as the current optimum.
optimalValue This argument will be saved as the function value a the current optimum.
needsFurtherOptimization This argument indicates whether or not further refinement is necessary.

Definition at line 172 of file optimizer.h.

References dlr::optimization::Optimizer< Functor >::m_needsOptimization, dlr::optimization::Optimizer< Functor >::m_optimalValue, and dlr::optimization::Optimizer< Functor >::m_optimum.

Referenced by dlr::optimization::OptimizerBFGS< Functor >::doBfgs().

template<class Functor >
void dlr::optimization::OptimizerLM< Functor >::setParameters ( double  initialLambda = 1.0,
size_t  maxIterations = 40,
double  maxLambda = 1.0E7,
double  minLambda = 1.0E-13,
double  minError = 0.0,
double  minimumGradientMagnitude = 1.0E-5,
double  minDrop = 1.0E-4,
size_t  strikes = 3,
int  maxBackSteps = -1,
int  verbosity = 0 
) [inline, virtual]

This method sets minimization parameters.

Default values are reasonable for functions which take values and arguments in the "normal" range of 0 to 100 or so.

Parameters:
iterationLimit Each minimization will terminate after this many iterations.
numberOfRestarts Following successful termination, the minimization will be re-run this many times to refine the result accuracy. Generally you should leave this at its default value.
gradientTolerance Iteration will terminate when the magnitude of the gradient times the magnitude of the parameter vector becomes smaller than the function value by this factor.
argumentTolerance Iteration will terminate when a minimization step moves, along every axis, a distance less than this factor times the corresponding element of the argument vector.
numericEpsilon Sets the internal epsilon value of the gradient update routine.
maximumStepMagnitudeFactor Sets the maximum step distance for each line minimization in the algorithm.

Definition at line 402 of file optimizerLM.h.

Referenced by dlr::optimization::OptimizerLM< Functor >::OptimizerLM().

template<class Functor >
void dlr::optimization::OptimizerLM< Functor >::setStartPoint ( const typename Functor::argument_type &  startPoint  )  [inline, virtual]

This method sets the initial conditions for the minimization.

Gradient based search will start at this location in parameter space.

Parameters:
startPoint Indicates a point in the parameter space of the objective function.

Definition at line 440 of file optimizerLM.h.

References dlr::optimization::copyArgumentType().

template<class Functor>
virtual void dlr::optimization::OptimizerLM< Functor >::setVerbosity ( int  verbosity  )  [inline, virtual]

This method sets the amount of text printed to the standard output during the optimization.

Currently this method does nothing, since OptimizerBFGS never generates any standard output.

Parameters:
verbosity This argument indicates the desired output level. Setting verbosity to zero mean that no standard output should be generated. Higher numbers indicate increasingly more output.

Definition at line 218 of file optimizerLM.h.


Member Data Documentation

Functor dlr::optimization::Optimizer< Functor >::m_functor [protected, inherited]

bool dlr::optimization::Optimizer< Functor >::m_needsOptimization [protected, inherited]

result_type dlr::optimization::Optimizer< Functor >::m_optimalValue [protected, inherited]

Caches the result of the most recent optimization.

Definition at line 191 of file optimizer.h.

argument_type dlr::optimization::Optimizer< Functor >::m_optimum [protected, inherited]

Caches the result of the most recent optimization.

Definition at line 188 of file optimizer.h.


The documentation for this class was generated from the following file:

Generated on Wed Nov 25 00:57:06 2009 for dlrOptimization Utility Library by  doxygen 1.5.8