go home Home | Main Page | Modules | Namespace List | Class Hierarchy | Alphabetical List | Data Structures | File List | Namespace Members | Data Fields | Globals | Related Pages
Public Types | Public Member Functions | Static Public Member Functions | Protected Member Functions | Protected Attributes | Private Member Functions | Private Attributes
itk::AdaptiveStepsizeOptimizer Class Reference

#include <itkAdaptiveStepsizeOptimizer.h>

Detailed Description

This class implements a gradient descent optimizer with adaptive gain.

If $C(x)$ is a cost function that has to be minimized, the following iterative algorithm is used to find the optimal parameters $x$:

\[ x(k+1) = x(k) - a(t_k) dC/dx \]

The gain $a(t_k)$ at each iteration $k$ is defined by:

\[ a(t_k) =  a / (A + t_k + 1)^alpha \]

.

And the time $t_k$ is updated according to:

\[ t_{k+1} = [ t_k + sigmoid( -g_k^T g_{k-1} ) ]^+ \]

where $g_k$ equals $dC/dx$ at iteration $k$. For $t_0$ the InitialTime is used, which is defined in the the superclass (StandardGradientDescentOptimizer). Whereas in the superclass this parameter is superfluous, in this class it makes sense.

This method is described in the following references:

[1] P. Cruz, "Almost sure convergence and asymptotical normality of a generalization of Kesten's stochastic approximation algorithm for multidimensional case." Technical Report, 2005. http://hdl.handle.net/2052/74

[2] S. Klein, J.P.W. Pluim, and M. Staring, M.A. Viergever, "Adaptive stochastic gradient descent optimisation for image registration," International Journal of Computer Vision, vol. 81, no. 3, pp. 227-239, 2009. http://dx.doi.org/10.1007/s11263-008-0168-y

It is very suitable to be used in combination with a stochastic estimate of the gradient $dC/dx$. For example, in image registration problems it is often advantageous to compute the metric derivative ( $dC/dx$) on a new set of randomly selected image samples in each iteration. You may set the parameter NewSamplesEveryIteration to "true" to achieve this effect. For more information on this strategy, you may have a look at:

See also
VoxelWiseASGD, StandardGradientDescentOptimizer

Definition at line 71 of file itkAdaptiveStepsizeOptimizer.h.

+ Inheritance diagram for itk::AdaptiveStepsizeOptimizer:

Public Types

typedef SmartPointer< const SelfConstPointer
 
typedef Superclass::CostFunctionType CostFunctionType
 
typedef Superclass::DerivativeType DerivativeType
 
typedef Superclass::MeasureType MeasureType
 
typedef Superclass::ParametersType ParametersType
 
typedef SmartPointer< SelfPointer
 
typedef Superclass::ScaledCostFunctionPointer ScaledCostFunctionPointer
 
typedef Superclass::ScaledCostFunctionType ScaledCostFunctionType
 
typedef Superclass::ScalesType ScalesType
 
typedef AdaptiveStepsizeOptimizer Self
 
typedef Superclass::StopConditionType StopConditionType
 
typedef StandardGradientDescentOptimizer Superclass
 
- Public Types inherited from itk::StandardGradientDescentOptimizer
typedef SmartPointer< const SelfConstPointer
 
typedef Superclass::CostFunctionType CostFunctionType
 
typedef Superclass::DerivativeType DerivativeType
 
typedef Superclass::MeasureType MeasureType
 
typedef Superclass::ParametersType ParametersType
 
typedef SmartPointer< SelfPointer
 
typedef Superclass::ScaledCostFunctionPointer ScaledCostFunctionPointer
 
typedef Superclass::ScaledCostFunctionType ScaledCostFunctionType
 
typedef Superclass::ScalesType ScalesType
 
typedef StandardGradientDescentOptimizer Self
 
typedef Superclass::StopConditionType StopConditionType
 
typedef GradientDescentOptimizer2 Superclass
 
- Public Types inherited from itk::GradientDescentOptimizer2
typedef SmartPointer< const SelfConstPointer
 
typedef Superclass::CostFunctionType CostFunctionType
 
typedef Superclass::DerivativeType DerivativeType
 
typedef Superclass::MeasureType MeasureType
 
typedef Superclass::ParametersType ParametersType
 
typedef SmartPointer< SelfPointer
 
typedef Superclass::ScaledCostFunctionPointer ScaledCostFunctionPointer
 
typedef Superclass::ScaledCostFunctionType ScaledCostFunctionType
 
typedef Superclass::ScalesType ScalesType
 
typedef GradientDescentOptimizer2 Self
 
enum  StopConditionType { MaximumNumberOfIterations , MetricError , MinimumStepSize }
 
typedef ScaledSingleValuedNonLinearOptimizer Superclass
 
- Public Types inherited from itk::ScaledSingleValuedNonLinearOptimizer
typedef SmartPointer< const SelfConstPointer
 
typedef Superclass::CostFunctionType CostFunctionType
 
typedef Superclass::DerivativeType DerivativeType
 
typedef Superclass::MeasureType MeasureType
 
typedef Superclass::ParametersType ParametersType
 
typedef SmartPointer< SelfPointer
 
typedef ScaledCostFunctionType::Pointer ScaledCostFunctionPointer
 
typedef ScaledSingleValuedCostFunction ScaledCostFunctionType
 
typedef NonLinearOptimizer::ScalesType ScalesType
 
typedef ScaledSingleValuedNonLinearOptimizer Self
 
typedef SingleValuedNonLinearOptimizer Superclass
 

Public Member Functions

virtual const char * GetClassName () const
 
virtual const ParametersTypeGetPreconditionVector ()
 
virtual double GetSigmoidMax () const
 
virtual double GetSigmoidMin () const
 
virtual double GetSigmoidScale () const
 
virtual bool GetUseAdaptiveStepSizes () const
 
virtual void SetSigmoidMax (double _arg)
 
virtual void SetSigmoidMin (double _arg)
 
virtual void SetSigmoidScale (double _arg)
 
virtual void SetUseAdaptiveStepSizes (bool _arg)
 
- Public Member Functions inherited from itk::StandardGradientDescentOptimizer
void AdvanceOneStep (void) override
 
virtual const char * GetClassName () const
 
virtual double GetCurrentTime () const
 
virtual double GetInitialTime () const
 
virtual double GetParam_a () const
 
virtual double GetParam_A () const
 
virtual double GetParam_alpha () const
 
virtual void ResetCurrentTimeToInitialTime (void)
 
virtual void SetInitialTime (double _arg)
 
virtual void SetParam_a (double _arg)
 
virtual void SetParam_A (double _arg)
 
virtual void SetParam_alpha (double _arg)
 
void StartOptimization (void) override
 
- Public Member Functions inherited from itk::GradientDescentOptimizer2
virtual void AdvanceOneStep (void)
 
virtual const char * GetClassName () const
 
virtual unsigned int GetCurrentIteration () const
 
virtual const DerivativeTypeGetGradient ()
 
virtual const doubleGetLearningRate ()
 
virtual const unsigned long & GetNumberOfIterations ()
 
virtual const DerivativeTypeGetSearchDirection ()
 
virtual const StopConditionTypeGetStopCondition ()
 
virtual const doubleGetValue ()
 
virtual void MetricErrorResponse (ExceptionObject &err)
 
virtual void ResumeOptimization (void)
 
virtual void SetLearningRate (double _arg)
 
virtual void SetNumberOfIterations (unsigned long _arg)
 
virtual void SetUseOpenMP (bool _arg)
 
void StartOptimization (void) override
 
virtual void StopOptimization (void)
 
- Public Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer
virtual const char * GetClassName () const
 
const ParametersTypeGetCurrentPosition (void) const override
 
virtual bool GetMaximize () const
 
virtual const ScaledCostFunctionTypeGetScaledCostFunction ()
 
virtual const ParametersTypeGetScaledCurrentPosition ()
 
bool GetUseScales (void) const
 
virtual void InitializeScales (void)
 
virtual void MaximizeOff ()
 
virtual void MaximizeOn ()
 
void SetCostFunction (CostFunctionType *costFunction) override
 
virtual void SetMaximize (bool _arg)
 
virtual void SetUseScales (bool arg)
 

Static Public Member Functions

static Pointer New ()
 
- Static Public Member Functions inherited from itk::StandardGradientDescentOptimizer
static Pointer New ()
 
- Static Public Member Functions inherited from itk::GradientDescentOptimizer2
static Pointer New ()
 
- Static Public Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer
static Pointer New ()
 

Protected Member Functions

 AdaptiveStepsizeOptimizer ()
 
void UpdateCurrentTime (void) override
 
 ~AdaptiveStepsizeOptimizer () override
 
- Protected Member Functions inherited from itk::StandardGradientDescentOptimizer
virtual double Compute_a (double k) const
 
 StandardGradientDescentOptimizer ()
 
virtual void UpdateCurrentTime (void)
 
 ~StandardGradientDescentOptimizer () override
 
- Protected Member Functions inherited from itk::GradientDescentOptimizer2
 GradientDescentOptimizer2 ()
 
void PrintSelf (std::ostream &os, Indent indent) const override
 
 ~GradientDescentOptimizer2 () override
 
- Protected Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer
virtual void GetScaledDerivative (const ParametersType &parameters, DerivativeType &derivative) const
 
virtual MeasureType GetScaledValue (const ParametersType &parameters) const
 
virtual void GetScaledValueAndDerivative (const ParametersType &parameters, MeasureType &value, DerivativeType &derivative) const
 
void PrintSelf (std::ostream &os, Indent indent) const override
 
 ScaledSingleValuedNonLinearOptimizer ()
 
void SetCurrentPosition (const ParametersType &param) override
 
virtual void SetScaledCurrentPosition (const ParametersType &parameters)
 
 ~ScaledSingleValuedNonLinearOptimizer () override
 

Protected Attributes

ParametersType m_PreconditionVector
 
DerivativeType m_PreviousSearchDirection
 
std::string m_StepSizeStrategy
 
- Protected Attributes inherited from itk::StandardGradientDescentOptimizer
double m_CurrentTime
 
bool m_UseConstantStep
 
- Protected Attributes inherited from itk::GradientDescentOptimizer2
unsigned long m_CurrentIteration
 
DerivativeType m_Gradient
 
double m_LearningRate
 
unsigned long m_NumberOfIterations
 
DerivativeType m_SearchDirection
 
bool m_Stop
 
StopConditionType m_StopCondition
 
double m_Value
 
- Protected Attributes inherited from itk::ScaledSingleValuedNonLinearOptimizer
ScaledCostFunctionPointer m_ScaledCostFunction
 
ParametersType m_ScaledCurrentPosition
 

Private Member Functions

 AdaptiveStepsizeOptimizer (const Self &)
 
void operator= (const Self &)
 

Private Attributes

double m_SigmoidMax
 
double m_SigmoidMin
 
double m_SigmoidScale
 
bool m_UseAdaptiveStepSizes
 

Member Typedef Documentation

◆ ConstPointer

typedef SmartPointer< const Self > itk::AdaptiveStepsizeOptimizer::ConstPointer

Definition at line 80 of file itkAdaptiveStepsizeOptimizer.h.

◆ CostFunctionType

typedef Superclass::CostFunctionType itk::AdaptiveStepsizeOptimizer::CostFunctionType

Definition at line 93 of file itkAdaptiveStepsizeOptimizer.h.

◆ DerivativeType

Definition at line 92 of file itkAdaptiveStepsizeOptimizer.h.

◆ MeasureType

typedef Superclass::MeasureType itk::AdaptiveStepsizeOptimizer::MeasureType

Typedefs inherited from the superclass.

Definition at line 90 of file itkAdaptiveStepsizeOptimizer.h.

◆ ParametersType

Definition at line 91 of file itkAdaptiveStepsizeOptimizer.h.

◆ Pointer

Definition at line 79 of file itkAdaptiveStepsizeOptimizer.h.

◆ ScaledCostFunctionPointer

typedef Superclass::ScaledCostFunctionPointer itk::AdaptiveStepsizeOptimizer::ScaledCostFunctionPointer

Definition at line 96 of file itkAdaptiveStepsizeOptimizer.h.

◆ ScaledCostFunctionType

typedef Superclass::ScaledCostFunctionType itk::AdaptiveStepsizeOptimizer::ScaledCostFunctionType

Definition at line 95 of file itkAdaptiveStepsizeOptimizer.h.

◆ ScalesType

typedef Superclass::ScalesType itk::AdaptiveStepsizeOptimizer::ScalesType

Definition at line 94 of file itkAdaptiveStepsizeOptimizer.h.

◆ Self

Standard ITK.

Definition at line 77 of file itkAdaptiveStepsizeOptimizer.h.

◆ StopConditionType

typedef Superclass::StopConditionType itk::AdaptiveStepsizeOptimizer::StopConditionType

Definition at line 97 of file itkAdaptiveStepsizeOptimizer.h.

◆ Superclass

Definition at line 78 of file itkAdaptiveStepsizeOptimizer.h.

Constructor & Destructor Documentation

◆ AdaptiveStepsizeOptimizer() [1/2]

itk::AdaptiveStepsizeOptimizer::AdaptiveStepsizeOptimizer ( )
protected

◆ ~AdaptiveStepsizeOptimizer()

itk::AdaptiveStepsizeOptimizer::~AdaptiveStepsizeOptimizer ( )
inlineoverrideprotected

Definition at line 124 of file itkAdaptiveStepsizeOptimizer.h.

◆ AdaptiveStepsizeOptimizer() [2/2]

itk::AdaptiveStepsizeOptimizer::AdaptiveStepsizeOptimizer ( const Self )
private

Member Function Documentation

◆ GetClassName()

virtual const char * itk::AdaptiveStepsizeOptimizer::GetClassName ( ) const
virtual

Run-time type information (and related methods).

Reimplemented from itk::StandardGradientDescentOptimizer.

Reimplemented in elastix::AdaGrad< TElastix >.

◆ GetPreconditionVector()

virtual const ParametersType & itk::AdaptiveStepsizeOptimizer::GetPreconditionVector ( )
virtual

Get current gradient.

◆ GetSigmoidMax()

virtual double itk::AdaptiveStepsizeOptimizer::GetSigmoidMax ( ) const
virtual

◆ GetSigmoidMin()

virtual double itk::AdaptiveStepsizeOptimizer::GetSigmoidMin ( ) const
virtual

◆ GetSigmoidScale()

virtual double itk::AdaptiveStepsizeOptimizer::GetSigmoidScale ( ) const
virtual

◆ GetUseAdaptiveStepSizes()

virtual bool itk::AdaptiveStepsizeOptimizer::GetUseAdaptiveStepSizes ( ) const
virtual

◆ New()

static Pointer itk::AdaptiveStepsizeOptimizer::New ( )
static

Method for creation through the object factory.

◆ operator=()

void itk::AdaptiveStepsizeOptimizer::operator= ( const Self )
private

◆ SetSigmoidMax()

virtual void itk::AdaptiveStepsizeOptimizer::SetSigmoidMax ( double  _arg)
virtual

Set/Get the maximum of the sigmoid. Should be >0. Default: 1.0

◆ SetSigmoidMin()

virtual void itk::AdaptiveStepsizeOptimizer::SetSigmoidMin ( double  _arg)
virtual

Set/Get the maximum of the sigmoid. Should be <0. Default: -0.8

◆ SetSigmoidScale()

virtual void itk::AdaptiveStepsizeOptimizer::SetSigmoidScale ( double  _arg)
virtual

Set/Get the scaling of the sigmoid width. Large values cause a more wide sigmoid. Default: 1e-8. Should be >0.

◆ SetUseAdaptiveStepSizes()

virtual void itk::AdaptiveStepsizeOptimizer::SetUseAdaptiveStepSizes ( bool  _arg)
virtual

Set/Get whether the adaptive step size mechanism is desired. Default: true

◆ UpdateCurrentTime()

void itk::AdaptiveStepsizeOptimizer::UpdateCurrentTime ( void  )
overrideprotectedvirtual

Function to update the current time If UseAdaptiveStepSizes is false this function just increments the CurrentTime by $E_0 = (sigmoid_{max} + sigmoid_{min})/2$. Else, the CurrentTime is updated according to:
time = max[ 0, time + sigmoid( -gradient*previousgradient) ]
In that case, also the m_PreviousGradient is updated.

Reimplemented from itk::StandardGradientDescentOptimizer.

Field Documentation

◆ m_PreconditionVector

ParametersType itk::AdaptiveStepsizeOptimizer::m_PreconditionVector
protected

Definition at line 137 of file itkAdaptiveStepsizeOptimizer.h.

◆ m_PreviousSearchDirection

DerivativeType itk::AdaptiveStepsizeOptimizer::m_PreviousSearchDirection
protected

The PreviousGradient, necessary for the CruzAcceleration

Definition at line 136 of file itkAdaptiveStepsizeOptimizer.h.

◆ m_SigmoidMax

double itk::AdaptiveStepsizeOptimizer::m_SigmoidMax
private

Definition at line 147 of file itkAdaptiveStepsizeOptimizer.h.

◆ m_SigmoidMin

double itk::AdaptiveStepsizeOptimizer::m_SigmoidMin
private

Definition at line 148 of file itkAdaptiveStepsizeOptimizer.h.

◆ m_SigmoidScale

double itk::AdaptiveStepsizeOptimizer::m_SigmoidScale
private

Definition at line 149 of file itkAdaptiveStepsizeOptimizer.h.

◆ m_StepSizeStrategy

std::string itk::AdaptiveStepsizeOptimizer::m_StepSizeStrategy
protected

Definition at line 138 of file itkAdaptiveStepsizeOptimizer.h.

◆ m_UseAdaptiveStepSizes

bool itk::AdaptiveStepsizeOptimizer::m_UseAdaptiveStepSizes
private

Settings

Definition at line 146 of file itkAdaptiveStepsizeOptimizer.h.



Generated on 1667476801 for elastix by doxygen 1.9.4 elastix logo