go home Home | Main Page | Modules | Namespace List | Class Hierarchy | Alphabetical List | Data Structures | File List | Namespace Members | Data Fields | Globals | Related Pages
Public Types | Public Member Functions | Static Public Member Functions | Protected Member Functions | Protected Attributes | Private Member Functions | Private Attributes
itk::FiniteDifferenceGradientDescentOptimizer Class Reference

#include <itkFiniteDifferenceGradientDescentOptimizer.h>

Detailed Description

An optimizer based on gradient descent ...

If $C(x)$ is a costfunction that has to be minimised, the following iterative algorithm is used to find the optimal parameters x:

\[ x(k+1)_j = x(k)_j - a(k) \left[ C(x(k)_j + c(k)) - C(x(k)_j - c(k)) \right] / 2c(k), \]

for all parameters $j$.

From this equation it is clear that it a gradient descent optimizer, using a finite difference approximation of the gradient.

The gain $a(k)$ at each iteration $k$ is defined by:

\[ a(k) =  a / (A + k + 1)^{\alpha}. \]

The perturbation size $c(k)$ at each iteration $k$ is defined by:

\[ c(k) =  c / (k + 1)^{\gamma}. \]

Note the similarities to the SimultaneousPerturbation optimizer and the StandardGradientDescent optimizer.

See also
FiniteDifferenceGradientDescent

Definition at line 55 of file itkFiniteDifferenceGradientDescentOptimizer.h.

+ Inheritance diagram for itk::FiniteDifferenceGradientDescentOptimizer:

Public Types

typedef SmartPointer< const SelfConstPointer
 
typedef SmartPointer< SelfPointer
 
typedef FiniteDifferenceGradientDescentOptimizer Self
 
enum  StopConditionType { MaximumNumberOfIterations , MetricError }
 
typedef ScaledSingleValuedNonLinearOptimizer Superclass
 
- Public Types inherited from itk::ScaledSingleValuedNonLinearOptimizer
typedef SmartPointer< const SelfConstPointer
 
typedef Superclass::CostFunctionType CostFunctionType
 
typedef Superclass::DerivativeType DerivativeType
 
typedef Superclass::MeasureType MeasureType
 
typedef Superclass::ParametersType ParametersType
 
typedef SmartPointer< SelfPointer
 
typedef ScaledCostFunctionType::Pointer ScaledCostFunctionPointer
 
typedef ScaledSingleValuedCostFunction ScaledCostFunctionType
 
typedef NonLinearOptimizer::ScalesType ScalesType
 
typedef ScaledSingleValuedNonLinearOptimizer Self
 
typedef SingleValuedNonLinearOptimizer Superclass
 

Public Member Functions

virtual void AdvanceOneStep (void)
 
virtual void ComputeCurrentValueOff ()
 
virtual void ComputeCurrentValueOn ()
 
virtual const char * GetClassName () const
 
virtual bool GetComputeCurrentValue () const
 
virtual unsigned long GetCurrentIteration () const
 
virtual double GetGradientMagnitude () const
 
virtual double GetLearningRate () const
 
virtual unsigned long GetNumberOfIterations () const
 
virtual double GetParam_a ()
 
virtual double GetParam_A ()
 
virtual double GetParam_alpha ()
 
virtual double GetParam_c ()
 
virtual double GetParam_gamma ()
 
virtual StopConditionType GetStopCondition () const
 
virtual double GetValue () const
 
void ResumeOptimization (void)
 
virtual void SetComputeCurrentValue (bool _arg)
 
virtual void SetNumberOfIterations (unsigned long _arg)
 
virtual void SetParam_a (double _arg)
 
virtual void SetParam_A (double _arg)
 
virtual void SetParam_alpha (double _arg)
 
virtual void SetParam_c (double _arg)
 
virtual void SetParam_gamma (double _arg)
 
void StartOptimization (void) override
 
void StopOptimization (void)
 
- Public Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer
virtual const char * GetClassName () const
 
const ParametersTypeGetCurrentPosition (void) const override
 
virtual bool GetMaximize () const
 
virtual const ScaledCostFunctionTypeGetScaledCostFunction ()
 
virtual const ParametersTypeGetScaledCurrentPosition ()
 
bool GetUseScales (void) const
 
virtual void InitializeScales (void)
 
virtual void MaximizeOff ()
 
virtual void MaximizeOn ()
 
void SetCostFunction (CostFunctionType *costFunction) override
 
virtual void SetMaximize (bool _arg)
 
virtual void SetUseScales (bool arg)
 

Static Public Member Functions

static Pointer New ()
 
- Static Public Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer
static Pointer New ()
 

Protected Member Functions

virtual double Compute_a (unsigned long k) const
 
virtual double Compute_c (unsigned long k) const
 
 FiniteDifferenceGradientDescentOptimizer ()
 
void PrintSelf (std::ostream &os, Indent indent) const override
 
 ~FiniteDifferenceGradientDescentOptimizer () override
 
- Protected Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer
virtual void GetScaledDerivative (const ParametersType &parameters, DerivativeType &derivative) const
 
virtual MeasureType GetScaledValue (const ParametersType &parameters) const
 
virtual void GetScaledValueAndDerivative (const ParametersType &parameters, MeasureType &value, DerivativeType &derivative) const
 
void PrintSelf (std::ostream &os, Indent indent) const override
 
 ScaledSingleValuedNonLinearOptimizer ()
 
void SetCurrentPosition (const ParametersType &param) override
 
virtual void SetScaledCurrentPosition (const ParametersType &parameters)
 
 ~ScaledSingleValuedNonLinearOptimizer () override
 

Protected Attributes

bool m_ComputeCurrentValue
 
DerivativeType m_Gradient
 
double m_GradientMagnitude
 
double m_LearningRate
 
- Protected Attributes inherited from itk::ScaledSingleValuedNonLinearOptimizer
ScaledCostFunctionPointer m_ScaledCostFunction
 
ParametersType m_ScaledCurrentPosition
 

Private Member Functions

 FiniteDifferenceGradientDescentOptimizer (const Self &)
 
void operator= (const Self &)
 

Private Attributes

unsigned long m_CurrentIteration
 
unsigned long m_NumberOfIterations
 
double m_Param_a
 
double m_Param_A
 
double m_Param_alpha
 
double m_Param_c
 
double m_Param_gamma
 
bool m_Stop
 
StopConditionType m_StopCondition
 
double m_Value
 

Member Typedef Documentation

◆ ConstPointer

◆ Pointer

◆ Self

Standard class typedefs.

Definition at line 61 of file itkFiniteDifferenceGradientDescentOptimizer.h.

◆ Superclass

Member Enumeration Documentation

◆ StopConditionType

Codes of stopping conditions

Enumerator
MaximumNumberOfIterations 
MetricError 

Definition at line 73 of file itkFiniteDifferenceGradientDescentOptimizer.h.

Constructor & Destructor Documentation

◆ FiniteDifferenceGradientDescentOptimizer() [1/2]

itk::FiniteDifferenceGradientDescentOptimizer::FiniteDifferenceGradientDescentOptimizer ( )
protected

◆ ~FiniteDifferenceGradientDescentOptimizer()

itk::FiniteDifferenceGradientDescentOptimizer::~FiniteDifferenceGradientDescentOptimizer ( )
inlineoverrideprotected

◆ FiniteDifferenceGradientDescentOptimizer() [2/2]

itk::FiniteDifferenceGradientDescentOptimizer::FiniteDifferenceGradientDescentOptimizer ( const Self )
private

Member Function Documentation

◆ AdvanceOneStep()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::AdvanceOneStep ( void  )
virtual

Advance one step following the gradient direction.

◆ Compute_a()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::Compute_a ( unsigned long  k) const
protectedvirtual

◆ Compute_c()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::Compute_c ( unsigned long  k) const
protectedvirtual

◆ ComputeCurrentValueOff()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::ComputeCurrentValueOff ( )
virtual

◆ ComputeCurrentValueOn()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::ComputeCurrentValueOn ( )
virtual

◆ GetClassName()

virtual const char * itk::FiniteDifferenceGradientDescentOptimizer::GetClassName ( ) const
virtual

Run-time type information (and related methods).

Reimplemented from itk::ScaledSingleValuedNonLinearOptimizer.

Reimplemented in elastix::FiniteDifferenceGradientDescent< TElastix >.

◆ GetComputeCurrentValue()

virtual bool itk::FiniteDifferenceGradientDescentOptimizer::GetComputeCurrentValue ( ) const
virtual

◆ GetCurrentIteration()

virtual unsigned long itk::FiniteDifferenceGradientDescentOptimizer::GetCurrentIteration ( ) const
virtual

Get the current iteration number.

◆ GetGradientMagnitude()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetGradientMagnitude ( ) const
virtual

Get the CurrentStepLength, GradientMagnitude and LearningRate (a_k)

◆ GetLearningRate()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetLearningRate ( ) const
virtual

◆ GetNumberOfIterations()

virtual unsigned long itk::FiniteDifferenceGradientDescentOptimizer::GetNumberOfIterations ( ) const
virtual

Get the number of iterations.

◆ GetParam_a()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetParam_a ( )
virtual

◆ GetParam_A()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetParam_A ( )
virtual

◆ GetParam_alpha()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetParam_alpha ( )
virtual

◆ GetParam_c()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetParam_c ( )
virtual

◆ GetParam_gamma()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetParam_gamma ( )
virtual

◆ GetStopCondition()

virtual StopConditionType itk::FiniteDifferenceGradientDescentOptimizer::GetStopCondition ( ) const
virtual

Get Stop condition.

◆ GetValue()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetValue ( ) const
virtual

Get the current value.

◆ New()

static Pointer itk::FiniteDifferenceGradientDescentOptimizer::New ( )
static

Method for creation through the object factory.

◆ operator=()

void itk::FiniteDifferenceGradientDescentOptimizer::operator= ( const Self )
private

◆ PrintSelf()

void itk::FiniteDifferenceGradientDescentOptimizer::PrintSelf ( std::ostream &  os,
Indent  indent 
) const
overrideprotected

PrintSelf method.

◆ ResumeOptimization()

void itk::FiniteDifferenceGradientDescentOptimizer::ResumeOptimization ( void  )

Resume previously stopped optimization with current parameters

See also
StopOptimization.

◆ SetComputeCurrentValue()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetComputeCurrentValue ( bool  _arg)
virtual

◆ SetNumberOfIterations()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetNumberOfIterations ( unsigned long  _arg)
virtual

Set the number of iterations.

◆ SetParam_a()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetParam_a ( double  _arg)
virtual

Set/Get a.

◆ SetParam_A()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetParam_A ( double  _arg)
virtual

Set/Get A.

◆ SetParam_alpha()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetParam_alpha ( double  _arg)
virtual

Set/Get alpha.

◆ SetParam_c()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetParam_c ( double  _arg)
virtual

Set/Get c.

◆ SetParam_gamma()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetParam_gamma ( double  _arg)
virtual

Set/Get gamma.

◆ StartOptimization()

void itk::FiniteDifferenceGradientDescentOptimizer::StartOptimization ( void  )
override

Start optimization.

◆ StopOptimization()

void itk::FiniteDifferenceGradientDescentOptimizer::StopOptimization ( void  )

Stop optimization.

See also
ResumeOptimization

Field Documentation

◆ m_ComputeCurrentValue

bool itk::FiniteDifferenceGradientDescentOptimizer::m_ComputeCurrentValue
protected

Boolean that says if the current value of the metric has to be computed. This is not necessary for optimisation; just nice for progress information.

Definition at line 153 of file itkFiniteDifferenceGradientDescentOptimizer.h.

◆ m_CurrentIteration

unsigned long itk::FiniteDifferenceGradientDescentOptimizer::m_CurrentIteration
private

◆ m_Gradient

DerivativeType itk::FiniteDifferenceGradientDescentOptimizer::m_Gradient
protected

◆ m_GradientMagnitude

double itk::FiniteDifferenceGradientDescentOptimizer::m_GradientMagnitude
protected

◆ m_LearningRate

double itk::FiniteDifferenceGradientDescentOptimizer::m_LearningRate
protected

◆ m_NumberOfIterations

unsigned long itk::FiniteDifferenceGradientDescentOptimizer::m_NumberOfIterations
private

◆ m_Param_a

double itk::FiniteDifferenceGradientDescentOptimizer::m_Param_a
private

Parameters, as described by Spall.

Definition at line 173 of file itkFiniteDifferenceGradientDescentOptimizer.h.

◆ m_Param_A

double itk::FiniteDifferenceGradientDescentOptimizer::m_Param_A
private

◆ m_Param_alpha

double itk::FiniteDifferenceGradientDescentOptimizer::m_Param_alpha
private

◆ m_Param_c

double itk::FiniteDifferenceGradientDescentOptimizer::m_Param_c
private

◆ m_Param_gamma

double itk::FiniteDifferenceGradientDescentOptimizer::m_Param_gamma
private

◆ m_Stop

bool itk::FiniteDifferenceGradientDescentOptimizer::m_Stop
private

Private member variables.

Definition at line 166 of file itkFiniteDifferenceGradientDescentOptimizer.h.

◆ m_StopCondition

StopConditionType itk::FiniteDifferenceGradientDescentOptimizer::m_StopCondition
private

◆ m_Value

double itk::FiniteDifferenceGradientDescentOptimizer::m_Value
private


Generated on 1667476801 for elastix by doxygen 1.9.4 elastix logo