![]() |
Home | Main Page | Modules | Namespace List | Class Hierarchy | Alphabetical List | Data Structures | File List | Namespace Members | Data Fields | Globals | Related Pages |
#include <itkAdaptiveStochasticVarianceReducedGradientOptimizer.h>
This class implements a gradient descent optimizer with adaptive gain.
If is a cost function that has to be minimized, the following iterative algorithm is used to find the optimal parameters
:
The gain at each iteration
is defined by:
.
And the time is updated according to:
where equals
at iteration
. For
the InitialTime is used, which is defined in the the superclass (StandardGradientDescentOptimizer). Whereas in the superclass this parameter is superfluous, in this class it makes sense.
This method is described in the following references:
[1] P. Cruz, "Almost sure convergence and asymptotical normality of a generalization of Kesten's stochastic approximation algorithm for multidimensional case." Technical Report, 2005. http://hdl.handle.net/2052/74
[2] S. Klein, J.P.W. Pluim, and M. Staring, M.A. Viergever, "Adaptive stochastic gradient descent optimisation for image registration," International Journal of Computer Vision, vol. 81, no. 3, pp. 227-239, 2009. http://dx.doi.org/10.1007/s11263-008-0168-y It is very suitable to be used in combination with a stochastic estimate of the gradient . For example, in image registration problems it is often advantageous to compute the metric derivative (
) on a new set of randomly selected image samples in each iteration. You may set the parameter
NewSamplesEveryIteration
to "true"
to achieve this effect. For more information on this strategy, you may have a look at:
If is a costfunction that has to be minimised, the following iterative algorithm is used to find the optimal parameters
:
The gain at each iteration
is defined by:
.
And the time is updated according to:
where equals
at iteration
. For
the InitialTime is used, which is defined in the the superclass (StandardGradientDescentOptimizer). Whereas in the superclass this parameter is superfluous, in this class it makes sense.
This method is described in the following references:
[1] P. Cruz, "Almost sure convergence and asymptotical normality of a generalization of Kesten's stochastic approximation algorithm for multidimensional case." Technical Report, 2005. http://hdl.handle.net/2052/74
[2] S. Klein, J.P.W. Pluim, and M. Staring, M.A. Viergever, "Adaptive stochastic gradient descent optimisation for image registration," International Journal of Computer Vision, vol. 81, no. 3, pp. 227-239, 2009. http://dx.doi.org/10.1007/s11263-008-0168-y
It is very suitable to be used in combination with a stochastic estimate of the gradient . For example, in image registration problems it is often advantageous to compute the metric derivative (
) on a new set of randomly selected image samples in each iteration. You may set the parameter
NewSamplesEveryIteration
to "true"
to achieve this effect. For more information on this strategy, you may have a look at:
Definition at line 70 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
Public Member Functions | |
virtual const char * | GetClassName () const |
virtual double | GetSigmoidMax () const |
virtual double | GetSigmoidMin () const |
virtual double | GetSigmoidScale () const |
virtual bool | GetUseAdaptiveStepSizes () const |
virtual void | SetSigmoidMax (double _arg) |
virtual void | SetSigmoidMin (double _arg) |
virtual void | SetSigmoidScale (double _arg) |
virtual void | SetUseAdaptiveStepSizes (bool _arg) |
![]() | |
void | AdvanceOneStep (void) override |
virtual const char * | GetClassName () const |
virtual double | GetCurrentTime () const |
virtual double | GetInitialTime () const |
virtual double | GetParam_a () const |
virtual double | GetParam_A () const |
virtual double | GetParam_alpha () const |
virtual double | GetParam_beta () const |
virtual void | ResetCurrentTimeToInitialTime (void) |
virtual void | SetInitialTime (double _arg) |
virtual void | SetParam_a (double _arg) |
virtual void | SetParam_A (double _arg) |
virtual void | SetParam_alpha (double _arg) |
virtual void | SetParam_beta (double _arg) |
void | StartOptimization (void) override |
![]() | |
virtual void | AdvanceOneStep (void) |
virtual const char * | GetClassName () const |
virtual unsigned int | GetCurrentInnerIteration () const |
virtual unsigned int | GetCurrentIteration () const |
virtual const DerivativeType & | GetGradient () |
virtual unsigned int | GetLBFGSMemory () const |
virtual const double & | GetLearningRate () |
virtual const unsigned long & | GetNumberOfInnerIterations () |
virtual const unsigned long & | GetNumberOfIterations () |
virtual const DerivativeType & | GetPreviousGradient () |
virtual const ParametersType & | GetPreviousPosition () |
virtual const DerivativeType & | GetSearchDir () |
virtual const StopConditionType & | GetStopCondition () |
virtual const double & | GetValue () |
virtual void | MetricErrorResponse (ExceptionObject &err) |
virtual void | ResumeOptimization (void) |
virtual void | SetLearningRate (double _arg) |
virtual void | SetNumberOfIterations (unsigned long _arg) |
void | SetNumberOfWorkUnits (ThreadIdType numberOfThreads) |
virtual void | SetPreviousGradient (DerivativeType _arg) |
virtual void | SetPreviousPosition (ParametersType _arg) |
virtual void | SetUseEigen (bool _arg) |
virtual void | SetUseMultiThread (bool _arg) |
virtual void | SetUseOpenMP (bool _arg) |
void | StartOptimization (void) override |
virtual void | StopOptimization (void) |
![]() | |
virtual const char * | GetClassName () const |
const ParametersType & | GetCurrentPosition (void) const override |
virtual bool | GetMaximize () const |
virtual const ScaledCostFunctionType * | GetScaledCostFunction () |
virtual const ParametersType & | GetScaledCurrentPosition () |
bool | GetUseScales (void) const |
virtual void | InitializeScales (void) |
virtual void | MaximizeOff () |
virtual void | MaximizeOn () |
void | SetCostFunction (CostFunctionType *costFunction) override |
virtual void | SetMaximize (bool _arg) |
virtual void | SetUseScales (bool arg) |
Static Public Member Functions | |
static Pointer | New () |
![]() | |
static Pointer | New () |
![]() | |
static Pointer | New () |
![]() | |
static Pointer | New () |
Protected Attributes | |
DerivativeType | m_PreviousGradient |
![]() | |
double | m_CurrentTime |
bool | m_UseConstantStep |
![]() | |
unsigned long | m_CurrentInnerIteration |
unsigned long | m_CurrentIteration |
DerivativeType | m_Gradient |
unsigned long | m_LBFGSMemory |
double | m_LearningRate |
ParametersType | m_MeanSearchDir |
unsigned long | m_NumberOfInnerIterations |
unsigned long | m_NumberOfIterations |
DerivativeType | m_PreviousGradient |
ParametersType | m_PreviousPosition |
ParametersType | m_PreviousSearchDir |
ParametersType | m_SearchDir |
bool | m_Stop |
StopConditionType | m_StopCondition |
ThreaderType::Pointer | m_Threader |
double | m_Value |
![]() | |
ScaledCostFunctionPointer | m_ScaledCostFunction |
ParametersType | m_ScaledCurrentPosition |
Private Member Functions | |
AdaptiveStochasticVarianceReducedGradientOptimizer (const Self &) | |
void | operator= (const Self &) |
Private Attributes | |
double | m_SigmoidMax |
double | m_SigmoidMin |
double | m_SigmoidScale |
bool | m_UseAdaptiveStepSizes |
Additional Inherited Members | |
![]() | |
typedef itk::PlatformMultiThreader | ThreaderType |
typedef ThreaderType::WorkUnitInfo | ThreadInfoType |
typedef SmartPointer<const Self> itk::AdaptiveStochasticVarianceReducedGradientOptimizer::ConstPointer |
Definition at line 80 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
typedef Superclass::CostFunctionType itk::AdaptiveStochasticVarianceReducedGradientOptimizer::CostFunctionType |
Definition at line 93 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
typedef Superclass::DerivativeType itk::AdaptiveStochasticVarianceReducedGradientOptimizer::DerivativeType |
Definition at line 92 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
typedef Superclass::MeasureType itk::AdaptiveStochasticVarianceReducedGradientOptimizer::MeasureType |
Typedefs inherited from the superclass.
Definition at line 90 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
typedef Superclass::ParametersType itk::AdaptiveStochasticVarianceReducedGradientOptimizer::ParametersType |
Definition at line 91 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
typedef SmartPointer<Self> itk::AdaptiveStochasticVarianceReducedGradientOptimizer::Pointer |
Definition at line 79 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
typedef Superclass::ScaledCostFunctionPointer itk::AdaptiveStochasticVarianceReducedGradientOptimizer::ScaledCostFunctionPointer |
Definition at line 96 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
typedef Superclass::ScaledCostFunctionType itk::AdaptiveStochasticVarianceReducedGradientOptimizer::ScaledCostFunctionType |
Definition at line 95 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
typedef Superclass::ScalesType itk::AdaptiveStochasticVarianceReducedGradientOptimizer::ScalesType |
Definition at line 94 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
typedef AdaptiveStochasticVarianceReducedGradientOptimizer itk::AdaptiveStochasticVarianceReducedGradientOptimizer::Self |
Standard ITK.
Definition at line 76 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
typedef Superclass::StopConditionType itk::AdaptiveStochasticVarianceReducedGradientOptimizer::StopConditionType |
Definition at line 97 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
typedef StandardStochasticVarianceReducedGradientOptimizer itk::AdaptiveStochasticVarianceReducedGradientOptimizer::Superclass |
Definition at line 77 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
|
protected |
|
inlineoverrideprotected |
Definition at line 121 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
|
private |
|
virtual |
Run-time type information (and related methods).
Reimplemented from itk::StandardStochasticVarianceReducedGradientOptimizer.
Reimplemented in elastix::AdaptiveStochasticVarianceReducedGradient< TElastix >.
|
virtual |
|
virtual |
|
virtual |
|
virtual |
|
static |
Method for creation through the object factory.
|
private |
|
virtual |
Set/Get the maximum of the sigmoid. Should be >0. Default: 1.0
|
virtual |
Set/Get the maximum of the sigmoid. Should be <0. Default: -0.8
|
virtual |
Set/Get the scaling of the sigmoid width. Large values cause a more wide sigmoid. Default: 1e-8. Should be >0.
|
virtual |
Set/Get whether the adaptive step size mechanism is desired. Default: true
|
overrideprotectedvirtual |
Function to update the current time If UseAdaptiveStepSizes is false this function just increments the CurrentTime by . Else, the CurrentTime is updated according to:
time = max[ 0, time + sigmoid( -gradient*previousgradient) ]
In that case, also the m_PreviousGradient is updated.
Reimplemented from itk::StandardStochasticVarianceReducedGradientOptimizer.
|
protected |
The PreviousGradient, necessary for the CruzAcceleration
Definition at line 133 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
|
private |
Definition at line 142 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
|
private |
Definition at line 143 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
|
private |
Definition at line 144 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
|
private |
Settings
Definition at line 141 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
Generated on 1667476801 for elastix by ![]() |
![]() |