## Class StochasticLevenbergMarquardt

• All Implemented Interfaces:
Serializable, Cloneable, StochasticOptimizer
Direct Known Subclasses:
StochasticLevenbergMarquardtAD

public abstract class StochasticLevenbergMarquardt
extends Object
implements Serializable, Cloneable, StochasticOptimizer
This class implements a stochastic Levenberg Marquardt non-linear least-squares fit algorithm.

The design avoids the need to define the objective function as a separate class. The objective function is defined by overriding a class method, see the sample code below.

The Levenberg-Marquardt solver is implemented in using multi-threading. The calculation of the derivatives (in case a specific implementation of setDerivatives(RandomVariable[] parameters, RandomVariable[][] derivatives) is not provided) may be performed in parallel by setting the parameter numberOfThreads.

To use the solver inherit from it and implement the objective function as setValues(RandomVariable[] parameters, RandomVariable[] values) where values has to be set to the value of the objective functions for the given parameters.
You may also provide an a derivative for your objective function by additionally overriding the function setDerivatives(RandomVariable[] parameters, RandomVariable[][] derivatives), otherwise the solver will calculate the derivative via finite differences.

To reject a point, it is allowed to set an element of values to Double.NaN in the implementation of setValues(RandomVariable[] parameters, RandomVariable[] values). Put differently: The solver handles NaN values in values as an error larger than the current one (regardless of the current error) and rejects the point.
Note, however, that is is an error if the initial parameter guess results in an NaN value. That is, the solver should be initialized with an initial parameter in an admissible region.

The following simple example finds a solution for the equation
 0.0 * x1 + 1.0 * x2 = 5.0 2.0 * x1 + 1.0 * x2 = 10.0
 
LevenbergMarquardt optimizer = new LevenbergMarquardt() {
// Override your objective function here
public void setValues(RandomVariable[] parameters, RandomVariable[] values) {
values[0] = parameters[0] * 0.0 + parameters[1];
values[1] = parameters[0] * 2.0 + parameters[1];
}
};

// Set solver parameters
optimizer.setInitialParameters(new RandomVariable[] { 0, 0 });
optimizer.setWeights(new RandomVariable[] { 1, 1 });
optimizer.setMaxIteration(100);
optimizer.setTargetValues(new RandomVariable[] { 5, 10 });

optimizer.run();

RandomVariable[] bestParameters = optimizer.getBestFitParameters();


See the example in the main method below.

The class can be initialized to use a multi-threaded valuation. If initialized this way the implementation of setValues must be thread-safe. The solver will evaluate the gradient of the value vector in parallel, i.e., use as many threads as the number of parameters.

Note: Iteration steps will be logged (java.util.logging) with LogLevel.FINE
Version:
1.6
Author:
Christian Fries
Serialized Form
• ### Nested Class Summary

Nested Classes
Modifier and Type Class Description
static class  StochasticLevenbergMarquardt.RegularizationMethod
The regularization method used to invert the approximation of the Hessian matrix.
• ### Nested classes/interfaces inherited from interface net.finmath.optimizer.StochasticOptimizer

StochasticOptimizer.ObjectiveFunction
• ### Constructor Summary

Constructors
Constructor Description
StochasticLevenbergMarquardt​(StochasticLevenbergMarquardt.RegularizationMethod regularizationMethod, RandomVariable[] initialParameters, RandomVariable[] targetValues, RandomVariable[] parameterSteps, int maxIteration, double errorTolerance, int numberOfThreads)
Create a Levenberg-Marquardt solver.
StochasticLevenbergMarquardt​(StochasticLevenbergMarquardt.RegularizationMethod regularizationMethod, RandomVariable[] initialParameters, RandomVariable[] targetValues, RandomVariable[] parameterSteps, int maxIteration, double errorTolerance, ExecutorService executorService)
Create a Levenberg-Marquardt solver.
StochasticLevenbergMarquardt​(RandomVariable[] initialParameters, RandomVariable[] targetValues, RandomVariable[] parameterSteps, int maxIteration, double errorTolerance, ExecutorService executorService)
Create a Levenberg-Marquardt solver.
• ### Method Summary

All Methods
Modifier and Type Method Description
StochasticLevenbergMarquardt clone()
Create a clone of this LevenbergMarquardt optimizer.
RandomVariable[] getBestFitParameters()
StochasticLevenbergMarquardt getCloneWithModifiedTargetValues​(List<RandomVariable> newTargetVaues, List<RandomVariable> newWeights, boolean isUseBestParametersAsInitialParameters)
Create a clone of this LevenbergMarquardt optimizer with a new vector for the target values and weights.
StochasticLevenbergMarquardt getCloneWithModifiedTargetValues​(RandomVariable[] newTargetVaues, RandomVariable[] newWeights, boolean isUseBestParametersAsInitialParameters)
Create a clone of this LevenbergMarquardt optimizer with a new vector for the target values and weights.
int getIterations()
Get the number of iterations.
double getLambda()
Get the parameter λ used in the Tikhonov-like regularization of the Hessian matrix, that is the $$\lambda$$ in $$H + \lambda \diag H$$.
double getLambdaDivisor()
Get the divisor applied to lambda (for the next iteration) if the inversion of regularized Hessian succeeds, that is, if $$H + \lambda \diag H$$ is invertable.
double getLambdaMultiplicator()
Get the multiplicator applied to lambda if the inversion of regularized Hessian fails, that is, if $$H + \lambda \diag H$$ is not invertable.
double getMeanSquaredError​(RandomVariable[] value)
double getRootMeanSquaredError()
static void main​(String[] args)
protected void prepareAndSetDerivatives​(RandomVariable[] parameters, RandomVariable[] values, RandomVariable[][] derivatives)
protected void prepareAndSetValues​(RandomVariable[] parameters, RandomVariable[] values)
void run()
Runs the optimization.
void setDerivatives​(RandomVariable[] parameters, RandomVariable[][] derivatives)
The derivative of the objective function.
void setErrorMeanSquaredCurrent​(double errorMeanSquaredCurrent)
void setLambda​(double lambda)
Set the parameter λ used in the Tikhonov-like regularization of the Hessian matrix, that is the $$\lambda$$ in $$H + \lambda \diag H$$.
void setLambdaDivisor​(double lambdaDivisor)
Set the divisor applied to lambda (for the next iteration) if the inversion of regularized Hessian succeeds, that is, if $$H + \lambda \diag H$$ is invertable.
void setLambdaMultiplicator​(double lambdaMultiplicator)
Set the multiplicator applied to lambda if the inversion of regularized Hessian fails, that is, if $$H + \lambda \diag H$$ is not invertable.
abstract void setValues​(RandomVariable[] parameters, RandomVariable[] values)
The objective function.
• ### Methods inherited from class java.lang.Object

equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
• ### Constructor Detail

• #### StochasticLevenbergMarquardt

public StochasticLevenbergMarquardt​(StochasticLevenbergMarquardt.RegularizationMethod regularizationMethod,
RandomVariable[] initialParameters,
RandomVariable[] targetValues,
RandomVariable[] parameterSteps,
int maxIteration,
double errorTolerance,
ExecutorService executorService)
Create a Levenberg-Marquardt solver.
Parameters:
regularizationMethod - The regularization method to use. See StochasticLevenbergMarquardt.RegularizationMethod.
initialParameters - Initial value for the parameters where the solver starts its search.
targetValues - Target values to achieve.
parameterSteps - Step used for finite difference approximation.
maxIteration - Maximum number of iterations.
errorTolerance - Error tolerance / accuracy.
executorService - Executor to be used for concurrent valuation of the derivatives. This is only performed if setDerivative is not overwritten. Warning: The implementation of setValues has to be thread safe!
• #### StochasticLevenbergMarquardt

public StochasticLevenbergMarquardt​(RandomVariable[] initialParameters,
RandomVariable[] targetValues,
RandomVariable[] parameterSteps,
int maxIteration,
double errorTolerance,
ExecutorService executorService)
Create a Levenberg-Marquardt solver.
Parameters:
initialParameters - Initial value for the parameters where the solver starts its search.
targetValues - Target values to achieve.
parameterSteps - Step used for finite difference approximation.
maxIteration - Maximum number of iterations.
errorTolerance - Error tolerance / accuracy.
executorService - Executor to be used for concurrent valuation of the derivatives. This is only performed if setDerivative is not overwritten. Warning: The implementation of setValues has to be thread safe!
• #### StochasticLevenbergMarquardt

public StochasticLevenbergMarquardt​(StochasticLevenbergMarquardt.RegularizationMethod regularizationMethod,
RandomVariable[] initialParameters,
RandomVariable[] targetValues,
RandomVariable[] parameterSteps,
int maxIteration,
double errorTolerance,
int numberOfThreads)
Create a Levenberg-Marquardt solver.
Parameters:
regularizationMethod - The regularization method to use. See StochasticLevenbergMarquardt.RegularizationMethod.
initialParameters - Initial value for the parameters where the solver starts its search.
targetValues - Target values to achieve.
parameterSteps - Step used for finite difference approximation.
maxIteration - Maximum number of iterations.
errorTolerance - Error tolerance / accuracy.
numberOfThreads - Maximum number of threads. Warning: If this number is larger than one, the implementation of setValues has to be thread safe!
• ### Method Detail

• #### main

public static void main​(String[] args)
throws SolverException
Throws:
SolverException
• #### getLambda

public double getLambda()
Get the parameter λ used in the Tikhonov-like regularization of the Hessian matrix, that is the $$\lambda$$ in $$H + \lambda \diag H$$.
Returns:
the parameter $$\lambda$$.
• #### setLambda

public void setLambda​(double lambda)
Set the parameter λ used in the Tikhonov-like regularization of the Hessian matrix, that is the $$\lambda$$ in $$H + \lambda \diag H$$.
Parameters:
lambda - the lambda to set
• #### getLambdaMultiplicator

public double getLambdaMultiplicator()
Get the multiplicator applied to lambda if the inversion of regularized Hessian fails, that is, if $$H + \lambda \diag H$$ is not invertable.
Returns:
the lambdaMultiplicator
• #### setLambdaMultiplicator

public void setLambdaMultiplicator​(double lambdaMultiplicator)
Set the multiplicator applied to lambda if the inversion of regularized Hessian fails, that is, if $$H + \lambda \diag H$$ is not invertable. This will make lambda larger, hence let the stepping move slower.
Parameters:
lambdaMultiplicator - the lambdaMultiplicator to set. Should be > 1.

public double getLambdaDivisor()
Get the divisor applied to lambda (for the next iteration) if the inversion of regularized Hessian succeeds, that is, if $$H + \lambda \diag H$$ is invertable.
Returns:

public void setLambdaDivisor​(double lambdaDivisor)
Set the divisor applied to lambda (for the next iteration) if the inversion of regularized Hessian succeeds, that is, if $$H + \lambda \diag H$$ is invertable. This will make lambda smaller, hence let the stepping move faster.
Parameters:
lambdaDivisor - the lambdaDivisor to set. Should be > 1.
• #### getBestFitParameters

public RandomVariable[] getBestFitParameters()
Description copied from interface: StochasticOptimizer
Specified by:
getBestFitParameters in interface StochasticOptimizer
Returns:
The best fit parameter.
• #### getRootMeanSquaredError

public double getRootMeanSquaredError()
Specified by:
getRootMeanSquaredError in interface StochasticOptimizer
Returns:
the the root mean square error achieved with the the best fit parameter
• #### setErrorMeanSquaredCurrent

public void setErrorMeanSquaredCurrent​(double errorMeanSquaredCurrent)
Parameters:
errorMeanSquaredCurrent - the errorMeanSquaredCurrent to set
• #### getIterations

public int getIterations()
Description copied from interface: StochasticOptimizer
Get the number of iterations.
Specified by:
getIterations in interface StochasticOptimizer
Returns:
The number of iterations required
• #### prepareAndSetValues

protected void prepareAndSetValues​(RandomVariable[] parameters,
RandomVariable[] values)
throws SolverException
Throws:
SolverException
• #### prepareAndSetDerivatives

protected void prepareAndSetDerivatives​(RandomVariable[] parameters,
RandomVariable[] values,
RandomVariable[][] derivatives)
throws SolverException
Throws:
SolverException
• #### setValues

public abstract void setValues​(RandomVariable[] parameters,
RandomVariable[] values)
throws SolverException
The objective function. Override this method to implement your custom function.
Parameters:
parameters - Input value. The parameter vector.
values - Output value. The vector of values f(i,parameters), i=1,...,n
Throws:
SolverException - Thrown if the valuation fails, specific cause may be available via the cause() method.
• #### setDerivatives

public void setDerivatives​(RandomVariable[] parameters,
RandomVariable[][] derivatives)
throws SolverException
The derivative of the objective function. You may override this method if you like to implement your own derivative.
Parameters:
parameters - Input value. The parameter vector.
derivatives - Output value, where derivatives[i][j] is d(value(j)) / d(parameters(i)
Throws:
SolverException - Thrown if the valuation fails, specific cause may be available via the cause() method.
• #### run

public void run()
throws SolverException
Description copied from interface: StochasticOptimizer
Runs the optimization.
Specified by:
run in interface StochasticOptimizer
Throws:
SolverException - Thrown if the valuation fails, specific cause may be available via the cause() method.
• #### getMeanSquaredError

public double getMeanSquaredError​(RandomVariable[] value)
• #### getCloneWithModifiedTargetValues

public StochasticLevenbergMarquardt getCloneWithModifiedTargetValues​(RandomVariable[] newTargetVaues,
RandomVariable[] newWeights,
boolean isUseBestParametersAsInitialParameters)
throws CloneNotSupportedException
Create a clone of this LevenbergMarquardt optimizer with a new vector for the target values and weights. The clone will use the same objective function than this implementation, i.e., the implementation of setValues(RandomVariable[], RandomVariable[]) and that of setDerivatives(RandomVariable[], RandomVariable[][]) is reused. The initial values of the cloned optimizer will either be the original initial values of this object or the best parameters obtained by this optimizer, the latter is used only if this optimized signals a done().
Parameters:
newTargetVaues - New array of target values.
newWeights - New array of weights.
isUseBestParametersAsInitialParameters - If true and this optimizer is done(), then the clone will use this.getBestFitParameters() as initial parameters.
Returns:
A new LevenbergMarquardt optimizer, cloning this one except modified target values and weights.
Throws:
CloneNotSupportedException - Thrown if this optimizer cannot be cloned.
• #### getCloneWithModifiedTargetValues

public StochasticLevenbergMarquardt getCloneWithModifiedTargetValues​(List<RandomVariable> newTargetVaues,
List<RandomVariable> newWeights,
boolean isUseBestParametersAsInitialParameters)
throws CloneNotSupportedException
Create a clone of this LevenbergMarquardt optimizer with a new vector for the target values and weights. The clone will use the same objective function than this implementation, i.e., the implementation of setValues(RandomVariable[], RandomVariable[]) and that of setDerivatives(RandomVariable[], RandomVariable[][]) is reused. The initial values of the cloned optimizer will either be the original initial values of this object or the best parameters obtained by this optimizer, the latter is used only if this optimized signals a done().
Parameters:
newTargetVaues - New list of target values.
newWeights - New list of weights.
isUseBestParametersAsInitialParameters - If true and this optimizer is done(), then the clone will use this.getBestFitParameters() as initial parameters.
Returns:
A new LevenbergMarquardt optimizer, cloning this one except modified target values and weights.
Throws:
CloneNotSupportedException - Thrown if this optimizer cannot be cloned.