- All Implemented Interfaces:
Serializable
,Cloneable
,StochasticOptimizer
- Direct Known Subclasses:
StochasticLevenbergMarquardtAD
The design avoids the need to define the objective function as a separate class. The objective function is defined by overriding a class method, see the sample code below.
The Levenberg-Marquardt solver is implemented in using multi-threading.
The calculation of the derivatives (in case a specific implementation of
setDerivatives(RandomVariable[] parameters, RandomVariable[][] derivatives)
is not
provided) may be performed in parallel by setting the parameter numberOfThreads
.
To use the solver inherit from it and implement the objective function as
setValues(RandomVariable[] parameters, RandomVariable[] values)
where values has
to be set to the value of the objective functions for the given parameters.
You may also provide an a derivative for your objective function by
additionally overriding the function setDerivatives(RandomVariable[] parameters, RandomVariable[][] derivatives)
,
otherwise the solver will calculate the derivative via finite differences.
To reject a point, it is allowed to set an element of values
to Double.NaN
in the implementation of setValues(RandomVariable[] parameters, RandomVariable[] values)
.
Put differently: The solver handles NaN values in values
as an error larger than
the current one (regardless of the current error) and rejects the point.
Note, however, that is is an error if the initial parameter guess results in an NaN value.
That is, the solver should be initialized with an initial parameter in an admissible region.
0.0 * x1 + 1.0 * x2 = 5.0 |
2.0 * x1 + 1.0 * x2 = 10.0 |
LevenbergMarquardt optimizer = new LevenbergMarquardt() {
// Override your objective function here
public void setValues(RandomVariable[] parameters, RandomVariable[] values) {
values[0] = parameters[0] * 0.0 + parameters[1];
values[1] = parameters[0] * 2.0 + parameters[1];
}
};
// Set solver parameters
optimizer.setInitialParameters(new RandomVariable[] { 0, 0 });
optimizer.setWeights(new RandomVariable[] { 1, 1 });
optimizer.setMaxIteration(100);
optimizer.setTargetValues(new RandomVariable[] { 5, 10 });
optimizer.run();
RandomVariable[] bestParameters = optimizer.getBestFitParameters();
See the example in the main method below.
The class can be initialized to use a multi-threaded valuation. If initialized
this way the implementation of setValues
must be thread-safe.
The solver will evaluate the gradient of the value vector in parallel, i.e.,
use as many threads as the number of parameters.
- Version:
- 1.6
- Author:
- Christian Fries
- See Also:
- Serialized Form
-
Nested Class Summary
Nested ClassesModifier and TypeClassDescriptionstatic class
The regularization method used to invert the approximation of the Hessian matrix.Nested classes/interfaces inherited from interface net.finmath.optimizer.StochasticOptimizer
StochasticOptimizer.ObjectiveFunction
-
Constructor Summary
ConstructorsConstructorDescriptionStochasticLevenbergMarquardt(StochasticLevenbergMarquardt.RegularizationMethod regularizationMethod, RandomVariable[] initialParameters, RandomVariable[] targetValues, RandomVariable[] parameterSteps, int maxIteration, double errorTolerance, int numberOfThreads)
Create a Levenberg-Marquardt solver.StochasticLevenbergMarquardt(StochasticLevenbergMarquardt.RegularizationMethod regularizationMethod, RandomVariable[] initialParameters, RandomVariable[] targetValues, RandomVariable[] parameterSteps, int maxIteration, double errorTolerance, ExecutorService executorService)
Create a Levenberg-Marquardt solver.StochasticLevenbergMarquardt(RandomVariable[] initialParameters, RandomVariable[] targetValues, RandomVariable[] parameterSteps, int maxIteration, double errorTolerance, ExecutorService executorService)
Create a Levenberg-Marquardt solver. -
Method Summary
Modifier and TypeMethodDescriptionclone()
Create a clone of this LevenbergMarquardt optimizer.Get the best fit parameter vector.getCloneWithModifiedTargetValues(List<RandomVariable> newTargetVaues, List<RandomVariable> newWeights, boolean isUseBestParametersAsInitialParameters)
Create a clone of this LevenbergMarquardt optimizer with a new vector for the target values and weights.getCloneWithModifiedTargetValues(RandomVariable[] newTargetVaues, RandomVariable[] newWeights, boolean isUseBestParametersAsInitialParameters)
Create a clone of this LevenbergMarquardt optimizer with a new vector for the target values and weights.int
Get the number of iterations.double
Get the parameter λ used in the Tikhonov-like regularization of the Hessian matrix, that is the \( \lambda \) in \( H + \lambda \diag H \).double
Get the divisor applied to lambda (for the next iteration) if the inversion of regularized Hessian succeeds, that is, if \( H + \lambda \diag H \) is invertable.double
Get the multiplicator applied to lambda if the inversion of regularized Hessian fails, that is, if \( H + \lambda \diag H \) is not invertable.double
getMeanSquaredError(RandomVariable[] value)
double
static void
protected void
prepareAndSetDerivatives(RandomVariable[] parameters, RandomVariable[] values, RandomVariable[][] derivatives)
protected void
prepareAndSetValues(RandomVariable[] parameters, RandomVariable[] values)
void
run()
Runs the optimization.void
setDerivatives(RandomVariable[] parameters, RandomVariable[][] derivatives)
The derivative of the objective function.void
setErrorMeanSquaredCurrent(double errorMeanSquaredCurrent)
void
setLambda(double lambda)
Set the parameter λ used in the Tikhonov-like regularization of the Hessian matrix, that is the \( \lambda \) in \( H + \lambda \diag H \).void
setLambdaDivisor(double lambdaDivisor)
Set the divisor applied to lambda (for the next iteration) if the inversion of regularized Hessian succeeds, that is, if \( H + \lambda \diag H \) is invertable.void
setLambdaMultiplicator(double lambdaMultiplicator)
Set the multiplicator applied to lambda if the inversion of regularized Hessian fails, that is, if \( H + \lambda \diag H \) is not invertable.abstract void
setValues(RandomVariable[] parameters, RandomVariable[] values)
The objective function.
-
Constructor Details
-
StochasticLevenbergMarquardt
public StochasticLevenbergMarquardt(StochasticLevenbergMarquardt.RegularizationMethod regularizationMethod, RandomVariable[] initialParameters, RandomVariable[] targetValues, RandomVariable[] parameterSteps, int maxIteration, double errorTolerance, ExecutorService executorService)Create a Levenberg-Marquardt solver.- Parameters:
regularizationMethod
- The regularization method to use. SeeStochasticLevenbergMarquardt.RegularizationMethod
.initialParameters
- Initial value for the parameters where the solver starts its search.targetValues
- Target values to achieve.parameterSteps
- Step used for finite difference approximation.maxIteration
- Maximum number of iterations.errorTolerance
- Error tolerance / accuracy.executorService
- Executor to be used for concurrent valuation of the derivatives. This is only performed if setDerivative is not overwritten. Warning: The implementation of setValues has to be thread safe!
-
StochasticLevenbergMarquardt
public StochasticLevenbergMarquardt(RandomVariable[] initialParameters, RandomVariable[] targetValues, RandomVariable[] parameterSteps, int maxIteration, double errorTolerance, ExecutorService executorService)Create a Levenberg-Marquardt solver.- Parameters:
initialParameters
- Initial value for the parameters where the solver starts its search.targetValues
- Target values to achieve.parameterSteps
- Step used for finite difference approximation.maxIteration
- Maximum number of iterations.errorTolerance
- Error tolerance / accuracy.executorService
- Executor to be used for concurrent valuation of the derivatives. This is only performed if setDerivative is not overwritten. Warning: The implementation of setValues has to be thread safe!
-
StochasticLevenbergMarquardt
public StochasticLevenbergMarquardt(StochasticLevenbergMarquardt.RegularizationMethod regularizationMethod, RandomVariable[] initialParameters, RandomVariable[] targetValues, RandomVariable[] parameterSteps, int maxIteration, double errorTolerance, int numberOfThreads)Create a Levenberg-Marquardt solver.- Parameters:
regularizationMethod
- The regularization method to use. SeeStochasticLevenbergMarquardt.RegularizationMethod
.initialParameters
- Initial value for the parameters where the solver starts its search.targetValues
- Target values to achieve.parameterSteps
- Step used for finite difference approximation.maxIteration
- Maximum number of iterations.errorTolerance
- Error tolerance / accuracy.numberOfThreads
- Maximum number of threads. Warning: If this number is larger than one, the implementation of setValues has to be thread safe!
-
-
Method Details
-
main
- Throws:
SolverException
-
getLambda
public double getLambda()Get the parameter λ used in the Tikhonov-like regularization of the Hessian matrix, that is the \( \lambda \) in \( H + \lambda \diag H \).- Returns:
- the parameter \( \lambda \).
-
setLambda
public void setLambda(double lambda)Set the parameter λ used in the Tikhonov-like regularization of the Hessian matrix, that is the \( \lambda \) in \( H + \lambda \diag H \).- Parameters:
lambda
- the lambda to set
-
getLambdaMultiplicator
public double getLambdaMultiplicator()Get the multiplicator applied to lambda if the inversion of regularized Hessian fails, that is, if \( H + \lambda \diag H \) is not invertable.- Returns:
- the lambdaMultiplicator
-
setLambdaMultiplicator
public void setLambdaMultiplicator(double lambdaMultiplicator)Set the multiplicator applied to lambda if the inversion of regularized Hessian fails, that is, if \( H + \lambda \diag H \) is not invertable. This will make lambda larger, hence let the stepping move slower.- Parameters:
lambdaMultiplicator
- the lambdaMultiplicator to set. Should be > 1.
-
getLambdaDivisor
public double getLambdaDivisor()Get the divisor applied to lambda (for the next iteration) if the inversion of regularized Hessian succeeds, that is, if \( H + \lambda \diag H \) is invertable.- Returns:
- the lambdaDivisor
-
setLambdaDivisor
public void setLambdaDivisor(double lambdaDivisor)Set the divisor applied to lambda (for the next iteration) if the inversion of regularized Hessian succeeds, that is, if \( H + \lambda \diag H \) is invertable. This will make lambda smaller, hence let the stepping move faster.- Parameters:
lambdaDivisor
- the lambdaDivisor to set. Should be > 1.
-
getBestFitParameters
Description copied from interface:StochasticOptimizer
Get the best fit parameter vector.- Specified by:
getBestFitParameters
in interfaceStochasticOptimizer
- Returns:
- The best fit parameter.
-
getRootMeanSquaredError
public double getRootMeanSquaredError()- Specified by:
getRootMeanSquaredError
in interfaceStochasticOptimizer
- Returns:
- the the root mean square error achieved with the the best fit parameter
-
setErrorMeanSquaredCurrent
public void setErrorMeanSquaredCurrent(double errorMeanSquaredCurrent)- Parameters:
errorMeanSquaredCurrent
- the errorMeanSquaredCurrent to set
-
getIterations
public int getIterations()Description copied from interface:StochasticOptimizer
Get the number of iterations.- Specified by:
getIterations
in interfaceStochasticOptimizer
- Returns:
- The number of iterations required
-
prepareAndSetValues
protected void prepareAndSetValues(RandomVariable[] parameters, RandomVariable[] values) throws SolverException- Throws:
SolverException
-
prepareAndSetDerivatives
protected void prepareAndSetDerivatives(RandomVariable[] parameters, RandomVariable[] values, RandomVariable[][] derivatives) throws SolverException- Throws:
SolverException
-
setValues
public abstract void setValues(RandomVariable[] parameters, RandomVariable[] values) throws SolverExceptionThe objective function. Override this method to implement your custom function.- Parameters:
parameters
- Input value. The parameter vector.values
- Output value. The vector of values f(i,parameters), i=1,...,n- Throws:
SolverException
- Thrown if the valuation fails, specific cause may be available via thecause()
method.
-
setDerivatives
public void setDerivatives(RandomVariable[] parameters, RandomVariable[][] derivatives) throws SolverExceptionThe derivative of the objective function. You may override this method if you like to implement your own derivative.- Parameters:
parameters
- Input value. The parameter vector.derivatives
- Output value, where derivatives[i][j] is d(value(j)) / d(parameters(i)- Throws:
SolverException
- Thrown if the valuation fails, specific cause may be available via thecause()
method.
-
run
Description copied from interface:StochasticOptimizer
Runs the optimization.- Specified by:
run
in interfaceStochasticOptimizer
- Throws:
SolverException
- Thrown if the valuation fails, specific cause may be available via thecause()
method.
-
getMeanSquaredError
-
clone
Create a clone of this LevenbergMarquardt optimizer. The clone will use the same objective function than this implementation, i.e., the implementation ofsetValues(RandomVariable[], RandomVariable[])
and that ofsetDerivatives(RandomVariable[], RandomVariable[][])
is reused.- Overrides:
clone
in classObject
- Throws:
CloneNotSupportedException
-
getCloneWithModifiedTargetValues
public StochasticLevenbergMarquardt getCloneWithModifiedTargetValues(RandomVariable[] newTargetVaues, RandomVariable[] newWeights, boolean isUseBestParametersAsInitialParameters) throws CloneNotSupportedExceptionCreate a clone of this LevenbergMarquardt optimizer with a new vector for the target values and weights. The clone will use the same objective function than this implementation, i.e., the implementation ofsetValues(RandomVariable[], RandomVariable[])
and that ofsetDerivatives(RandomVariable[], RandomVariable[][])
is reused. The initial values of the cloned optimizer will either be the original initial values of this object or the best parameters obtained by this optimizer, the latter is used only if this optimized signals adone()
.- Parameters:
newTargetVaues
- New array of target values.newWeights
- New array of weights.isUseBestParametersAsInitialParameters
- If true and this optimizer is done(), then the clone will use this.getBestFitParameters()
as initial parameters.- Returns:
- A new LevenbergMarquardt optimizer, cloning this one except modified target values and weights.
- Throws:
CloneNotSupportedException
- Thrown if this optimizer cannot be cloned.
-
getCloneWithModifiedTargetValues
public StochasticLevenbergMarquardt getCloneWithModifiedTargetValues(List<RandomVariable> newTargetVaues, List<RandomVariable> newWeights, boolean isUseBestParametersAsInitialParameters) throws CloneNotSupportedExceptionCreate a clone of this LevenbergMarquardt optimizer with a new vector for the target values and weights. The clone will use the same objective function than this implementation, i.e., the implementation ofsetValues(RandomVariable[], RandomVariable[])
and that ofsetDerivatives(RandomVariable[], RandomVariable[][])
is reused. The initial values of the cloned optimizer will either be the original initial values of this object or the best parameters obtained by this optimizer, the latter is used only if this optimized signals adone()
.- Parameters:
newTargetVaues
- New list of target values.newWeights
- New list of weights.isUseBestParametersAsInitialParameters
- If true and this optimizer is done(), then the clone will use this.getBestFitParameters()
as initial parameters.- Returns:
- A new LevenbergMarquardt optimizer, cloning this one except modified target values and weights.
- Throws:
CloneNotSupportedException
- Thrown if this optimizer cannot be cloned.
-