AugmentedLagrangian Class |
Namespace: Accord.Math.Optimization
public class AugmentedLagrangian : BaseGradientOptimizationMethod, IGradientOptimizationMethod, IOptimizationMethod, IOptimizationMethod<double[], double>, IGradientOptimizationMethod<double[], double>, IFunctionOptimizationMethod<double[], double>, IOptimizationMethod<AugmentedLagrangianStatus>, IOptimizationMethod<double[], double, AugmentedLagrangianStatus>
The AugmentedLagrangian type exposes the following members.
Name | Description | |
---|---|---|
AugmentedLagrangian(Int32, IEnumerableIConstraint) |
Creates a new instance of the Augmented Lagrangian algorithm.
| |
AugmentedLagrangian(IGradientOptimizationMethod, IEnumerableIConstraint) |
Creates a new instance of the Augmented Lagrangian algorithm.
| |
AugmentedLagrangian(NonlinearObjectiveFunction, IEnumerableIConstraint) |
Creates a new instance of the Augmented Lagrangian algorithm.
| |
AugmentedLagrangian(IGradientOptimizationMethod, NonlinearObjectiveFunction, IEnumerableIConstraint) |
Creates a new instance of the Augmented Lagrangian algorithm.
|
Name | Description | |
---|---|---|
Evaluations | ||
Function |
Gets or sets the function to be optimized.
(Inherited from BaseOptimizationMethod.) | |
Gradient |
Gets or sets a function returning the gradient
vector of the function to be optimized for a
given value of its free parameters.
(Inherited from BaseGradientOptimizationMethod.) | |
Iterations | ||
MaxEvaluations |
Gets or sets the maximum number of evaluations
to be performed during optimization. Default
is 0 (evaluate until convergence).
| |
NumberOfVariables |
Gets the number of variables (free parameters)
in the optimization problem.
(Inherited from BaseOptimizationMethod.) | |
Optimizer |
Gets the inner dual problem optimization algorithm.
| |
Solution |
Gets the current solution found, the values of
the parameters which optimizes the function.
(Inherited from BaseOptimizationMethod.) | |
Status | ||
Token |
Gets or sets a cancellation token that can be used to
stop the learning algorithm while it is running.
(Inherited from BaseOptimizationMethod.) | |
Value |
Gets the output of the function at the current Solution.
(Inherited from BaseOptimizationMethod.) |
Name | Description | |
---|---|---|
Equals | Determines whether the specified object is equal to the current object. (Inherited from Object.) | |
Finalize | Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection. (Inherited from Object.) | |
GetHashCode | Serves as the default hash function. (Inherited from Object.) | |
GetType | Gets the Type of the current instance. (Inherited from Object.) | |
Maximize |
Finds the maximum value of a function. The solution vector
will be made available at the Solution property.
(Inherited from BaseGradientOptimizationMethod.) | |
Maximize(Double) |
Finds the maximum value of a function. The solution vector
will be made available at the Solution property.
(Inherited from BaseOptimizationMethod.) | |
MemberwiseClone | Creates a shallow copy of the current Object. (Inherited from Object.) | |
Minimize |
Finds the minimum value of a function. The solution vector
will be made available at the Solution property.
(Inherited from BaseGradientOptimizationMethod.) | |
Minimize(Double) |
Finds the minimum value of a function. The solution vector
will be made available at the Solution property.
(Inherited from BaseOptimizationMethod.) | |
OnNumberOfVariablesChanged |
Called when the NumberOfVariables property has changed.
(Inherited from BaseOptimizationMethod.) | |
Optimize |
Implements the actual optimization algorithm. This
method should try to minimize the objective function.
(Overrides BaseOptimizationMethodOptimize.) | |
ToString | Returns a string that represents the current object. (Inherited from Object.) |
Name | Description | |
---|---|---|
HasMethod |
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.) | |
IsEqual |
Compares two objects for equality, performing an elementwise
comparison if the elements are vectors or matrices.
(Defined by Matrix.) | |
To(Type) | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) | |
ToT | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) |
References:
In this framework, it is possible to state a non-linear programming problem using either symbolic processing or vector-valued functions. The following example demonstrates the symbolic processing case:
// Suppose we would like to minimize the following function: // // f(x,y) = min 100(y-x²)²+(1-x)² // // Subject to the constraints // // x >= 0 (x must be positive) // y >= 0 (y must be positive) // // First, let's declare some symbolic variables double x = 0, y = 0; // (values do not matter) // Now, we create an objective function var f = new NonlinearObjectiveFunction( // This is the objective function: f(x,y) = min 100(y-x²)²+(1-x)² function: () => 100 * Math.Pow(y - x * x, 2) + Math.Pow(1 - x, 2), // And this is the vector gradient for the same function: gradient: () => new[] { 2 * (200 * Math.Pow(x, 3) - 200 * x * y + x - 1), // df/dx = 2(200x³-200xy+x-1) 200 * (y - x*x) // df/dy = 200(y-x²) } ); // Now we can start stating the constraints var constraints = new List<NonlinearConstraint>() { // Add the non-negativity constraint for x new NonlinearConstraint(f, // 1st constraint: x should be greater than or equal to 0 function: () => x, shouldBe: ConstraintType.GreaterThanOrEqualTo, value: 0, gradient: () => new[] { 1.0, 0.0 } ), // Add the non-negativity constraint for y new NonlinearConstraint(f, // 2nd constraint: y should be greater than or equal to 0 function: () => y, shouldBe: ConstraintType.GreaterThanOrEqualTo, value: 0, gradient: () => new[] { 0.0, 1.0 } ) }; // Finally, we create the non-linear programming solver var solver = new AugmentedLagrangian(f, constraints); // And attempt to find a minimum bool success = solver.Minimize(); // The solution found was { 1, 1 } double[] solution = solver.Solution; // with the minimum value zero. double minValue = solver.Value;
And this is the same example as before, but using standard vectors instead.
// Suppose we would like to minimize the following function: // // f(x,y) = min 100(y-x²)²+(1-x)² // // Subject to the constraints // // x >= 0 (x must be positive) // y >= 0 (y must be positive) // // Now, we can create an objective function using vectors var f = new NonlinearObjectiveFunction(numberOfVariables: 2, // This is the objective function: f(x,y) = min 100(y-x²)²+(1-x)² function: (x) => 100 * Math.Pow(x[1] - x[0] * x[0], 2) + Math.Pow(1 - x[0], 2), // And this is the vector gradient for the same function: gradient: (x) => new[] { 2 * (200 * Math.Pow(x[0], 3) - 200 * x[0] * x[1] + x[0] - 1), // df/dx = 2(200x³-200xy+x-1) 200 * (x[1] - x[0]*x[0]) // df/dy = 200(y-x²) } ); // As before, we state the constraints. However, to illustrate the flexibility // of the AugmentedLagrangian, we shall use LinearConstraints to constrain the problem. double[,] a = Matrix.Identity(2); // Set up the constraint matrix... double[] b = Vector.Zeros(2); // ...and the values they must be greater than int numberOfEqualities = 0; var linearConstraints = LinearConstraintCollection.Create(a, b, numberOfEqualities); // Finally, we create the non-linear programming solver var solver = new AugmentedLagrangian(f, linearConstraints); // And attempt to find a minimum bool success = solver.Minimize(); // The solution found was { 1, 1 } double[] solution = solver.Solution; // with the minimum value zero. double minValue = solver.Value;