Click or drag to resize
Accord.NET (logo)

LevenbergMarquardtLearning Class

Levenberg-Marquardt Learning Algorithm with optional Bayesian Regularization.
Inheritance Hierarchy
SystemObject
  Accord.Neuro.LearningLevenbergMarquardtLearning

Namespace:  Accord.Neuro.Learning
Assembly:  Accord.Neuro (in Accord.Neuro.dll) Version: 3.8.0
Syntax
public class LevenbergMarquardtLearning : ISupervisedLearning
Request Example View Source

The LevenbergMarquardtLearning type exposes the following members.

Constructors
  NameDescription
Public methodLevenbergMarquardtLearning(ActivationNetwork)
Initializes a new instance of the LevenbergMarquardtLearning class.
Public methodLevenbergMarquardtLearning(ActivationNetwork, JacobianMethod)
Initializes a new instance of the LevenbergMarquardtLearning class.
Public methodLevenbergMarquardtLearning(ActivationNetwork, Boolean)
Initializes a new instance of the LevenbergMarquardtLearning class.
Public methodLevenbergMarquardtLearning(ActivationNetwork, Boolean, JacobianMethod)
Initializes a new instance of the LevenbergMarquardtLearning class.
Top
Properties
  NameDescription
Public propertyAdjustment
Learning rate adjustment. Default value is 10.
Public propertyAlpha
Gets or sets the importance of the squared sum of network weights in the cost function. Used by the regularization.
Public propertyBeta
Gets or sets the importance of the squared sum of network errors in the cost function. Used by the regularization.
Public propertyBlocks
Gets or sets the number of blocks to divide the Jacobian matrix in the Hessian calculation to preserve memory. Default is 1.
Public propertyEffectiveParameters
Gets the number of effective parameters being used by the network as determined by the Bayesian regularization.
Public propertyGradient
Gets the gradient vector computed in the last iteration.
Public propertyHessian
Gets the approximate Hessian matrix of second derivatives generated in the last algorithm iteration. The Hessian is stored in the upper triangular part of this matrix. See remarks for details.
Public propertyJacobian
Gets the Jacobian matrix created in the last iteration.
Public propertyLearningRate
Levenberg's damping factor (lambda). This value must be positive. Default is 0.1.
Public propertyNumberOfParameters
Gets the total number of parameters in the network being trained.
Public propertyParallelOptions
Gets or sets the parallelization options for this algorithm.
Public propertyUseRegularization
Gets or sets whether to use Bayesian Regularization.
Top
Methods
  NameDescription
Public methodComputeError
Compute network error for a given data set.
Public methodEquals
Determines whether the specified object is equal to the current object.
(Inherited from Object.)
Protected methodFinalize
Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection.
(Inherited from Object.)
Public methodGetHashCode
Serves as the default hash function.
(Inherited from Object.)
Public methodGetType
Gets the Type of the current instance.
(Inherited from Object.)
Protected methodMemberwiseClone
Creates a shallow copy of the current Object.
(Inherited from Object.)
Public methodRun
Public methodRunEpoch
Runs a single learning epoch.
Public methodToString
Returns a string that represents the current object.
(Inherited from Object.)
Top
Extension Methods
  NameDescription
Public Extension MethodHasMethod
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.)
Public Extension MethodIsEqual
Compares two objects for equality, performing an elementwise comparison if the elements are vectors or matrices.
(Defined by Matrix.)
Public Extension MethodTo(Type)Overloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Public Extension MethodToTOverloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Top
Remarks

This class implements the Levenberg-Marquardt learning algorithm, which treats the neural network learning as a function optimization problem. The Levenberg-Marquardt is one of the fastest and accurate learning algorithms for small to medium sized networks.

However, in general, the standard LM algorithm does not perform as well on pattern recognition problems as it does on function approximation problems. The LM algorithm is designed for least squares problems that are approximately linear. Because the output neurons in pattern recognition problems are generally saturated, it will not be operating in the linear region.

The advantages of the LM algorithm decreases as the number of network parameters increases.

Examples

Sample usage (training network to calculate XOR function):

// initialize input and output values
double[][] input =
{
    new double[] {0, 0}, new double[] {0, 1},
    new double[] {1, 0}, new double[] {1, 1}
};

double[][] output = 
{
    new double[] {0}, new double[] {1},
    new double[] {1}, new double[] {0}
};

// create neural network
ActivationNetwork   network = new ActivationNetwork(
    SigmoidFunction( 2 ),
    2, // two inputs in the network
    2, // two neurons in the first layer
    1 ); // one neuron in the second layer

// create teacher
LevenbergMarquardtLearning teacher = new LevenbergMarquardtLearning( network );

// loop
while ( !needToStop )
{
    // run epoch of learning procedure
    double error = teacher.RunEpoch( input, output );

    // check error value to see if we need to stop
    // ...
}

The following example shows how to create a neural network to learn a classification problem with multiple classes.

// Here we will be creating a neural network to process 3-valued input
// vectors and classify them into 4-possible classes. We will be using
// a single hidden layer with 5 hidden neurons to accomplish this task.
// 
int numberOfInputs = 3;
int numberOfClasses = 4;
int hiddenNeurons = 5;

// Those are the input vectors and their expected class labels
// that we expect our network to learn.
// 
double[][] input = 
{
    new double[] { -1, -1, -1 }, // 0
    new double[] { -1,  1, -1 }, // 1
    new double[] {  1, -1, -1 }, // 1
    new double[] {  1,  1, -1 }, // 0
    new double[] { -1, -1,  1 }, // 2
    new double[] { -1,  1,  1 }, // 3
    new double[] {  1, -1,  1 }, // 3
    new double[] {  1,  1,  1 }  // 2
 };

 int[] labels =
 {
    0,
    1,
    1,
    0,
    2,
    3,
    3,
    2,
};

// In order to perform multi-class classification, we have to select a 
// decision strategy in order to be able to interpret neural network 
// outputs as labels. For this, we will be expanding our 4 possible class
// labels into 4-dimensional output vectors where one single dimension 
// corresponding to a label will contain the value +1 and -1 otherwise.

double[][] outputs = Accord.Statistics.Tools
  .Expand(labels, numberOfClasses, -1, 1);

// Next we can proceed to create our network
var function = new BipolarSigmoidFunction(2);
var network = new ActivationNetwork(function,
  numberOfInputs, hiddenNeurons, numberOfClasses);

// Heuristically randomize the network
new NguyenWidrow(network).Randomize();

// Create the learning algorithm
var teacher = new LevenbergMarquardtLearning(network);

// Teach the network for 10 iterations:
double error = Double.PositiveInfinity;
for (int i = 0; i < 10; i++)
   error = teacher.RunEpoch(input, outputs);

// At this point, the network should be able to 
// perfectly classify the training input points.

for (int i = 0; i < input.Length; i++)
{
   int answer;
   double[] output = network.Compute(input[i]);
   double response = output.Max(out answer);

   int expected = labels[i];

   // at this point, the variables 'answer' and
   // 'expected' should contain the same value.
}

References:

See Also