Click or drag to resize
Accord.NET (logo)

AveragedStochasticGradientDescentTKernel Class

Averaged Stochastic Gradient Descent (ASGD) for training linear support vector machines.
Inheritance Hierarchy
SystemObject
  Accord.MachineLearningBinaryLearningBaseSupportVectorMachineTKernel, Double
    Accord.MachineLearning.VectorMachines.LearningBaseAveragedStochasticGradientDescentSupportVectorMachineTKernel, TKernel, Double, HingeLoss
      Accord.MachineLearning.VectorMachines.LearningAveragedStochasticGradientDescentTKernel

Namespace:  Accord.MachineLearning.VectorMachines.Learning
Assembly:  Accord.MachineLearning (in Accord.MachineLearning.dll) Version: 3.8.0
Syntax
public sealed class AveragedStochasticGradientDescent<TKernel> : BaseAveragedStochasticGradientDescent<SupportVectorMachine<TKernel>, TKernel, double[], HingeLoss>
where TKernel : struct, new(), ILinear
Request Example View Source

Type Parameters

TKernel

The AveragedStochasticGradientDescentTKernel type exposes the following members.

Constructors
  NameDescription
Public methodAveragedStochasticGradientDescentTKernel
Initializes a new instance of the AveragedStochasticGradientDescentTKernel class
Top
Properties
  NameDescription
Public propertyCurrentEpoch
Gets or sets the current epoch counter.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public propertyIterations Obsolete.
Please use MaxIterations instead.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public propertyKernel
Gets or sets the kernel function use to create a kernel Support Vector Machine.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public propertyLambda
Gets or sets the lambda regularization term. Default is 0.5.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public propertyLearningRate
Gets or sets the learning rate for the SGD algorithm.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public propertyLoss
Gets or sets the loss function to be used. Default is to use the LogisticLoss.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public propertyMaxIterations (Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public propertyModel
Gets or sets the classifier being learned.
(Inherited from BinaryLearningBaseTModel, TInput.)
Public propertyParallelOptions
Gets or sets the parallelization options for this algorithm.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public propertyToken
Gets or sets a cancellation token that can be used to cancel the algorithm while it is running.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public propertyTolerance
Gets or sets the maximum relative change in the watched value after an iteration of the algorithm used to detect convergence. Default is 1e-3. If set to 0, the loss will not be computed during learning and execution will be faster.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Top
Methods
  NameDescription
Public methodClone
Creates a new object that is a copy of the current instance.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public methodEquals
Determines whether the specified object is equal to the current object.
(Inherited from Object.)
Public methodGetHashCode
Serves as the default hash function.
(Inherited from Object.)
Public methodGetType
Gets the Type of the current instance.
(Inherited from Object.)
Public methodLearn(TInput, Boolean, Double)
Learns a model that can map the given inputs to the given outputs.
(Inherited from BinaryLearningBaseTModel, TInput.)
Public methodLearn(TInput, Double, Double)
Learns a model that can map the given inputs to the given outputs.
(Inherited from BinaryLearningBaseTModel, TInput.)
Public methodLearn(TInput, Int32, Double)
Learns a model that can map the given inputs to the given outputs.
(Inherited from BinaryLearningBaseTModel, TInput.)
Public methodLearn(TInput, Int32, Double)
Learns a model that can map the given inputs to the given outputs.
(Inherited from BinaryLearningBaseTModel, TInput.)
Public methodLearn(TInput, Boolean, Double)
Learns a model that can map the given inputs to the given outputs.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public methodToString
Returns a string that represents the current object.
(Inherited from Object.)
Top
Extension Methods
  NameDescription
Public Extension MethodHasMethod
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.)
Public Extension MethodIsEqual
Compares two objects for equality, performing an elementwise comparison if the elements are vectors or matrices.
(Defined by Matrix.)
Public Extension MethodTo(Type)Overloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Public Extension MethodToTOverloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Top
Examples
// In this example, we will learn a multi-class SVM using the one-vs-one (OvO)
// approach. The OvO approacbh can decompose decision problems involving multiple 
// classes into a series of binary ones, which can then be solved using SVMs.

// Ensure we have reproducible results
Accord.Math.Random.Generator.Seed = 0;

// We will try to learn a classifier
// for the Fisher Iris Flower dataset
var iris = new Iris();
double[][] inputs = iris.Instances; // get the flower characteristics
int[] outputs = iris.ClassLabels;   // get the expected flower classes

// We will use mini-batches of size 32 to learn a SVM using SGD
var batches = MiniBatches.Create(batchSize: 32, maxIterations: 1000,
   shuffle: ShuffleMethod.EveryEpoch, input: inputs, output: outputs);

// Now, we can create a multi-class teaching algorithm for the SVMs
var teacher = new MulticlassSupportVectorLearning<Linear, double[]>
{
    // We will use SGD to learn each of the binary problems in the multi-class problem
    Learner = (p) => new AveragedStochasticGradientDescent<Linear, double[], LogisticLoss>()
    {
        LearningRate = 1e-3,
        MaxIterations = 1 // so the gradient is only updated once after each mini-batch
    }
};

// The following line is only needed to ensure reproducible results. Please remove it to enable full parallelization
teacher.ParallelOptions.MaxDegreeOfParallelism = 1; // (Remove, comment, or change this line to enable full parallelism)

// Now, we can start training the model on mini-batches:
foreach (var batch in batches)
{
    teacher.Learn(batch.Inputs, batch.Outputs);
}

// Get the final model:
var svm = teacher.Model;

// Now, we should be able to use the model to predict 
// the classes of all flowers in Fisher's Iris dataset:
int[] prediction = svm.Decide(inputs);

// And from those predictions, we can compute the model accuracy:
var cm = new GeneralConfusionMatrix(expected: outputs, predicted: prediction);
double accuracy = cm.Accuracy; // should be approximately 0.973
// In this example, we will learn a multi-class SVM using the one-vs-rest (OvR)
// approach. The OvR approacbh can decompose decision problems involving multiple 
// classes into a series of binary ones, which can then be solved using SVMs.

// Ensure we have reproducible results
Accord.Math.Random.Generator.Seed = 0;

// We will try to learn a classifier
// for the Fisher Iris Flower dataset
var iris = new Iris();
double[][] inputs = iris.Instances; // get the flower characteristics
int[] outputs = iris.ClassLabels;   // get the expected flower classes

// We will use mini-batches of size 32 to learn a SVM using SGD
var batches = MiniBatches.Create(batchSize: 32, maxIterations: 1000,
   shuffle: ShuffleMethod.EveryEpoch, input: inputs, output: outputs);

// Now, we can create a multi-label teaching algorithm for the SVMs
var teacher = new MultilabelSupportVectorLearning<Linear, double[]>
{
    // We will use SGD to learn each of the binary problems in the multi-class problem
    Learner = (p) => new AveragedStochasticGradientDescent<Linear, double[], LogisticLoss>()
    {
        LearningRate = 1e-3,
        MaxIterations = 1 // so the gradient is only updated once after each mini-batch
    }
};

// The following line is only needed to ensure reproducible results. Please remove it to enable full parallelization
teacher.ParallelOptions.MaxDegreeOfParallelism = 1; // (Remove, comment, or change this line to enable full parallelism)

// Now, we can start training the model on mini-batches:
foreach (var batch in batches)
{
    teacher.Learn(batch.Inputs, batch.Outputs);
}

// Get the final model:
var svm = teacher.Model;

// Now, we should be able to use the model to predict 
// the classes of all flowers in Fisher's Iris dataset:
int[] prediction = svm.ToMulticlass().Decide(inputs);

// And from those predictions, we can compute the model accuracy:
var cm = new GeneralConfusionMatrix(expected: outputs, predicted: prediction);
double accuracy = cm.Accuracy; // should be approximately 0.913
// In this example, we will show how its possible to learn a 
// non-linear SVM using a linear algorithm by using a explicit
// expansion of the kernel function:

// Ensure we have reproducible results
Accord.Math.Random.Generator.Seed = 0;

// We will try to learn a classifier
// for the Fisher Iris Flower dataset
var iris = new WisconsinDiagnosticBreastCancer();
double[][] inputs = iris.Features; // get the flower characteristics
int[] outputs = iris.ClassLabels;   // get the expected flower classes

// We will use mini-batches of size 32 to learn a SVM using SGD
var batches = MiniBatches.Create(batchSize: 32, maxIterations: 1000,
   shuffle: ShuffleMethod.EveryEpoch, input: inputs, output: outputs);

// We will use an explicit Polynomial kernel expansion
var polynomial = new Polynomial(2);

// Now, we can create a multi-class teaching algorithm for the SVMs
var teacher = new MulticlassSupportVectorLearning<Linear, double[]>
{
    // We will use SGD to learn each of the binary problems in the multi-class problem
    Learner = (p) => new AveragedStochasticGradientDescent<Linear, double[], LogisticLoss>()
    {
        LearningRate = 1e-3,
        MaxIterations = 1 // so the gradient is only updated once after each mini-batch
    }
};

// The following line is only needed to ensure reproducible results. Please remove it to enable full parallelization
teacher.ParallelOptions.MaxDegreeOfParallelism = 1; // (Remove, comment, or change this line to enable full parallelism)

// Now, we can start training the model on mini-batches:
foreach (var batch in batches)
{
    teacher.Learn(polynomial.Transform(batch.Inputs), batch.Outputs);
}

// Get the final model:
var svm = teacher.Model;

// The following line is only needed to ensure reproducible results. Please remove it to enable full parallelization
svm.ParallelOptions.MaxDegreeOfParallelism = 1; // (Remove, comment, or change this line to enable full parallelism)

// Now, we should be able to use the model to predict 
// the classes of all flowers in Fisher's Iris dataset:
int[] prediction = svm.Decide(polynomial.Transform(inputs));

// And from those predictions, we can compute the model accuracy:
var cm = new GeneralConfusionMatrix(expected: outputs, predicted: prediction);
double accuracy = cm.Accuracy; // should be approximately 0.92
See Also