Click or drag to resize
Accord.NET (logo)

StochasticGradientDescentTKernel, TInput, TLoss Class

Stochastic Gradient Descent (SGD) for training linear support vector machines.
Inheritance Hierarchy
SystemObject
  Accord.MachineLearningBinaryLearningBaseSupportVectorMachineTKernel, TInput, TInput
    Accord.MachineLearning.VectorMachines.LearningBaseStochasticGradientDescentSupportVectorMachineTKernel, TInput, TKernel, TInput, TLoss
      Accord.MachineLearning.VectorMachines.LearningStochasticGradientDescentTKernel, TInput, TLoss

Namespace:  Accord.MachineLearning.VectorMachines.Learning
Assembly:  Accord.MachineLearning (in Accord.MachineLearning.dll) Version: 3.8.0
Syntax
public class StochasticGradientDescent<TKernel, TInput, TLoss> : BaseStochasticGradientDescent<SupportVectorMachine<TKernel, TInput>, TKernel, TInput, TLoss>
where TKernel : struct, new(), ILinear<TInput>
where TInput : IList, ICloneable
where TLoss : struct, new(), IDifferentiableLoss<bool, double, double>
Request Example View Source

Type Parameters

TKernel
TInput
TLoss

The StochasticGradientDescentTKernel, TInput, TLoss type exposes the following members.

Constructors
  NameDescription
Public methodStochasticGradientDescentTKernel, TInput, TLoss
Initializes a new instance of the StochasticGradientDescentTKernel, TInput, TLoss class
Top
Properties
  NameDescription
Public propertyIterations Obsolete.
Please use MaxIterations instead.
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public propertyKernel
Gets or sets the kernel function use to create a kernel Support Vector Machine.
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public propertyLambda
Gets or sets the lambda regularization term. Default is 0.5.
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public propertyLearningRate
Gets or sets the learning rate for the SGD algorithm.
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public propertyLoss
Gets or sets the loss function to be used. Default is to use the LogisticLoss.
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public propertyMaxIterations (Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Public propertyModel
Gets or sets the classifier being learned.
(Inherited from BinaryLearningBaseTModel, TInput.)
Public propertyToken
Gets or sets a cancellation token that can be used to stop the learning algorithm while it is running.
(Inherited from BinaryLearningBaseTModel, TInput.)
Public propertyTolerance
Gets or sets the maximum relative change in the watched value after an iteration of the algorithm used to detect convergence. Default is 1e-5.
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Top
Methods
  NameDescription
Public methodClone
Creates a new object that is a copy of the current instance.
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Protected methodCreate
Creates an instance of the model to be learned. Inheritors of this abstract class must define this method so new models can be created from the training data.
(Overrides BaseStochasticGradientDescentTModel, TKernel, TInput, TLossCreate(Int32, TKernel).)
Public methodEquals
Determines whether the specified object is equal to the current object.
(Inherited from Object.)
Protected methodFinalize
Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection.
(Inherited from Object.)
Public methodGetHashCode
Serves as the default hash function.
(Inherited from Object.)
Public methodGetType
Gets the Type of the current instance.
(Inherited from Object.)
Protected methodInnerClone
Inheritors should implement this function to produce a new instance with the same characteristics of the current object.
(Overrides BaseStochasticGradientDescentTModel, TKernel, TInput, TLossInnerClone.)
Public methodLearn(TInput, Boolean, Double)
Learns a model that can map the given inputs to the given outputs.
(Inherited from BinaryLearningBaseTModel, TInput.)
Public methodLearn(TInput, Double, Double)
Learns a model that can map the given inputs to the given outputs.
(Inherited from BinaryLearningBaseTModel, TInput.)
Public methodLearn(TInput, Int32, Double)
Learns a model that can map the given inputs to the given outputs.
(Inherited from BinaryLearningBaseTModel, TInput.)
Public methodLearn(TInput, Int32, Double)
Learns a model that can map the given inputs to the given outputs.
(Inherited from BinaryLearningBaseTModel, TInput.)
Public methodLearn(TInput, Boolean, Double)
Learns a model that can map the given inputs to the given outputs.
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.)
Protected methodMemberwiseClone
Creates a shallow copy of the current Object.
(Inherited from Object.)
Public methodToString
Returns a string that represents the current object.
(Inherited from Object.)
Top
Extension Methods
  NameDescription
Public Extension MethodHasMethod
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.)
Public Extension MethodIsEqual
Compares two objects for equality, performing an elementwise comparison if the elements are vectors or matrices.
(Defined by Matrix.)
Public Extension MethodTo(Type)Overloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Public Extension MethodToTOverloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Top
Examples
// In this example, we will learn a multi-class SVM using the one-vs-one (OvO)
// approach. The OvO approacbh can decompose decision problems involving multiple 
// classes into a series of binary ones, which can then be solved using SVMs.

// Ensure we have reproducible results
Accord.Math.Random.Generator.Seed = 0;

// We will try to learn a classifier
// for the Fisher Iris Flower dataset
var iris = new Iris();
double[][] inputs = iris.Instances; // get the flower characteristics
int[] outputs = iris.ClassLabels;   // get the expected flower classes

// We will use mini-batches of size 32 to learn a SVM using SGD
var batches = MiniBatches.Create(batchSize: 32, maxIterations: 1000,
   shuffle: ShuffleMethod.EveryEpoch, input: inputs, output: outputs);

// Now, we can create a multi-class teaching algorithm for the SVMs
var teacher = new MulticlassSupportVectorLearning<Linear, double[]>
{
    // We will use SGD to learn each of the binary problems in the multi-class problem
    Learner = (p) => new StochasticGradientDescent<Linear, double[], LogisticLoss>()
    {
        LearningRate = 1e-3, 
        MaxIterations = 1 // so the gradient is only updated once after each mini-batch
    }
};

// The following line is only needed to ensure reproducible results. Please remove it to enable full parallelization
teacher.ParallelOptions.MaxDegreeOfParallelism = 1; // (Remove, comment, or change this line to enable full parallelism)

// Now, we can start training the model on mini-batches:
foreach (var batch in batches)
{
    teacher.Learn(batch.Inputs, batch.Outputs);
}

// Get the final model:
var svm = teacher.Model;

// Now, we should be able to use the model to predict 
// the classes of all flowers in Fisher's Iris dataset:
int[] prediction = svm.Decide(inputs);

// And from those predictions, we can compute the model accuracy:
var cm = new GeneralConfusionMatrix(expected: outputs, predicted: prediction);
double accuracy = cm.Accuracy; // should be approximately 0.973
// In this example, we will learn a multi-class SVM using the one-vs-rest (OvR)
// approach. The OvR approacbh can decompose decision problems involving multiple 
// classes into a series of binary ones, which can then be solved using SVMs.

// Ensure we have reproducible results
Accord.Math.Random.Generator.Seed = 0;

// We will try to learn a classifier
// for the Fisher Iris Flower dataset
var iris = new Iris();
double[][] inputs = iris.Instances; // get the flower characteristics
int[] outputs = iris.ClassLabels;   // get the expected flower classes

// We will use mini-batches of size 32 to learn a SVM using SGD
var batches = MiniBatches.Create(batchSize: 32, maxIterations: 1000,
   shuffle: ShuffleMethod.EveryEpoch, input: inputs, output: outputs);

// Now, we can create a multi-label teaching algorithm for the SVMs
var teacher = new MultilabelSupportVectorLearning<Linear, double[]>
{
    // We will use SGD to learn each of the binary problems in the multi-class problem
    Learner = (p) => new StochasticGradientDescent<Linear, double[], LogisticLoss>()
    {
        LearningRate = 1e-3,
        MaxIterations = 1 // so the gradient is only updated once after each mini-batch
    }
};

// The following line is only needed to ensure reproducible results. Please remove it to enable full parallelization
teacher.ParallelOptions.MaxDegreeOfParallelism = 1; // (Remove, comment, or change this line to enable full parallelism)

// Now, we can start training the model on mini-batches:
foreach (var batch in batches)
{
    teacher.Learn(batch.Inputs, batch.Outputs);
}

// Get the final model:
var svm = teacher.Model;

// Now, we should be able to use the model to predict 
// the classes of all flowers in Fisher's Iris dataset:
int[] prediction = svm.ToMulticlass().Decide(inputs);

// And from those predictions, we can compute the model accuracy:
var cm = new GeneralConfusionMatrix(expected: outputs, predicted: prediction);
double accuracy = cm.Accuracy; // should be approximately 0.913
See Also