StochasticGradientDescentTKernel, TInput, TLoss Class |
Namespace: Accord.MachineLearning.VectorMachines.Learning
public class StochasticGradientDescent<TKernel, TInput, TLoss> : BaseStochasticGradientDescent<SupportVectorMachine<TKernel, TInput>, TKernel, TInput, TLoss> where TKernel : struct, new(), ILinear<TInput> where TInput : IList, ICloneable where TLoss : struct, new(), IDifferentiableLoss<bool, double, double>
The StochasticGradientDescentTKernel, TInput, TLoss type exposes the following members.
Name | Description | |
---|---|---|
StochasticGradientDescentTKernel, TInput, TLoss | Initializes a new instance of the StochasticGradientDescentTKernel, TInput, TLoss class |
Name | Description | |
---|---|---|
Iterations | Obsolete.
Please use MaxIterations instead.
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
Kernel |
Gets or sets the kernel function use to create a
kernel Support Vector Machine.
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
Lambda |
Gets or sets the lambda regularization term. Default is 0.5.
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
LearningRate |
Gets or sets the learning rate for the SGD algorithm.
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
Loss |
Gets or sets the loss function to be used.
Default is to use the LogisticLoss.
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
MaxIterations |
Gets or sets the number of iterations that should be
performed by the algorithm when calling Learn(TInput, Boolean, Double).
Default is 0 (iterate until convergence).
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
Model |
Gets or sets the classifier being learned.
(Inherited from BinaryLearningBaseTModel, TInput.) | |
Token |
Gets or sets a cancellation token that can be used to
stop the learning algorithm while it is running.
(Inherited from BinaryLearningBaseTModel, TInput.) | |
Tolerance |
Gets or sets the maximum relative change in the watched value
after an iteration of the algorithm used to detect convergence.
Default is 1e-5.
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.) |
Name | Description | |
---|---|---|
Clone |
Creates a new object that is a copy of the current instance.
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
Create |
Creates an instance of the model to be learned. Inheritors
of this abstract class must define this method so new models
can be created from the training data.
(Overrides BaseStochasticGradientDescentTModel, TKernel, TInput, TLossCreate(Int32, TKernel).) | |
Equals | Determines whether the specified object is equal to the current object. (Inherited from Object.) | |
Finalize | Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection. (Inherited from Object.) | |
GetHashCode | Serves as the default hash function. (Inherited from Object.) | |
GetType | Gets the Type of the current instance. (Inherited from Object.) | |
InnerClone |
Inheritors should implement this function to produce a new instance
with the same characteristics of the current object.
(Overrides BaseStochasticGradientDescentTModel, TKernel, TInput, TLossInnerClone.) | |
Learn(TInput, Boolean, Double) |
Learns a model that can map the given inputs to the given outputs.
(Inherited from BinaryLearningBaseTModel, TInput.) | |
Learn(TInput, Double, Double) |
Learns a model that can map the given inputs to the given outputs.
(Inherited from BinaryLearningBaseTModel, TInput.) | |
Learn(TInput, Int32, Double) |
Learns a model that can map the given inputs to the given outputs.
(Inherited from BinaryLearningBaseTModel, TInput.) | |
Learn(TInput, Int32, Double) |
Learns a model that can map the given inputs to the given outputs.
(Inherited from BinaryLearningBaseTModel, TInput.) | |
Learn(TInput, Boolean, Double) |
Learns a model that can map the given inputs to the given outputs.
(Inherited from BaseStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
MemberwiseClone | Creates a shallow copy of the current Object. (Inherited from Object.) | |
ToString | Returns a string that represents the current object. (Inherited from Object.) |
Name | Description | |
---|---|---|
HasMethod |
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.) | |
IsEqual |
Compares two objects for equality, performing an elementwise
comparison if the elements are vectors or matrices.
(Defined by Matrix.) | |
To(Type) | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) | |
ToT | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) |
// In this example, we will learn a multi-class SVM using the one-vs-one (OvO) // approach. The OvO approacbh can decompose decision problems involving multiple // classes into a series of binary ones, which can then be solved using SVMs. // Ensure we have reproducible results Accord.Math.Random.Generator.Seed = 0; // We will try to learn a classifier // for the Fisher Iris Flower dataset var iris = new Iris(); double[][] inputs = iris.Instances; // get the flower characteristics int[] outputs = iris.ClassLabels; // get the expected flower classes // We will use mini-batches of size 32 to learn a SVM using SGD var batches = MiniBatches.Create(batchSize: 32, maxIterations: 1000, shuffle: ShuffleMethod.EveryEpoch, input: inputs, output: outputs); // Now, we can create a multi-class teaching algorithm for the SVMs var teacher = new MulticlassSupportVectorLearning<Linear, double[]> { // We will use SGD to learn each of the binary problems in the multi-class problem Learner = (p) => new StochasticGradientDescent<Linear, double[], LogisticLoss>() { LearningRate = 1e-3, MaxIterations = 1 // so the gradient is only updated once after each mini-batch } }; // The following line is only needed to ensure reproducible results. Please remove it to enable full parallelization teacher.ParallelOptions.MaxDegreeOfParallelism = 1; // (Remove, comment, or change this line to enable full parallelism) // Now, we can start training the model on mini-batches: foreach (var batch in batches) { teacher.Learn(batch.Inputs, batch.Outputs); } // Get the final model: var svm = teacher.Model; // Now, we should be able to use the model to predict // the classes of all flowers in Fisher's Iris dataset: int[] prediction = svm.Decide(inputs); // And from those predictions, we can compute the model accuracy: var cm = new GeneralConfusionMatrix(expected: outputs, predicted: prediction); double accuracy = cm.Accuracy; // should be approximately 0.973
// In this example, we will learn a multi-class SVM using the one-vs-rest (OvR) // approach. The OvR approacbh can decompose decision problems involving multiple // classes into a series of binary ones, which can then be solved using SVMs. // Ensure we have reproducible results Accord.Math.Random.Generator.Seed = 0; // We will try to learn a classifier // for the Fisher Iris Flower dataset var iris = new Iris(); double[][] inputs = iris.Instances; // get the flower characteristics int[] outputs = iris.ClassLabels; // get the expected flower classes // We will use mini-batches of size 32 to learn a SVM using SGD var batches = MiniBatches.Create(batchSize: 32, maxIterations: 1000, shuffle: ShuffleMethod.EveryEpoch, input: inputs, output: outputs); // Now, we can create a multi-label teaching algorithm for the SVMs var teacher = new MultilabelSupportVectorLearning<Linear, double[]> { // We will use SGD to learn each of the binary problems in the multi-class problem Learner = (p) => new StochasticGradientDescent<Linear, double[], LogisticLoss>() { LearningRate = 1e-3, MaxIterations = 1 // so the gradient is only updated once after each mini-batch } }; // The following line is only needed to ensure reproducible results. Please remove it to enable full parallelization teacher.ParallelOptions.MaxDegreeOfParallelism = 1; // (Remove, comment, or change this line to enable full parallelism) // Now, we can start training the model on mini-batches: foreach (var batch in batches) { teacher.Learn(batch.Inputs, batch.Outputs); } // Get the final model: var svm = teacher.Model; // Now, we should be able to use the model to predict // the classes of all flowers in Fisher's Iris dataset: int[] prediction = svm.ToMulticlass().Decide(inputs); // And from those predictions, we can compute the model accuracy: var cm = new GeneralConfusionMatrix(expected: outputs, predicted: prediction); double accuracy = cm.Accuracy; // should be approximately 0.913