AveragedStochasticGradientDescentTKernel, TInput, TLoss Class |
Namespace: Accord.MachineLearning.VectorMachines.Learning
public sealed class AveragedStochasticGradientDescent<TKernel, TInput, TLoss> : BaseAveragedStochasticGradientDescent<SupportVectorMachine<TKernel, TInput>, TKernel, TInput, TLoss> where TKernel : struct, new(), ILinear<TInput> where TInput : IList, ICloneable where TLoss : struct, new(), IDifferentiableLoss<bool, double, double>
The AveragedStochasticGradientDescentTKernel, TInput, TLoss type exposes the following members.
Name | Description | |
---|---|---|
AveragedStochasticGradientDescentTKernel, TInput, TLoss | Initializes a new instance of the AveragedStochasticGradientDescentTKernel, TInput, TLoss class |
Name | Description | |
---|---|---|
CurrentEpoch |
Gets or sets the current epoch counter.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
Iterations | Obsolete.
Please use MaxIterations instead.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
Kernel |
Gets or sets the kernel function use to create a
kernel Support Vector Machine.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
Lambda |
Gets or sets the lambda regularization term. Default is 0.5.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
LearningRate |
Gets or sets the learning rate for the SGD algorithm.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
Loss |
Gets or sets the loss function to be used.
Default is to use the LogisticLoss.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
MaxIterations |
Gets or sets the number of iterations that should be
performed by the algorithm when calling Learn(TInput, Boolean, Double).
Default is 0 (iterate until convergence).
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
Model |
Gets or sets the classifier being learned.
(Inherited from BinaryLearningBaseTModel, TInput.) | |
ParallelOptions |
Gets or sets the parallelization options for this algorithm.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
Token |
Gets or sets a cancellation token that can be used
to cancel the algorithm while it is running.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
Tolerance |
Gets or sets the maximum relative change in the watched value
after an iteration of the algorithm used to detect convergence.
Default is 1e-3. If set to 0, the loss will not be computed
during learning and execution will be faster.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.) |
Name | Description | |
---|---|---|
Clone |
Creates a new object that is a copy of the current instance.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
Equals | Determines whether the specified object is equal to the current object. (Inherited from Object.) | |
GetHashCode | Serves as the default hash function. (Inherited from Object.) | |
GetType | Gets the Type of the current instance. (Inherited from Object.) | |
Learn(TInput, Boolean, Double) |
Learns a model that can map the given inputs to the given outputs.
(Inherited from BinaryLearningBaseTModel, TInput.) | |
Learn(TInput, Double, Double) |
Learns a model that can map the given inputs to the given outputs.
(Inherited from BinaryLearningBaseTModel, TInput.) | |
Learn(TInput, Int32, Double) |
Learns a model that can map the given inputs to the given outputs.
(Inherited from BinaryLearningBaseTModel, TInput.) | |
Learn(TInput, Int32, Double) |
Learns a model that can map the given inputs to the given outputs.
(Inherited from BinaryLearningBaseTModel, TInput.) | |
Learn(TInput, Boolean, Double) |
Learns a model that can map the given inputs to the given outputs.
(Inherited from BaseAveragedStochasticGradientDescentTModel, TKernel, TInput, TLoss.) | |
ToString | Returns a string that represents the current object. (Inherited from Object.) |
Name | Description | |
---|---|---|
HasMethod |
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.) | |
IsEqual |
Compares two objects for equality, performing an elementwise
comparison if the elements are vectors or matrices.
(Defined by Matrix.) | |
To(Type) | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) | |
ToT | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) |
// In this example, we will learn a multi-class SVM using the one-vs-one (OvO) // approach. The OvO approacbh can decompose decision problems involving multiple // classes into a series of binary ones, which can then be solved using SVMs. // Ensure we have reproducible results Accord.Math.Random.Generator.Seed = 0; // We will try to learn a classifier // for the Fisher Iris Flower dataset var iris = new Iris(); double[][] inputs = iris.Instances; // get the flower characteristics int[] outputs = iris.ClassLabels; // get the expected flower classes // We will use mini-batches of size 32 to learn a SVM using SGD var batches = MiniBatches.Create(batchSize: 32, maxIterations: 1000, shuffle: ShuffleMethod.EveryEpoch, input: inputs, output: outputs); // Now, we can create a multi-class teaching algorithm for the SVMs var teacher = new MulticlassSupportVectorLearning<Linear, double[]> { // We will use SGD to learn each of the binary problems in the multi-class problem Learner = (p) => new AveragedStochasticGradientDescent<Linear, double[], LogisticLoss>() { LearningRate = 1e-3, MaxIterations = 1 // so the gradient is only updated once after each mini-batch } }; // The following line is only needed to ensure reproducible results. Please remove it to enable full parallelization teacher.ParallelOptions.MaxDegreeOfParallelism = 1; // (Remove, comment, or change this line to enable full parallelism) // Now, we can start training the model on mini-batches: foreach (var batch in batches) { teacher.Learn(batch.Inputs, batch.Outputs); } // Get the final model: var svm = teacher.Model; // Now, we should be able to use the model to predict // the classes of all flowers in Fisher's Iris dataset: int[] prediction = svm.Decide(inputs); // And from those predictions, we can compute the model accuracy: var cm = new GeneralConfusionMatrix(expected: outputs, predicted: prediction); double accuracy = cm.Accuracy; // should be approximately 0.973
// In this example, we will learn a multi-class SVM using the one-vs-rest (OvR) // approach. The OvR approacbh can decompose decision problems involving multiple // classes into a series of binary ones, which can then be solved using SVMs. // Ensure we have reproducible results Accord.Math.Random.Generator.Seed = 0; // We will try to learn a classifier // for the Fisher Iris Flower dataset var iris = new Iris(); double[][] inputs = iris.Instances; // get the flower characteristics int[] outputs = iris.ClassLabels; // get the expected flower classes // We will use mini-batches of size 32 to learn a SVM using SGD var batches = MiniBatches.Create(batchSize: 32, maxIterations: 1000, shuffle: ShuffleMethod.EveryEpoch, input: inputs, output: outputs); // Now, we can create a multi-label teaching algorithm for the SVMs var teacher = new MultilabelSupportVectorLearning<Linear, double[]> { // We will use SGD to learn each of the binary problems in the multi-class problem Learner = (p) => new AveragedStochasticGradientDescent<Linear, double[], LogisticLoss>() { LearningRate = 1e-3, MaxIterations = 1 // so the gradient is only updated once after each mini-batch } }; // The following line is only needed to ensure reproducible results. Please remove it to enable full parallelization teacher.ParallelOptions.MaxDegreeOfParallelism = 1; // (Remove, comment, or change this line to enable full parallelism) // Now, we can start training the model on mini-batches: foreach (var batch in batches) { teacher.Learn(batch.Inputs, batch.Outputs); } // Get the final model: var svm = teacher.Model; // Now, we should be able to use the model to predict // the classes of all flowers in Fisher's Iris dataset: int[] prediction = svm.ToMulticlass().Decide(inputs); // And from those predictions, we can compute the model accuracy: var cm = new GeneralConfusionMatrix(expected: outputs, predicted: prediction); double accuracy = cm.Accuracy; // should be approximately 0.913
// In this example, we will show how its possible to learn a // non-linear SVM using a linear algorithm by using a explicit // expansion of the kernel function: // Ensure we have reproducible results Accord.Math.Random.Generator.Seed = 0; // We will try to learn a classifier // for the Fisher Iris Flower dataset var iris = new WisconsinDiagnosticBreastCancer(); double[][] inputs = iris.Features; // get the flower characteristics int[] outputs = iris.ClassLabels; // get the expected flower classes // We will use mini-batches of size 32 to learn a SVM using SGD var batches = MiniBatches.Create(batchSize: 32, maxIterations: 1000, shuffle: ShuffleMethod.EveryEpoch, input: inputs, output: outputs); // We will use an explicit Polynomial kernel expansion var polynomial = new Polynomial(2); // Now, we can create a multi-class teaching algorithm for the SVMs var teacher = new MulticlassSupportVectorLearning<Linear, double[]> { // We will use SGD to learn each of the binary problems in the multi-class problem Learner = (p) => new AveragedStochasticGradientDescent<Linear, double[], LogisticLoss>() { LearningRate = 1e-3, MaxIterations = 1 // so the gradient is only updated once after each mini-batch } }; // The following line is only needed to ensure reproducible results. Please remove it to enable full parallelization teacher.ParallelOptions.MaxDegreeOfParallelism = 1; // (Remove, comment, or change this line to enable full parallelism) // Now, we can start training the model on mini-batches: foreach (var batch in batches) { teacher.Learn(polynomial.Transform(batch.Inputs), batch.Outputs); } // Get the final model: var svm = teacher.Model; // The following line is only needed to ensure reproducible results. Please remove it to enable full parallelization svm.ParallelOptions.MaxDegreeOfParallelism = 1; // (Remove, comment, or change this line to enable full parallelism) // Now, we should be able to use the model to predict // the classes of all flowers in Fisher's Iris dataset: int[] prediction = svm.Decide(polynomial.Transform(inputs)); // And from those predictions, we can compute the model accuracy: var cm = new GeneralConfusionMatrix(expected: outputs, predicted: prediction); double accuracy = cm.Accuracy; // should be approximately 0.92