MulticlassSupportVectorLearning Class |
Note: This API is now obsolete.
Namespace: Accord.MachineLearning.VectorMachines.Learning
[ObsoleteAttribute("Please specify the desired kernel function as a template parameter.")] public class MulticlassSupportVectorLearning : BaseMulticlassSupportVectorLearning<double[], SupportVectorMachine<IKernel<double[]>>, IKernel<double[]>, MulticlassSupportVectorMachine>
The MulticlassSupportVectorLearning type exposes the following members.
Name | Description | |
---|---|---|
MulticlassSupportVectorLearning |
Initializes a new instance of the MulticlassSupportVectorLearning class.
| |
MulticlassSupportVectorLearning(MulticlassSupportVectorMachine, Double, Int32) | Obsolete.
Obsolete.
|
Name | Description | |
---|---|---|
AggregateExceptions |
Gets or sets a value indicating whether the entire training algorithm should stop
in case an exception has been detected at just one of the inner binary learning
problems. Default is true (execution will not be stopped).
(Inherited from OneVsOneLearningTInput, TBinary, TModel.) | |
Algorithm | Obsolete.
Obsolete.
| |
Kernel |
Gets or sets the kernel function to be used to learn the
kernel support
vector machines.
| |
Learner |
Gets or sets a function that takes a set of parameters and creates
a learning algorithm for learning each of the binary inner classifiers
needed by the one-vs-one classification strategy.
(Inherited from OneVsOneLearningTInput, TBinary, TModel.) | |
Model |
Gets or sets the model being learned.
(Inherited from OneVsOneLearningTInput, TBinary, TModel.) | |
ParallelOptions |
Gets or sets the parallelization options for this algorithm.
(Inherited from ParallelLearningBase.) | |
Token |
Gets or sets a cancellation token that can be used
to cancel the algorithm while it is running.
(Inherited from ParallelLearningBase.) |
Name | Description | |
---|---|---|
ComputeError | Obsolete.
Computes the error ratio, the number of
misclassifications divided by the total
number of samples in a dataset.
| |
ConfigureTResult(FuncTResult) |
Sets a callback function that takes a set of parameters and creates
a learning algorithm for learning each of the binary inner classifiers
needed by the one-vs-rest classification strategy. Calling this method
sets the Learner property.
(Inherited from OneVsOneLearningTInput, TBinary, TModel.) | |
ConfigureT, TResult(FuncT, TResult) |
Sets a callback function that takes a set of parameters and creates
a learning algorithm for learning each of the binary inner classifiers
needed by the one-vs-rest classification strategy. Calling this method
sets the Learner property.
(Inherited from OneVsOneLearningTInput, TBinary, TModel.) | |
Convert |
Converts SupportVectorMachineLearningConfigurationFunction
into a lambda function that can be passed to the Learner
property of a MulticlassSupportVectorLearning learning algorithm.
| |
Create |
Creates an instance of the model to be learned. Inheritors
of this abstract class must define this method so new models
can be created from the training data.
(Overrides OneVsOneLearningTInput, TBinary, TModelCreate(Int32, Int32).) | |
Equals | Determines whether the specified object is equal to the current object. (Inherited from Object.) | |
Finalize | Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection. (Inherited from Object.) | |
GetHashCode | Serves as the default hash function. (Inherited from Object.) | |
GetType | Gets the Type of the current instance. (Inherited from Object.) | |
Learn |
Learns a model that can map the given inputs to the given outputs.
(Inherited from OneVsOneLearningTInput, TBinary, TModel.) | |
MemberwiseClone | Creates a shallow copy of the current Object. (Inherited from Object.) | |
OnSubproblemFinished |
Raises the [E:SubproblemFinished] event.
(Inherited from OneVsOneLearningTInput, TBinary, TModel.) | |
OnSubproblemStarted |
Raises the [E:SubproblemStarted] event.
(Inherited from OneVsOneLearningTInput, TBinary, TModel.) | |
Run | Obsolete.
Obsolete.
| |
ToString | Returns a string that represents the current object. (Inherited from Object.) |
Name | Description | |
---|---|---|
SubproblemFinished |
Occurs when the learning of a subproblem has finished.
(Inherited from OneVsOneLearningTInput, TBinary, TModel.) | |
SubproblemStarted |
Occurs when the learning of a subproblem has started.
(Inherited from OneVsOneLearningTInput, TBinary, TModel.) |
Name | Description | |
---|---|---|
HasMethod |
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.) | |
IsEqual |
Compares two objects for equality, performing an elementwise
comparison if the elements are vectors or matrices.
(Defined by Matrix.) | |
To(Type) | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) | |
ToT | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) |
This class can be used to train Kernel Support Vector Machines with any algorithm using a one-against-one strategy. The underlying training algorithm can be configured by defining the Algorithm property.
One example of learning algorithm that can be used with this class is the Sequential Minimal Optimization (SMO) algorithm.
The following example shows how to learn a linear, multi-class support vector machine using the LinearDualCoordinateDescent algorithm.
// Let's say we have the following data to be classified // into three possible classes. Those are the samples: // double[][] inputs = { // input output new double[] { 0, 1, 1, 0 }, // 0 new double[] { 0, 1, 0, 0 }, // 0 new double[] { 0, 0, 1, 0 }, // 0 new double[] { 0, 1, 1, 0 }, // 0 new double[] { 0, 1, 0, 0 }, // 0 new double[] { 1, 0, 0, 0 }, // 1 new double[] { 1, 0, 0, 0 }, // 1 new double[] { 1, 0, 0, 1 }, // 1 new double[] { 0, 0, 0, 1 }, // 1 new double[] { 0, 0, 0, 1 }, // 1 new double[] { 1, 1, 1, 1 }, // 2 new double[] { 1, 0, 1, 1 }, // 2 new double[] { 1, 1, 0, 1 }, // 2 new double[] { 0, 1, 1, 1 }, // 2 new double[] { 1, 1, 1, 1 }, // 2 }; int[] outputs = // those are the class labels { 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, }; // Create a one-vs-one multi-class SVM learning algorithm var teacher = new MulticlassSupportVectorLearning<Linear>() { // using LIBLINEAR's L2-loss SVC dual for each SVM Learner = (p) => new LinearDualCoordinateDescent() { Loss = Loss.L2 } }; // The following line is only needed to ensure reproducible results. Please remove it to enable full parallelization teacher.ParallelOptions.MaxDegreeOfParallelism = 1; // (Remove, comment, or change this line to enable full parallelism) // Learn a machine var machine = teacher.Learn(inputs, outputs); // Obtain class predictions for each sample int[] predicted = machine.Decide(inputs); // Compute classification error double error = new ZeroOneLoss(outputs).Loss(predicted);
The following example shows how to learn a non-linear, multi-class support vector machine using the Gaussian kernel and the SequentialMinimalOptimization algorithm.
// Let's say we have the following data to be classified // into three possible classes. Those are the samples: // double[][] inputs = { // input output new double[] { 0, 1, 1, 0 }, // 0 new double[] { 0, 1, 0, 0 }, // 0 new double[] { 0, 0, 1, 0 }, // 0 new double[] { 0, 1, 1, 0 }, // 0 new double[] { 0, 1, 0, 0 }, // 0 new double[] { 1, 0, 0, 0 }, // 1 new double[] { 1, 0, 0, 0 }, // 1 new double[] { 1, 0, 0, 1 }, // 1 new double[] { 0, 0, 0, 1 }, // 1 new double[] { 0, 0, 0, 1 }, // 1 new double[] { 1, 1, 1, 1 }, // 2 new double[] { 1, 0, 1, 1 }, // 2 new double[] { 1, 1, 0, 1 }, // 2 new double[] { 0, 1, 1, 1 }, // 2 new double[] { 1, 1, 1, 1 }, // 2 }; int[] outputs = // those are the class labels { 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, }; // Create the multi-class learning algorithm for the machine var teacher = new MulticlassSupportVectorLearning<Gaussian>() { // Configure the learning algorithm to use SMO to train the // underlying SVMs in each of the binary class subproblems. Learner = (param) => new SequentialMinimalOptimization<Gaussian>() { // Estimate a suitable guess for the Gaussian kernel's parameters. // This estimate can serve as a starting point for a grid search. UseKernelEstimation = true } }; // The following line is only needed to ensure reproducible results. Please remove it to enable full parallelization teacher.ParallelOptions.MaxDegreeOfParallelism = 1; // (Remove, comment, or change this line to enable full parallelism) // Learn a machine var machine = teacher.Learn(inputs, outputs); // Obtain class predictions for each sample int[] predicted = machine.Decide(inputs); // Get class scores for each sample double[] scores = machine.Score(inputs); // Compute classification error double error = new ZeroOneLoss(outputs).Loss(predicted);
Support vector machines can have their weights calibrated in order to produce probability estimates (instead of simple class separation distances). The following example shows how to use ProbabilisticOutputCalibration within MulticlassSupportVectorLearning to generate a probabilistic SVM:
// Let's say we have the following data to be classified // into three possible classes. Those are the samples: // double[][] inputs = { // input output new double[] { 0, 1, 1, 0 }, // 0 new double[] { 0, 1, 0, 0 }, // 0 new double[] { 0, 0, 1, 0 }, // 0 new double[] { 0, 1, 1, 0 }, // 0 new double[] { 0, 1, 0, 0 }, // 0 new double[] { 1, 0, 0, 1 }, // 1 new double[] { 0, 0, 0, 1 }, // 1 new double[] { 0, 0, 0, 1 }, // 1 new double[] { 1, 0, 1, 1 }, // 2 new double[] { 1, 1, 0, 1 }, // 2 new double[] { 0, 1, 1, 1 }, // 2 new double[] { 1, 1, 1, 1 }, // 2 }; int[] outputs = // those are the class labels { 0, 0, 0, 0, 0, 1, 1, 1, 2, 2, 2, 2, }; // Create the multi-class learning algorithm for the machine var teacher = new MulticlassSupportVectorLearning<Gaussian>() { // Configure the learning algorithm to use SMO to train the // underlying SVMs in each of the binary class subproblems. Learner = (param) => new SequentialMinimalOptimization<Gaussian>() { // Estimate a suitable guess for the Gaussian kernel's parameters. // This estimate can serve as a starting point for a grid search. UseKernelEstimation = true } }; // Learn a machine var machine = teacher.Learn(inputs, outputs); // Create the multi-class learning algorithm for the machine var calibration = new MulticlassSupportVectorLearning<Gaussian>() { Model = machine, // We will start with an existing machine // Configure the learning algorithm to use Platt's calibration Learner = (param) => new ProbabilisticOutputCalibration<Gaussian>() { Model = param.Model // Start with an existing machine } }; // Configure parallel execution options calibration.ParallelOptions.MaxDegreeOfParallelism = 1; // Learn a machine calibration.Learn(inputs, outputs); // Obtain class predictions for each sample int[] predicted = machine.Decide(inputs); // Get class scores for each sample double[] scores = machine.Score(inputs); // Get log-likelihoods (should be same as scores) double[][] logl = machine.LogLikelihoods(inputs); // Get probability for each sample double[][] prob = machine.Probabilities(inputs); // Compute classification error double error = new ZeroOneLoss(outputs).Loss(predicted); double loss = new CategoryCrossEntropyLoss(outputs).Loss(prob);