Click or drag to resize
Accord.NET (logo)

MultilabelSupportVectorLearningTKernel, TInput Class

One-against-all Multi-label Support Vector Machine Learning Algorithm
Inheritance Hierarchy
SystemObject
  Accord.MachineLearningParallelLearningBase
    Accord.MachineLearningOneVsRestLearningTInput, SupportVectorMachineTKernel, TInput, MultilabelSupportVectorMachineTKernel, TInput
      Accord.MachineLearning.VectorMachines.LearningBaseMultilabelSupportVectorLearningTInput, SupportVectorMachineTKernel, TInput, TKernel, MultilabelSupportVectorMachineTKernel, TInput
        Accord.MachineLearning.VectorMachines.LearningMultilabelSupportVectorLearningTKernel, TInput

Namespace:  Accord.MachineLearning.VectorMachines.Learning
Assembly:  Accord.MachineLearning (in Accord.MachineLearning.dll) Version: 3.8.0
Syntax
public class MultilabelSupportVectorLearning<TKernel, TInput> : BaseMultilabelSupportVectorLearning<TInput, SupportVectorMachine<TKernel, TInput>, TKernel, MultilabelSupportVectorMachine<TKernel, TInput>>
where TKernel : Object, IKernel<TInput>
where TInput : ICloneable
Request Example View Source

Type Parameters

TKernel
TInput

The MultilabelSupportVectorLearningTKernel, TInput type exposes the following members.

Constructors
  NameDescription
Public methodMultilabelSupportVectorLearningTKernel, TInput
Initializes a new instance of the MultilabelSupportVectorLearningTKernel, TInput class.
Public methodMultilabelSupportVectorLearningTKernel, TInput(MultilabelSupportVectorMachineTKernel, TInput)
Initializes a new instance of the MultilabelSupportVectorLearningTKernel, TInput class.
Top
Properties
  NameDescription
Public propertyAggregateExceptions
Gets or sets a value indicating whether the entire training algorithm should stop in case an exception has been detected at just one of the inner binary learning problems. Default is true (execution will not be stopped).
(Inherited from OneVsRestLearningTInput, TBinary, TModel.)
Public propertyIsMultilabel (Inherited from OneVsRestLearningTInput, TBinary, TModel.)
Public propertyKernel
Gets or sets the kernel function to be used to learn the kernel support vector machines.
(Inherited from BaseMultilabelSupportVectorLearningTInput, TBinary, TKernel, TModel.)
Public propertyLearner
Gets or sets a function that takes a set of parameters and creates a learning algorithm for learning each of the binary inner classifiers needed by the one-vs-rest classification strategy.
(Inherited from OneVsRestLearningTInput, TBinary, TModel.)
Public propertyModel
Gets or sets the model being learned.
(Inherited from OneVsRestLearningTInput, TBinary, TModel.)
Public propertyParallelOptions
Gets or sets the parallelization options for this algorithm.
(Inherited from ParallelLearningBase.)
Public propertyToken
Gets or sets a cancellation token that can be used to cancel the algorithm while it is running.
(Inherited from ParallelLearningBase.)
Top
Methods
  NameDescription
Public methodConfigureTResult(FuncTResult)
Sets a callback function that takes a set of parameters and creates a learning algorithm for learning each of the binary inner classifiers needed by the one-vs-rest classification strategy. Calling this method sets the Learner property.
(Inherited from OneVsRestLearningTInput, TBinary, TModel.)
Public methodConfigureT, TResult(FuncT, TResult)
Sets a callback function that takes a set of parameters and creates a learning algorithm for learning each of the binary inner classifiers needed by the one-vs-rest classification strategy. Calling this method sets the Learner property.
(Inherited from OneVsRestLearningTInput, TBinary, TModel.)
Protected methodCreate
Creates an instance of the model to be learned. Inheritors of this abstract class must define this method so new models can be created from the training data.
(Overrides OneVsRestLearningTInput, TBinary, TModelCreate(Int32, Int32, Boolean).)
Public methodEquals
Determines whether the specified object is equal to the current object.
(Inherited from Object.)
Protected methodFinalize
Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection.
(Inherited from Object.)
Public methodGetHashCode
Serves as the default hash function.
(Inherited from Object.)
Public methodGetType
Gets the Type of the current instance.
(Inherited from Object.)
Public methodLearn(TInput, Boolean, Double)
Learns a model that can map the given inputs to the given outputs.
(Inherited from OneVsRestLearningTInput, TBinary, TModel.)
Public methodLearn(TInput, Int32, Double)
Learns a model that can map the given inputs to the given outputs.
(Inherited from OneVsRestLearningTInput, TBinary, TModel.)
Public methodLearn(TInput, Int32, Double)
Learns a model that can map the given inputs to the given outputs.
(Inherited from OneVsRestLearningTInput, TBinary, TModel.)
Protected methodMemberwiseClone
Creates a shallow copy of the current Object.
(Inherited from Object.)
Protected methodOnSubproblemFinished
Raises the [E:SubproblemFinished] event.
(Inherited from OneVsRestLearningTInput, TBinary, TModel.)
Protected methodOnSubproblemStarted
Raises the [E:SubproblemStarted] event.
(Inherited from OneVsRestLearningTInput, TBinary, TModel.)
Public methodToString
Returns a string that represents the current object.
(Inherited from Object.)
Top
Events
Extension Methods
  NameDescription
Public Extension MethodHasMethod
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.)
Public Extension MethodIsEqual
Compares two objects for equality, performing an elementwise comparison if the elements are vectors or matrices.
(Defined by Matrix.)
Public Extension MethodTo(Type)Overloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Public Extension MethodToTOverloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Top
Remarks

This class can be used to train Kernel Support Vector Machines with any algorithm using a one-against-all strategy. The underlying training algorithm can be configured by defining the Learner property.

One example of learning algorithm that can be used with this class is the Sequential Minimal Optimization (SMO) algorithm.

Examples

The following example shows how to learn a linear, multi-label (one-vs-rest) support vector machine using the LinearDualCoordinateDescent algorithm.

// Let's say we have the following data to be classified
// into three possible classes. Those are the samples:
// 
double[][] inputs =
{
    //               input         output
    new double[] { 0, 1, 1, 0 }, //  0 
    new double[] { 0, 1, 0, 0 }, //  0
    new double[] { 0, 0, 1, 0 }, //  0
    new double[] { 0, 1, 1, 0 }, //  0
    new double[] { 0, 1, 0, 0 }, //  0
    new double[] { 1, 0, 0, 0 }, //  1
    new double[] { 1, 0, 0, 0 }, //  1
    new double[] { 1, 0, 0, 1 }, //  1
    new double[] { 0, 0, 0, 1 }, //  1
    new double[] { 0, 0, 0, 1 }, //  1
    new double[] { 1, 1, 1, 1 }, //  2
    new double[] { 1, 0, 1, 1 }, //  2
    new double[] { 1, 1, 0, 1 }, //  2
    new double[] { 0, 1, 1, 1 }, //  2
    new double[] { 1, 1, 1, 1 }, //  2
};

int[] outputs = // those are the class labels
{
    0, 0, 0, 0, 0,
    1, 1, 1, 1, 1,
    2, 2, 2, 2, 2,
};

// Create a one-vs-one multi-class SVM learning algorithm 
var teacher = new MultilabelSupportVectorLearning<Linear>()
{
    // using LIBLINEAR's L2-loss SVC dual for each SVM
    Learner = (p) => new LinearDualCoordinateDescent()
    {
        Loss = Loss.L2
    }
};

// The following line is only needed to ensure reproducible results. Please remove it to enable full parallelization
teacher.ParallelOptions.MaxDegreeOfParallelism = 1; // (Remove, comment, or change this line to enable full parallelism)

// Learn a machine
var machine = teacher.Learn(inputs, outputs);

// Obtain class predictions for each sample
bool[][] predicted = machine.Decide(inputs);

// Compute classification error using mean accuracy (mAcc)
double error = new HammingLoss(outputs).Loss(predicted);

The following example shows how to learn a non-linear, multi-label (one-vs-rest) support vector machine using the Gaussian kernel and the SequentialMinimalOptimizationTKernel algorithm.

// Let's say we have the following data to be classified
// into three possible classes. Those are the samples:
// 
double[][] inputs =
{
    //               input         output
    new double[] { 0, 1, 1, 0 }, //  0 
    new double[] { 0, 1, 0, 0 }, //  0
    new double[] { 0, 0, 1, 0 }, //  0
    new double[] { 0, 1, 1, 0 }, //  0
    new double[] { 0, 1, 0, 0 }, //  0
    new double[] { 1, 0, 0, 0 }, //  1
    new double[] { 1, 0, 0, 0 }, //  1
    new double[] { 1, 0, 0, 1 }, //  1
    new double[] { 0, 0, 0, 1 }, //  1
    new double[] { 0, 0, 0, 1 }, //  1
    new double[] { 1, 1, 1, 1 }, //  2
    new double[] { 1, 0, 1, 1 }, //  2
    new double[] { 1, 1, 0, 1 }, //  2
    new double[] { 0, 1, 1, 1 }, //  2
    new double[] { 1, 1, 1, 1 }, //  2
};

int[] outputs = // those are the class labels
{
    0, 0, 0, 0, 0,
    1, 1, 1, 1, 1,
    2, 2, 2, 2, 2,
};

// Create the multi-class learning algorithm for the machine
var teacher = new MulticlassSupportVectorLearning<Gaussian>()
{
    // Configure the learning algorithm to use SMO to train the
    //  underlying SVMs in each of the binary class subproblems.
    Learner = (param) => new SequentialMinimalOptimization<Gaussian>()
    {
        // Estimate a suitable guess for the Gaussian kernel's parameters.
        // This estimate can serve as a starting point for a grid search.
        UseKernelEstimation = true
    }
};

// The following line is only needed to ensure reproducible results. Please remove it to enable full parallelization
teacher.ParallelOptions.MaxDegreeOfParallelism = 1; // (Remove, comment, or change this line to enable full parallelism)

// Learn a machine
var machine = teacher.Learn(inputs, outputs);

// Obtain class predictions for each sample
int[] predicted = machine.Decide(inputs);

// Get class scores for each sample
double[] scores = machine.Score(inputs);

// Compute classification error
double error = new ZeroOneLoss(outputs).Loss(predicted);

Support vector machines can have their weights calibrated in order to produce probability estimates (instead of simple class separation distances). The following example shows how to use ProbabilisticOutputCalibration within MulticlassSupportVectorLearningTKernel to generate a probabilistic SVM:

// Let's say we have the following data to be classified
// into three possible classes. Those are the samples:
// 
double[][] inputs =
{
    //               input         output
    new double[] { 0, 1, 1, 0 }, //  0 
    new double[] { 0, 1, 0, 0 }, //  0
    new double[] { 0, 0, 1, 0 }, //  0
    new double[] { 0, 1, 1, 0 }, //  0
    new double[] { 0, 1, 0, 0 }, //  0
    new double[] { 1, 0, 0, 1 }, //  1
    new double[] { 0, 0, 0, 1 }, //  1
    new double[] { 0, 0, 0, 1 }, //  1
    new double[] { 1, 0, 1, 1 }, //  2
    new double[] { 1, 1, 0, 1 }, //  2
    new double[] { 0, 1, 1, 1 }, //  2
    new double[] { 1, 1, 1, 1 }, //  2
};

int[] outputs = // those are the class labels
{
    0, 0, 0, 0, 0,
    1, 1, 1,
    2, 2, 2, 2,
};

// Create the multi-class learning algorithm for the machine
var teacher = new MultilabelSupportVectorLearning<Gaussian>()
{
    // Configure the learning algorithm to use SMO to train the
    //  underlying SVMs in each of the binary class subproblems.
    Learner = (param) => new SequentialMinimalOptimization<Gaussian>()
    {
        // Estimate a suitable guess for the Gaussian kernel's parameters.
        // This estimate can serve as a starting point for a grid search.
        UseKernelEstimation = true
    }
};

// Learn a machine
var machine = teacher.Learn(inputs, outputs);

// Create the multi-class learning algorithm for the machine
var calibration = new MultilabelSupportVectorLearning<Gaussian>()
{
    Model = machine, // We will start with an existing machine

    // Configure the learning algorithm to use SMO to train the
    //  underlying SVMs in each of the binary class subproblems.
    Learner = (param) => new ProbabilisticOutputCalibration<Gaussian>()
    {
        Model = param.Model // Start with an existing machine
    }
};


// Configure parallel execution options
calibration.ParallelOptions.MaxDegreeOfParallelism = 1;

// Learn a machine
calibration.Learn(inputs, outputs);

// Obtain class predictions for each sample
bool[][] predicted = machine.Decide(inputs);

// Get class scores for each sample
double[][] scores = machine.Scores(inputs);

// Get log-likelihoods (should be same as scores)
double[][] logl = machine.LogLikelihoods(inputs);

// Get probability for each sample
double[][] prob = machine.Probabilities(inputs);

// Compute classification error using mean accuracy (mAcc)
double error = new HammingLoss(outputs).Loss(predicted);
double loss = new CategoryCrossEntropyLoss(outputs).Loss(prob);
See Also