Click or drag to resize
Accord.NET (logo)

SequentialMinimalOptimizationRegression Class

Sequential Minimal Optimization (SMO) Algorithm for Regression. Warning: this code is contained in a GPL assembly. Thus, if you link against this assembly, you should comply with the GPL license.
Inheritance Hierarchy
SystemObject
  Accord.MachineLearning.VectorMachines.LearningBaseSupportVectorRegressionSupportVectorMachineIKernel, IKernel, Double
    Accord.MachineLearning.VectorMachines.LearningBaseSequentialMinimalOptimizationRegressionSupportVectorMachineIKernel, IKernel, Double
      Accord.MachineLearning.VectorMachines.LearningSequentialMinimalOptimizationRegression

Namespace:  Accord.MachineLearning.VectorMachines.Learning
Assembly:  Accord.MachineLearning.GPL (in Accord.MachineLearning.GPL.dll) Version: 3.8.0
Syntax
public class SequentialMinimalOptimizationRegression : BaseSequentialMinimalOptimizationRegression<SupportVectorMachine<IKernel>, IKernel, double[]>
Request Example View Source

The SequentialMinimalOptimizationRegression type exposes the following members.

Constructors
Properties
  NameDescription
Protected propertyC
Gets or sets the cost values associated with each input vector.
(Inherited from BaseSupportVectorRegressionTModel, TKernel, TInput.)
Public propertyComplexity
Complexity (cost) parameter C. Increasing the value of C forces the creation of a more accurate model that may not generalize well. If this value is not set and UseComplexityHeuristic is set to true, the framework will automatically guess a value for C. If this value is manually set to something else, then UseComplexityHeuristic will be automatically disabled and the given value will be used instead.
(Inherited from BaseSupportVectorRegressionTModel, TKernel, TInput.)
Public propertyEpsilon
Insensitivity zone ε. Increasing the value of ε can result in fewer support vectors in the created model. Default value is 1e-3.
(Inherited from BaseSupportVectorRegressionTModel, TKernel, TInput.)
Protected propertyInputs
Gets or sets the input vectors for training.
(Inherited from BaseSupportVectorRegressionTModel, TKernel, TInput.)
Protected propertyIsLinear
Gets whether the machine to be learned has a Linear kernel.
(Inherited from BaseSupportVectorRegressionTModel, TKernel, TInput.)
Public propertyKernel
Gets or sets the kernel function use to create a kernel Support Vector Machine. If this property is set, UseKernelEstimation will be set to false.
(Inherited from BaseSupportVectorRegressionTModel, TKernel, TInput.)
Public propertyModel
Gets the machine to be taught.
(Inherited from BaseSupportVectorRegressionTModel, TKernel, TInput.)
Protected propertyOutputs
Gets or sets the output values for each calibration vector.
(Inherited from BaseSupportVectorRegressionTModel, TKernel, TInput.)
Public propertyToken
Gets or sets a cancellation token that can be used to stop the learning algorithm while it is running.
(Inherited from BaseSupportVectorRegressionTModel, TKernel, TInput.)
Public propertyTolerance
Convergence tolerance. Default value is 1e-3.
(Inherited from BaseSequentialMinimalOptimizationRegressionTModel, TKernel, TInput.)
Public propertyUseComplexityHeuristic
Gets or sets a value indicating whether the Complexity parameter C should be computed automatically by employing an heuristic rule. Default is false.
(Inherited from BaseSupportVectorRegressionTModel, TKernel, TInput.)
Public propertyUseKernelEstimation
Gets or sets whether initial values for some kernel parameters should be estimated from the data, if possible. Default is true.
(Inherited from BaseSupportVectorRegressionTModel, TKernel, TInput.)
Public propertyWeights
Gets or sets the individual weight of each sample in the training set. If set to null, all samples will be assumed equal weight. Default is null.
(Inherited from BaseSupportVectorRegressionTModel, TKernel, TInput.)
Top
Methods
  NameDescription
Public methodComputeError Obsolete.
Obsolete.
(Inherited from BaseSupportVectorRegressionTModel, TKernel, TInput.)
Protected methodCreate
Obsolete.
(Overrides BaseSupportVectorRegressionTModel, TKernel, TInputCreate(Int32, TKernel).)
Public methodEquals
Determines whether the specified object is equal to the current object.
(Inherited from Object.)
Protected methodFinalize
Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection.
(Inherited from Object.)
Public methodGetHashCode
Serves as the default hash function.
(Inherited from Object.)
Public methodGetType
Gets the Type of the current instance.
(Inherited from Object.)
Protected methodInnerRun
Runs the learning algorithm.
(Inherited from BaseSequentialMinimalOptimizationRegressionTModel, TKernel, TInput.)
Public methodLearn
Learns a model that can map the given inputs to the given outputs.
(Inherited from BaseSupportVectorRegressionTModel, TKernel, TInput.)
Protected methodMemberwiseClone
Creates a shallow copy of the current Object.
(Inherited from Object.)
Public methodRun Obsolete.
Obsolete.
(Inherited from BaseSupportVectorRegressionTModel, TKernel, TInput.)
Public methodToString
Returns a string that represents the current object.
(Inherited from Object.)
Top
Extension Methods
  NameDescription
Public Extension MethodHasMethod
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.)
Public Extension MethodIsEqual
Compares two objects for equality, performing an elementwise comparison if the elements are vectors or matrices.
(Defined by Matrix.)
Public Extension MethodTo(Type)Overloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Public Extension MethodToTOverloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Top
Remarks

The SMO algorithm is an algorithm for solving large quadratic programming (QP) optimization problems, widely used for the training of support vector machines. First developed by John C. Platt in 1998, SMO breaks up large QP problems into a series of smallest possible QP problems, which are then solved analytically.

This class incorporates modifications in the original SMO algorithm to solve regression problems as suggested by Alex J. Smola and Bernhard Schölkopf and further modifications for better performance by Shevade et al.

Portions of this implementation has been based on the GPL code by Sylvain Roy in SMOreg.java, a part of the Weka software package. It is, thus, available under the same GPL license. This file is not linked against the rest of the Accord.NET Framework and can only be used in GPL applications. This class is only available in the special Accord.MachineLearning.GPL assembly, which has to be explicitly selected in the framework installation. Before linking against this assembly, please read the GPL license for more details. This assembly also should have been distributed with a copy of the GNU GPLv3 alongside with it.

For a non-GPL'd version, see FanChenLinSupportVectorRegressionTKernel.

To use this class, add a reference to the Accord.MachineLearning.GPL.dll assembly that resides inside the Release/GPL folder of the framework's installation directory.

References:

Examples
Accord.Math.Random.Generator.Seed = 0;

// Example regression problem. Suppose we are trying
// to model the following equation: f(x, y) = 2x + y

double[][] inputs = // (x, y)
{
    new double[] { 0,  1 }, // 2*0 + 1 =  1
    new double[] { 4,  3 }, // 2*4 + 3 = 11
    new double[] { 8, -8 }, // 2*8 - 8 =  8
    new double[] { 2,  2 }, // 2*2 + 2 =  6
    new double[] { 6,  1 }, // 2*6 + 1 = 13
    new double[] { 5,  4 }, // 2*5 + 4 = 14
    new double[] { 9,  1 }, // 2*9 + 1 = 19
    new double[] { 1,  6 }, // 2*1 + 6 =  8
};

double[] outputs = // f(x, y)
{
    1, 11, 8, 6, 13, 14, 19, 8
};

// Create the sequential minimal optimization teacher
var learn = new SequentialMinimalOptimizationRegression<Polynomial>()
{
    Kernel = new Polynomial(2), // Polynomial Kernel of 2nd degree
    Complexity = 100
};

// Run the learning algorithm
SupportVectorMachine<Polynomial> svm = learn.Learn(inputs, outputs);

// Compute the predicted scores
double[] predicted = svm.Score(inputs);

// Compute the error between the expected and predicted
double error = new SquareLoss(outputs).Loss(predicted);

// Compute the answer for one particular example
double fxy = svm.Score(inputs[0]); // 1.0003849827673186
See Also