Click or drag to resize
Accord.NET (logo)

Precomputed Structure

Precomputed Gram Matrix Kernel.

Namespace:  Accord.Statistics.Kernels
Assembly:  Accord.Statistics (in Accord.Statistics.dll) Version: 3.8.0
Syntax
[SerializableAttribute]
public struct Precomputed : IKernel, 
	IKernel<double[]>, IKernel<int>, ICloneable
Request Example View Source

The Precomputed type exposes the following members.

Constructors
Properties
  NameDescription
Public propertyIndices
Gets a vector of indices that can be fed as the inputs of a learning algorithm. The learning algorithm will then use the indices to refer to each element in the precomputed kernel matrix.
Public propertyMatrix Obsolete.
Gets or sets the precomputed Gram matrix for this kernel.
Public propertyNumberOfBasisVectors
Gets the dimension of the basis spawned by the initial training vectors.
Public propertyNumberOfSamples
Gets the current number of training samples.
Public propertyValues
Gets or sets the precomputed Gram matrix for this kernel.
Top
Methods
  NameDescription
Public methodClone
Creates a new object that is a copy of the current instance.
Public methodEquals
Indicates whether this instance and a specified object are equal.
(Inherited from ValueType.)
Public methodFunction(Double, Double)
Kernel function.
Public methodFunction(Int32, Int32)
The kernel function.
Public methodGetHashCode
Returns the hash code for this instance.
(Inherited from ValueType.)
Public methodGetType
Gets the Type of the current instance.
(Inherited from Object.)
Public methodToString
Returns the fully qualified type name of this instance.
(Inherited from ValueType.)
Top
Extension Methods
  NameDescription
Public Extension MethodDistance
Computes the kernel distance for a kernel function even if it doesn't implement the IDistance interface. Can be used to check the proper implementation of the distance function.
(Defined by Tools.)
Public Extension MethodHasMethod
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.)
Public Extension MethodIsEqual
Compares two objects for equality, performing an elementwise comparison if the elements are vectors or matrices.
(Defined by Matrix.)
Public Extension MethodTo(Type)Overloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Public Extension MethodToTOverloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Top
Examples

The following example shows how to learn a multi-class SVM using a precomputed kernel matrix, obtained from a Polynomial kernel.

            // Let's say we have the following data to be classified
            // into three possible classes. Those are the samples:
            // 
            double[][] trainInputs =
            {
                //               input         output
                new double[] { 0, 1, 1, 0 }, //  0 
                new double[] { 0, 1, 0, 0 }, //  0
                new double[] { 0, 0, 1, 0 }, //  0
                new double[] { 0, 1, 1, 0 }, //  0
                new double[] { 0, 1, 0, 0 }, //  0
                new double[] { 1, 0, 0, 0 }, //  1
                new double[] { 1, 0, 0, 0 }, //  1
                new double[] { 1, 0, 0, 1 }, //  1
                new double[] { 0, 0, 0, 1 }, //  1
                new double[] { 0, 0, 0, 1 }, //  1
                new double[] { 1, 1, 1, 1 }, //  2
                new double[] { 1, 0, 1, 1 }, //  2
                new double[] { 1, 1, 0, 1 }, //  2
                new double[] { 0, 1, 1, 1 }, //  2
                new double[] { 1, 1, 1, 1 }, //  2
            };

            int[] trainOutputs = // those are the training set class labels
            {
                0, 0, 0, 0, 0,
                1, 1, 1, 1, 1,
                2, 2, 2, 2, 2,
            };

            // Let's chose a kernel function
            Polynomial kernel = new Polynomial(2);

            // Get the kernel matrix for the training set
            double[][] K = kernel.ToJagged(trainInputs);

            // Create a pre-computed kernel
            var pre = new Precomputed(K);

            // Create a one-vs-one learning algorithm using SMO
            var teacher = new MulticlassSupportVectorLearning<Precomputed, int>()
            {
                Learner = (p) => new SequentialMinimalOptimization<Precomputed, int>()
                {
                    Kernel = pre
                }
            };

#if DEBUG
            teacher.ParallelOptions.MaxDegreeOfParallelism = 1;
#endif

            // Learn a machine
            var machine = teacher.Learn(pre.Indices, trainOutputs);

            // Compute the machine's prediction for the training set
            int[] trainPrediction = machine.Decide(pre.Indices);

            // Evaluate prediction error for the training set using mean accuracy (mAcc)
            double trainingError = new ZeroOneLoss(trainOutputs).Loss(trainPrediction);

            // Now let's compute the machine's prediction for a test set
            double[][] testInputs = // test-set inputs
            {
                //               input         output
                new double[] { 0, 1, 1, 0 }, //  0 
                new double[] { 0, 1, 0, 0 }, //  0
                new double[] { 0, 0, 0, 1 }, //  1
                new double[] { 1, 1, 1, 1 }, //  2
            };

            int[] testOutputs = // those are the test set class labels
            {
                0, 0,  1,  2,
            };

            // Compute precomputed matrix between train and testing
            pre.Values = kernel.ToJagged2(trainInputs, testInputs);

            // Update the kernel
            machine.Kernel = pre;

            // Compute the machine's prediction for the test set
            int[] testPrediction = machine.Decide(pre.Indices);

            // Evaluate prediction error for the training set using mean accuracy (mAcc)
            double testError = new ZeroOneLoss(testOutputs).Loss(testPrediction);

The following example shows how to learn a simple binary SVM using a precomputed kernel matrix obtained from a Gaussian kernel.

// As an example, we will try to learn a decision machine 
// that can replicate the "exclusive-or" logical function:

double[][] inputs =
{
    new double[] { 0, 0 }, // the XOR function takes two booleans
    new double[] { 0, 1 }, // and computes their exclusive or: the
    new double[] { 1, 0 }, // output is true only if the two booleans
    new double[] { 1, 1 }  // are different
};

int[] xor = // this is the output of the xor function
{
    0, // 0 xor 0 = 0 (inputs are equal)
    1, // 0 xor 1 = 1 (inputs are different)
    1, // 1 xor 0 = 1 (inputs are different)
    0, // 1 xor 1 = 0 (inputs are equal)
};

// Let's use a Gaussian kernel
var kernel = new Gaussian(0.1);

// Create a pre-computed Gaussian kernel matrix
var precomputed = new Precomputed(kernel.ToJagged(inputs));

// Now, we can create the sequential minimal optimization teacher
var learn = new SequentialMinimalOptimization<Precomputed, int>()
{
    Kernel = precomputed // set the precomputed kernel we created
};

// And then we can obtain the SVM by using Learn
var svm = learn.Learn(precomputed.Indices, xor);

// Finally, we can obtain the decisions predicted by the machine:
bool[] prediction = svm.Decide(precomputed.Indices);

// We can also compute the machine prediction to new samples
double[][] sample =
{
    new double[] { 0, 1 }
};

// Update the precomputed kernel with the new samples
precomputed = new Precomputed(kernel.ToJagged2(inputs, sample));

// Update the SVM kernel
svm.Kernel = precomputed;

// Compute the predictions to the new samples
bool[] newPrediction = svm.Decide(precomputed.Indices);
See Also