DynamicTimeWarpingTDistance, TInput Structure |
Namespace: Accord.Statistics.Kernels
[SerializableAttribute] public struct DynamicTimeWarping<TDistance, TInput> : IKernel<TInput[]>, ICloneable, IDistance<TInput[]>, IDistance<TInput[], TInput[]> where TDistance : struct, new(), IDistance<TInput>
The DynamicTimeWarpingTDistance, TInput type exposes the following members.
Name | Description | |
---|---|---|
DynamicTimeWarpingTDistance, TInput(Double) |
Constructs a new Dynamic Time Warping kernel.
| |
DynamicTimeWarpingTDistance, TInput(TDistance) |
Constructs a new Dynamic Time Warping kernel.
| |
DynamicTimeWarpingTDistance, TInput(TDistance, Double) |
Constructs a new Dynamic Time Warping kernel.
|
Name | Description | |
---|---|---|
Gamma |
Gets or sets the gamma value for the kernel. When setting
gamma, sigma gets updated accordingly (gamma = 0.5/sigma^2).
| |
Sigma |
Gets or sets the sigma value for the kernel. When setting
sigma, gamma gets updated accordingly (gamma = 0.5/sigma^2).
| |
SigmaSquared |
Gets or sets the sigma² value for the kernel. When setting
sigma², gamma gets updated accordingly (gamma = 0.5/sigma²).
|
Name | Description | |
---|---|---|
Clone |
Creates a new object that is a copy of the current instance.
| |
Distance |
Computes the squared distance in feature space
between two points given in input space.
| |
Equals | Indicates whether this instance and a specified object are equal. (Inherited from ValueType.) | |
Function |
Dynamic Time Warping kernel function.
| |
GetHashCode | Returns the hash code for this instance. (Inherited from ValueType.) | |
GetType | Gets the Type of the current instance. (Inherited from Object.) | |
ToString | Returns the fully qualified type name of this instance. (Inherited from ValueType.) |
Name | Description | |
---|---|---|
HasMethod |
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.) | |
IsEqual |
Compares two objects for equality, performing an elementwise
comparison if the elements are vectors or matrices.
(Defined by Matrix.) | |
To(Type) | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) | |
ToT | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) |
The Dynamic Time Warping Sequence Kernel is a sequence kernel, accepting vector sequences of variable size as input. Despite the sequences being variable in size, the vectors contained in such sequences should have its size fixed and should be informed at the construction of this kernel.
The conversion of the DTW global distance to a dot product uses a combination of a technique known as spherical normalization and the polynomial kernel. The degree of the polynomial kernel and the alpha for the spherical normalization should be given at the construction of the kernel. For more information, please see the referenced papers shown below.
The use of a cache is highly advisable when using this kernel.
The following example demonstrates how to create and learn a Support Vector Machine (SVM) to recognize sequences of univariate observations using the Dynamic Time Warping kernel.
// Suppose you have sequences of univariate observations, // and that those sequences could be of arbitrary length. // In this example, we have sequences binary numbers: double[][] inputs = { // Class -1 new double[] { 0,1,1,0 }, new double[] { 0,0,1,0 }, new double[] { 0,1,1,1,0 }, new double[] { 0,1,0 }, // Class +1 new double[] { 1,0,0,1 }, new double[] { 1,1,0,1 }, new double[] { 1,0,0,0,1 }, new double[] { 1,0,1 }, new double[] { 1,0,0,0,1,1 } }; int[] outputs = { 0, 0, 0, 0, // First four sequences are of class 0 1, 1, 1, 1, 1 // Last five sequences are of class 1 }; // Create the Sequential Minimal Optimization learning algorithm var smo = new SequentialMinimalOptimization<DynamicTimeWarping>() { Complexity = 1.5, // Set the parameters of the kernel Kernel = new DynamicTimeWarping(alpha: 1, degree: 1) }; // And use it to learn a machine! var svm = smo.Learn(inputs, outputs); // Now we can compute predicted values bool[] predicted = svm.Decide(inputs); // And check how far we are from the expected values double error = new ZeroOneLoss(outputs).Loss(predicted); // error will be 0.0
Now, instead of having univariate observations, the following example demonstrates how to create and learn a sequences of multivariate (or n-dimensional) observations.
// Suppose you have sequences of multivariate observations, and that // those sequences could be of arbitrary length. On the other hand, // each observation have a fixed, delimited number of dimensions. // In this example, we have sequences of 3-dimensional observations. // Each sequence can have an arbitrary length, but each observation // will always have length 3: double[][][] sequences = { new double[][] // first sequence { new double[] { 1, 1, 1 }, // first observation of the first sequence new double[] { 1, 2, 1 }, // second observation of the first sequence new double[] { 1, 4, 2 }, // third observation of the first sequence new double[] { 2, 2, 2 }, // fourth observation of the first sequence }, new double[][] // second sequence (note that this sequence has a different length) { new double[] { 1, 1, 1 }, // first observation of the second sequence new double[] { 1, 5, 6 }, // second observation of the second sequence new double[] { 2, 7, 1 }, // third observation of the second sequence }, new double[][] // third sequence { new double[] { 8, 2, 1 }, // first observation of the third sequence }, new double[][] // fourth sequence { new double[] { 8, 2, 5 }, // first observation of the fourth sequence new double[] { 1, 5, 4 }, // second observation of the fourth sequence } }; // Now, we will also have different class labels associated which each // sequence. We will assign -1 to sequences whose observations start // with { 1, 1, 1 } and +1 to those that do not: int[] outputs = { 0, 0, // First two sequences are of class 0 (those start with {1,1,1}) 1, 1, // Last two sequences are of class 1 (don't start with {1,1,1}) }; // Now we can create the Sequential Minimal Optimization learning algorithm var smo = new SequentialMinimalOptimization<DynamicTimeWarping, double[][]>() { Complexity = 1.5, // Set the parameters of the kernel Kernel = new DynamicTimeWarping(alpha: 1, degree: 1) }; // And use it to learn a machine! var svm = smo.Learn(sequences, outputs); // Now we can compute predicted values bool[] predicted = svm.Decide(sequences); // And check how far we are from the expected values double error = new ZeroOneLoss(outputs).Loss(predicted); // error will be 0.0 // At this point, we should have obtained an useful machine. Let's // see if it can understand a few examples it hasn't seem before: double[][] a = { new double[] { 1, 1, 1 }, new double[] { 7, 2, 5 }, new double[] { 2, 5, 1 }, }; double[][] b = { new double[] { 8, 5, 2 }, new double[] { 4, 2, 5 }, }; // Following the aforementioned logic, sequence (a) should be // classified as -1, and sequence (b) should be classified as +1. bool resultA = svm.Decide(a); // false bool resultB = svm.Decide(b); // true