LogLikelihoodLoss Class |
Namespace: Accord.Math.Optimization.Losses
[SerializableAttribute] public class LogLikelihoodLoss : ILoss<double[][]>, ILoss<double[][], double>, ILoss<double[]>, ILoss<double[], double>
The LogLikelihoodLoss type exposes the following members.
Name | Description | |
---|---|---|
LogLikelihoodLoss |
Initializes a new instance of the LogLikelihoodLoss class.
|
Name | Description | |
---|---|---|
Equals | Determines whether the specified object is equal to the current object. (Inherited from Object.) | |
Finalize | Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection. (Inherited from Object.) | |
GetHashCode | Serves as the default hash function. (Inherited from Object.) | |
GetType | Gets the Type of the current instance. (Inherited from Object.) | |
Loss(Double) |
Computes the loss between the expected values (ground truth)
and the given actual values that have been predicted.
| |
Loss(Double) |
Computes the loss between the expected values (ground truth)
and the given actual values that have been predicted.
| |
MemberwiseClone | Creates a shallow copy of the current Object. (Inherited from Object.) | |
ToString | Returns a string that represents the current object. (Inherited from Object.) |
Name | Description | |
---|---|---|
HasMethod |
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.) | |
IsEqual |
Compares two objects for equality, performing an elementwise
comparison if the elements are vectors or matrices.
(Defined by Matrix.) | |
To(Type) | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) | |
ToT | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) |
The log-likelihood loss can be used to measure the performance of unsupervised model fitting algorithms. It simply computes the sum of all log-likelihood values produced by the model.
If you would like to measure the performance of a supervised classification model based on their probability predictions, please refer to the BinaryCrossEntropyLoss and CategoryCrossEntropyLoss for binary and multi-class decision problems, respectively.
The following example shows how to learn an one-class SVM and measure its performance using the log-likelihood loss class.
// Ensure that results are reproducible Accord.Math.Random.Generator.Seed = 0; // Generate some data to be learned double[][] inputs = { new double[] { +1.0312479734420776 }, new double[] { +0.99444115161895752 }, new double[] { +0.21835240721702576 }, new double[] { +0.47197291254997253 }, new double[] { +0.68701112270355225 }, new double[] { -0.58556461334228516 }, new double[] { -0.64154046773910522 }, new double[] { -0.66485315561294556 }, new double[] { +0.37940266728401184 }, new double[] { -0.61046308279037476 } }; // Create a new One-class SVM learning algorithm var teacher = new OneclassSupportVectorLearning<Linear>() { Kernel = new Linear(), // or, for example, 'new Gaussian(0.9)' Nu = 0.1 }; // Learn a support vector machine var svm = teacher.Learn(inputs); // Test the machine double[] prediction = svm.Score(inputs); // Compute the log-likelihood of the answer double ll = new LogLikelihoodLoss().Loss(prediction);