ConfusionMatrix Class |
Namespace: Accord.Statistics.Analysis
The ConfusionMatrix type exposes the following members.
Name | Description | |
---|---|---|
ConfusionMatrix(Int32) |
Constructs a new Confusion Matrix.
| |
ConfusionMatrix(Boolean, Boolean) |
Constructs a new Confusion Matrix.
| |
ConfusionMatrix(Int32, Int32) |
Constructs a new Confusion Matrix.
| |
ConfusionMatrix(Int32, Int32, Int32) |
Constructs a new Confusion Matrix.
| |
ConfusionMatrix(Int32, Int32, Int32, Int32) |
Constructs a new Confusion Matrix.
| |
ConfusionMatrix(Int32, Int32, Int32, Int32) |
Constructs a new Confusion Matrix.
|
Name | Description | |
---|---|---|
Accuracy |
Accuracy, or raw performance of the system
| |
ActualNegatives |
Gets the number of actual negatives
| |
ActualPositives |
Gets the number of actual positives.
| |
ChanceAgreement |
Chance agreement.
| |
ChiSquare |
Gets the Chi-Square statistic for the contingency table.
| |
ColumnTotals |
Gets the marginal sums for table columns.
| |
Diagonal |
Gets the diagonal of the confusion matrix.
| |
Efficiency |
Efficiency, the arithmetic mean of sensitivity and specificity
| |
Error |
Error rate, or 1 - accuracy.
| |
Errors |
Gets the number of errors between the expected and predicted values.
| |
ExpectedValues |
Expected values, or values that could
have been generated just by chance.
| |
FalseDiscoveryRate |
False Discovery Rate, or the expected false positive rate.
| |
FalseNegatives |
Cases incorrectly identified by the system as negatives.
| |
FalsePositiveRate |
False Positive Rate, also known as false alarm rate.
| |
FalsePositives |
Cases incorrectly identified by the system as positives.
| |
FScore | ||
GeometricAgreement |
Geometric agreement.
| |
Hits |
Gets the number of hits between the expected and predicted values.
| |
Kappa |
Kappa coefficient.
| |
Matrix |
Gets the confusion matrix in count matrix form.
| |
MatthewsCorrelationCoefficient |
Matthews Correlation Coefficient, also known as Phi coefficient
| |
NegativePredictiveValue |
Negative Predictive Value, also known as Negative Precision
| |
NormalizedMutualInformation |
Normalized Mutual Information.
| |
NumberOfClasses |
Gets the number of classes in this decision problem.
| |
NumberOfSamples |
Gets the number of observations for this matrix.
| |
OddsRatio |
Odds-ratio.
| |
OverallAgreement |
Overall agreement.
| |
OverallDiagnosticPower |
Diagnostic power.
| |
Pearson |
Pearson's contingency coefficient C.
| |
PositivePredictiveValue |
Positive Predictive Value, also known as Positive Precision
| |
Precision |
Precision, same as the PositivePredictiveValue.
| |
PredictedNegatives |
Gets the number of predicted negatives.
| |
PredictedPositives |
Gets the number of predicted positives.
| |
Prevalence |
Prevalence of outcome occurrence.
| |
Recall |
Recall, same as the Sensitivity.
| |
RowTotals |
Gets the marginal sums for table rows.
| |
Samples | Obsolete.
Gets the number of observations for this matrix
| |
Sensitivity |
Sensitivity, also known as True Positive Rate
| |
Specificity |
Specificity, also known as True Negative Rate
| |
StandardError |
Gets the standard error of the Kappa
coefficient of performance.
| |
StandardErrorUnderNull |
Gets the standard error of the Kappa
under the null hypothesis that the underlying Kappa
value is 0.
| |
TrueNegatives |
Cases correctly identified by the system as negatives.
| |
TruePositives |
Cases correctly identified by the system as positives.
| |
Variance |
Gets the variance of the Kappa
coefficient of performance.
| |
VarianceUnderNull |
Gets the variance of the Kappa
under the null hypothesis that the underlying
Kappa value is 0.
|
Name | Description | |
---|---|---|
Combine |
Combines several confusion matrices into one single matrix.
| |
Equals | Determines whether the specified object is equal to the current object. (Inherited from Object.) | |
EstimateTInput(IClassifierTInput, Boolean, TInput, Boolean) |
Estimates a ConfusionMatrix directly from a classifier, a set of inputs and its expected outputs.
| |
EstimateTInput(IClassifierTInput, Int32, TInput, Int32) |
Estimates a ConfusionMatrix directly from a classifier, a set of inputs and its expected outputs.
| |
Finalize | Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection. (Inherited from Object.) | |
GetHashCode | Serves as the default hash function. (Inherited from Object.) | |
GetType | Gets the Type of the current instance. (Inherited from Object.) | |
MemberwiseClone | Creates a shallow copy of the current Object. (Inherited from Object.) | |
ToGeneralMatrix |
Converts this matrix into a GeneralConfusionMatrix.
| |
ToString |
Returns a String representing this confusion matrix.
(Overrides ObjectToString.) |
Name | Description | |
---|---|---|
HasMethod |
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.) | |
IsEqual |
Compares two objects for equality, performing an elementwise
comparison if the elements are vectors or matrices.
(Defined by Matrix.) | |
To(Type) | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) | |
ToT | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) |
References:
The first example shows how a confusion matrix can be constructed from a vector of expected (ground-truth) values and their associated predictions (as done by a test, procedure or machine learning classifier):
// Let's say we have a decision problem involving 3 classes. In a typical // machine learning problem, have a set of expected, ground truth values: // int[] expected = { 0, 0, 0, 0, 1, 1, 1, 1, 2, 2, 2, 2 }; // And we have a set of values that have been predicted by a machine model: // int[] predicted = { 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1 }; // We can get different performance measures to assess how good our model was at // predicting the true, expected, ground-truth labels for the decision problem: var cm = new GeneralConfusionMatrix(classes: 3, expected: expected, predicted: predicted); // We can obtain the proper confusion matrix using: int[,] matrix = cm.Matrix; // The values of this matrix should be the same as: int[,] expectedMatrix = { // expected /*predicted*/ { 4, 0, 0 }, { 0, 4, 4 }, { 0, 0, 0 }, }; // We can get more information about our problem as well: int classes = cm.NumberOfClasses; // should be 3 int samples = cm.NumberOfSamples; // should be 12 // And multiple performance measures: double accuracy = cm.Accuracy; // should be 0.66666666666666663 double error = cm.Error; // should be 0.33333333333333337 double chanceAgreement = cm.ChanceAgreement; // should be 0.33333333333333331 double geommetricAgreement = cm.GeometricAgreement; // should be 0 (the classifier completely missed one class) double pearson = cm.Pearson; // should be 0.70710678118654757 double kappa = cm.Kappa; // should be 0.49999999999999994 double tau = cm.Tau; // should be 0.49999999999999994 double chiSquare = cm.ChiSquare; // should be 12 // and some of their standard errors: double kappaStdErr = cm.StandardError; // should be 0.15590239111558091 double kappaStdErr0 = cm.StandardErrorUnderNull; // should be 0.16666666666666663
The second example shows how to construct a binary confusion matrix directly from a classifier and a dataset:
// Let's say we want to classify the following 2-dimensional // data samples into 2 possible classes, either true or false: double[][] inputs = { new double[] { 10, 42 }, new double[] { 162, 96 }, new double[] { 125, 20 }, new double[] { 96, 6 }, new double[] { 2, 73 }, new double[] { 52, 51 }, new double[] { 71, 49 }, }; // And those are their associated class labels bool[] outputs = { false, false, true, true, false, false, true }; // We can create an AdaBoost algorithm as: var learner = new AdaBoost<DecisionStump>() { Learner = (p) => new ThresholdLearning(), // Train until: MaxIterations = 5, Tolerance = 1e-3 }; // Now, we can use the Learn method to learn a boosted classifier Boost<DecisionStump> classifier = learner.Learn(inputs, outputs); // And we can test its performance using (error should be 0): ConfusionMatrix cm = ConfusionMatrix.Estimate(classifier, inputs, outputs); double error = cm.Error; // should be 0.0 double acc = cm.Accuracy; // should be 1.0 double kappa = cm.Kappa; // should be 1.0 // And compute a decision for a single data point using: bool y = classifier.Decide(inputs[0]); // result should false