Click or drag to resize
Accord.NET (logo)

LowerBoundNewtonRaphson Class

Lower-Bound Newton-Raphson for Multinomial logistic regression fitting.
Inheritance Hierarchy
SystemObject
  Accord.Statistics.Models.Regression.FittingLowerBoundNewtonRaphson

Namespace:  Accord.Statistics.Models.Regression.Fitting
Assembly:  Accord.Statistics (in Accord.Statistics.dll) Version: 3.8.0
Syntax
public class LowerBoundNewtonRaphson : ISupervisedLearning<MultinomialLogisticRegression, double[], int>, 
	ISupervisedLearning<MultinomialLogisticRegression, double[], int[]>, ISupervisedLearning<MultinomialLogisticRegression, double[], double[]>, 
	IConvergenceLearning
Request Example View Source

The LowerBoundNewtonRaphson type exposes the following members.

Constructors
Properties
  NameDescription
Public propertyComputeStandardErrors
Gets or sets a value indicating whether standard errors should be computed in the next iteration.
Public propertyCurrentIteration
Gets or sets the number of performed iterations.
Public propertyGradient
Gets the Gradient vector computed in the last Newton-Raphson iteration.
Public propertyHasConverged
Gets or sets whether the algorithm has converged.
Public propertyHessianLowerBound
Gets the Lower-Bound matrix being used in place of the Hessian matrix in the Newton-Raphson iterations.
Public propertyIterations Obsolete.
Please use MaxIterations instead.
Public propertyMaximumChange
Gets the maximum parameter change in the last iteration. If this value is less than Tolerance, the algorithm has converged.
Public propertyMaxIterations
Gets or sets the maximum number of iterations performed by the learning algorithm.
Public propertyParameterChange
Gets the vector of parameter updates in the last iteration.
Public propertyParameters
Gets the total number of parameters in the model.
Public propertyPrevious
Gets the previous values for the coefficients which were in place before the last learning iteration was performed.
Public propertySolution
Gets the current values for the coefficients.
Public propertyToken
Gets or sets a cancellation token that can be used to stop the learning algorithm while it is running.
Public propertyTolerance
Gets or sets the tolerance value used to determine whether the algorithm has converged.
Public propertyUpdateLowerBound
Gets or sets a value indicating whether the lower bound should be updated using new data. Default is true.
Top
Methods
  NameDescription
Public methodEquals
Determines whether the specified object is equal to the current object.
(Inherited from Object.)
Protected methodFinalize
Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection.
(Inherited from Object.)
Public methodGetHashCode
Serves as the default hash function.
(Inherited from Object.)
Public methodGetType
Gets the Type of the current instance.
(Inherited from Object.)
Public methodLearn(Double, Double, Double)
Learns a model that can map the given inputs to the given outputs.
Public methodLearn(Double, Int32, Double)
Learns a model that can map the given inputs to the given outputs.
Public methodLearn(Double, Int32, Double)
Learns a model that can map the given inputs to the given outputs.
Protected methodMemberwiseClone
Creates a shallow copy of the current Object.
(Inherited from Object.)
Public methodRun(Double, Double) Obsolete.
Runs one iteration of the Lower-Bound Newton-Raphson iteration.
Public methodRun(Double, Int32) Obsolete.
Runs one iteration of the Lower-Bound Newton-Raphson iteration.
Public methodToString
Returns a string that represents the current object.
(Inherited from Object.)
Top
Extension Methods
  NameDescription
Public Extension MethodHasMethod
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.)
Public Extension MethodIsEqual
Compares two objects for equality, performing an elementwise comparison if the elements are vectors or matrices.
(Defined by Matrix.)
Public Extension MethodTo(Type)Overloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Public Extension MethodToTOverloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Top
Remarks

The Lower Bound principle consists of replacing the second derivative matrix by a global lower bound in the Leowner ordering [Böhning, 92]. In the case of multinomial logistic regression estimation, the Hessian of the negative log-likelihood function can be replaced by one of those lower bounds, leading to a monotonically converging sequence of iterates. Furthermore, [Krishnapuram, Carin, Figueiredo and Hartemink, 2005] also have shown that a lower bound can be achieved which does not depend on the coefficients for the current iteration.

References:

  • B. Krishnapuram, L. Carin, M.A.T. Figueiredo, A. Hartemink. Sparse Multinomial Logistic Regression: Fast Algorithms and Generalization Bounds. 2005. Available on: http://www.lx.it.pt/~mtf/Krishnapuram_Carin_Figueiredo_Hartemink_2005.pdf
  • D. Böhning. Multinomial logistic regression algorithm. Annals of the Institute of Statistical Mathematics, 44(9):197 ˝U200, 1992. 2. M. Corney.
  • Bishop, Christopher M.; Pattern Recognition and Machine Learning. Springer; 1st ed. 2006.

Examples
// Declare a very simple classification/regression
// problem with only 2 input variables (x and y):
double[][] inputs =
{
    new[] { 3.0, 1.0 },
    new[] { 7.0, 1.0 },
    new[] { 3.0, 1.1 },
    new[] { 3.0, 2.0 },
    new[] { 6.0, 1.0 },
};

// Class labels for each of the inputs
int[] outputs =
{
    0, 2, 0, 1, 2
};

// Create a estimation algorithm to estimate the regression
LowerBoundNewtonRaphson lbnr = new LowerBoundNewtonRaphson()
{
    MaxIterations = 100,
    Tolerance = 1e-6
};

// Now, we will iteratively estimate our model:
MultinomialLogisticRegression mlr = lbnr.Learn(inputs, outputs);

// We can compute the model answers
int[] answers = mlr.Decide(inputs);

// And also the probability of each of the answers
double[][] probabilities = mlr.Probabilities(inputs);

// Now we can check how good our model is at predicting
double error = new ZeroOneLoss(outputs).Loss(answers);

// We can also verify the classes with highest 
// probability are the ones being decided for:
int[] argmax = probabilities.ArgMax(dimension: 1); // should be same as 'answers
See Also