Click or drag to resize
Accord.NET (logo)

OrdinaryLeastSquares Class

Least Squares learning algorithm for linear regression models.
Inheritance Hierarchy
SystemObject
  Accord.Statistics.Models.Regression.LinearOrdinaryLeastSquares

Namespace:  Accord.Statistics.Models.Regression.Linear
Assembly:  Accord.Statistics (in Accord.Statistics.dll) Version: 3.8.0
Syntax
public class OrdinaryLeastSquares : ISupervisedLearning<MultivariateLinearRegression, double[], double[]>, 
	ISupervisedLearning<MultipleLinearRegression, double[], double>, ISupervisedLearning<SimpleLinearRegression, double, double>
Request Example View Source

The OrdinaryLeastSquares type exposes the following members.

Constructors
  NameDescription
Public methodOrdinaryLeastSquares
Initializes a new instance of the OrdinaryLeastSquares class.
Top
Properties
  NameDescription
Public propertyIsRobust
Gets or sets whether to always use a robust Least-Squares estimate using the SingularValueDecomposition. Default is false.
Public propertyToken
Gets or sets a cancellation token that can be used to stop the learning algorithm while it is running.
Public propertyUseIntercept
Gets or sets whether to include an intercept term in the learned models. Default is true.
Top
Methods
  NameDescription
Public methodEquals
Determines whether the specified object is equal to the current object.
(Inherited from Object.)
Protected methodFinalize
Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection.
(Inherited from Object.)
Public methodGetHashCode
Serves as the default hash function.
(Inherited from Object.)
Public methodGetInformationMatrix
Public methodGetType
Gets the Type of the current instance.
(Inherited from Object.)
Public methodLearn(Double, Double, Double)
Learns a model that can map the given inputs to the given outputs.
Public methodLearn(Double, Double, Double)
Learns a model that can map the given inputs to the given outputs.
Public methodLearn(Double, Double, Double)
Learns a model that can map the given inputs to the given outputs.
Protected methodMemberwiseClone
Creates a shallow copy of the current Object.
(Inherited from Object.)
Public methodToString
Returns a string that represents the current object.
(Inherited from Object.)
Top
Extension Methods
  NameDescription
Public Extension MethodHasMethod
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.)
Public Extension MethodIsEqual
Compares two objects for equality, performing an elementwise comparison if the elements are vectors or matrices.
(Defined by Matrix.)
Public Extension MethodTo(Type)Overloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Public Extension MethodToTOverloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Top
Examples

Let's say we have some univariate, continuous sets of input data, and a corresponding univariate, continuous set of output data, such as a set of points in R². A simple linear regression is able to fit a line relating the input variables to the output variables in which the minimum-squared-error of the line and the actual output points is minimum.

// Let's say we have some univariate, continuous sets of input data,
// and a corresponding univariate, continuous set of output data, such
// as a set of points in R². A simple linear regression is able to fit
// a line relating the input variables to the output variables in which
// the minimum-squared-error of the line and the actual output points
// is minimum.

// Declare some sample test data.
double[] inputs = { 80, 60, 10, 20, 30 };
double[] outputs = { 20, 40, 30, 50, 60 };

// Use Ordinary Least Squares to learn the regression
OrdinaryLeastSquares ols = new OrdinaryLeastSquares();

// Use OLS to learn the simple linear regression
SimpleLinearRegression regression = ols.Learn(inputs, outputs);

// Compute the output for a given input:
double y = regression.Transform(85); // The answer will be 28.088

// We can also extract the slope and the intercept term
// for the line. Those will be -0.26 and 50.5, respectively.
double s = regression.Slope;     // -0.264706
double c = regression.Intercept; // 50.588235

The following example shows how to fit a multiple linear regression model to model a plane as an equation in the form ax + by + c = z.

// We will try to model a plane as an equation in the form
// "ax + by + c = z". We have two input variables (x and y)
// and we will be trying to find two parameters a and b and 
// an intercept term c.

// We will use Ordinary Least Squares to create a
// linear regression model with an intercept term
var ols = new OrdinaryLeastSquares()
{
    UseIntercept = true
};

// Now suppose you have some points
double[][] inputs =
{
    new double[] { 1, 1 },
    new double[] { 0, 1 },
    new double[] { 1, 0 },
    new double[] { 0, 0 },
};

// located in the same Z (z = 1)
double[] outputs = { 1, 1, 1, 1 };

// Use Ordinary Least Squares to estimate a regression model
MultipleLinearRegression regression = ols.Learn(inputs, outputs);

// As result, we will be given the following:
double a = regression.Weights[0]; // a = 0
double b = regression.Weights[1]; // b = 0
double c = regression.Intercept;  // c = 1

// This is the plane described by the equation
// ax + by + c = z => 0x + 0y + 1 = z => 1 = z.

// We can compute the predicted points using
double[] predicted = regression.Transform(inputs);

// And the squared error loss using 
double error = new SquareLoss(outputs).Loss(predicted);

// We can also compute other measures, such as the coefficient of determination r²
double r2 = new RSquaredLoss(numberOfInputs: 2, expected: outputs).Loss(predicted); // should be 1

// We can also compute the adjusted or weighted versions of r² using
var r2loss = new RSquaredLoss(numberOfInputs: 2, expected: outputs)
{
    Adjust = true,
    // Weights = weights; // (if you have a weighted problem)
};

double ar2 = r2loss.Loss(predicted); // should be 1

// Alternatively, we can also use the less generic, but maybe more user-friendly method directly:
double ur2 = regression.CoefficientOfDetermination(inputs, outputs, adjust: true); // should be 1

The following example shows how to fit a multivariate linear regression model, producing multidimensional outputs for each input.

// The multivariate linear regression is a generalization of
// the multiple linear regression. In the multivariate linear
// regression, not only the input variables are multivariate,
// but also are the output dependent variables.

// In the following example, we will perform a regression of
// a 2-dimensional output variable over a 3-dimensional input
// variable.

double[][] inputs =
{
    // variables:  x1  x2  x3
    new double[] {  1,  1,  1 }, // input sample 1
    new double[] {  2,  1,  1 }, // input sample 2
    new double[] {  3,  1,  1 }, // input sample 3
};

double[][] outputs =
{
    // variables:  y1  y2
    new double[] {  2,  3 }, // corresponding output to sample 1
    new double[] {  4,  6 }, // corresponding output to sample 2
    new double[] {  6,  9 }, // corresponding output to sample 3
};

// With a quick eye inspection, it is possible to see that
// the first output variable y1 is always the double of the
// first input variable. The second output variable y2 is
// always the triple of the first input variable. The other
// input variables are unused. Nevertheless, we will fit a
// multivariate regression model and confirm the validity
// of our impressions:

// Use Ordinary Least Squares to create the regression
OrdinaryLeastSquares ols = new OrdinaryLeastSquares();

// Now, compute the multivariate linear regression:
MultivariateLinearRegression regression = ols.Learn(inputs, outputs);

// We can obtain predictions using
double[][] predictions = regression.Transform(inputs);

// The prediction error is
double error = new SquareLoss(outputs).Loss(predictions); // 0

// At this point, the regression error will be 0 (the fit was
// perfect). The regression coefficients for the first input
// and first output variables will be 2. The coefficient for
// the first input and second output variables will be 3. All
// others will be 0.
// 
// regression.Coefficients should be the matrix given by
// 
// double[,] coefficients = {
//                              { 2, 3 },
//                              { 0, 0 },
//                              { 0, 0 },
//                          };
// 

// We can also check the r-squared coefficients of determination:
double[] r2 = regression.CoefficientOfDetermination(inputs, outputs);
See Also