NonlinearLeastSquares Class |
Namespace: Accord.Statistics.Models.Regression.Fitting
public class NonlinearLeastSquares : ISupervisedLearning<NonlinearRegression, double[], double>
The NonlinearLeastSquares type exposes the following members.
Name | Description | |
---|---|---|
NonlinearLeastSquares |
Initializes a new instance of the NonlinearLeastSquares class.
| |
NonlinearLeastSquares(NonlinearRegression) |
Initializes a new instance of the NonlinearLeastSquares class.
| |
NonlinearLeastSquares(NonlinearRegression, ILeastSquaresMethod) |
Initializes a new instance of the NonlinearLeastSquares class.
|
Name | Description | |
---|---|---|
Algorithm |
Gets the Least-Squares
optimization algorithm used to perform the actual learning.
| |
ComputeStandardErrors |
Gets or sets a value indicating whether standard
errors should be computed in the next iteration.
| |
Function |
Gets or sets the model function, mapping inputs to
outputs given a suitable parameter vector.
| |
Gradient |
Gets or sets a function that computes the gradient of the
Function in respect to the current parameters.
| |
NumberOfParameters |
Gets the number of variables (free parameters) in the non-linear model specified in Function.
| |
StartValues |
Gets or sets the vector of initial values to be used at the beginning
of the optimization. Setting a suitable set of initial values can be
important to achieve good convergence or avoid poor local minimas.
| |
Token |
Gets or sets a cancellation token that can be used to
stop the learning algorithm while it is running.
|
Name | Description | |
---|---|---|
Equals | Determines whether the specified object is equal to the current object. (Inherited from Object.) | |
Finalize | Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection. (Inherited from Object.) | |
GetHashCode | Serves as the default hash function. (Inherited from Object.) | |
GetType | Gets the Type of the current instance. (Inherited from Object.) | |
Learn |
Learns a model that can map the given inputs to the given outputs.
| |
MemberwiseClone | Creates a shallow copy of the current Object. (Inherited from Object.) | |
Run | Obsolete.
Runs the fitting algorithm.
| |
ToString | Returns a string that represents the current object. (Inherited from Object.) |
Name | Description | |
---|---|---|
HasMethod |
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.) | |
IsEqual |
Compares two objects for equality, performing an elementwise
comparison if the elements are vectors or matrices.
(Defined by Matrix.) | |
To(Type) | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) | |
ToT | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) |
The first example shows how to fit a non-linear least squares problem with LevenbergMarquardt.
// Suppose we would like to map the continuous values in the // second column to the integer values in the first column. double[,] data = { { -40, -21142.1111111111 }, { -30, -21330.1111111111 }, { -20, -12036.1111111111 }, { -10, 7255.3888888889 }, { 0, 32474.8888888889 }, { 10, 32474.8888888889 }, { 20, 9060.8888888889 }, { 30, -11628.1111111111 }, { 40, -15129.6111111111 }, }; // Extract inputs and outputs double[][] inputs = data.GetColumn(0).ToJagged(); double[] outputs = data.GetColumn(1); // Create a Nonlinear regression using var nls = new NonlinearLeastSquares() { NumberOfParameters = 3, // Initialize to some random values StartValues = new[] { 4.2, 0.3, 1 }, // Let's assume a quadratic model function: ax² + bx + c Function = (w, x) => w[0] * x[0] * x[0] + w[1] * x[0] + w[2], // Derivative in respect to the weights: Gradient = (w, x, r) => { r[0] = w[0]* w[0]; // w.r.t a: a² // https://www.wolframalpha.com/input/?i=diff+ax²+%2B+bx+%2B+c+w.r.t.+a r[1] = w[0]; // w.r.t b: b // https://www.wolframalpha.com/input/?i=diff+ax²+%2B+bx+%2B+c+w.r.t.+b r[2] = 1; // w.r.t c: 1 // https://www.wolframalpha.com/input/?i=diff+ax²+%2B+bx+%2B+c+w.r.t.+c }, Algorithm = new LevenbergMarquardt() { MaxIterations = 100, Tolerance = 0 } }; var regression = nls.Learn(inputs, outputs); // Use the function to compute the input values double[] predict = regression.Transform(inputs);
The second example shows how to fit a non-linear least squares problem with GaussNewton.
// Suppose we would like to map the continuous values in the // second row to the integer values in the first row. double[,] data = { { 0.03, 0.1947, 0.425, 0.626, 1.253, 2.500, 3.740 }, { 0.05, 0.127, 0.094, 0.2122, 0.2729, 0.2665, 0.3317} }; // Extract inputs and outputs double[][] inputs = data.GetRow(0).ToJagged(); double[] outputs = data.GetRow(1); // Create a Nonlinear regression using var nls = new NonlinearLeastSquares() { // Initialize to some random values StartValues = new[] { 0.9, 0.2 }, // Let's assume a quadratic model function: ax² + bx + c Function = (w, x) => (w[0] * x[0]) / (w[1] + x[0]), // Derivative in respect to the weights: Gradient = (w, x, r) => { r[0] = -((-x[0]) / (w[1] + x[0])); r[1] = -((w[0] * x[0]) / Math.Pow(w[1] + x[0], 2)); }, Algorithm = new GaussNewton() { MaxIterations = 0, Tolerance = 1e-5 } }; var regression = nls.Learn(inputs, outputs); // Use the function to compute the input values double[] predict = regression.Transform(inputs);