Class OLSMultipleLinearRegression
- java.lang.Object
-
- org.apache.commons.math.stat.regression.AbstractMultipleLinearRegression
-
- org.apache.commons.math.stat.regression.OLSMultipleLinearRegression
-
- All Implemented Interfaces:
MultipleLinearRegression
public class OLSMultipleLinearRegression extends AbstractMultipleLinearRegression
Implements ordinary least squares (OLS) to estimate the parameters of a multiple linear regression model.
The regression coefficients,
b
, satisfy the normal equations:XT X b = XT y
To solve the normal equations, this implementation uses QR decomposition of the
X
matrix. (SeeQRDecompositionImpl
for details on the decomposition algorithm.) TheX
matrix, also known as the design matrix, has rows corresponding to sample observations and columns corresponding to independent variables. When the model is estimated using an intercept term (i.e. whenisNoIntercept
is false as it is by default), theX
matrix includes an initial column identically equal to 1. We solve the normal equations as follows:XTX b = XT y (QR)T (QR) b = (QR)Ty RT (QTQ) R b = RT QT y RT R b = RT QT y (RT)-1 RT R b = (RT)-1 RT QT y R b = QT y
Given
Q
andR
, the last equation is solved by back-substitution.- Since:
- 2.0
- Version:
- $Revision: 1073464 $ $Date: 2011-02-22 20:35:02 +0100 (mar. 22 févr. 2011) $
-
-
Field Summary
-
Fields inherited from class org.apache.commons.math.stat.regression.AbstractMultipleLinearRegression
X, Y
-
-
Constructor Summary
Constructors Constructor Description OLSMultipleLinearRegression()
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description double
calculateAdjustedRSquared()
Returns the adjusted R-squared statistic, defined by the formulaprotected RealVector
calculateBeta()
Calculates the regression coefficients using OLS.protected RealMatrix
calculateBetaVariance()
Calculates the variance-covariance matrix of the regression parameters.RealMatrix
calculateHat()
Compute the "hat" matrix.double
calculateResidualSumOfSquares()
Returns the sum of squared residuals.double
calculateRSquared()
Returns the R-Squared statistic, defined by the formuladouble
calculateTotalSumOfSquares()
Returns the sum of squared deviations of Y from its mean.void
newSampleData(double[] y, double[][] x)
Loads model x and y sample data, overriding any previous sample.void
newSampleData(double[] data, int nobs, int nvars)
Loads model x and y sample data from a flat input array, overriding any previous sample.protected void
newXSampleData(double[][] x)
Loads new x sample data, overriding any previous data.-
Methods inherited from class org.apache.commons.math.stat.regression.AbstractMultipleLinearRegression
calculateErrorVariance, calculateResiduals, calculateYVariance, estimateErrorVariance, estimateRegressandVariance, estimateRegressionParameters, estimateRegressionParametersStandardErrors, estimateRegressionParametersVariance, estimateRegressionStandardError, estimateResiduals, isNoIntercept, newYSampleData, setNoIntercept, validateCovarianceData, validateSampleData
-
-
-
-
Method Detail
-
newSampleData
public void newSampleData(double[] y, double[][] x)
Loads model x and y sample data, overriding any previous sample. Computes and caches QR decomposition of the X matrix.- Parameters:
y
- the [n,1] array representing the y samplex
- the [n,k] array representing the x sample- Throws:
java.lang.IllegalArgumentException
- if the x and y array data are not compatible for the regression
-
newSampleData
public void newSampleData(double[] data, int nobs, int nvars)
Loads model x and y sample data from a flat input array, overriding any previous sample.
Assumes that rows are concatenated with y values first in each row. For example, an input
data
array containing the sequence of values (1, 2, 3, 4, 5, 6, 7, 8, 9) withnobs = 3
andnvars = 2
creates a regression dataset with two independent variables, as below:y x[0] x[1] -------------- 1 2 3 4 5 6 7 8 9
Note that there is no need to add an initial unitary column (column of 1's) when specifying a model including an intercept term. If
AbstractMultipleLinearRegression.isNoIntercept()
istrue
, the X matrix will be created without an initial column of "1"s; otherwise this column will be added.Throws IllegalArgumentException if any of the following preconditions fail:
data
cannot be nulldata.length = nobs * (nvars + 1)
nobs > nvars
This implementation computes and caches the QR decomposition of the X matrix.
- Overrides:
newSampleData
in classAbstractMultipleLinearRegression
- Parameters:
data
- input data arraynobs
- number of observations (rows)nvars
- number of independent variables (columns, not counting y)
-
calculateHat
public RealMatrix calculateHat()
Compute the "hat" matrix.
The hat matrix is defined in terms of the design matrix X by X(XTX)-1XT
The implementation here uses the QR decomposition to compute the hat matrix as Q IpQT where Ip is the p-dimensional identity matrix augmented by 0's. This computational formula is from "The Hat Matrix in Regression and ANOVA", David C. Hoaglin and Roy E. Welsch, The American Statistician, Vol. 32, No. 1 (Feb., 1978), pp. 17-22.
- Returns:
- the hat matrix
-
calculateTotalSumOfSquares
public double calculateTotalSumOfSquares()
Returns the sum of squared deviations of Y from its mean.
If the model has no intercept term,
0
is used for the mean of Y - i.e., what is returned is the sum of the squared Y values.The value returned by this method is the SSTO value used in the
R-squared
computation.- Returns:
- SSTO - the total sum of squares
- Since:
- 2.2
- See Also:
AbstractMultipleLinearRegression.isNoIntercept()
-
calculateResidualSumOfSquares
public double calculateResidualSumOfSquares()
Returns the sum of squared residuals.- Returns:
- residual sum of squares
- Since:
- 2.2
-
calculateRSquared
public double calculateRSquared()
Returns the R-Squared statistic, defined by the formulaR2 = 1 - SSR / SSTO
where SSR is thesum of squared residuals
and SSTO is thetotal sum of squares
- Returns:
- R-square statistic
- Since:
- 2.2
-
calculateAdjustedRSquared
public double calculateAdjustedRSquared()
Returns the adjusted R-squared statistic, defined by the formula
R2adj = 1 - [SSR (n - 1)] / [SSTO (n - p)]
where SSR is thesum of squared residuals
, SSTO is thetotal sum of squares
, n is the number of observations and p is the number of parameters estimated (including the intercept).If the regression is estimated without an intercept term, what is returned is
1 - (1 -
calculateRSquared()
) * (n / (n - p))- Returns:
- adjusted R-Squared statistic
- Since:
- 2.2
- See Also:
AbstractMultipleLinearRegression.isNoIntercept()
-
newXSampleData
protected void newXSampleData(double[][] x)
Loads new x sample data, overriding any previous data.
The inputx
array should have one row for each sample observation, with columns corresponding to independent variables. For example, ifx = new double[][] {{1, 2}, {3, 4}, {5, 6}}
setXSampleData(x)
results in a model with two independent variables and 3 observations:x[0] x[1] ---------- 1 2 3 4 5 6
Note that there is no need to add an initial unitary column (column of 1's) when specifying a model including an intercept term.
This implementation computes and caches the QR decomposition of the X matrix once it is successfully loaded.
- Overrides:
newXSampleData
in classAbstractMultipleLinearRegression
- Parameters:
x
- the rectangular array representing the x sample
-
calculateBeta
protected RealVector calculateBeta()
Calculates the regression coefficients using OLS.- Specified by:
calculateBeta
in classAbstractMultipleLinearRegression
- Returns:
- beta
-
calculateBetaVariance
protected RealMatrix calculateBetaVariance()
Calculates the variance-covariance matrix of the regression parameters.
Var(b) = (XTX)-1
Uses QR decomposition to reduce (XTX)-1 to (RTR)-1, with only the top p rows of R included, where p = the length of the beta vector.
- Specified by:
calculateBetaVariance
in classAbstractMultipleLinearRegression
- Returns:
- The beta variance-covariance matrix
-
-