Skip to contents

Ordinary Least Squares Regression

Usage

fOLS(y, X, c = 1L)

Arguments

y

Dependent variable matrix! not vector

X

Independent variables matrix (T x N)

c

Integer indicator for intercept (1 if intercept included, 0 otherwise)

Value

A list containing:

  • beta: Coefficient estimates ((N+1) x 1 if c=1, N x 1 if c=0)

  • fitted: Fitted values (T x 1)

  • err: Residuals (T x 1)

  • r2: R-squared statistic (scalar)

  • fitted_partial: Fitted values excluding intercept (T x 1)

Details

This function performs ordinary least squares (OLS) regression. The coefficient estimates are computed using the normal equations: $$\hat{\beta} = (X'X)^{-1}X'y$$

The R-squared statistic measures the proportion of variance explained: $$R^2 = 1 - \frac{RSS}{TSS}$$ where RSS is the residual sum of squares and TSS is the total sum of squares.

If an intercept is included (c=1), fitted_partial contains the fitted values excluding the intercept contribution, useful for assessing the explanatory power of the regressors alone.

Examples

if (FALSE) { # \dontrun{
# Generate sample data
set.seed(123)
y <- rnorm(100)
X <- matrix(rnorm(200), 100, 2)

# OLS with intercept
result <- fOLS(y, X, c = 1)
print(result$beta)
print(result$r2)

# OLS without intercept
result_no_int <- fOLS(y, X, c = 0)
} # }