J-Kit
Português

linear regression for prediction with OLS

Linear Regression for Prediction — OLS

OLS minimizes the sum of squared residuals. The resulting equation Ŷ = β₀ + β₁X lets you predict Y for any X in the sampled domain.

How to interpret OLS coefficients

  • β₁ (slope): each additional unit of X increases/decreases Y by β₁ units. β₀ (intercept): expected value of Y when X = 0.
  • Standard error measures the typical spread of points around the line. r² measures the portion of Y's variance explained by the model.

Examples

Sales prediction

Input
X = investimento em publicidade (R$), Y = vendas
Expected output
Ŷ = 500 + 3.2 × X

Each additional R$1 in advertising generates R$3.20 in predicted sales.

Height vs weight

Input
X = altura (cm), Y = peso (kg)
Expected output
Ŷ = −105 + 1.0 × X

For 175cm: Ŷ = −105 + 175 = 70kg expected.

Full tool FAQ

Simple linear regression is a statistical model that describes the linear relationship between an independent variable (X) and a dependent variable (Y) as a line: Ŷ = β₀ + β₁X.

Frequently asked questions

Can I predict values far beyond the observed data?

Not recommended. Extrapolation beyond the sampled range assumes the linear relationship continues, which is often invalid. The model is only reliable within the observed domain.

Does this page replace official or professional review?

No. It helps explain the scenario and use the tool more safely, but real decisions should consider official sources, full context and qualified guidance when needed.