Elementary Regression

Paul E. Johnson

2015-02-18

This accompanies Regression lecture

Draw this

\[ y = 2 + 1 x \]

Set the range of x from 0 to 10

This accompanies Regression lecture

Draw this

\[ y = 2 + 1 x \]

\[ y = 4 + 1 x \]

This accompanies Regression lecture

Draw

\[ y = 2 + 1 x \]

\[ y = 2 + 0 x \]

Line art of the Day

Muenchen1

http://r4stats.com/articles/popularity

Idle question. When would you ever prefer a Correlation to Regression?

Here’s why I asked

ConwayWhiteScatter1

http://www.dataists.com/2010/12/ranking-the-popularity-of-programming-langauges/

Comments

Why am I so luke warm about \(R^2\)

Categorical predictors…

Why do they call that a Linear Estimator?

\[ \hat{\beta}_{1}^{OLS} = \frac{\sum (x_i - \bar{x})(y_i - \bar{y})}{\sum (x_i - \bar{x})^2} \]

Make that more obviously “Linear”

\[ \hat{\beta}_{1}^{OLS} =\sum_{i=1}^{N}\left(\frac{x_{i}-\bar{x}}{\sum(x_{i}-\bar{x})^{2}}\times y_{_{i}}\right) \]

\[ SS_x = \sum (x_i - \bar{x})^2 \]

Make that more obviously “Linear”

These are weights, applied one-by-one in the sum.

Easier to see with centered data

\[ \hat{\beta}_{1}^{OLS}=\frac{\sum x_{i}y_{i}}{\sum x_{i}^{2}} \]

\[ \hat{\beta}_{1}^{OLS} = \sum\left(\frac{x_{i}}{\sum x_{i}{}^{2}}\times y_{i}\right) \]

Easier to see with centered data

\[ \hat{\beta}_{1}^{OLS} = \sum\left(\frac{x_{i}}{SS_x}\times y_{i}\right) \]

\[ \hat{\beta}_{1}^{OLS} = \sum\left(h_i \times y_{i}\right) \]