Linear Regression

Linear Regression

Linear regression analysis is a statistical technique for finding out exactly which linear function best fits a given set of data. We can find out the equation of the regression line by using an algebraic method called the least squares method, available on most scientific calculators. The linear regression equation is written \(\hat{y}=a+bx\) (we say y-hat) or \(y=A+Bx\). Of course these are both variations of the more familiar equation \(y=mx+c\).

The least squares method is very simple. Suppose we guess a line of best fit, then at every data point, we find the distance between the data point and the line. If the line fitted the data perfectly, this distance would be zero for all the data points. The worse the fit, the larger the differences. We then square each of these distances, and add them all together.

3f1e34d6ef1265f1bcaeac7f98fc6301.png

The best-fit line is then the line that minimises the sum of the squared distances.

Suppose we have a data set of \(n\) points \(\{({x}_{1};{y}_{1}),({x}_{2};{y}_{2}),...,({x}_{n};{y}_{n})\}\). We also have a line \(f(x)=mx+c\) that we are trying to fit to the data. The distance between the first data point and the line, for example, is

\(\text{distance}={y}_{1}-f({x}_{1})={y}_{1}-(m{x}_{1}+c)\)

We now square each of these distances and add them together. Lets call this sum \(S(m,c)\). Then we have that

\begin{align*} S(m,c) & = {({y}_{1}-f({x}_{1}))}^{2}+{({y}_{2}-f({x}_{2}))}^{2}+\ldots +{({y}_{n}-f({x}_{n}))}^{2} \\ & = \sum_{i=1}^{n}{({y}_{i}-f({x}_{i}))}^{2} \end{align*}

Thus our problem is to find the value of m and c such that \(S(m,c)\) is minimised. Let us call these minimising values \(b\) and \(a\) respectively. Then the line of best-fit is \(f(x)=a+bx\). We can find \(a\) and \(b\) using calculus, but it is tricky, and we will just give you the result, which is that

\begin{align*} b & = \cfrac{n{\sum }_{i=1}^{n}{x}_{i}{y}_{i}-{\sum }_{i=1}^{n}{x}_{i}{\sum }_{i=1}^{n}{y}_{i}}{n{\sum }_{i=1}^{n}{({x}_{i})}^{2}-{({\sum }_{i=1}^{n}{x}_{i})}^{2}} \\ a & = \cfrac{1}{n}\sum _{i=1}^{n}{y}_{i}-\cfrac{b}{n}\sum _{i=1}^{n}{x}_{i}=\bar{y}-b\bar{x} \end{align*}

Example

Question

In the table below, we have the records of the maintenance costs in rands compared with the age of the appliance in months. We have data for five appliances. Determine the equation for the least squares regression line by hand.

Appliance

1

2

3

4

5

Age (\(x\))

\(\text{5}\)

\(\text{10}\)

\(\text{15}\)

\(\text{20}\)

\(\text{30}\)

Cost (\(y\))

\(\text{90}\)

\(\text{140}\)

\(\text{250}\)

\(\text{300}\)

\(\text{380}\)

Appliance

\(x\)

\(y\)

\(xy\)

\({x}^{2}\)

1

\(\text{5}\)

\(\text{90}\)

\(\text{450}\)

\(\text{25}\)

2

\(\text{10}\)

\(\text{140}\)

\(\text{1 400}\)

\(\text{100}\)

3

\(\text{15}\)

\(\text{250}\)

\(\text{3 750}\)

\(\text{225}\)

4

\(\text{20}\)

\(\text{300}\)

\(\text{6 000}\)

\(\text{400}\)

5

\(\text{30}\)

\(\text{380}\)

\(\text{11 400}\)

\(\text{900}\)

Total

80

\(\text{1 160}\)

\(\text{23 000}\)

\(\text{1 650}\)

\begin{align*} b & = \cfrac{n\sum xy-\sum x\sum y}{n\sum {x}^{2}-{(\sum x)}^{2}}=\cfrac{5\times 23000-80\times 1160}{5\times 1650-{80}^{2}}=12 \\ a & = \bar{y}-b\bar{x}=\cfrac{1160}{5}-\cfrac{12\times 80}{5}=40 \\ \therefore \hat{y}&= 40+12x \end{align*}

Example

Question

Using a calculator, find the equation of the least squares regression line for the following data:

Days (\(x\))

1

2

3

4

5

Growth in m (\(y\))

\(\text{1.00}\)

\(\text{2.50}\)

\(\text{2.75}\)

\(\text{3.00}\)

\(\text{3.50}\)

NB. If you have a CASIO calculator, do the next worked example first. Come back to this worked example once you are done and see if you get the same answer on your calculator.

Getting your calculator ready

Using your calculator, change the mode from normal to “Stat \(xy\) ”. Do this by pressing [2ndF] and then 2. This mode enables you to type in bivariate data.

Entering the data

Key in the data row by row:

Enter:

Press:

Enter:

Press:

See:

1

\((x,y)\)

1

DATA

n = \(\text{1}\)

2

\((x,y)\)

\(\text{2.5}\)

DATA

n = \(\text{2}\)

3

\((x,y)\)

\(\text{2.75}\)

DATA

n = \(\text{3}\)

4

\((x,y)\)

\(\text{3.0}\)

DATA

n = \(\text{4}\)

5

\((x,y)\)

\(\text{3.5}\)

DATA

n = \(\text{5}\)

Note: The [(\(x,y\))] button is the same as the [STO] button and the [DATA] button is the same as the [M+] button.

Getting regression results from the calculator

Ask for the values of the regression coefficients \(a\) and \(b\).

Press:

Press:

See:

RCL

\(a\)

\(a=\text{0.9}\)

RCL

\(b\)

\(b=\text{0.55}\)

\(\therefore \hat{y}=\text{0.9}+\text{0.55}x\)

Example

Question

Using a calculator determine the least squares line of best fit for the following data set.

Learner

1

2

3

4

5

Chemistry (\%)

\(\text{52}\)

\(\text{55}\)

\(\text{86}\)

\(\text{71}\)

\(\text{45}\)

Accounting (\%)

\(\text{48}\)

\(\text{64}\)

\(\text{95}\)

\(\text{79}\)

\(\text{50}\)

For a Chemistry mark of \(\text{65}\%\), what mark does the least squares line predict for Accounting?

NB. If you have a SHARP calculator, ensure that you have done the previous worked example first. Once you have completed the previous worked example, attempt this example using your calculator and see if you get the same answer.

Getting your calculator ready

Switch on the calculator. Press [MODE] and then select STAT by pressing [2]. The following screen will appear:

1

\(1-VAR\)

2

\(A+BX\)

3

\(_+C{X}^{2}\)

4

\(lnX\)

5

\(eX\)

6

\(A.B\)\(X\)

7

\(A.XB\)

8

\(1/X\)

Now press [2] for linear regression. Your screen should look something like this:

\(x\)

\(y\)

1

2

3

Entering the data

Press [52] and then [\(=\)] to enter the first mark under \(x\). Then enter the other values, in the same way, for the \(x\)-variable (the Chemistry marks) in the order in which they are given in the data set. Then move the cursor across and up and enter 48 under y opposite 52 in the \(x\)-column. Continue to enter the other \(y\)-values (the Accounting marks) in order so that they pair off correctly with the corresponding \(x\)-values.

\(x\)

\(y\)

1

52

2

55

3

Then press [AC]. The screen clears but the data remains stored.

Now press [SHIFT][1] to get the stats computations screen shown below.

1:

Type

2:

Data

3:

Edit

4:

Sum

5:

Var

6:

MinMax

7:

Reg

Choose Regression by pressing [7].

1:

A

2:

B

3:

r

4:

\(\hat{x}\)

5:

\(\hat{y}\)

Getting regression results from the calculator

  1. Press [1] and [=] to get the value of the \(y\)-intercept, \(a=-\text{5.065} \ldots = -\text{5.07}\) (to two decimal places)

    Finally, to get the slope, use the following key sequence: [SHIFT][1][7][2][\(=\)]. The calculator gives \(b=\text{1.169} \ldots = \text{1.17}\) (to two decimal places)

    The equation of the line of regression is thus:

    \(\hat{y}=-\text{5.07}+\text{1.17}x\)

  2. Press [AC][65][SHIFT][1][7][5][\(=\)]

    This gives a (predicted) Accounting mark of \(=\text{70.94}=\text{71}\%\)

Now that we have a precise technique for finding the line of best fit, we still do not know how well our line of best fit really fits our data. We can fit a least squares regression line to any bivariate data, even if the two variables do not show a linear relationship. If the fit is not “good”, our assumption of the \(a\) and \(b\) values in \(\hat{y}=a+bx\) might be incorrect. Next, we will learn of a quantitative measure to determine how well our line really fits our data.

This lesson is part of:

Statistics and Probability

View Full Tutorial

Track Your Learning Progress

Sign in to unlock unlimited practice exams, tutorial practice quizzes, personalized weak area practice, AI study assistance with Lexi, and detailed performance analytics.