EXAMPLE

    • Example of simple
      linear regression which
      has one independent
      variable.
Least Squares Estimation of b0, b1
    • b0  Mean response when x=0 (y-intercept)
    • b1  Change in mean response when x increases
      by 1 unit (slope)
    • b0, b1 are unknown parameters (like m)
    • b0+b1x  Mean response when explanatory
      variable takes on the value x
    • Goal: Choose values (estimates) that minimize
      the sum of squared errors (SSE) of observed
      values to the straight-line:
                                                                             2
                                            n                         
                                         2
^   ^    ^
                                    ^
                                                          ^    ^
y  b 0 b1 x   SSE  i 1  yi  y i   i 1  yi   b 0  b 1 xi  
                           n

                                                                   
The least squares estimate of the slope
coefficient β1 of true regression line is
          β1= Σ(Xi-X’)(Yi-Y’)
                Σ (Xi-X’)2
     The least squares estimate of the intercept
  β0 of true regression line is

             β0 = Y’ – β1x’
X           Y
Temperature    Sales
          63      1.52
          70      1.68
          73       1.8
          75      2.05
          80      2.36
          82      2.25
          85      2.68
          88       2.9
          90      3.14
          91      3.06
          92      3.24
          75      1.92
          98       3.4
         100      3.28
          92      3.17
          87      2.83
          84      2.58
          88      2.86
          80      2.26
          82      2.14
          76      1.98
Ice Cream Sales

 4
3.5
 3
2.5
 2
1.5
 1
0.5
 0
      0   20   40      60         80   100   120
THANK YOU

Linear regression

  • 10.
    EXAMPLE • Example of simple linear regression which has one independent variable.
  • 15.
    Least Squares Estimationof b0, b1 • b0  Mean response when x=0 (y-intercept) • b1  Change in mean response when x increases by 1 unit (slope) • b0, b1 are unknown parameters (like m) • b0+b1x  Mean response when explanatory variable takes on the value x • Goal: Choose values (estimates) that minimize the sum of squared errors (SSE) of observed values to the straight-line: 2 n   2 ^ ^ ^  ^   ^ ^ y  b 0 b1 x SSE  i 1  yi  y i   i 1  yi   b 0  b 1 xi   n     
  • 16.
    The least squaresestimate of the slope coefficient β1 of true regression line is β1= Σ(Xi-X’)(Yi-Y’) Σ (Xi-X’)2 The least squares estimate of the intercept β0 of true regression line is β0 = Y’ – β1x’
  • 19.
    X Y Temperature Sales 63 1.52 70 1.68 73 1.8 75 2.05 80 2.36 82 2.25 85 2.68 88 2.9 90 3.14 91 3.06 92 3.24 75 1.92 98 3.4 100 3.28 92 3.17 87 2.83 84 2.58 88 2.86 80 2.26 82 2.14 76 1.98
  • 20.
    Ice Cream Sales 4 3.5 3 2.5 2 1.5 1 0.5 0 0 20 40 60 80 100 120
  • 22.