Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 01 Dec 2008 12:44:44 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2008/Dec/01/t1228160757gyd19w5vgcza2dc.htm/, Retrieved Sun, 05 May 2024 19:51:42 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=27263, Retrieved Sun, 05 May 2024 19:51:42 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact289
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
F     [Multiple Regression] [] [2007-11-19 19:55:31] [b731da8b544846036771bbf9bf2f34ce]
-   PD  [Multiple Regression] [Q3 Seatbelt law z...] [2008-11-24 16:42:23] [7d3039e6253bb5fb3b26df1537d500b4]
-    D      [Multiple Regression] [Q3 Multiple regre...] [2008-12-01 19:44:44] [35348cd8592af0baf5f138bd59921307] [Current]
-   PD        [Multiple Regression] [Q3 Multiple Regre...] [2008-12-01 19:51:08] [7d3039e6253bb5fb3b26df1537d500b4]
Feedback Forum

Post a new message
Dataseries X:
7,8	0
7,6	0
7,5	0
7,6	0
7,5	0
7,3	0
7,6	0
7,5	0
7,6	0
7,9	0
7,9	0
8,1	0
8,2	0
8,0	0
7,5	0
6,8	0
6,5	0
6,6	0
7,6	0
8,0	0
8,0	0
7,7	0
7,5	0
7,6	0
7,7	0
7,9	0
7,8	0
7,5	0
7,5	0
7,1	0
7,5	0
7,5	0
7,6	0
7,7	0
7,7	0
7,9	1
8,1	1
8,2	1
8,2	1
8,1	1
7,9	1
7,3	1
6,9	1
6,6	1
6,7	1
6,9	1
7,0	1
7,1	1
7,2	1
7,1	1
6,9	1
7,0	1
6,8	1
6,4	1
6,7	1
6,7	1
6,4	1
6,3	1
6,2	1
6,5	1
6,8	1
6,8	1
6,5	1
6,3	1
5,9	1
5,9	1
6,4	1
6,4	1




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time2 seconds
R Server'George Udny Yule' @ 72.249.76.132

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 2 seconds \tabularnewline
R Server & 'George Udny Yule' @ 72.249.76.132 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=27263&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]2 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'George Udny Yule' @ 72.249.76.132[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=27263&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=27263&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time2 seconds
R Server'George Udny Yule' @ 72.249.76.132







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 7.58285714285715 -0.670735930735931x[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  7.58285714285715 -0.670735930735931x[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=27263&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  7.58285714285715 -0.670735930735931x[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=27263&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=27263&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 7.58285714285715 -0.670735930735931x[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)7.582857142857150.08857985.605500
x-0.6707359307359310.127154-5.2752e-061e-06

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 7.58285714285715 & 0.088579 & 85.6055 & 0 & 0 \tabularnewline
x & -0.670735930735931 & 0.127154 & -5.275 & 2e-06 & 1e-06 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=27263&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]7.58285714285715[/C][C]0.088579[/C][C]85.6055[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]x[/C][C]-0.670735930735931[/C][C]0.127154[/C][C]-5.275[/C][C]2e-06[/C][C]1e-06[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=27263&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=27263&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)7.582857142857150.08857985.605500
x-0.6707359307359310.127154-5.2752e-061e-06







Multiple Linear Regression - Regression Statistics
Multiple R0.54458024087934
R-squared0.2965676387562
Adjusted R-squared0.285909572676749
F-TEST (value)27.8256520972388
F-TEST (DF numerator)1
F-TEST (DF denominator)66
p-value1.58036143083073e-06
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.524041199526332
Sum Squared Residuals18.1248658008658

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.54458024087934 \tabularnewline
R-squared & 0.2965676387562 \tabularnewline
Adjusted R-squared & 0.285909572676749 \tabularnewline
F-TEST (value) & 27.8256520972388 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 66 \tabularnewline
p-value & 1.58036143083073e-06 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.524041199526332 \tabularnewline
Sum Squared Residuals & 18.1248658008658 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=27263&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.54458024087934[/C][/ROW]
[ROW][C]R-squared[/C][C]0.2965676387562[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.285909572676749[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]27.8256520972388[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]66[/C][/ROW]
[ROW][C]p-value[/C][C]1.58036143083073e-06[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.524041199526332[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]18.1248658008658[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=27263&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=27263&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.54458024087934
R-squared0.2965676387562
Adjusted R-squared0.285909572676749
F-TEST (value)27.8256520972388
F-TEST (DF numerator)1
F-TEST (DF denominator)66
p-value1.58036143083073e-06
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.524041199526332
Sum Squared Residuals18.1248658008658







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
17.87.582857142857130.217142857142872
27.67.582857142857140.0171428571428566
37.57.58285714285714-0.0828571428571432
47.67.582857142857140.0171428571428565
57.57.58285714285714-0.0828571428571432
67.37.58285714285714-0.282857142857143
77.67.582857142857140.0171428571428565
87.57.58285714285714-0.0828571428571432
97.67.582857142857140.0171428571428565
107.97.582857142857140.317142857142857
117.97.582857142857140.317142857142857
128.17.582857142857140.517142857142856
138.27.582857142857140.617142857142856
1487.582857142857140.417142857142857
157.57.58285714285714-0.0828571428571432
166.87.58285714285714-0.782857142857143
176.57.58285714285714-1.08285714285714
186.67.58285714285714-0.982857142857144
197.67.582857142857140.0171428571428565
2087.582857142857140.417142857142857
2187.582857142857140.417142857142857
227.77.582857142857140.117142857142857
237.57.58285714285714-0.0828571428571432
247.67.582857142857140.0171428571428565
257.77.582857142857140.117142857142857
267.97.582857142857140.317142857142857
277.87.582857142857140.217142857142857
287.57.58285714285714-0.0828571428571432
297.57.58285714285714-0.0828571428571432
307.17.58285714285714-0.482857142857144
317.57.58285714285714-0.0828571428571432
327.57.58285714285714-0.0828571428571432
337.67.582857142857140.0171428571428565
347.77.582857142857140.117142857142857
357.77.582857142857140.117142857142857
367.96.912121212121210.987878787878788
378.16.912121212121211.18787878787879
388.26.912121212121211.28787878787879
398.26.912121212121211.28787878787879
408.16.912121212121211.18787878787879
417.96.912121212121210.987878787878788
427.36.912121212121210.387878787878788
436.96.91212121212121-0.0121212121212118
446.66.91212121212121-0.312121212121212
456.76.91212121212121-0.212121212121212
466.96.91212121212121-0.0121212121212118
4776.912121212121210.0878787878787879
487.16.912121212121210.187878787878788
497.26.912121212121210.287878787878788
507.16.912121212121210.187878787878788
516.96.91212121212121-0.0121212121212118
5276.912121212121210.0878787878787879
536.86.91212121212121-0.112121212121212
546.46.91212121212121-0.512121212121212
556.76.91212121212121-0.212121212121212
566.76.91212121212121-0.212121212121212
576.46.91212121212121-0.512121212121212
586.36.91212121212121-0.612121212121212
596.26.91212121212121-0.712121212121212
606.56.91212121212121-0.412121212121212
616.86.91212121212121-0.112121212121212
626.86.91212121212121-0.112121212121212
636.56.91212121212121-0.412121212121212
646.36.91212121212121-0.612121212121212
655.96.91212121212121-1.01212121212121
665.96.91212121212121-1.01212121212121
676.46.91212121212121-0.512121212121212
686.46.91212121212121-0.512121212121212

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 7.8 & 7.58285714285713 & 0.217142857142872 \tabularnewline
2 & 7.6 & 7.58285714285714 & 0.0171428571428566 \tabularnewline
3 & 7.5 & 7.58285714285714 & -0.0828571428571432 \tabularnewline
4 & 7.6 & 7.58285714285714 & 0.0171428571428565 \tabularnewline
5 & 7.5 & 7.58285714285714 & -0.0828571428571432 \tabularnewline
6 & 7.3 & 7.58285714285714 & -0.282857142857143 \tabularnewline
7 & 7.6 & 7.58285714285714 & 0.0171428571428565 \tabularnewline
8 & 7.5 & 7.58285714285714 & -0.0828571428571432 \tabularnewline
9 & 7.6 & 7.58285714285714 & 0.0171428571428565 \tabularnewline
10 & 7.9 & 7.58285714285714 & 0.317142857142857 \tabularnewline
11 & 7.9 & 7.58285714285714 & 0.317142857142857 \tabularnewline
12 & 8.1 & 7.58285714285714 & 0.517142857142856 \tabularnewline
13 & 8.2 & 7.58285714285714 & 0.617142857142856 \tabularnewline
14 & 8 & 7.58285714285714 & 0.417142857142857 \tabularnewline
15 & 7.5 & 7.58285714285714 & -0.0828571428571432 \tabularnewline
16 & 6.8 & 7.58285714285714 & -0.782857142857143 \tabularnewline
17 & 6.5 & 7.58285714285714 & -1.08285714285714 \tabularnewline
18 & 6.6 & 7.58285714285714 & -0.982857142857144 \tabularnewline
19 & 7.6 & 7.58285714285714 & 0.0171428571428565 \tabularnewline
20 & 8 & 7.58285714285714 & 0.417142857142857 \tabularnewline
21 & 8 & 7.58285714285714 & 0.417142857142857 \tabularnewline
22 & 7.7 & 7.58285714285714 & 0.117142857142857 \tabularnewline
23 & 7.5 & 7.58285714285714 & -0.0828571428571432 \tabularnewline
24 & 7.6 & 7.58285714285714 & 0.0171428571428565 \tabularnewline
25 & 7.7 & 7.58285714285714 & 0.117142857142857 \tabularnewline
26 & 7.9 & 7.58285714285714 & 0.317142857142857 \tabularnewline
27 & 7.8 & 7.58285714285714 & 0.217142857142857 \tabularnewline
28 & 7.5 & 7.58285714285714 & -0.0828571428571432 \tabularnewline
29 & 7.5 & 7.58285714285714 & -0.0828571428571432 \tabularnewline
30 & 7.1 & 7.58285714285714 & -0.482857142857144 \tabularnewline
31 & 7.5 & 7.58285714285714 & -0.0828571428571432 \tabularnewline
32 & 7.5 & 7.58285714285714 & -0.0828571428571432 \tabularnewline
33 & 7.6 & 7.58285714285714 & 0.0171428571428565 \tabularnewline
34 & 7.7 & 7.58285714285714 & 0.117142857142857 \tabularnewline
35 & 7.7 & 7.58285714285714 & 0.117142857142857 \tabularnewline
36 & 7.9 & 6.91212121212121 & 0.987878787878788 \tabularnewline
37 & 8.1 & 6.91212121212121 & 1.18787878787879 \tabularnewline
38 & 8.2 & 6.91212121212121 & 1.28787878787879 \tabularnewline
39 & 8.2 & 6.91212121212121 & 1.28787878787879 \tabularnewline
40 & 8.1 & 6.91212121212121 & 1.18787878787879 \tabularnewline
41 & 7.9 & 6.91212121212121 & 0.987878787878788 \tabularnewline
42 & 7.3 & 6.91212121212121 & 0.387878787878788 \tabularnewline
43 & 6.9 & 6.91212121212121 & -0.0121212121212118 \tabularnewline
44 & 6.6 & 6.91212121212121 & -0.312121212121212 \tabularnewline
45 & 6.7 & 6.91212121212121 & -0.212121212121212 \tabularnewline
46 & 6.9 & 6.91212121212121 & -0.0121212121212118 \tabularnewline
47 & 7 & 6.91212121212121 & 0.0878787878787879 \tabularnewline
48 & 7.1 & 6.91212121212121 & 0.187878787878788 \tabularnewline
49 & 7.2 & 6.91212121212121 & 0.287878787878788 \tabularnewline
50 & 7.1 & 6.91212121212121 & 0.187878787878788 \tabularnewline
51 & 6.9 & 6.91212121212121 & -0.0121212121212118 \tabularnewline
52 & 7 & 6.91212121212121 & 0.0878787878787879 \tabularnewline
53 & 6.8 & 6.91212121212121 & -0.112121212121212 \tabularnewline
54 & 6.4 & 6.91212121212121 & -0.512121212121212 \tabularnewline
55 & 6.7 & 6.91212121212121 & -0.212121212121212 \tabularnewline
56 & 6.7 & 6.91212121212121 & -0.212121212121212 \tabularnewline
57 & 6.4 & 6.91212121212121 & -0.512121212121212 \tabularnewline
58 & 6.3 & 6.91212121212121 & -0.612121212121212 \tabularnewline
59 & 6.2 & 6.91212121212121 & -0.712121212121212 \tabularnewline
60 & 6.5 & 6.91212121212121 & -0.412121212121212 \tabularnewline
61 & 6.8 & 6.91212121212121 & -0.112121212121212 \tabularnewline
62 & 6.8 & 6.91212121212121 & -0.112121212121212 \tabularnewline
63 & 6.5 & 6.91212121212121 & -0.412121212121212 \tabularnewline
64 & 6.3 & 6.91212121212121 & -0.612121212121212 \tabularnewline
65 & 5.9 & 6.91212121212121 & -1.01212121212121 \tabularnewline
66 & 5.9 & 6.91212121212121 & -1.01212121212121 \tabularnewline
67 & 6.4 & 6.91212121212121 & -0.512121212121212 \tabularnewline
68 & 6.4 & 6.91212121212121 & -0.512121212121212 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=27263&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]7.8[/C][C]7.58285714285713[/C][C]0.217142857142872[/C][/ROW]
[ROW][C]2[/C][C]7.6[/C][C]7.58285714285714[/C][C]0.0171428571428566[/C][/ROW]
[ROW][C]3[/C][C]7.5[/C][C]7.58285714285714[/C][C]-0.0828571428571432[/C][/ROW]
[ROW][C]4[/C][C]7.6[/C][C]7.58285714285714[/C][C]0.0171428571428565[/C][/ROW]
[ROW][C]5[/C][C]7.5[/C][C]7.58285714285714[/C][C]-0.0828571428571432[/C][/ROW]
[ROW][C]6[/C][C]7.3[/C][C]7.58285714285714[/C][C]-0.282857142857143[/C][/ROW]
[ROW][C]7[/C][C]7.6[/C][C]7.58285714285714[/C][C]0.0171428571428565[/C][/ROW]
[ROW][C]8[/C][C]7.5[/C][C]7.58285714285714[/C][C]-0.0828571428571432[/C][/ROW]
[ROW][C]9[/C][C]7.6[/C][C]7.58285714285714[/C][C]0.0171428571428565[/C][/ROW]
[ROW][C]10[/C][C]7.9[/C][C]7.58285714285714[/C][C]0.317142857142857[/C][/ROW]
[ROW][C]11[/C][C]7.9[/C][C]7.58285714285714[/C][C]0.317142857142857[/C][/ROW]
[ROW][C]12[/C][C]8.1[/C][C]7.58285714285714[/C][C]0.517142857142856[/C][/ROW]
[ROW][C]13[/C][C]8.2[/C][C]7.58285714285714[/C][C]0.617142857142856[/C][/ROW]
[ROW][C]14[/C][C]8[/C][C]7.58285714285714[/C][C]0.417142857142857[/C][/ROW]
[ROW][C]15[/C][C]7.5[/C][C]7.58285714285714[/C][C]-0.0828571428571432[/C][/ROW]
[ROW][C]16[/C][C]6.8[/C][C]7.58285714285714[/C][C]-0.782857142857143[/C][/ROW]
[ROW][C]17[/C][C]6.5[/C][C]7.58285714285714[/C][C]-1.08285714285714[/C][/ROW]
[ROW][C]18[/C][C]6.6[/C][C]7.58285714285714[/C][C]-0.982857142857144[/C][/ROW]
[ROW][C]19[/C][C]7.6[/C][C]7.58285714285714[/C][C]0.0171428571428565[/C][/ROW]
[ROW][C]20[/C][C]8[/C][C]7.58285714285714[/C][C]0.417142857142857[/C][/ROW]
[ROW][C]21[/C][C]8[/C][C]7.58285714285714[/C][C]0.417142857142857[/C][/ROW]
[ROW][C]22[/C][C]7.7[/C][C]7.58285714285714[/C][C]0.117142857142857[/C][/ROW]
[ROW][C]23[/C][C]7.5[/C][C]7.58285714285714[/C][C]-0.0828571428571432[/C][/ROW]
[ROW][C]24[/C][C]7.6[/C][C]7.58285714285714[/C][C]0.0171428571428565[/C][/ROW]
[ROW][C]25[/C][C]7.7[/C][C]7.58285714285714[/C][C]0.117142857142857[/C][/ROW]
[ROW][C]26[/C][C]7.9[/C][C]7.58285714285714[/C][C]0.317142857142857[/C][/ROW]
[ROW][C]27[/C][C]7.8[/C][C]7.58285714285714[/C][C]0.217142857142857[/C][/ROW]
[ROW][C]28[/C][C]7.5[/C][C]7.58285714285714[/C][C]-0.0828571428571432[/C][/ROW]
[ROW][C]29[/C][C]7.5[/C][C]7.58285714285714[/C][C]-0.0828571428571432[/C][/ROW]
[ROW][C]30[/C][C]7.1[/C][C]7.58285714285714[/C][C]-0.482857142857144[/C][/ROW]
[ROW][C]31[/C][C]7.5[/C][C]7.58285714285714[/C][C]-0.0828571428571432[/C][/ROW]
[ROW][C]32[/C][C]7.5[/C][C]7.58285714285714[/C][C]-0.0828571428571432[/C][/ROW]
[ROW][C]33[/C][C]7.6[/C][C]7.58285714285714[/C][C]0.0171428571428565[/C][/ROW]
[ROW][C]34[/C][C]7.7[/C][C]7.58285714285714[/C][C]0.117142857142857[/C][/ROW]
[ROW][C]35[/C][C]7.7[/C][C]7.58285714285714[/C][C]0.117142857142857[/C][/ROW]
[ROW][C]36[/C][C]7.9[/C][C]6.91212121212121[/C][C]0.987878787878788[/C][/ROW]
[ROW][C]37[/C][C]8.1[/C][C]6.91212121212121[/C][C]1.18787878787879[/C][/ROW]
[ROW][C]38[/C][C]8.2[/C][C]6.91212121212121[/C][C]1.28787878787879[/C][/ROW]
[ROW][C]39[/C][C]8.2[/C][C]6.91212121212121[/C][C]1.28787878787879[/C][/ROW]
[ROW][C]40[/C][C]8.1[/C][C]6.91212121212121[/C][C]1.18787878787879[/C][/ROW]
[ROW][C]41[/C][C]7.9[/C][C]6.91212121212121[/C][C]0.987878787878788[/C][/ROW]
[ROW][C]42[/C][C]7.3[/C][C]6.91212121212121[/C][C]0.387878787878788[/C][/ROW]
[ROW][C]43[/C][C]6.9[/C][C]6.91212121212121[/C][C]-0.0121212121212118[/C][/ROW]
[ROW][C]44[/C][C]6.6[/C][C]6.91212121212121[/C][C]-0.312121212121212[/C][/ROW]
[ROW][C]45[/C][C]6.7[/C][C]6.91212121212121[/C][C]-0.212121212121212[/C][/ROW]
[ROW][C]46[/C][C]6.9[/C][C]6.91212121212121[/C][C]-0.0121212121212118[/C][/ROW]
[ROW][C]47[/C][C]7[/C][C]6.91212121212121[/C][C]0.0878787878787879[/C][/ROW]
[ROW][C]48[/C][C]7.1[/C][C]6.91212121212121[/C][C]0.187878787878788[/C][/ROW]
[ROW][C]49[/C][C]7.2[/C][C]6.91212121212121[/C][C]0.287878787878788[/C][/ROW]
[ROW][C]50[/C][C]7.1[/C][C]6.91212121212121[/C][C]0.187878787878788[/C][/ROW]
[ROW][C]51[/C][C]6.9[/C][C]6.91212121212121[/C][C]-0.0121212121212118[/C][/ROW]
[ROW][C]52[/C][C]7[/C][C]6.91212121212121[/C][C]0.0878787878787879[/C][/ROW]
[ROW][C]53[/C][C]6.8[/C][C]6.91212121212121[/C][C]-0.112121212121212[/C][/ROW]
[ROW][C]54[/C][C]6.4[/C][C]6.91212121212121[/C][C]-0.512121212121212[/C][/ROW]
[ROW][C]55[/C][C]6.7[/C][C]6.91212121212121[/C][C]-0.212121212121212[/C][/ROW]
[ROW][C]56[/C][C]6.7[/C][C]6.91212121212121[/C][C]-0.212121212121212[/C][/ROW]
[ROW][C]57[/C][C]6.4[/C][C]6.91212121212121[/C][C]-0.512121212121212[/C][/ROW]
[ROW][C]58[/C][C]6.3[/C][C]6.91212121212121[/C][C]-0.612121212121212[/C][/ROW]
[ROW][C]59[/C][C]6.2[/C][C]6.91212121212121[/C][C]-0.712121212121212[/C][/ROW]
[ROW][C]60[/C][C]6.5[/C][C]6.91212121212121[/C][C]-0.412121212121212[/C][/ROW]
[ROW][C]61[/C][C]6.8[/C][C]6.91212121212121[/C][C]-0.112121212121212[/C][/ROW]
[ROW][C]62[/C][C]6.8[/C][C]6.91212121212121[/C][C]-0.112121212121212[/C][/ROW]
[ROW][C]63[/C][C]6.5[/C][C]6.91212121212121[/C][C]-0.412121212121212[/C][/ROW]
[ROW][C]64[/C][C]6.3[/C][C]6.91212121212121[/C][C]-0.612121212121212[/C][/ROW]
[ROW][C]65[/C][C]5.9[/C][C]6.91212121212121[/C][C]-1.01212121212121[/C][/ROW]
[ROW][C]66[/C][C]5.9[/C][C]6.91212121212121[/C][C]-1.01212121212121[/C][/ROW]
[ROW][C]67[/C][C]6.4[/C][C]6.91212121212121[/C][C]-0.512121212121212[/C][/ROW]
[ROW][C]68[/C][C]6.4[/C][C]6.91212121212121[/C][C]-0.512121212121212[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=27263&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=27263&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
17.87.582857142857130.217142857142872
27.67.582857142857140.0171428571428566
37.57.58285714285714-0.0828571428571432
47.67.582857142857140.0171428571428565
57.57.58285714285714-0.0828571428571432
67.37.58285714285714-0.282857142857143
77.67.582857142857140.0171428571428565
87.57.58285714285714-0.0828571428571432
97.67.582857142857140.0171428571428565
107.97.582857142857140.317142857142857
117.97.582857142857140.317142857142857
128.17.582857142857140.517142857142856
138.27.582857142857140.617142857142856
1487.582857142857140.417142857142857
157.57.58285714285714-0.0828571428571432
166.87.58285714285714-0.782857142857143
176.57.58285714285714-1.08285714285714
186.67.58285714285714-0.982857142857144
197.67.582857142857140.0171428571428565
2087.582857142857140.417142857142857
2187.582857142857140.417142857142857
227.77.582857142857140.117142857142857
237.57.58285714285714-0.0828571428571432
247.67.582857142857140.0171428571428565
257.77.582857142857140.117142857142857
267.97.582857142857140.317142857142857
277.87.582857142857140.217142857142857
287.57.58285714285714-0.0828571428571432
297.57.58285714285714-0.0828571428571432
307.17.58285714285714-0.482857142857144
317.57.58285714285714-0.0828571428571432
327.57.58285714285714-0.0828571428571432
337.67.582857142857140.0171428571428565
347.77.582857142857140.117142857142857
357.77.582857142857140.117142857142857
367.96.912121212121210.987878787878788
378.16.912121212121211.18787878787879
388.26.912121212121211.28787878787879
398.26.912121212121211.28787878787879
408.16.912121212121211.18787878787879
417.96.912121212121210.987878787878788
427.36.912121212121210.387878787878788
436.96.91212121212121-0.0121212121212118
446.66.91212121212121-0.312121212121212
456.76.91212121212121-0.212121212121212
466.96.91212121212121-0.0121212121212118
4776.912121212121210.0878787878787879
487.16.912121212121210.187878787878788
497.26.912121212121210.287878787878788
507.16.912121212121210.187878787878788
516.96.91212121212121-0.0121212121212118
5276.912121212121210.0878787878787879
536.86.91212121212121-0.112121212121212
546.46.91212121212121-0.512121212121212
556.76.91212121212121-0.212121212121212
566.76.91212121212121-0.212121212121212
576.46.91212121212121-0.512121212121212
586.36.91212121212121-0.612121212121212
596.26.91212121212121-0.712121212121212
606.56.91212121212121-0.412121212121212
616.86.91212121212121-0.112121212121212
626.86.91212121212121-0.112121212121212
636.56.91212121212121-0.412121212121212
646.36.91212121212121-0.612121212121212
655.96.91212121212121-1.01212121212121
665.96.91212121212121-1.01212121212121
676.46.91212121212121-0.512121212121212
686.46.91212121212121-0.512121212121212



Parameters (Session):
par1 = 0 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 0 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')