Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 24 Nov 2008 11:56:50 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2008/Nov/24/t1227553564bpe9c0aoy25skhk.htm/, Retrieved Tue, 14 May 2024 01:36:33 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=25490, Retrieved Tue, 14 May 2024 01:36:33 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact149
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [Q1 The Seatbeltlaw] [2007-11-14 19:27:43] [8cd6641b921d30ebe00b648d1481bba0]
F    D  [Multiple Regression] [Q1] [2008-11-24 18:32:39] [b47fceb71c9525e79a89b5fc6d023d0e]
F    D      [Multiple Regression] [Q3] [2008-11-24 18:56:50] [541f63fa3157af9df10fc4d202b2a90b] [Current]
Feedback Forum
2008-11-28 13:10:40 [Wim Golsteyn] [reply
De p-values van M1/2/4/5 liggen te hoog, deze parameters zijn dus niet betrouwbaar. Ook M10/11 liggen op het randje. Het histogram en de density plot hebben wel een redelijke normaalverdeling. Het is mij niet helemaal duidelijk wat dit model juist moet verklaren, maar ik neem aan dat het om de stijging of daling van de productie van transportmiddelen gaat. Indien dit zo is blijkt dat in M7 er het minste geproduceerd wordt, en in M6 het meeste. Het model zou 80% van de stijging of daling in de productie verklaren, maar zoals eerder gezegd zijn een aantal parameters niet betrouwbaar.

Post a new message
Dataseries X:
91,2	0
99,2	0
108,2	0
101,5	0
106,9	0
104,4	0
77,9	0
60	0
99,5	0
95	0
105,6	0
102,5	0
93,3	0
97,3	0
127	0
111,7	0
96,4	0
133	0
72,2	0
95,8	0
124,1	0
127,6	0
110,7	0
104,6	0
112,7	0
115,3	0
139,4	0
119	0
97,4	0
154	0
81,5	0
88,8	0
127,7	0
105,1	0
114,9	0
106,4	0
104,5	1
121,6	1
141,4	1
99	1
126,7	1
134,1	1
81,3	1
88,6	1
132,7	1
132,9	1
134,4	1
103,7	1
119,7	1
115	1
132,9	1
108,5	1
113,9	1
142	1
97,7	1
92,2	1
128,8	1
134,9	1
128,2	1
114,8	1




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'George Udny Yule' @ 72.249.76.132

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'George Udny Yule' @ 72.249.76.132 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25490&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'George Udny Yule' @ 72.249.76.132[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25490&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25490&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'George Udny Yule' @ 72.249.76.132







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 86.6072222222222 -6.55555555555556x[t] + 4.72902777777782M1[t] + 9.5063888888889M2[t] + 28.9837500000000M3[t] + 6.52111111111113M4[t] + 6.21847222222221M5[t] + 30.8358333333334M6[t] -21.1668055555555M7[t] -18.8294444444445M8[t] + 18.0279166666667M9[t] + 13.9452777777778M10[t] + 12.9826388888889M11[t] + 0.622638888888888t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  86.6072222222222 -6.55555555555556x[t] +  4.72902777777782M1[t] +  9.5063888888889M2[t] +  28.9837500000000M3[t] +  6.52111111111113M4[t] +  6.21847222222221M5[t] +  30.8358333333334M6[t] -21.1668055555555M7[t] -18.8294444444445M8[t] +  18.0279166666667M9[t] +  13.9452777777778M10[t] +  12.9826388888889M11[t] +  0.622638888888888t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25490&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  86.6072222222222 -6.55555555555556x[t] +  4.72902777777782M1[t] +  9.5063888888889M2[t] +  28.9837500000000M3[t] +  6.52111111111113M4[t] +  6.21847222222221M5[t] +  30.8358333333334M6[t] -21.1668055555555M7[t] -18.8294444444445M8[t] +  18.0279166666667M9[t] +  13.9452777777778M10[t] +  12.9826388888889M11[t] +  0.622638888888888t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25490&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25490&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 86.6072222222222 -6.55555555555556x[t] + 4.72902777777782M1[t] + 9.5063888888889M2[t] + 28.9837500000000M3[t] + 6.52111111111113M4[t] + 6.21847222222221M5[t] + 30.8358333333334M6[t] -21.1668055555555M7[t] -18.8294444444445M8[t] + 18.0279166666667M9[t] + 13.9452777777778M10[t] + 12.9826388888889M11[t] + 0.622638888888888t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)86.60722222222225.5209115.687100
x-6.555555555555564.957923-1.32220.1926260.096313
M14.729027777777826.1542850.76840.4461690.223085
M29.50638888888896.1192371.55350.1271520.063576
M328.98375000000006.0873524.76131.9e-051e-05
M46.521111111111136.0586821.07630.2873950.143697
M56.218472222222216.0332721.03070.3080720.154036
M630.83583333333346.0111625.12986e-063e-06
M7-21.16680555555555.992391-3.53230.000950.000475
M8-18.82944444444455.976988-3.15030.0028660.001433
M918.02791666666675.9649813.02230.004090.002045
M1013.94527777777785.956392.34120.0236090.011804
M1112.98263888888895.9512292.18150.0342940.017147
t0.6226388888888880.1431234.35047.5e-053.7e-05

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 86.6072222222222 & 5.52091 & 15.6871 & 0 & 0 \tabularnewline
x & -6.55555555555556 & 4.957923 & -1.3222 & 0.192626 & 0.096313 \tabularnewline
M1 & 4.72902777777782 & 6.154285 & 0.7684 & 0.446169 & 0.223085 \tabularnewline
M2 & 9.5063888888889 & 6.119237 & 1.5535 & 0.127152 & 0.063576 \tabularnewline
M3 & 28.9837500000000 & 6.087352 & 4.7613 & 1.9e-05 & 1e-05 \tabularnewline
M4 & 6.52111111111113 & 6.058682 & 1.0763 & 0.287395 & 0.143697 \tabularnewline
M5 & 6.21847222222221 & 6.033272 & 1.0307 & 0.308072 & 0.154036 \tabularnewline
M6 & 30.8358333333334 & 6.011162 & 5.1298 & 6e-06 & 3e-06 \tabularnewline
M7 & -21.1668055555555 & 5.992391 & -3.5323 & 0.00095 & 0.000475 \tabularnewline
M8 & -18.8294444444445 & 5.976988 & -3.1503 & 0.002866 & 0.001433 \tabularnewline
M9 & 18.0279166666667 & 5.964981 & 3.0223 & 0.00409 & 0.002045 \tabularnewline
M10 & 13.9452777777778 & 5.95639 & 2.3412 & 0.023609 & 0.011804 \tabularnewline
M11 & 12.9826388888889 & 5.951229 & 2.1815 & 0.034294 & 0.017147 \tabularnewline
t & 0.622638888888888 & 0.143123 & 4.3504 & 7.5e-05 & 3.7e-05 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25490&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]86.6072222222222[/C][C]5.52091[/C][C]15.6871[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]x[/C][C]-6.55555555555556[/C][C]4.957923[/C][C]-1.3222[/C][C]0.192626[/C][C]0.096313[/C][/ROW]
[ROW][C]M1[/C][C]4.72902777777782[/C][C]6.154285[/C][C]0.7684[/C][C]0.446169[/C][C]0.223085[/C][/ROW]
[ROW][C]M2[/C][C]9.5063888888889[/C][C]6.119237[/C][C]1.5535[/C][C]0.127152[/C][C]0.063576[/C][/ROW]
[ROW][C]M3[/C][C]28.9837500000000[/C][C]6.087352[/C][C]4.7613[/C][C]1.9e-05[/C][C]1e-05[/C][/ROW]
[ROW][C]M4[/C][C]6.52111111111113[/C][C]6.058682[/C][C]1.0763[/C][C]0.287395[/C][C]0.143697[/C][/ROW]
[ROW][C]M5[/C][C]6.21847222222221[/C][C]6.033272[/C][C]1.0307[/C][C]0.308072[/C][C]0.154036[/C][/ROW]
[ROW][C]M6[/C][C]30.8358333333334[/C][C]6.011162[/C][C]5.1298[/C][C]6e-06[/C][C]3e-06[/C][/ROW]
[ROW][C]M7[/C][C]-21.1668055555555[/C][C]5.992391[/C][C]-3.5323[/C][C]0.00095[/C][C]0.000475[/C][/ROW]
[ROW][C]M8[/C][C]-18.8294444444445[/C][C]5.976988[/C][C]-3.1503[/C][C]0.002866[/C][C]0.001433[/C][/ROW]
[ROW][C]M9[/C][C]18.0279166666667[/C][C]5.964981[/C][C]3.0223[/C][C]0.00409[/C][C]0.002045[/C][/ROW]
[ROW][C]M10[/C][C]13.9452777777778[/C][C]5.95639[/C][C]2.3412[/C][C]0.023609[/C][C]0.011804[/C][/ROW]
[ROW][C]M11[/C][C]12.9826388888889[/C][C]5.951229[/C][C]2.1815[/C][C]0.034294[/C][C]0.017147[/C][/ROW]
[ROW][C]t[/C][C]0.622638888888888[/C][C]0.143123[/C][C]4.3504[/C][C]7.5e-05[/C][C]3.7e-05[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25490&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25490&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)86.60722222222225.5209115.687100
x-6.555555555555564.957923-1.32220.1926260.096313
M14.729027777777826.1542850.76840.4461690.223085
M29.50638888888896.1192371.55350.1271520.063576
M328.98375000000006.0873524.76131.9e-051e-05
M46.521111111111136.0586821.07630.2873950.143697
M56.218472222222216.0332721.03070.3080720.154036
M630.83583333333346.0111625.12986e-063e-06
M7-21.16680555555555.992391-3.53230.000950.000475
M8-18.82944444444455.976988-3.15030.0028660.001433
M918.02791666666675.9649813.02230.004090.002045
M1013.94527777777785.956392.34120.0236090.011804
M1112.98263888888895.9512292.18150.0342940.017147
t0.6226388888888880.1431234.35047.5e-053.7e-05







Multiple Linear Regression - Regression Statistics
Multiple R0.899877752596622
R-squared0.809779969618348
Adjusted R-squared0.756022134945272
F-TEST (value)15.0634781803055
F-TEST (DF numerator)13
F-TEST (DF denominator)46
p-value1.92490468009510e-12
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation9.40699772936048
Sum Squared Residuals4070.61388888889

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.899877752596622 \tabularnewline
R-squared & 0.809779969618348 \tabularnewline
Adjusted R-squared & 0.756022134945272 \tabularnewline
F-TEST (value) & 15.0634781803055 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 46 \tabularnewline
p-value & 1.92490468009510e-12 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 9.40699772936048 \tabularnewline
Sum Squared Residuals & 4070.61388888889 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25490&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.899877752596622[/C][/ROW]
[ROW][C]R-squared[/C][C]0.809779969618348[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.756022134945272[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]15.0634781803055[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]46[/C][/ROW]
[ROW][C]p-value[/C][C]1.92490468009510e-12[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]9.40699772936048[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]4070.61388888889[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25490&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25490&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.899877752596622
R-squared0.809779969618348
Adjusted R-squared0.756022134945272
F-TEST (value)15.0634781803055
F-TEST (DF numerator)13
F-TEST (DF denominator)46
p-value1.92490468009510e-12
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation9.40699772936048
Sum Squared Residuals4070.61388888889







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
191.291.9588888888887-0.758888888888741
299.297.35888888888891.84111111111113
3108.2117.458888888889-9.25888888888888
4101.595.61888888888895.88111111111111
5106.995.938888888888910.9611111111111
6104.4121.178888888889-16.7788888888889
777.969.79888888888898.10111111111114
86072.758888888889-12.7588888888890
999.5110.238888888889-10.7388888888889
1095106.778888888889-11.7788888888889
11105.6106.438888888889-0.838888888888912
12102.594.07888888888898.42111111111111
1393.399.4305555555556-6.1305555555556
1497.3104.830555555556-7.53055555555555
15127124.9305555555562.06944444444444
16111.7103.0905555555568.60944444444445
1796.4103.410555555556-7.01055555555554
18133128.6505555555564.34944444444446
1972.277.2705555555556-5.07055555555556
2095.880.230555555555515.5694444444445
21124.1117.7105555555566.38944444444444
22127.6114.25055555555613.3494444444444
23110.7113.910555555556-3.21055555555555
24104.6101.5505555555563.04944444444443
25112.7106.9022222222225.79777777777773
26115.3112.3022222222222.99777777777776
27139.4132.4022222222226.99777777777778
28119110.5622222222228.43777777777778
2997.4110.882222222222-13.4822222222222
30154136.12222222222217.8777777777778
3181.584.7422222222222-3.24222222222223
3288.887.70222222222221.09777777777779
33127.7125.1822222222222.51777777777779
34105.1121.722222222222-16.6222222222222
35114.9121.382222222222-6.48222222222221
36106.4109.022222222222-2.62222222222222
37104.5107.818333333333-3.31833333333338
38121.6113.2183333333338.38166666666665
39141.4133.3183333333338.08166666666666
4099111.478333333333-12.4783333333333
41126.7111.79833333333314.9016666666667
42134.1137.038333333333-2.93833333333332
4381.385.6583333333333-4.35833333333335
4488.688.6183333333333-0.0183333333333177
45132.7126.0983333333336.60166666666667
46132.9122.63833333333310.2616666666667
47134.4122.29833333333312.1016666666667
48103.7109.938333333333-6.23833333333333
49119.7115.294.40999999999997
50115120.69-5.69
51132.9140.79-7.89
52108.5118.95-10.45
53113.9119.27-5.36999999999999
54142144.51-2.50999999999998
5597.793.134.56999999999999
5692.296.09-3.88999999999997
57128.8133.57-4.76999999999998
58134.9130.114.79
59128.2129.77-1.57000000000001
60114.8117.41-2.61000000000000

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 91.2 & 91.9588888888887 & -0.758888888888741 \tabularnewline
2 & 99.2 & 97.3588888888889 & 1.84111111111113 \tabularnewline
3 & 108.2 & 117.458888888889 & -9.25888888888888 \tabularnewline
4 & 101.5 & 95.6188888888889 & 5.88111111111111 \tabularnewline
5 & 106.9 & 95.9388888888889 & 10.9611111111111 \tabularnewline
6 & 104.4 & 121.178888888889 & -16.7788888888889 \tabularnewline
7 & 77.9 & 69.7988888888889 & 8.10111111111114 \tabularnewline
8 & 60 & 72.758888888889 & -12.7588888888890 \tabularnewline
9 & 99.5 & 110.238888888889 & -10.7388888888889 \tabularnewline
10 & 95 & 106.778888888889 & -11.7788888888889 \tabularnewline
11 & 105.6 & 106.438888888889 & -0.838888888888912 \tabularnewline
12 & 102.5 & 94.0788888888889 & 8.42111111111111 \tabularnewline
13 & 93.3 & 99.4305555555556 & -6.1305555555556 \tabularnewline
14 & 97.3 & 104.830555555556 & -7.53055555555555 \tabularnewline
15 & 127 & 124.930555555556 & 2.06944444444444 \tabularnewline
16 & 111.7 & 103.090555555556 & 8.60944444444445 \tabularnewline
17 & 96.4 & 103.410555555556 & -7.01055555555554 \tabularnewline
18 & 133 & 128.650555555556 & 4.34944444444446 \tabularnewline
19 & 72.2 & 77.2705555555556 & -5.07055555555556 \tabularnewline
20 & 95.8 & 80.2305555555555 & 15.5694444444445 \tabularnewline
21 & 124.1 & 117.710555555556 & 6.38944444444444 \tabularnewline
22 & 127.6 & 114.250555555556 & 13.3494444444444 \tabularnewline
23 & 110.7 & 113.910555555556 & -3.21055555555555 \tabularnewline
24 & 104.6 & 101.550555555556 & 3.04944444444443 \tabularnewline
25 & 112.7 & 106.902222222222 & 5.79777777777773 \tabularnewline
26 & 115.3 & 112.302222222222 & 2.99777777777776 \tabularnewline
27 & 139.4 & 132.402222222222 & 6.99777777777778 \tabularnewline
28 & 119 & 110.562222222222 & 8.43777777777778 \tabularnewline
29 & 97.4 & 110.882222222222 & -13.4822222222222 \tabularnewline
30 & 154 & 136.122222222222 & 17.8777777777778 \tabularnewline
31 & 81.5 & 84.7422222222222 & -3.24222222222223 \tabularnewline
32 & 88.8 & 87.7022222222222 & 1.09777777777779 \tabularnewline
33 & 127.7 & 125.182222222222 & 2.51777777777779 \tabularnewline
34 & 105.1 & 121.722222222222 & -16.6222222222222 \tabularnewline
35 & 114.9 & 121.382222222222 & -6.48222222222221 \tabularnewline
36 & 106.4 & 109.022222222222 & -2.62222222222222 \tabularnewline
37 & 104.5 & 107.818333333333 & -3.31833333333338 \tabularnewline
38 & 121.6 & 113.218333333333 & 8.38166666666665 \tabularnewline
39 & 141.4 & 133.318333333333 & 8.08166666666666 \tabularnewline
40 & 99 & 111.478333333333 & -12.4783333333333 \tabularnewline
41 & 126.7 & 111.798333333333 & 14.9016666666667 \tabularnewline
42 & 134.1 & 137.038333333333 & -2.93833333333332 \tabularnewline
43 & 81.3 & 85.6583333333333 & -4.35833333333335 \tabularnewline
44 & 88.6 & 88.6183333333333 & -0.0183333333333177 \tabularnewline
45 & 132.7 & 126.098333333333 & 6.60166666666667 \tabularnewline
46 & 132.9 & 122.638333333333 & 10.2616666666667 \tabularnewline
47 & 134.4 & 122.298333333333 & 12.1016666666667 \tabularnewline
48 & 103.7 & 109.938333333333 & -6.23833333333333 \tabularnewline
49 & 119.7 & 115.29 & 4.40999999999997 \tabularnewline
50 & 115 & 120.69 & -5.69 \tabularnewline
51 & 132.9 & 140.79 & -7.89 \tabularnewline
52 & 108.5 & 118.95 & -10.45 \tabularnewline
53 & 113.9 & 119.27 & -5.36999999999999 \tabularnewline
54 & 142 & 144.51 & -2.50999999999998 \tabularnewline
55 & 97.7 & 93.13 & 4.56999999999999 \tabularnewline
56 & 92.2 & 96.09 & -3.88999999999997 \tabularnewline
57 & 128.8 & 133.57 & -4.76999999999998 \tabularnewline
58 & 134.9 & 130.11 & 4.79 \tabularnewline
59 & 128.2 & 129.77 & -1.57000000000001 \tabularnewline
60 & 114.8 & 117.41 & -2.61000000000000 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25490&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]91.2[/C][C]91.9588888888887[/C][C]-0.758888888888741[/C][/ROW]
[ROW][C]2[/C][C]99.2[/C][C]97.3588888888889[/C][C]1.84111111111113[/C][/ROW]
[ROW][C]3[/C][C]108.2[/C][C]117.458888888889[/C][C]-9.25888888888888[/C][/ROW]
[ROW][C]4[/C][C]101.5[/C][C]95.6188888888889[/C][C]5.88111111111111[/C][/ROW]
[ROW][C]5[/C][C]106.9[/C][C]95.9388888888889[/C][C]10.9611111111111[/C][/ROW]
[ROW][C]6[/C][C]104.4[/C][C]121.178888888889[/C][C]-16.7788888888889[/C][/ROW]
[ROW][C]7[/C][C]77.9[/C][C]69.7988888888889[/C][C]8.10111111111114[/C][/ROW]
[ROW][C]8[/C][C]60[/C][C]72.758888888889[/C][C]-12.7588888888890[/C][/ROW]
[ROW][C]9[/C][C]99.5[/C][C]110.238888888889[/C][C]-10.7388888888889[/C][/ROW]
[ROW][C]10[/C][C]95[/C][C]106.778888888889[/C][C]-11.7788888888889[/C][/ROW]
[ROW][C]11[/C][C]105.6[/C][C]106.438888888889[/C][C]-0.838888888888912[/C][/ROW]
[ROW][C]12[/C][C]102.5[/C][C]94.0788888888889[/C][C]8.42111111111111[/C][/ROW]
[ROW][C]13[/C][C]93.3[/C][C]99.4305555555556[/C][C]-6.1305555555556[/C][/ROW]
[ROW][C]14[/C][C]97.3[/C][C]104.830555555556[/C][C]-7.53055555555555[/C][/ROW]
[ROW][C]15[/C][C]127[/C][C]124.930555555556[/C][C]2.06944444444444[/C][/ROW]
[ROW][C]16[/C][C]111.7[/C][C]103.090555555556[/C][C]8.60944444444445[/C][/ROW]
[ROW][C]17[/C][C]96.4[/C][C]103.410555555556[/C][C]-7.01055555555554[/C][/ROW]
[ROW][C]18[/C][C]133[/C][C]128.650555555556[/C][C]4.34944444444446[/C][/ROW]
[ROW][C]19[/C][C]72.2[/C][C]77.2705555555556[/C][C]-5.07055555555556[/C][/ROW]
[ROW][C]20[/C][C]95.8[/C][C]80.2305555555555[/C][C]15.5694444444445[/C][/ROW]
[ROW][C]21[/C][C]124.1[/C][C]117.710555555556[/C][C]6.38944444444444[/C][/ROW]
[ROW][C]22[/C][C]127.6[/C][C]114.250555555556[/C][C]13.3494444444444[/C][/ROW]
[ROW][C]23[/C][C]110.7[/C][C]113.910555555556[/C][C]-3.21055555555555[/C][/ROW]
[ROW][C]24[/C][C]104.6[/C][C]101.550555555556[/C][C]3.04944444444443[/C][/ROW]
[ROW][C]25[/C][C]112.7[/C][C]106.902222222222[/C][C]5.79777777777773[/C][/ROW]
[ROW][C]26[/C][C]115.3[/C][C]112.302222222222[/C][C]2.99777777777776[/C][/ROW]
[ROW][C]27[/C][C]139.4[/C][C]132.402222222222[/C][C]6.99777777777778[/C][/ROW]
[ROW][C]28[/C][C]119[/C][C]110.562222222222[/C][C]8.43777777777778[/C][/ROW]
[ROW][C]29[/C][C]97.4[/C][C]110.882222222222[/C][C]-13.4822222222222[/C][/ROW]
[ROW][C]30[/C][C]154[/C][C]136.122222222222[/C][C]17.8777777777778[/C][/ROW]
[ROW][C]31[/C][C]81.5[/C][C]84.7422222222222[/C][C]-3.24222222222223[/C][/ROW]
[ROW][C]32[/C][C]88.8[/C][C]87.7022222222222[/C][C]1.09777777777779[/C][/ROW]
[ROW][C]33[/C][C]127.7[/C][C]125.182222222222[/C][C]2.51777777777779[/C][/ROW]
[ROW][C]34[/C][C]105.1[/C][C]121.722222222222[/C][C]-16.6222222222222[/C][/ROW]
[ROW][C]35[/C][C]114.9[/C][C]121.382222222222[/C][C]-6.48222222222221[/C][/ROW]
[ROW][C]36[/C][C]106.4[/C][C]109.022222222222[/C][C]-2.62222222222222[/C][/ROW]
[ROW][C]37[/C][C]104.5[/C][C]107.818333333333[/C][C]-3.31833333333338[/C][/ROW]
[ROW][C]38[/C][C]121.6[/C][C]113.218333333333[/C][C]8.38166666666665[/C][/ROW]
[ROW][C]39[/C][C]141.4[/C][C]133.318333333333[/C][C]8.08166666666666[/C][/ROW]
[ROW][C]40[/C][C]99[/C][C]111.478333333333[/C][C]-12.4783333333333[/C][/ROW]
[ROW][C]41[/C][C]126.7[/C][C]111.798333333333[/C][C]14.9016666666667[/C][/ROW]
[ROW][C]42[/C][C]134.1[/C][C]137.038333333333[/C][C]-2.93833333333332[/C][/ROW]
[ROW][C]43[/C][C]81.3[/C][C]85.6583333333333[/C][C]-4.35833333333335[/C][/ROW]
[ROW][C]44[/C][C]88.6[/C][C]88.6183333333333[/C][C]-0.0183333333333177[/C][/ROW]
[ROW][C]45[/C][C]132.7[/C][C]126.098333333333[/C][C]6.60166666666667[/C][/ROW]
[ROW][C]46[/C][C]132.9[/C][C]122.638333333333[/C][C]10.2616666666667[/C][/ROW]
[ROW][C]47[/C][C]134.4[/C][C]122.298333333333[/C][C]12.1016666666667[/C][/ROW]
[ROW][C]48[/C][C]103.7[/C][C]109.938333333333[/C][C]-6.23833333333333[/C][/ROW]
[ROW][C]49[/C][C]119.7[/C][C]115.29[/C][C]4.40999999999997[/C][/ROW]
[ROW][C]50[/C][C]115[/C][C]120.69[/C][C]-5.69[/C][/ROW]
[ROW][C]51[/C][C]132.9[/C][C]140.79[/C][C]-7.89[/C][/ROW]
[ROW][C]52[/C][C]108.5[/C][C]118.95[/C][C]-10.45[/C][/ROW]
[ROW][C]53[/C][C]113.9[/C][C]119.27[/C][C]-5.36999999999999[/C][/ROW]
[ROW][C]54[/C][C]142[/C][C]144.51[/C][C]-2.50999999999998[/C][/ROW]
[ROW][C]55[/C][C]97.7[/C][C]93.13[/C][C]4.56999999999999[/C][/ROW]
[ROW][C]56[/C][C]92.2[/C][C]96.09[/C][C]-3.88999999999997[/C][/ROW]
[ROW][C]57[/C][C]128.8[/C][C]133.57[/C][C]-4.76999999999998[/C][/ROW]
[ROW][C]58[/C][C]134.9[/C][C]130.11[/C][C]4.79[/C][/ROW]
[ROW][C]59[/C][C]128.2[/C][C]129.77[/C][C]-1.57000000000001[/C][/ROW]
[ROW][C]60[/C][C]114.8[/C][C]117.41[/C][C]-2.61000000000000[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25490&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25490&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
191.291.9588888888887-0.758888888888741
299.297.35888888888891.84111111111113
3108.2117.458888888889-9.25888888888888
4101.595.61888888888895.88111111111111
5106.995.938888888888910.9611111111111
6104.4121.178888888889-16.7788888888889
777.969.79888888888898.10111111111114
86072.758888888889-12.7588888888890
999.5110.238888888889-10.7388888888889
1095106.778888888889-11.7788888888889
11105.6106.438888888889-0.838888888888912
12102.594.07888888888898.42111111111111
1393.399.4305555555556-6.1305555555556
1497.3104.830555555556-7.53055555555555
15127124.9305555555562.06944444444444
16111.7103.0905555555568.60944444444445
1796.4103.410555555556-7.01055555555554
18133128.6505555555564.34944444444446
1972.277.2705555555556-5.07055555555556
2095.880.230555555555515.5694444444445
21124.1117.7105555555566.38944444444444
22127.6114.25055555555613.3494444444444
23110.7113.910555555556-3.21055555555555
24104.6101.5505555555563.04944444444443
25112.7106.9022222222225.79777777777773
26115.3112.3022222222222.99777777777776
27139.4132.4022222222226.99777777777778
28119110.5622222222228.43777777777778
2997.4110.882222222222-13.4822222222222
30154136.12222222222217.8777777777778
3181.584.7422222222222-3.24222222222223
3288.887.70222222222221.09777777777779
33127.7125.1822222222222.51777777777779
34105.1121.722222222222-16.6222222222222
35114.9121.382222222222-6.48222222222221
36106.4109.022222222222-2.62222222222222
37104.5107.818333333333-3.31833333333338
38121.6113.2183333333338.38166666666665
39141.4133.3183333333338.08166666666666
4099111.478333333333-12.4783333333333
41126.7111.79833333333314.9016666666667
42134.1137.038333333333-2.93833333333332
4381.385.6583333333333-4.35833333333335
4488.688.6183333333333-0.0183333333333177
45132.7126.0983333333336.60166666666667
46132.9122.63833333333310.2616666666667
47134.4122.29833333333312.1016666666667
48103.7109.938333333333-6.23833333333333
49119.7115.294.40999999999997
50115120.69-5.69
51132.9140.79-7.89
52108.5118.95-10.45
53113.9119.27-5.36999999999999
54142144.51-2.50999999999998
5597.793.134.56999999999999
5692.296.09-3.88999999999997
57128.8133.57-4.76999999999998
58134.9130.114.79
59128.2129.77-1.57000000000001
60114.8117.41-2.61000000000000



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')