Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 24 Nov 2008 23:52:48 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2008/Nov/25/t1227596025hxy4dyq0nswztyp.htm/, Retrieved Thu, 09 May 2024 02:51:51 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=25563, Retrieved Thu, 09 May 2024 02:51:51 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact200
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
F     [Multiple Regression] [] [2007-11-19 19:55:31] [b731da8b544846036771bbf9bf2f34ce]
F   PD    [Multiple Regression] [Multiple regressi...] [2008-11-25 06:52:48] [d592f629d96b926609f311957d74fcca] [Current]
Feedback Forum
2008-11-25 17:24:58 [Romina Machiels] [reply
Deze vraag werd correct beantwoord.
2008-11-30 15:09:23 [Joris Deboel] [reply
Deze vraag is correct beantwoord
2008-11-30 20:18:08 [Yara Van Overstraeten] [reply
De student heeft hier een juiste analyse gemaakt van zijn eigen datareeks. Aan de hand van het multiple regression model kan 91% van de resultaten verklaard worden, wat een zeer hoog percentage is.
2008-11-30 21:31:24 [Inge Meelberghs] [reply
De student heeft de techniek van Q1 en Q2 correct op zijn tijdreeksen toegepast.

Post a new message
Dataseries X:
10413.00	0.00
10709.00	0.00
10662.00	0.00
10570.00	0.00
10297.00	0.00
10635.00	0.00
10872.00	0.00
10296.00	0.00
10383.00	0.00
10431.00	0.00
10574.00	0.00
10653.00	0.00
10805.00	0.00
10872.00	0.00
10625.00	0.00
10407.00	0.00
10463.00	0.00
10556.00	0.00
10646.00	0.00
10702.00	0.00
11353.00	0.00
11346.00	0.00
11451.00	0.00
11964.00	0.00
12574.00	0.00
13031.00	0.00
13812.00	0.00
14544.00	0.00
14931.00	0.00
14886.00	0.00
16005.00	1.00
17064.00	1.00
15168.00	1.00
16050.00	1.00
15839.00	1.00
15137.00	1.00
14954.00	0.00
15648.00	1.00
15305.00	1.00
15579.00	1.00
16348.00	1.00
15928.00	1.00
16171.00	1.00
15937.00	1.00
15713.00	1.00
15594.00	1.00
15683.00	1.00
16438.00	1.00
17032.00	1.00
17696.00	1.00
17745.00	1.00
19394.00	1.00
20148.00	1.00
20108.00	1.00
18584.00	1.00
18441.00	1.00
18391.00	1.00
19178.00	1.00
18079.00	1.00
18483.00	1.00




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Herman Ole Andreas Wold' @ 193.190.124.10:1001

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 6 seconds \tabularnewline
R Server & 'Herman Ole Andreas Wold' @ 193.190.124.10:1001 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25563&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]6 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Herman Ole Andreas Wold' @ 193.190.124.10:1001[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25563&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25563&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Herman Ole Andreas Wold' @ 193.190.124.10:1001







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 8706.52879120879 + 1685.19560439560x[t] + 766.647472527471M1[t] + 731.392967032967M2[t] + 636.177582417582M3[t] + 971.362197802197M4[t] + 1176.14681318681M5[t] + 1027.53142857143M6[t] + 589.676923076923M7[t] + 488.261538461538M8[t] + 68.0461538461537M9[t] + 252.430769230769M10[t] -75.9846153846156M11[t] + 133.815384615385t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  8706.52879120879 +  1685.19560439560x[t] +  766.647472527471M1[t] +  731.392967032967M2[t] +  636.177582417582M3[t] +  971.362197802197M4[t] +  1176.14681318681M5[t] +  1027.53142857143M6[t] +  589.676923076923M7[t] +  488.261538461538M8[t] +  68.0461538461537M9[t] +  252.430769230769M10[t] -75.9846153846156M11[t] +  133.815384615385t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25563&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  8706.52879120879 +  1685.19560439560x[t] +  766.647472527471M1[t] +  731.392967032967M2[t] +  636.177582417582M3[t] +  971.362197802197M4[t] +  1176.14681318681M5[t] +  1027.53142857143M6[t] +  589.676923076923M7[t] +  488.261538461538M8[t] +  68.0461538461537M9[t] +  252.430769230769M10[t] -75.9846153846156M11[t] +  133.815384615385t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25563&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25563&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 8706.52879120879 + 1685.19560439560x[t] + 766.647472527471M1[t] + 731.392967032967M2[t] + 636.177582417582M3[t] + 971.362197802197M4[t] + 1176.14681318681M5[t] + 1027.53142857143M6[t] + 589.676923076923M7[t] + 488.261538461538M8[t] + 68.0461538461537M9[t] + 252.430769230769M10[t] -75.9846153846156M11[t] + 133.815384615385t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)8706.52879120879513.16058516.966500
x1685.19560439560485.252363.47280.0011330.000566
M1766.647472527471606.2373671.26460.2123850.106192
M2731.392967032967602.2728511.21440.2307970.115399
M3636.177582417582601.1631791.05820.295470.147735
M4971.362197802197600.3693011.61790.1125130.056256
M51176.14681318681599.8924691.96060.0559980.027999
M61027.53142857143599.7334411.71330.0933880.046694
M7589.676923076923601.5801120.98020.3321080.166054
M8488.261538461538600.1513670.81360.4200860.210043
M968.0461538461537599.0377660.11360.9100550.455028
M10252.430769230769598.2410670.4220.6750230.337512
M11-75.9846153846156597.762537-0.12710.8994030.449702
t133.81538461538513.8121039.688300

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 8706.52879120879 & 513.160585 & 16.9665 & 0 & 0 \tabularnewline
x & 1685.19560439560 & 485.25236 & 3.4728 & 0.001133 & 0.000566 \tabularnewline
M1 & 766.647472527471 & 606.237367 & 1.2646 & 0.212385 & 0.106192 \tabularnewline
M2 & 731.392967032967 & 602.272851 & 1.2144 & 0.230797 & 0.115399 \tabularnewline
M3 & 636.177582417582 & 601.163179 & 1.0582 & 0.29547 & 0.147735 \tabularnewline
M4 & 971.362197802197 & 600.369301 & 1.6179 & 0.112513 & 0.056256 \tabularnewline
M5 & 1176.14681318681 & 599.892469 & 1.9606 & 0.055998 & 0.027999 \tabularnewline
M6 & 1027.53142857143 & 599.733441 & 1.7133 & 0.093388 & 0.046694 \tabularnewline
M7 & 589.676923076923 & 601.580112 & 0.9802 & 0.332108 & 0.166054 \tabularnewline
M8 & 488.261538461538 & 600.151367 & 0.8136 & 0.420086 & 0.210043 \tabularnewline
M9 & 68.0461538461537 & 599.037766 & 0.1136 & 0.910055 & 0.455028 \tabularnewline
M10 & 252.430769230769 & 598.241067 & 0.422 & 0.675023 & 0.337512 \tabularnewline
M11 & -75.9846153846156 & 597.762537 & -0.1271 & 0.899403 & 0.449702 \tabularnewline
t & 133.815384615385 & 13.812103 & 9.6883 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25563&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]8706.52879120879[/C][C]513.160585[/C][C]16.9665[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]x[/C][C]1685.19560439560[/C][C]485.25236[/C][C]3.4728[/C][C]0.001133[/C][C]0.000566[/C][/ROW]
[ROW][C]M1[/C][C]766.647472527471[/C][C]606.237367[/C][C]1.2646[/C][C]0.212385[/C][C]0.106192[/C][/ROW]
[ROW][C]M2[/C][C]731.392967032967[/C][C]602.272851[/C][C]1.2144[/C][C]0.230797[/C][C]0.115399[/C][/ROW]
[ROW][C]M3[/C][C]636.177582417582[/C][C]601.163179[/C][C]1.0582[/C][C]0.29547[/C][C]0.147735[/C][/ROW]
[ROW][C]M4[/C][C]971.362197802197[/C][C]600.369301[/C][C]1.6179[/C][C]0.112513[/C][C]0.056256[/C][/ROW]
[ROW][C]M5[/C][C]1176.14681318681[/C][C]599.892469[/C][C]1.9606[/C][C]0.055998[/C][C]0.027999[/C][/ROW]
[ROW][C]M6[/C][C]1027.53142857143[/C][C]599.733441[/C][C]1.7133[/C][C]0.093388[/C][C]0.046694[/C][/ROW]
[ROW][C]M7[/C][C]589.676923076923[/C][C]601.580112[/C][C]0.9802[/C][C]0.332108[/C][C]0.166054[/C][/ROW]
[ROW][C]M8[/C][C]488.261538461538[/C][C]600.151367[/C][C]0.8136[/C][C]0.420086[/C][C]0.210043[/C][/ROW]
[ROW][C]M9[/C][C]68.0461538461537[/C][C]599.037766[/C][C]0.1136[/C][C]0.910055[/C][C]0.455028[/C][/ROW]
[ROW][C]M10[/C][C]252.430769230769[/C][C]598.241067[/C][C]0.422[/C][C]0.675023[/C][C]0.337512[/C][/ROW]
[ROW][C]M11[/C][C]-75.9846153846156[/C][C]597.762537[/C][C]-0.1271[/C][C]0.899403[/C][C]0.449702[/C][/ROW]
[ROW][C]t[/C][C]133.815384615385[/C][C]13.812103[/C][C]9.6883[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25563&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25563&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)8706.52879120879513.16058516.966500
x1685.19560439560485.252363.47280.0011330.000566
M1766.647472527471606.2373671.26460.2123850.106192
M2731.392967032967602.2728511.21440.2307970.115399
M3636.177582417582601.1631791.05820.295470.147735
M4971.362197802197600.3693011.61790.1125130.056256
M51176.14681318681599.8924691.96060.0559980.027999
M61027.53142857143599.7334411.71330.0933880.046694
M7589.676923076923601.5801120.98020.3321080.166054
M8488.261538461538600.1513670.81360.4200860.210043
M968.0461538461537599.0377660.11360.9100550.455028
M10252.430769230769598.2410670.4220.6750230.337512
M11-75.9846153846156597.762537-0.12710.8994030.449702
t133.81538461538513.8121039.688300







Multiple Linear Regression - Regression Statistics
Multiple R0.96462604449019
R-squared0.93050340570879
Adjusted R-squared0.910863063843882
F-TEST (value)47.3771491407375
F-TEST (DF numerator)13
F-TEST (DF denominator)46
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation944.8932174152
Sum Squared Residuals41069866.8465934

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.96462604449019 \tabularnewline
R-squared & 0.93050340570879 \tabularnewline
Adjusted R-squared & 0.910863063843882 \tabularnewline
F-TEST (value) & 47.3771491407375 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 46 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 944.8932174152 \tabularnewline
Sum Squared Residuals & 41069866.8465934 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25563&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.96462604449019[/C][/ROW]
[ROW][C]R-squared[/C][C]0.93050340570879[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.910863063843882[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]47.3771491407375[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]46[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]944.8932174152[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]41069866.8465934[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25563&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25563&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.96462604449019
R-squared0.93050340570879
Adjusted R-squared0.910863063843882
F-TEST (value)47.3771491407375
F-TEST (DF numerator)13
F-TEST (DF denominator)46
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation944.8932174152
Sum Squared Residuals41069866.8465934







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1104139606.99164835165806.008351648347
2107099705.552527472531003.44747252747
3106629744.15252747253917.847472527473
41057010213.1525274725356.847472527473
51029710551.7525274725-254.752527472527
61063510536.952527472598.0474725274727
71087210232.9134065934639.086593406593
81029610265.313406593430.6865934065935
9103839978.9134065934404.086593406593
101043110297.1134065934133.886593406593
111057410102.5134065934471.486593406593
121065310312.3134065934340.686593406593
131080511212.7762637363-407.776263736262
141087211311.3371428571-439.337142857142
151062511349.9371428571-724.937142857143
161040711818.9371428571-1411.93714285714
171046312157.5371428571-1694.53714285714
181055612142.7371428571-1586.73714285714
191064611838.6980219780-1192.69802197802
201070211871.0980219780-1169.09802197802
211135311584.6980219780-231.698021978022
221134611902.8980219780-556.898021978022
231145111708.2980219780-257.298021978022
241196411918.098021978045.9019780219782
251257412818.5608791209-244.560879120878
261303112917.1217582418113.878241758241
271381212955.7217582418856.278241758242
281454413424.72175824181119.27824175824
291493113763.32175824181167.67824175824
301488613748.52175824181137.47824175824
311600515129.6782417582875.321758241759
321706415162.07824175821901.92175824176
331516814875.6782417582292.321758241759
341605015193.8782417582856.121758241758
351583914999.2782417582839.721758241758
361513715209.0782417582-72.0782417582415
371495414424.3454945055529.654505494506
381564816208.1019780220-560.101978021978
391530516246.7019780220-941.701978021979
401557916715.7019780220-1136.70197802198
411634817054.3019780220-706.301978021978
421592817039.5019780220-1111.50197802198
431617116735.4628571429-564.462857142857
441593716767.8628571429-830.862857142857
451571316481.4628571429-768.462857142857
461559416799.6628571429-1205.66285714286
471568316605.0628571429-922.062857142857
481643816814.8628571429-376.862857142857
491703217715.3257142857-683.325714285712
501769617813.8865934066-117.886593406593
511774517852.4865934066-107.486593406593
521939418321.48659340661072.51340659341
532014818660.08659340661487.91340659341
542010818645.28659340661462.71340659341
551858418341.2474725275242.752527472527
561844118373.647472527567.3525274725269
571839118087.2474725275303.752527472527
581917818405.4474725275772.552527472527
591807918210.8474725275-131.847472527473
601848318420.647472527562.3525274725268

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 10413 & 9606.99164835165 & 806.008351648347 \tabularnewline
2 & 10709 & 9705.55252747253 & 1003.44747252747 \tabularnewline
3 & 10662 & 9744.15252747253 & 917.847472527473 \tabularnewline
4 & 10570 & 10213.1525274725 & 356.847472527473 \tabularnewline
5 & 10297 & 10551.7525274725 & -254.752527472527 \tabularnewline
6 & 10635 & 10536.9525274725 & 98.0474725274727 \tabularnewline
7 & 10872 & 10232.9134065934 & 639.086593406593 \tabularnewline
8 & 10296 & 10265.3134065934 & 30.6865934065935 \tabularnewline
9 & 10383 & 9978.9134065934 & 404.086593406593 \tabularnewline
10 & 10431 & 10297.1134065934 & 133.886593406593 \tabularnewline
11 & 10574 & 10102.5134065934 & 471.486593406593 \tabularnewline
12 & 10653 & 10312.3134065934 & 340.686593406593 \tabularnewline
13 & 10805 & 11212.7762637363 & -407.776263736262 \tabularnewline
14 & 10872 & 11311.3371428571 & -439.337142857142 \tabularnewline
15 & 10625 & 11349.9371428571 & -724.937142857143 \tabularnewline
16 & 10407 & 11818.9371428571 & -1411.93714285714 \tabularnewline
17 & 10463 & 12157.5371428571 & -1694.53714285714 \tabularnewline
18 & 10556 & 12142.7371428571 & -1586.73714285714 \tabularnewline
19 & 10646 & 11838.6980219780 & -1192.69802197802 \tabularnewline
20 & 10702 & 11871.0980219780 & -1169.09802197802 \tabularnewline
21 & 11353 & 11584.6980219780 & -231.698021978022 \tabularnewline
22 & 11346 & 11902.8980219780 & -556.898021978022 \tabularnewline
23 & 11451 & 11708.2980219780 & -257.298021978022 \tabularnewline
24 & 11964 & 11918.0980219780 & 45.9019780219782 \tabularnewline
25 & 12574 & 12818.5608791209 & -244.560879120878 \tabularnewline
26 & 13031 & 12917.1217582418 & 113.878241758241 \tabularnewline
27 & 13812 & 12955.7217582418 & 856.278241758242 \tabularnewline
28 & 14544 & 13424.7217582418 & 1119.27824175824 \tabularnewline
29 & 14931 & 13763.3217582418 & 1167.67824175824 \tabularnewline
30 & 14886 & 13748.5217582418 & 1137.47824175824 \tabularnewline
31 & 16005 & 15129.6782417582 & 875.321758241759 \tabularnewline
32 & 17064 & 15162.0782417582 & 1901.92175824176 \tabularnewline
33 & 15168 & 14875.6782417582 & 292.321758241759 \tabularnewline
34 & 16050 & 15193.8782417582 & 856.121758241758 \tabularnewline
35 & 15839 & 14999.2782417582 & 839.721758241758 \tabularnewline
36 & 15137 & 15209.0782417582 & -72.0782417582415 \tabularnewline
37 & 14954 & 14424.3454945055 & 529.654505494506 \tabularnewline
38 & 15648 & 16208.1019780220 & -560.101978021978 \tabularnewline
39 & 15305 & 16246.7019780220 & -941.701978021979 \tabularnewline
40 & 15579 & 16715.7019780220 & -1136.70197802198 \tabularnewline
41 & 16348 & 17054.3019780220 & -706.301978021978 \tabularnewline
42 & 15928 & 17039.5019780220 & -1111.50197802198 \tabularnewline
43 & 16171 & 16735.4628571429 & -564.462857142857 \tabularnewline
44 & 15937 & 16767.8628571429 & -830.862857142857 \tabularnewline
45 & 15713 & 16481.4628571429 & -768.462857142857 \tabularnewline
46 & 15594 & 16799.6628571429 & -1205.66285714286 \tabularnewline
47 & 15683 & 16605.0628571429 & -922.062857142857 \tabularnewline
48 & 16438 & 16814.8628571429 & -376.862857142857 \tabularnewline
49 & 17032 & 17715.3257142857 & -683.325714285712 \tabularnewline
50 & 17696 & 17813.8865934066 & -117.886593406593 \tabularnewline
51 & 17745 & 17852.4865934066 & -107.486593406593 \tabularnewline
52 & 19394 & 18321.4865934066 & 1072.51340659341 \tabularnewline
53 & 20148 & 18660.0865934066 & 1487.91340659341 \tabularnewline
54 & 20108 & 18645.2865934066 & 1462.71340659341 \tabularnewline
55 & 18584 & 18341.2474725275 & 242.752527472527 \tabularnewline
56 & 18441 & 18373.6474725275 & 67.3525274725269 \tabularnewline
57 & 18391 & 18087.2474725275 & 303.752527472527 \tabularnewline
58 & 19178 & 18405.4474725275 & 772.552527472527 \tabularnewline
59 & 18079 & 18210.8474725275 & -131.847472527473 \tabularnewline
60 & 18483 & 18420.6474725275 & 62.3525274725268 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25563&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]10413[/C][C]9606.99164835165[/C][C]806.008351648347[/C][/ROW]
[ROW][C]2[/C][C]10709[/C][C]9705.55252747253[/C][C]1003.44747252747[/C][/ROW]
[ROW][C]3[/C][C]10662[/C][C]9744.15252747253[/C][C]917.847472527473[/C][/ROW]
[ROW][C]4[/C][C]10570[/C][C]10213.1525274725[/C][C]356.847472527473[/C][/ROW]
[ROW][C]5[/C][C]10297[/C][C]10551.7525274725[/C][C]-254.752527472527[/C][/ROW]
[ROW][C]6[/C][C]10635[/C][C]10536.9525274725[/C][C]98.0474725274727[/C][/ROW]
[ROW][C]7[/C][C]10872[/C][C]10232.9134065934[/C][C]639.086593406593[/C][/ROW]
[ROW][C]8[/C][C]10296[/C][C]10265.3134065934[/C][C]30.6865934065935[/C][/ROW]
[ROW][C]9[/C][C]10383[/C][C]9978.9134065934[/C][C]404.086593406593[/C][/ROW]
[ROW][C]10[/C][C]10431[/C][C]10297.1134065934[/C][C]133.886593406593[/C][/ROW]
[ROW][C]11[/C][C]10574[/C][C]10102.5134065934[/C][C]471.486593406593[/C][/ROW]
[ROW][C]12[/C][C]10653[/C][C]10312.3134065934[/C][C]340.686593406593[/C][/ROW]
[ROW][C]13[/C][C]10805[/C][C]11212.7762637363[/C][C]-407.776263736262[/C][/ROW]
[ROW][C]14[/C][C]10872[/C][C]11311.3371428571[/C][C]-439.337142857142[/C][/ROW]
[ROW][C]15[/C][C]10625[/C][C]11349.9371428571[/C][C]-724.937142857143[/C][/ROW]
[ROW][C]16[/C][C]10407[/C][C]11818.9371428571[/C][C]-1411.93714285714[/C][/ROW]
[ROW][C]17[/C][C]10463[/C][C]12157.5371428571[/C][C]-1694.53714285714[/C][/ROW]
[ROW][C]18[/C][C]10556[/C][C]12142.7371428571[/C][C]-1586.73714285714[/C][/ROW]
[ROW][C]19[/C][C]10646[/C][C]11838.6980219780[/C][C]-1192.69802197802[/C][/ROW]
[ROW][C]20[/C][C]10702[/C][C]11871.0980219780[/C][C]-1169.09802197802[/C][/ROW]
[ROW][C]21[/C][C]11353[/C][C]11584.6980219780[/C][C]-231.698021978022[/C][/ROW]
[ROW][C]22[/C][C]11346[/C][C]11902.8980219780[/C][C]-556.898021978022[/C][/ROW]
[ROW][C]23[/C][C]11451[/C][C]11708.2980219780[/C][C]-257.298021978022[/C][/ROW]
[ROW][C]24[/C][C]11964[/C][C]11918.0980219780[/C][C]45.9019780219782[/C][/ROW]
[ROW][C]25[/C][C]12574[/C][C]12818.5608791209[/C][C]-244.560879120878[/C][/ROW]
[ROW][C]26[/C][C]13031[/C][C]12917.1217582418[/C][C]113.878241758241[/C][/ROW]
[ROW][C]27[/C][C]13812[/C][C]12955.7217582418[/C][C]856.278241758242[/C][/ROW]
[ROW][C]28[/C][C]14544[/C][C]13424.7217582418[/C][C]1119.27824175824[/C][/ROW]
[ROW][C]29[/C][C]14931[/C][C]13763.3217582418[/C][C]1167.67824175824[/C][/ROW]
[ROW][C]30[/C][C]14886[/C][C]13748.5217582418[/C][C]1137.47824175824[/C][/ROW]
[ROW][C]31[/C][C]16005[/C][C]15129.6782417582[/C][C]875.321758241759[/C][/ROW]
[ROW][C]32[/C][C]17064[/C][C]15162.0782417582[/C][C]1901.92175824176[/C][/ROW]
[ROW][C]33[/C][C]15168[/C][C]14875.6782417582[/C][C]292.321758241759[/C][/ROW]
[ROW][C]34[/C][C]16050[/C][C]15193.8782417582[/C][C]856.121758241758[/C][/ROW]
[ROW][C]35[/C][C]15839[/C][C]14999.2782417582[/C][C]839.721758241758[/C][/ROW]
[ROW][C]36[/C][C]15137[/C][C]15209.0782417582[/C][C]-72.0782417582415[/C][/ROW]
[ROW][C]37[/C][C]14954[/C][C]14424.3454945055[/C][C]529.654505494506[/C][/ROW]
[ROW][C]38[/C][C]15648[/C][C]16208.1019780220[/C][C]-560.101978021978[/C][/ROW]
[ROW][C]39[/C][C]15305[/C][C]16246.7019780220[/C][C]-941.701978021979[/C][/ROW]
[ROW][C]40[/C][C]15579[/C][C]16715.7019780220[/C][C]-1136.70197802198[/C][/ROW]
[ROW][C]41[/C][C]16348[/C][C]17054.3019780220[/C][C]-706.301978021978[/C][/ROW]
[ROW][C]42[/C][C]15928[/C][C]17039.5019780220[/C][C]-1111.50197802198[/C][/ROW]
[ROW][C]43[/C][C]16171[/C][C]16735.4628571429[/C][C]-564.462857142857[/C][/ROW]
[ROW][C]44[/C][C]15937[/C][C]16767.8628571429[/C][C]-830.862857142857[/C][/ROW]
[ROW][C]45[/C][C]15713[/C][C]16481.4628571429[/C][C]-768.462857142857[/C][/ROW]
[ROW][C]46[/C][C]15594[/C][C]16799.6628571429[/C][C]-1205.66285714286[/C][/ROW]
[ROW][C]47[/C][C]15683[/C][C]16605.0628571429[/C][C]-922.062857142857[/C][/ROW]
[ROW][C]48[/C][C]16438[/C][C]16814.8628571429[/C][C]-376.862857142857[/C][/ROW]
[ROW][C]49[/C][C]17032[/C][C]17715.3257142857[/C][C]-683.325714285712[/C][/ROW]
[ROW][C]50[/C][C]17696[/C][C]17813.8865934066[/C][C]-117.886593406593[/C][/ROW]
[ROW][C]51[/C][C]17745[/C][C]17852.4865934066[/C][C]-107.486593406593[/C][/ROW]
[ROW][C]52[/C][C]19394[/C][C]18321.4865934066[/C][C]1072.51340659341[/C][/ROW]
[ROW][C]53[/C][C]20148[/C][C]18660.0865934066[/C][C]1487.91340659341[/C][/ROW]
[ROW][C]54[/C][C]20108[/C][C]18645.2865934066[/C][C]1462.71340659341[/C][/ROW]
[ROW][C]55[/C][C]18584[/C][C]18341.2474725275[/C][C]242.752527472527[/C][/ROW]
[ROW][C]56[/C][C]18441[/C][C]18373.6474725275[/C][C]67.3525274725269[/C][/ROW]
[ROW][C]57[/C][C]18391[/C][C]18087.2474725275[/C][C]303.752527472527[/C][/ROW]
[ROW][C]58[/C][C]19178[/C][C]18405.4474725275[/C][C]772.552527472527[/C][/ROW]
[ROW][C]59[/C][C]18079[/C][C]18210.8474725275[/C][C]-131.847472527473[/C][/ROW]
[ROW][C]60[/C][C]18483[/C][C]18420.6474725275[/C][C]62.3525274725268[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25563&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25563&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1104139606.99164835165806.008351648347
2107099705.552527472531003.44747252747
3106629744.15252747253917.847472527473
41057010213.1525274725356.847472527473
51029710551.7525274725-254.752527472527
61063510536.952527472598.0474725274727
71087210232.9134065934639.086593406593
81029610265.313406593430.6865934065935
9103839978.9134065934404.086593406593
101043110297.1134065934133.886593406593
111057410102.5134065934471.486593406593
121065310312.3134065934340.686593406593
131080511212.7762637363-407.776263736262
141087211311.3371428571-439.337142857142
151062511349.9371428571-724.937142857143
161040711818.9371428571-1411.93714285714
171046312157.5371428571-1694.53714285714
181055612142.7371428571-1586.73714285714
191064611838.6980219780-1192.69802197802
201070211871.0980219780-1169.09802197802
211135311584.6980219780-231.698021978022
221134611902.8980219780-556.898021978022
231145111708.2980219780-257.298021978022
241196411918.098021978045.9019780219782
251257412818.5608791209-244.560879120878
261303112917.1217582418113.878241758241
271381212955.7217582418856.278241758242
281454413424.72175824181119.27824175824
291493113763.32175824181167.67824175824
301488613748.52175824181137.47824175824
311600515129.6782417582875.321758241759
321706415162.07824175821901.92175824176
331516814875.6782417582292.321758241759
341605015193.8782417582856.121758241758
351583914999.2782417582839.721758241758
361513715209.0782417582-72.0782417582415
371495414424.3454945055529.654505494506
381564816208.1019780220-560.101978021978
391530516246.7019780220-941.701978021979
401557916715.7019780220-1136.70197802198
411634817054.3019780220-706.301978021978
421592817039.5019780220-1111.50197802198
431617116735.4628571429-564.462857142857
441593716767.8628571429-830.862857142857
451571316481.4628571429-768.462857142857
461559416799.6628571429-1205.66285714286
471568316605.0628571429-922.062857142857
481643816814.8628571429-376.862857142857
491703217715.3257142857-683.325714285712
501769617813.8865934066-117.886593406593
511774517852.4865934066-107.486593406593
521939418321.48659340661072.51340659341
532014818660.08659340661487.91340659341
542010818645.28659340661462.71340659341
551858418341.2474725275242.752527472527
561844118373.647472527567.3525274725269
571839118087.2474725275303.752527472527
581917818405.4474725275772.552527472527
591807918210.8474725275-131.847472527473
601848318420.647472527562.3525274725268



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')