Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 29 Nov 2010 20:46:24 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2010/Nov/29/t1291063562rjmmt5j810lergd.htm/, Retrieved Sat, 20 Apr 2024 01:51:17 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=103109, Retrieved Sat, 20 Apr 2024 01:51:17 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact260
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [Q1 The Seatbeltlaw] [2007-11-14 19:27:43] [8cd6641b921d30ebe00b648d1481bba0]
-  M D  [Multiple Regression] [Workshop 8, Multi...] [2010-11-29 10:32:18] [d946de7cca328fbcf207448a112523ab]
-         [Multiple Regression] [Workshop 8, Multi...] [2010-11-29 20:25:47] [3635fb7041b1998c5a1332cf9de22bce]
-    D        [Multiple Regression] [Workshop 8, Multi...] [2010-11-29 20:46:24] [23a9b79f355c69a75648521a893cf584] [Current]
F               [Multiple Regression] [WS 8, Multiple Re...] [2010-11-29 23:13:27] [8081b8996d5947580de3eb171e82db4f]
-               [Multiple Regression] [WS 8, Multiple Re...] [2010-11-29 23:45:10] [8081b8996d5947580de3eb171e82db4f]
-               [Multiple Regression] [Workshop 8 Multip...] [2010-11-30 14:40:24] [a9e130f95bad0a0597234e75c6380c5a]
- R  D            [Multiple Regression] [WS8 Multiple Line...] [2011-11-29 11:55:04] [d0cddc92c01af61bef0226b9e5ade9b3]
- R  D            [Multiple Regression] [WS8 - Mini-tutori...] [2011-11-29 13:55:48] [74be16979710d4c4e7c6647856088456]
-                   [Multiple Regression] [Paper - Deel 2 - ...] [2011-12-20 11:10:35] [95a4a8598e82ac3272c4dca488d0ba38]
-                     [Multiple Regression] [Paper Deel 4 Mult...] [2012-12-19 19:15:09] [d5c5f9d2d41487720068c665b8e94d36]
-                   [Multiple Regression] [Paper - Deel 2 - ...] [2011-12-20 11:27:34] [95a4a8598e82ac3272c4dca488d0ba38]
-  M                [Multiple Regression] [WS8 02] [2012-11-26 12:32:24] [527264e3173c1bca10b2a11a99a7175d]
-  M                [Multiple Regression] [WS 8 03] [2012-11-26 12:48:33] [527264e3173c1bca10b2a11a99a7175d]
-  M                [Multiple Regression] [WS 8 04] [2012-11-26 12:54:59] [527264e3173c1bca10b2a11a99a7175d]
- R               [Multiple Regression] [] [2011-11-29 15:37:27] [06f5daa9a1979410bf169cb7a41fb3eb]
- R PD          [Multiple Regression] [Paper Multiple Re...] [2010-12-18 18:58:17] [3635fb7041b1998c5a1332cf9de22bce]
- R PD          [Multiple Regression] [Paper Multiple Li...] [2010-12-19 12:08:20] [8081b8996d5947580de3eb171e82db4f]
- R PD          [Multiple Regression] [Paper Multiple Li...] [2010-12-19 12:08:20] [8081b8996d5947580de3eb171e82db4f]
- R PD          [Multiple Regression] [Paper, MR poging 2] [2010-12-19 20:57:32] [3635fb7041b1998c5a1332cf9de22bce]
-   P             [Multiple Regression] [Multiple Regression] [2010-12-22 08:41:16] [8081b8996d5947580de3eb171e82db4f]
-                 [Multiple Regression] [Paper Multiple Li...] [2010-12-22 08:59:09] [d946de7cca328fbcf207448a112523ab]
-                 [Multiple Regression] [Paper Multiple Li...] [2010-12-22 08:59:09] [d946de7cca328fbcf207448a112523ab]
-   PD            [Multiple Regression] [paper] [2011-12-17 16:04:09] [43239ed98a62e091c70785d80176537f]
- R  D            [Multiple Regression] [] [2011-12-23 10:56:39] [74be16979710d4c4e7c6647856088456]
-   P               [Multiple Regression] [] [2011-12-23 11:07:47] [74be16979710d4c4e7c6647856088456]
- R  D          [Multiple Regression] [] [2011-11-25 08:48:38] [46896e8a404bb9354f2d070359621409]
- R  D          [Multiple Regression] [] [2011-11-25 16:05:07] [b1eb71d4db1ceb5d347df987feb4a25e]
- RM D          [Exponential Smoothing] [] [2011-11-25 17:03:34] [b1eb71d4db1ceb5d347df987feb4a25e]
- R PD            [Exponential Smoothing] [] [2011-12-09 17:07:28] [b1eb71d4db1ceb5d347df987feb4a25e]
- RM            [Multiple Regression] [] [2011-11-29 11:57:11] [74be16979710d4c4e7c6647856088456]
- R             [Multiple Regression] [Multiple regression] [2011-11-29 14:51:37] [c505444e07acba7694d29053ca5d114e]
- RMPD          [Decomposition by Loess] [ws8by loes Monthl...] [2011-11-29 21:04:41] [43a0606d8103c0ba382f0586f4417c48]
- RM            [Multiple Regression] [] [2012-11-27 21:43:17] [74be16979710d4c4e7c6647856088456]
Feedback Forum

Post a new message
Dataseries X:
9 911
8 915
9 452
9 112
8 472
8 230
8 384
8 625
8 221
8 649
8 625
10 443
10 357
8 586
8 892
8 329
8 101
7 922
8 120
7 838
7 735
8 406
8 209
9 451
10 041
9 411
10 405
8 467
8 464
8 102
7 627
7 513
7 510
8 291
8 064
9 383
9 706
8 579
9 474
8 318
8 213
8 059
9 111
7 708
7 680
8 014
8 007
8 718
9 486
9 113
9 025
8 476
7 952
7 759
7 835
7 600
7 651
8 319
8 812
8 630




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=103109&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=103109&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=103109&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 9.81564665770551 -0.00130798098140336V2[t] + 0.467047542017969M1[t] -0.496870365252102M2[t] + 0.0191390733849997M3[t] -0.914554765271227M4[t] -1.17461898261787M5[t] -1.39948880362134M6[t] -1.18904313812691M7[t] -1.46415884470312M8[t] -1.58241850777879M9[t] -1.06574537070756M10[t] -1.04666703073587M11[t] -0.0091376845130207t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  9.81564665770551 -0.00130798098140336V2[t] +  0.467047542017969M1[t] -0.496870365252102M2[t] +  0.0191390733849997M3[t] -0.914554765271227M4[t] -1.17461898261787M5[t] -1.39948880362134M6[t] -1.18904313812691M7[t] -1.46415884470312M8[t] -1.58241850777879M9[t] -1.06574537070756M10[t] -1.04666703073587M11[t] -0.0091376845130207t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=103109&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  9.81564665770551 -0.00130798098140336V2[t] +  0.467047542017969M1[t] -0.496870365252102M2[t] +  0.0191390733849997M3[t] -0.914554765271227M4[t] -1.17461898261787M5[t] -1.39948880362134M6[t] -1.18904313812691M7[t] -1.46415884470312M8[t] -1.58241850777879M9[t] -1.06574537070756M10[t] -1.04666703073587M11[t] -0.0091376845130207t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=103109&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=103109&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 9.81564665770551 -0.00130798098140336V2[t] + 0.467047542017969M1[t] -0.496870365252102M2[t] + 0.0191390733849997M3[t] -0.914554765271227M4[t] -1.17461898261787M5[t] -1.39948880362134M6[t] -1.18904313812691M7[t] -1.46415884470312M8[t] -1.58241850777879M9[t] -1.06574537070756M10[t] -1.04666703073587M11[t] -0.0091376845130207t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)9.815646657705510.22366343.885800
V2-0.001307980981403360.000196-6.689800
M10.4670475420179690.2414311.93450.0592160.029608
M2-0.4968703652521020.24102-2.06150.0449290.022465
M30.01913907338499970.2411510.07940.9370860.468543
M4-0.9145547652712270.243112-3.76190.0004760.000238
M5-1.174618982617870.240717-4.87971.3e-057e-06
M6-1.399488803621340.240897-5.80951e-060
M7-1.189043138126910.240689-4.94021.1e-055e-06
M8-1.464158844703120.240945-6.076700
M9-1.582418507778790.239542-6.60600
M10-1.065745370707560.242209-4.40016.4e-053.2e-05
M11-1.046667030735870.241932-4.32638.1e-054e-05
t-0.00913768451302070.002878-3.17470.0026760.001338

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 9.81564665770551 & 0.223663 & 43.8858 & 0 & 0 \tabularnewline
V2 & -0.00130798098140336 & 0.000196 & -6.6898 & 0 & 0 \tabularnewline
M1 & 0.467047542017969 & 0.241431 & 1.9345 & 0.059216 & 0.029608 \tabularnewline
M2 & -0.496870365252102 & 0.24102 & -2.0615 & 0.044929 & 0.022465 \tabularnewline
M3 & 0.0191390733849997 & 0.241151 & 0.0794 & 0.937086 & 0.468543 \tabularnewline
M4 & -0.914554765271227 & 0.243112 & -3.7619 & 0.000476 & 0.000238 \tabularnewline
M5 & -1.17461898261787 & 0.240717 & -4.8797 & 1.3e-05 & 7e-06 \tabularnewline
M6 & -1.39948880362134 & 0.240897 & -5.8095 & 1e-06 & 0 \tabularnewline
M7 & -1.18904313812691 & 0.240689 & -4.9402 & 1.1e-05 & 5e-06 \tabularnewline
M8 & -1.46415884470312 & 0.240945 & -6.0767 & 0 & 0 \tabularnewline
M9 & -1.58241850777879 & 0.239542 & -6.606 & 0 & 0 \tabularnewline
M10 & -1.06574537070756 & 0.242209 & -4.4001 & 6.4e-05 & 3.2e-05 \tabularnewline
M11 & -1.04666703073587 & 0.241932 & -4.3263 & 8.1e-05 & 4e-05 \tabularnewline
t & -0.0091376845130207 & 0.002878 & -3.1747 & 0.002676 & 0.001338 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=103109&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]9.81564665770551[/C][C]0.223663[/C][C]43.8858[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]V2[/C][C]-0.00130798098140336[/C][C]0.000196[/C][C]-6.6898[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M1[/C][C]0.467047542017969[/C][C]0.241431[/C][C]1.9345[/C][C]0.059216[/C][C]0.029608[/C][/ROW]
[ROW][C]M2[/C][C]-0.496870365252102[/C][C]0.24102[/C][C]-2.0615[/C][C]0.044929[/C][C]0.022465[/C][/ROW]
[ROW][C]M3[/C][C]0.0191390733849997[/C][C]0.241151[/C][C]0.0794[/C][C]0.937086[/C][C]0.468543[/C][/ROW]
[ROW][C]M4[/C][C]-0.914554765271227[/C][C]0.243112[/C][C]-3.7619[/C][C]0.000476[/C][C]0.000238[/C][/ROW]
[ROW][C]M5[/C][C]-1.17461898261787[/C][C]0.240717[/C][C]-4.8797[/C][C]1.3e-05[/C][C]7e-06[/C][/ROW]
[ROW][C]M6[/C][C]-1.39948880362134[/C][C]0.240897[/C][C]-5.8095[/C][C]1e-06[/C][C]0[/C][/ROW]
[ROW][C]M7[/C][C]-1.18904313812691[/C][C]0.240689[/C][C]-4.9402[/C][C]1.1e-05[/C][C]5e-06[/C][/ROW]
[ROW][C]M8[/C][C]-1.46415884470312[/C][C]0.240945[/C][C]-6.0767[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M9[/C][C]-1.58241850777879[/C][C]0.239542[/C][C]-6.606[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M10[/C][C]-1.06574537070756[/C][C]0.242209[/C][C]-4.4001[/C][C]6.4e-05[/C][C]3.2e-05[/C][/ROW]
[ROW][C]M11[/C][C]-1.04666703073587[/C][C]0.241932[/C][C]-4.3263[/C][C]8.1e-05[/C][C]4e-05[/C][/ROW]
[ROW][C]t[/C][C]-0.0091376845130207[/C][C]0.002878[/C][C]-3.1747[/C][C]0.002676[/C][C]0.001338[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=103109&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=103109&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)9.815646657705510.22366343.885800
V2-0.001307980981403360.000196-6.689800
M10.4670475420179690.2414311.93450.0592160.029608
M2-0.4968703652521020.24102-2.06150.0449290.022465
M30.01913907338499970.2411510.07940.9370860.468543
M4-0.9145547652712270.243112-3.76190.0004760.000238
M5-1.174618982617870.240717-4.87971.3e-057e-06
M6-1.399488803621340.240897-5.80951e-060
M7-1.189043138126910.240689-4.94021.1e-055e-06
M8-1.464158844703120.240945-6.076700
M9-1.582418507778790.239542-6.60600
M10-1.065745370707560.242209-4.40016.4e-053.2e-05
M11-1.046667030735870.241932-4.32638.1e-054e-05
t-0.00913768451302070.002878-3.17470.0026760.001338







Multiple Linear Regression - Regression Statistics
Multiple R0.914398840553984
R-squared0.83612523960647
Adjusted R-squared0.789812807321342
F-TEST (value)18.0540126776060
F-TEST (DF numerator)13
F-TEST (DF denominator)46
p-value7.37188088351104e-14
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.378355541378205
Sum Squared Residuals6.58503412181335

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.914398840553984 \tabularnewline
R-squared & 0.83612523960647 \tabularnewline
Adjusted R-squared & 0.789812807321342 \tabularnewline
F-TEST (value) & 18.0540126776060 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 46 \tabularnewline
p-value & 7.37188088351104e-14 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.378355541378205 \tabularnewline
Sum Squared Residuals & 6.58503412181335 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=103109&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.914398840553984[/C][/ROW]
[ROW][C]R-squared[/C][C]0.83612523960647[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.789812807321342[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]18.0540126776060[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]46[/C][/ROW]
[ROW][C]p-value[/C][C]7.37188088351104e-14[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.378355541378205[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]6.58503412181335[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=103109&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=103109&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.914398840553984
R-squared0.83612523960647
Adjusted R-squared0.789812807321342
F-TEST (value)18.0540126776060
F-TEST (DF numerator)13
F-TEST (DF denominator)46
p-value7.37188088351104e-14
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.378355541378205
Sum Squared Residuals6.58503412181335







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
199.081985841152-0.0819858411519943
288.1036983254433-0.103698325443292
399.21616527395713-0.216165273957128
498.718047284465020.281952715534975
587.977972229300150.0220277706998493
688.06049612128328-0.0604961212832768
788.06037503112856-0.0603750311285617
887.460898223521120.539101776478877
987.86192519241940.138074807580605
1087.809644784936960.190355215063036
1187.850976983949310.149023016050689
12109.126558868787570.873441131212428
13109.696955090693210.30304490930679
1488.42437185416875-0.424371854168749
1588.5310014279834-0.531001427983402
1688.32456319734425-0.324563197344247
1788.35358095924455-0.353580959244549
1877.0457210679959-0.0457210679959024
1988.2960297960628-0.296029796062801
2077.07264606032596-0.07264606032596
2177.07997075382182-0.0799707538218178
2288.01783194926173-0.0178319492617327
2388.28544485805686-0.28544485805686
2499.0064428067801-0.00644280678009763
251010.0006248666604-0.000624866660424087
2698.543616311758090.456383688241911
27109.058335951770590.94166404822941
2888.03440960775433-0.0344096077543344
2987.769131648838880.230868351161119
3088.00861325859041-0.00861325859040978
3177.52323122433505-0.523231224335049
3277.3880876651258-0.388087665125803
3377.26461426048133-0.264614260481326
3488.05859754796687-0.0585975479668706
3588.3654498862041-0.365449886204099
3698.985733299359280.0142667006407225
3799.02116529987094-0.0211652998709404
3888.21422329272608-0.214223292726076
3998.858433049897510.141566950102490
4088.11964655982719-0.119646559827187
4187.987782661014880.0122173389851243
4287.95520422663450.044795773365494
4398.088497196582940.911502803417065
4477.0233791595959-0.0233791595958998
4576.932605279486510.0673947205134941
4688.31125606565935-0.311256065659353
4788.33035258798784-0.330352587987842
4888.4379074564329-0.437907456432903
4999.19926890162343-0.199268901623431
5098.71409021590380.285909784096206
5199.33606429639137-0.33606429639137
5287.80333335060920.196666649390793
5376.911532501601540.0884674983984564
5476.92996532549590.0700346745040952
5577.03186675189065-0.0318667518906529
5677.05498889143121-0.054988891431214
5776.860884513790960.139115486209045
5887.802669652175080.197330347824921
5987.167775683801890.832224316198112
6088.44335756864015-0.44335756864015

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 9 & 9.081985841152 & -0.0819858411519943 \tabularnewline
2 & 8 & 8.1036983254433 & -0.103698325443292 \tabularnewline
3 & 9 & 9.21616527395713 & -0.216165273957128 \tabularnewline
4 & 9 & 8.71804728446502 & 0.281952715534975 \tabularnewline
5 & 8 & 7.97797222930015 & 0.0220277706998493 \tabularnewline
6 & 8 & 8.06049612128328 & -0.0604961212832768 \tabularnewline
7 & 8 & 8.06037503112856 & -0.0603750311285617 \tabularnewline
8 & 8 & 7.46089822352112 & 0.539101776478877 \tabularnewline
9 & 8 & 7.8619251924194 & 0.138074807580605 \tabularnewline
10 & 8 & 7.80964478493696 & 0.190355215063036 \tabularnewline
11 & 8 & 7.85097698394931 & 0.149023016050689 \tabularnewline
12 & 10 & 9.12655886878757 & 0.873441131212428 \tabularnewline
13 & 10 & 9.69695509069321 & 0.30304490930679 \tabularnewline
14 & 8 & 8.42437185416875 & -0.424371854168749 \tabularnewline
15 & 8 & 8.5310014279834 & -0.531001427983402 \tabularnewline
16 & 8 & 8.32456319734425 & -0.324563197344247 \tabularnewline
17 & 8 & 8.35358095924455 & -0.353580959244549 \tabularnewline
18 & 7 & 7.0457210679959 & -0.0457210679959024 \tabularnewline
19 & 8 & 8.2960297960628 & -0.296029796062801 \tabularnewline
20 & 7 & 7.07264606032596 & -0.07264606032596 \tabularnewline
21 & 7 & 7.07997075382182 & -0.0799707538218178 \tabularnewline
22 & 8 & 8.01783194926173 & -0.0178319492617327 \tabularnewline
23 & 8 & 8.28544485805686 & -0.28544485805686 \tabularnewline
24 & 9 & 9.0064428067801 & -0.00644280678009763 \tabularnewline
25 & 10 & 10.0006248666604 & -0.000624866660424087 \tabularnewline
26 & 9 & 8.54361631175809 & 0.456383688241911 \tabularnewline
27 & 10 & 9.05833595177059 & 0.94166404822941 \tabularnewline
28 & 8 & 8.03440960775433 & -0.0344096077543344 \tabularnewline
29 & 8 & 7.76913164883888 & 0.230868351161119 \tabularnewline
30 & 8 & 8.00861325859041 & -0.00861325859040978 \tabularnewline
31 & 7 & 7.52323122433505 & -0.523231224335049 \tabularnewline
32 & 7 & 7.3880876651258 & -0.388087665125803 \tabularnewline
33 & 7 & 7.26461426048133 & -0.264614260481326 \tabularnewline
34 & 8 & 8.05859754796687 & -0.0585975479668706 \tabularnewline
35 & 8 & 8.3654498862041 & -0.365449886204099 \tabularnewline
36 & 9 & 8.98573329935928 & 0.0142667006407225 \tabularnewline
37 & 9 & 9.02116529987094 & -0.0211652998709404 \tabularnewline
38 & 8 & 8.21422329272608 & -0.214223292726076 \tabularnewline
39 & 9 & 8.85843304989751 & 0.141566950102490 \tabularnewline
40 & 8 & 8.11964655982719 & -0.119646559827187 \tabularnewline
41 & 8 & 7.98778266101488 & 0.0122173389851243 \tabularnewline
42 & 8 & 7.9552042266345 & 0.044795773365494 \tabularnewline
43 & 9 & 8.08849719658294 & 0.911502803417065 \tabularnewline
44 & 7 & 7.0233791595959 & -0.0233791595958998 \tabularnewline
45 & 7 & 6.93260527948651 & 0.0673947205134941 \tabularnewline
46 & 8 & 8.31125606565935 & -0.311256065659353 \tabularnewline
47 & 8 & 8.33035258798784 & -0.330352587987842 \tabularnewline
48 & 8 & 8.4379074564329 & -0.437907456432903 \tabularnewline
49 & 9 & 9.19926890162343 & -0.199268901623431 \tabularnewline
50 & 9 & 8.7140902159038 & 0.285909784096206 \tabularnewline
51 & 9 & 9.33606429639137 & -0.33606429639137 \tabularnewline
52 & 8 & 7.8033333506092 & 0.196666649390793 \tabularnewline
53 & 7 & 6.91153250160154 & 0.0884674983984564 \tabularnewline
54 & 7 & 6.9299653254959 & 0.0700346745040952 \tabularnewline
55 & 7 & 7.03186675189065 & -0.0318667518906529 \tabularnewline
56 & 7 & 7.05498889143121 & -0.054988891431214 \tabularnewline
57 & 7 & 6.86088451379096 & 0.139115486209045 \tabularnewline
58 & 8 & 7.80266965217508 & 0.197330347824921 \tabularnewline
59 & 8 & 7.16777568380189 & 0.832224316198112 \tabularnewline
60 & 8 & 8.44335756864015 & -0.44335756864015 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=103109&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]9[/C][C]9.081985841152[/C][C]-0.0819858411519943[/C][/ROW]
[ROW][C]2[/C][C]8[/C][C]8.1036983254433[/C][C]-0.103698325443292[/C][/ROW]
[ROW][C]3[/C][C]9[/C][C]9.21616527395713[/C][C]-0.216165273957128[/C][/ROW]
[ROW][C]4[/C][C]9[/C][C]8.71804728446502[/C][C]0.281952715534975[/C][/ROW]
[ROW][C]5[/C][C]8[/C][C]7.97797222930015[/C][C]0.0220277706998493[/C][/ROW]
[ROW][C]6[/C][C]8[/C][C]8.06049612128328[/C][C]-0.0604961212832768[/C][/ROW]
[ROW][C]7[/C][C]8[/C][C]8.06037503112856[/C][C]-0.0603750311285617[/C][/ROW]
[ROW][C]8[/C][C]8[/C][C]7.46089822352112[/C][C]0.539101776478877[/C][/ROW]
[ROW][C]9[/C][C]8[/C][C]7.8619251924194[/C][C]0.138074807580605[/C][/ROW]
[ROW][C]10[/C][C]8[/C][C]7.80964478493696[/C][C]0.190355215063036[/C][/ROW]
[ROW][C]11[/C][C]8[/C][C]7.85097698394931[/C][C]0.149023016050689[/C][/ROW]
[ROW][C]12[/C][C]10[/C][C]9.12655886878757[/C][C]0.873441131212428[/C][/ROW]
[ROW][C]13[/C][C]10[/C][C]9.69695509069321[/C][C]0.30304490930679[/C][/ROW]
[ROW][C]14[/C][C]8[/C][C]8.42437185416875[/C][C]-0.424371854168749[/C][/ROW]
[ROW][C]15[/C][C]8[/C][C]8.5310014279834[/C][C]-0.531001427983402[/C][/ROW]
[ROW][C]16[/C][C]8[/C][C]8.32456319734425[/C][C]-0.324563197344247[/C][/ROW]
[ROW][C]17[/C][C]8[/C][C]8.35358095924455[/C][C]-0.353580959244549[/C][/ROW]
[ROW][C]18[/C][C]7[/C][C]7.0457210679959[/C][C]-0.0457210679959024[/C][/ROW]
[ROW][C]19[/C][C]8[/C][C]8.2960297960628[/C][C]-0.296029796062801[/C][/ROW]
[ROW][C]20[/C][C]7[/C][C]7.07264606032596[/C][C]-0.07264606032596[/C][/ROW]
[ROW][C]21[/C][C]7[/C][C]7.07997075382182[/C][C]-0.0799707538218178[/C][/ROW]
[ROW][C]22[/C][C]8[/C][C]8.01783194926173[/C][C]-0.0178319492617327[/C][/ROW]
[ROW][C]23[/C][C]8[/C][C]8.28544485805686[/C][C]-0.28544485805686[/C][/ROW]
[ROW][C]24[/C][C]9[/C][C]9.0064428067801[/C][C]-0.00644280678009763[/C][/ROW]
[ROW][C]25[/C][C]10[/C][C]10.0006248666604[/C][C]-0.000624866660424087[/C][/ROW]
[ROW][C]26[/C][C]9[/C][C]8.54361631175809[/C][C]0.456383688241911[/C][/ROW]
[ROW][C]27[/C][C]10[/C][C]9.05833595177059[/C][C]0.94166404822941[/C][/ROW]
[ROW][C]28[/C][C]8[/C][C]8.03440960775433[/C][C]-0.0344096077543344[/C][/ROW]
[ROW][C]29[/C][C]8[/C][C]7.76913164883888[/C][C]0.230868351161119[/C][/ROW]
[ROW][C]30[/C][C]8[/C][C]8.00861325859041[/C][C]-0.00861325859040978[/C][/ROW]
[ROW][C]31[/C][C]7[/C][C]7.52323122433505[/C][C]-0.523231224335049[/C][/ROW]
[ROW][C]32[/C][C]7[/C][C]7.3880876651258[/C][C]-0.388087665125803[/C][/ROW]
[ROW][C]33[/C][C]7[/C][C]7.26461426048133[/C][C]-0.264614260481326[/C][/ROW]
[ROW][C]34[/C][C]8[/C][C]8.05859754796687[/C][C]-0.0585975479668706[/C][/ROW]
[ROW][C]35[/C][C]8[/C][C]8.3654498862041[/C][C]-0.365449886204099[/C][/ROW]
[ROW][C]36[/C][C]9[/C][C]8.98573329935928[/C][C]0.0142667006407225[/C][/ROW]
[ROW][C]37[/C][C]9[/C][C]9.02116529987094[/C][C]-0.0211652998709404[/C][/ROW]
[ROW][C]38[/C][C]8[/C][C]8.21422329272608[/C][C]-0.214223292726076[/C][/ROW]
[ROW][C]39[/C][C]9[/C][C]8.85843304989751[/C][C]0.141566950102490[/C][/ROW]
[ROW][C]40[/C][C]8[/C][C]8.11964655982719[/C][C]-0.119646559827187[/C][/ROW]
[ROW][C]41[/C][C]8[/C][C]7.98778266101488[/C][C]0.0122173389851243[/C][/ROW]
[ROW][C]42[/C][C]8[/C][C]7.9552042266345[/C][C]0.044795773365494[/C][/ROW]
[ROW][C]43[/C][C]9[/C][C]8.08849719658294[/C][C]0.911502803417065[/C][/ROW]
[ROW][C]44[/C][C]7[/C][C]7.0233791595959[/C][C]-0.0233791595958998[/C][/ROW]
[ROW][C]45[/C][C]7[/C][C]6.93260527948651[/C][C]0.0673947205134941[/C][/ROW]
[ROW][C]46[/C][C]8[/C][C]8.31125606565935[/C][C]-0.311256065659353[/C][/ROW]
[ROW][C]47[/C][C]8[/C][C]8.33035258798784[/C][C]-0.330352587987842[/C][/ROW]
[ROW][C]48[/C][C]8[/C][C]8.4379074564329[/C][C]-0.437907456432903[/C][/ROW]
[ROW][C]49[/C][C]9[/C][C]9.19926890162343[/C][C]-0.199268901623431[/C][/ROW]
[ROW][C]50[/C][C]9[/C][C]8.7140902159038[/C][C]0.285909784096206[/C][/ROW]
[ROW][C]51[/C][C]9[/C][C]9.33606429639137[/C][C]-0.33606429639137[/C][/ROW]
[ROW][C]52[/C][C]8[/C][C]7.8033333506092[/C][C]0.196666649390793[/C][/ROW]
[ROW][C]53[/C][C]7[/C][C]6.91153250160154[/C][C]0.0884674983984564[/C][/ROW]
[ROW][C]54[/C][C]7[/C][C]6.9299653254959[/C][C]0.0700346745040952[/C][/ROW]
[ROW][C]55[/C][C]7[/C][C]7.03186675189065[/C][C]-0.0318667518906529[/C][/ROW]
[ROW][C]56[/C][C]7[/C][C]7.05498889143121[/C][C]-0.054988891431214[/C][/ROW]
[ROW][C]57[/C][C]7[/C][C]6.86088451379096[/C][C]0.139115486209045[/C][/ROW]
[ROW][C]58[/C][C]8[/C][C]7.80266965217508[/C][C]0.197330347824921[/C][/ROW]
[ROW][C]59[/C][C]8[/C][C]7.16777568380189[/C][C]0.832224316198112[/C][/ROW]
[ROW][C]60[/C][C]8[/C][C]8.44335756864015[/C][C]-0.44335756864015[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=103109&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=103109&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
199.081985841152-0.0819858411519943
288.1036983254433-0.103698325443292
399.21616527395713-0.216165273957128
498.718047284465020.281952715534975
587.977972229300150.0220277706998493
688.06049612128328-0.0604961212832768
788.06037503112856-0.0603750311285617
887.460898223521120.539101776478877
987.86192519241940.138074807580605
1087.809644784936960.190355215063036
1187.850976983949310.149023016050689
12109.126558868787570.873441131212428
13109.696955090693210.30304490930679
1488.42437185416875-0.424371854168749
1588.5310014279834-0.531001427983402
1688.32456319734425-0.324563197344247
1788.35358095924455-0.353580959244549
1877.0457210679959-0.0457210679959024
1988.2960297960628-0.296029796062801
2077.07264606032596-0.07264606032596
2177.07997075382182-0.0799707538218178
2288.01783194926173-0.0178319492617327
2388.28544485805686-0.28544485805686
2499.0064428067801-0.00644280678009763
251010.0006248666604-0.000624866660424087
2698.543616311758090.456383688241911
27109.058335951770590.94166404822941
2888.03440960775433-0.0344096077543344
2987.769131648838880.230868351161119
3088.00861325859041-0.00861325859040978
3177.52323122433505-0.523231224335049
3277.3880876651258-0.388087665125803
3377.26461426048133-0.264614260481326
3488.05859754796687-0.0585975479668706
3588.3654498862041-0.365449886204099
3698.985733299359280.0142667006407225
3799.02116529987094-0.0211652998709404
3888.21422329272608-0.214223292726076
3998.858433049897510.141566950102490
4088.11964655982719-0.119646559827187
4187.987782661014880.0122173389851243
4287.95520422663450.044795773365494
4398.088497196582940.911502803417065
4477.0233791595959-0.0233791595958998
4576.932605279486510.0673947205134941
4688.31125606565935-0.311256065659353
4788.33035258798784-0.330352587987842
4888.4379074564329-0.437907456432903
4999.19926890162343-0.199268901623431
5098.71409021590380.285909784096206
5199.33606429639137-0.33606429639137
5287.80333335060920.196666649390793
5376.911532501601540.0884674983984564
5476.92996532549590.0700346745040952
5577.03186675189065-0.0318667518906529
5677.05498889143121-0.054988891431214
5776.860884513790960.139115486209045
5887.802669652175080.197330347824921
5987.167775683801890.832224316198112
6088.44335756864015-0.44335756864015



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')