Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationTue, 21 Dec 2010 17:22:06 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2010/Dec/21/t1292952039ox63qwqi3pw7xkt.htm/, Retrieved Wed, 15 May 2024 02:32:29 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=113766, Retrieved Wed, 15 May 2024 02:32:29 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact130
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [HPC Retail Sales] [2008-03-08 13:40:54] [1c0f2c85e8a48e42648374b3bcceca26]
-  MPD  [Multiple Regression] [ws8 - Regressie a...] [2010-11-27 11:23:58] [4a7069087cf9e0eda253aeed7d8c30d6]
-    D    [Multiple Regression] [Paper - Regressie...] [2010-11-28 20:30:20] [4a7069087cf9e0eda253aeed7d8c30d6]
-    D      [Multiple Regression] [Paper - Regressie...] [2010-11-29 18:29:27] [4a7069087cf9e0eda253aeed7d8c30d6]
-    D          [Multiple Regression] [Multiple regressi...] [2010-12-21 17:22:06] [039869833c16fe697975601e6b065e0f] [Current]
Feedback Forum

Post a new message
Dataseries X:
1038,00	0	934,00	988,00	870,00	854,00
934,00	0	988,00	870,00	854,00	834,00
988,00	0	870,00	854,00	834,00	872,00
870,00	0	854,00	834,00	872,00	954,00
854,00	0	834,00	872,00	954,00	870,00
834,00	0	872,00	954,00	870,00	1238,00
872,00	0	954,00	870,00	1238,00	1082,00
954,00	0	870,00	1238,00	1082,00	1053,00
870,00	0	1238,00	1082,00	1053,00	934,00
1238,00	0	1082,00	1053,00	934,00	787,00
1082,00	0	1053,00	934,00	787,00	1081,00
1053,00	0	934,00	787,00	1081,00	908,00
934,00	0	787,00	1081,00	908,00	995,00
787,00	0	1081,00	908,00	995,00	825,00
1081,00	0	908,00	995,00	825,00	822,00
908,00	0	995,00	825,00	822,00	856,00
995,00	0	825,00	822,00	856,00	887,00
825,00	0	822,00	856,00	887,00	1094,00
822,00	0	856,00	887,00	1094,00	990,00
856,00	0	887,00	1094,00	990,00	936,00
887,00	0	1094,00	990,00	936,00	1097,00
1094,00	0	990,00	936,00	1097,00	918,00
990,00	0	936,00	1097,00	918,00	926,00
936,00	0	1097,00	918,00	926,00	907,00
1097,00	0	918,00	926,00	907,00	899,00
918,00	0	926,00	907,00	899,00	971,00
926,00	0	907,00	899,00	971,00	1087,00
907,00	0	899,00	971,00	1087,00	1000,00
899,00	0	971,00	1087,00	1000,00	1071,00
971,00	0	1087,00	1000,00	1071,00	1190,00
1087,00	0	1000,00	1071,00	1190,00	1116,00
1000,00	0	1071,00	1190,00	1116,00	1070,00
1071,00	0	1190,00	1116,00	1070,00	1314,00
1190,00	0	1116,00	1070,00	1314,00	1068,00
1116,00	0	1070,00	1314,00	1068,00	1185,00
1070,00	0	1314,00	1068,00	1185,00	1215,00
1314,00	0	1068,00	1185,00	1215,00	1145,00
1068,00	0	1185,00	1215,00	1145,00	1251,00
1185,00	0	1215,00	1145,00	1251,00	1363,00
1215,00	0	1145,00	1251,00	1363,00	1368,00
1145,00	0	1251,00	1363,00	1368,00	1535,00
1251,00	1	1363,00	1368,00	1535,00	1853,00
1363,00	1	1368,00	1535,00	1853,00	1866,00
1368,00	1	1535,00	1853,00	1866,00	2023,00
1535,00	1	1853,00	1866,00	2023,00	1373,00
1853,00	1	1866,00	2023,00	1373,00	1968,00
1866,00	1	2023,00	1373,00	1968,00	1424,00
2023,00	1	1373,00	1968,00	1424,00	1160,00
1373,00	1	1968,00	1424,00	1160,00	1243,00
1968,00	1	1424,00	1160,00	1243,00	1375,00
1424,00	1	1160,00	1243,00	1375,00	1539,00
1160,00	1	1243,00	1375,00	1539,00	1773,00
1243,00	1	1375,00	1539,00	1773,00	1906,00
1375,00	1	1539,00	1773,00	1906,00	2076,00
1539,00	1	1773,00	1906,00	2076,00	2004,00




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'George Udny Yule' @ 72.249.76.132

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 6 seconds \tabularnewline
R Server & 'George Udny Yule' @ 72.249.76.132 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=113766&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]6 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'George Udny Yule' @ 72.249.76.132[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=113766&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=113766&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'George Udny Yule' @ 72.249.76.132







Multiple Linear Regression - Estimated Regression Equation
Maandelijksewerkloosheid[t] = + 785.581646496084 + 316.837588957848x[t] + 0.113241666763825`y-1`[t] + 0.331362488209895`y-2`[t] + 0.0193980967903263`y-3`[t] -0.253615790411075`y-4`[t] -59.0113984962233M1[t] -35.8482054764646M2[t] -25.6599930393899M3[t] -136.285762455275M4[t] -141.100697932726M5[t] -153.268034800840M6[t] -123.621358861251M7[t] -197.682505736142M8[t] -180.794636617663M9[t] + 78.107982061271M10[t] + 15.0932390499574M11[t] + 4.08900526417693t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Maandelijksewerkloosheid[t] =  +  785.581646496084 +  316.837588957848x[t] +  0.113241666763825`y-1`[t] +  0.331362488209895`y-2`[t] +  0.0193980967903263`y-3`[t] -0.253615790411075`y-4`[t] -59.0113984962233M1[t] -35.8482054764646M2[t] -25.6599930393899M3[t] -136.285762455275M4[t] -141.100697932726M5[t] -153.268034800840M6[t] -123.621358861251M7[t] -197.682505736142M8[t] -180.794636617663M9[t] +  78.107982061271M10[t] +  15.0932390499574M11[t] +  4.08900526417693t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=113766&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Maandelijksewerkloosheid[t] =  +  785.581646496084 +  316.837588957848x[t] +  0.113241666763825`y-1`[t] +  0.331362488209895`y-2`[t] +  0.0193980967903263`y-3`[t] -0.253615790411075`y-4`[t] -59.0113984962233M1[t] -35.8482054764646M2[t] -25.6599930393899M3[t] -136.285762455275M4[t] -141.100697932726M5[t] -153.268034800840M6[t] -123.621358861251M7[t] -197.682505736142M8[t] -180.794636617663M9[t] +  78.107982061271M10[t] +  15.0932390499574M11[t] +  4.08900526417693t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=113766&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=113766&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Maandelijksewerkloosheid[t] = + 785.581646496084 + 316.837588957848x[t] + 0.113241666763825`y-1`[t] + 0.331362488209895`y-2`[t] + 0.0193980967903263`y-3`[t] -0.253615790411075`y-4`[t] -59.0113984962233M1[t] -35.8482054764646M2[t] -25.6599930393899M3[t] -136.285762455275M4[t] -141.100697932726M5[t] -153.268034800840M6[t] -123.621358861251M7[t] -197.682505736142M8[t] -180.794636617663M9[t] + 78.107982061271M10[t] + 15.0932390499574M11[t] + 4.08900526417693t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)785.581646496084187.2435624.19550.0001638.2e-05
x316.837588957848102.6770913.08580.0038320.001916
`y-1`0.1132416667638250.1545180.73290.4682550.234127
`y-2`0.3313624882098950.1522412.17660.0359690.017985
`y-3`0.01939809679032630.1511660.12830.8985880.449294
`y-4`-0.2536157904110750.146221-1.73450.0911610.04558
M1-59.0113984962233100.972532-0.58440.5624780.281239
M2-35.8482054764646102.743999-0.34890.7291360.364568
M3-25.6599930393899105.464578-0.24330.8091140.404557
M4-136.285762455275106.342184-1.28160.2079640.103982
M5-141.100697932726106.655141-1.3230.1939660.096983
M6-153.268034800840117.242164-1.30730.1991840.099592
M7-123.621358861251113.945213-1.08490.2849760.142488
M8-197.682505736142113.423447-1.74290.0896630.044831
M9-180.794636617663109.898744-1.64510.1084190.05421
M1078.107982061271107.7281750.7250.4729840.236492
M1115.0932390499574107.0509480.1410.8886420.444321
t4.089005264176932.5656651.59370.1195020.059751

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 785.581646496084 & 187.243562 & 4.1955 & 0.000163 & 8.2e-05 \tabularnewline
x & 316.837588957848 & 102.677091 & 3.0858 & 0.003832 & 0.001916 \tabularnewline
`y-1` & 0.113241666763825 & 0.154518 & 0.7329 & 0.468255 & 0.234127 \tabularnewline
`y-2` & 0.331362488209895 & 0.152241 & 2.1766 & 0.035969 & 0.017985 \tabularnewline
`y-3` & 0.0193980967903263 & 0.151166 & 0.1283 & 0.898588 & 0.449294 \tabularnewline
`y-4` & -0.253615790411075 & 0.146221 & -1.7345 & 0.091161 & 0.04558 \tabularnewline
M1 & -59.0113984962233 & 100.972532 & -0.5844 & 0.562478 & 0.281239 \tabularnewline
M2 & -35.8482054764646 & 102.743999 & -0.3489 & 0.729136 & 0.364568 \tabularnewline
M3 & -25.6599930393899 & 105.464578 & -0.2433 & 0.809114 & 0.404557 \tabularnewline
M4 & -136.285762455275 & 106.342184 & -1.2816 & 0.207964 & 0.103982 \tabularnewline
M5 & -141.100697932726 & 106.655141 & -1.323 & 0.193966 & 0.096983 \tabularnewline
M6 & -153.268034800840 & 117.242164 & -1.3073 & 0.199184 & 0.099592 \tabularnewline
M7 & -123.621358861251 & 113.945213 & -1.0849 & 0.284976 & 0.142488 \tabularnewline
M8 & -197.682505736142 & 113.423447 & -1.7429 & 0.089663 & 0.044831 \tabularnewline
M9 & -180.794636617663 & 109.898744 & -1.6451 & 0.108419 & 0.05421 \tabularnewline
M10 & 78.107982061271 & 107.728175 & 0.725 & 0.472984 & 0.236492 \tabularnewline
M11 & 15.0932390499574 & 107.050948 & 0.141 & 0.888642 & 0.444321 \tabularnewline
t & 4.08900526417693 & 2.565665 & 1.5937 & 0.119502 & 0.059751 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=113766&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]785.581646496084[/C][C]187.243562[/C][C]4.1955[/C][C]0.000163[/C][C]8.2e-05[/C][/ROW]
[ROW][C]x[/C][C]316.837588957848[/C][C]102.677091[/C][C]3.0858[/C][C]0.003832[/C][C]0.001916[/C][/ROW]
[ROW][C]`y-1`[/C][C]0.113241666763825[/C][C]0.154518[/C][C]0.7329[/C][C]0.468255[/C][C]0.234127[/C][/ROW]
[ROW][C]`y-2`[/C][C]0.331362488209895[/C][C]0.152241[/C][C]2.1766[/C][C]0.035969[/C][C]0.017985[/C][/ROW]
[ROW][C]`y-3`[/C][C]0.0193980967903263[/C][C]0.151166[/C][C]0.1283[/C][C]0.898588[/C][C]0.449294[/C][/ROW]
[ROW][C]`y-4`[/C][C]-0.253615790411075[/C][C]0.146221[/C][C]-1.7345[/C][C]0.091161[/C][C]0.04558[/C][/ROW]
[ROW][C]M1[/C][C]-59.0113984962233[/C][C]100.972532[/C][C]-0.5844[/C][C]0.562478[/C][C]0.281239[/C][/ROW]
[ROW][C]M2[/C][C]-35.8482054764646[/C][C]102.743999[/C][C]-0.3489[/C][C]0.729136[/C][C]0.364568[/C][/ROW]
[ROW][C]M3[/C][C]-25.6599930393899[/C][C]105.464578[/C][C]-0.2433[/C][C]0.809114[/C][C]0.404557[/C][/ROW]
[ROW][C]M4[/C][C]-136.285762455275[/C][C]106.342184[/C][C]-1.2816[/C][C]0.207964[/C][C]0.103982[/C][/ROW]
[ROW][C]M5[/C][C]-141.100697932726[/C][C]106.655141[/C][C]-1.323[/C][C]0.193966[/C][C]0.096983[/C][/ROW]
[ROW][C]M6[/C][C]-153.268034800840[/C][C]117.242164[/C][C]-1.3073[/C][C]0.199184[/C][C]0.099592[/C][/ROW]
[ROW][C]M7[/C][C]-123.621358861251[/C][C]113.945213[/C][C]-1.0849[/C][C]0.284976[/C][C]0.142488[/C][/ROW]
[ROW][C]M8[/C][C]-197.682505736142[/C][C]113.423447[/C][C]-1.7429[/C][C]0.089663[/C][C]0.044831[/C][/ROW]
[ROW][C]M9[/C][C]-180.794636617663[/C][C]109.898744[/C][C]-1.6451[/C][C]0.108419[/C][C]0.05421[/C][/ROW]
[ROW][C]M10[/C][C]78.107982061271[/C][C]107.728175[/C][C]0.725[/C][C]0.472984[/C][C]0.236492[/C][/ROW]
[ROW][C]M11[/C][C]15.0932390499574[/C][C]107.050948[/C][C]0.141[/C][C]0.888642[/C][C]0.444321[/C][/ROW]
[ROW][C]t[/C][C]4.08900526417693[/C][C]2.565665[/C][C]1.5937[/C][C]0.119502[/C][C]0.059751[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=113766&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=113766&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)785.581646496084187.2435624.19550.0001638.2e-05
x316.837588957848102.6770913.08580.0038320.001916
`y-1`0.1132416667638250.1545180.73290.4682550.234127
`y-2`0.3313624882098950.1522412.17660.0359690.017985
`y-3`0.01939809679032630.1511660.12830.8985880.449294
`y-4`-0.2536157904110750.146221-1.73450.0911610.04558
M1-59.0113984962233100.972532-0.58440.5624780.281239
M2-35.8482054764646102.743999-0.34890.7291360.364568
M3-25.6599930393899105.464578-0.24330.8091140.404557
M4-136.285762455275106.342184-1.28160.2079640.103982
M5-141.100697932726106.655141-1.3230.1939660.096983
M6-153.268034800840117.242164-1.30730.1991840.099592
M7-123.621358861251113.945213-1.08490.2849760.142488
M8-197.682505736142113.423447-1.74290.0896630.044831
M9-180.794636617663109.898744-1.64510.1084190.05421
M1078.107982061271107.7281750.7250.4729840.236492
M1115.0932390499574107.0509480.1410.8886420.444321
t4.089005264176932.5656651.59370.1195020.059751







Multiple Linear Regression - Regression Statistics
Multiple R0.907790108097507
R-squared0.824082880359683
Adjusted R-squared0.743256095660078
F-TEST (value)10.195665748952
F-TEST (DF numerator)17
F-TEST (DF denominator)37
p-value2.88513257729051e-09
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation148.271370356329
Sum Squared Residuals813422.77289172

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.907790108097507 \tabularnewline
R-squared & 0.824082880359683 \tabularnewline
Adjusted R-squared & 0.743256095660078 \tabularnewline
F-TEST (value) & 10.195665748952 \tabularnewline
F-TEST (DF numerator) & 17 \tabularnewline
F-TEST (DF denominator) & 37 \tabularnewline
p-value & 2.88513257729051e-09 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 148.271370356329 \tabularnewline
Sum Squared Residuals & 813422.77289172 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=113766&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.907790108097507[/C][/ROW]
[ROW][C]R-squared[/C][C]0.824082880359683[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.743256095660078[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]10.195665748952[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]17[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]37[/C][/ROW]
[ROW][C]p-value[/C][C]2.88513257729051e-09[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]148.271370356329[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]813422.77289172[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=113766&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=113766&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.907790108097507
R-squared0.824082880359683
Adjusted R-squared0.743256095660078
F-TEST (value)10.195665748952
F-TEST (DF numerator)17
F-TEST (DF denominator)37
p-value2.88513257729051e-09
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation148.271370356329
Sum Squared Residuals813422.77289172







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
11038964.10156756935273.8984324306477
2934963.129988509344-29.1299885093441
3988948.71752774967939.2824722503214
4870813.68228002987656.3177199701244
5854846.1776613646387.82233863536176
6834774.61418612927559.3858138707251
7872836.50379792101135.4962020789889
8954883.28950678600770.7104932139932
9870923.864500629006-53.8645006290055
1012381194.5540600712543.445939928748
1110821015.4976152819566.502384718047
121053991.88590958189561.1140904181054
13934992.317118358785-58.3171183587849
147871040.33897500161-253.338975001614
1510811061.3170917438619.6829082561369
16908899.6195984405788.38040155942176
17995871.445943200952123.554056799048
18825822.3970835812672.60291641873282
19822900.64666682786-78.6466668278594
20856914.454902562276-58.4549025622758
21887882.5314637083544.46853629164575
2210941164.37270001152-70.372700011518
239901147.18008721217-157.180087212171
249361100.06776117793-164.067761177928
2510971029.1863719851167.8136280148916
269181032.63309464325-114.633094643247
279261014.08505205153-88.0850520515253
28907954.81520671026-47.8152067102606
29899980.986369596385-81.9863695963852
30971928.41252067598642.587479324014
311087996.8988555426790.10114445733
321000984.6298755655915.3701244344101
331071931.787118852952139.212881147048
3411901238.27880505585-48.2788050558488
3511161220.55141847227-104.551418472275
3610701150.32408288937-80.3240828893692
3713141124.64869898646189.351301013535
3810681146.84990636917-78.8499063691678
3911851114.9802296323870.0197703676242
4012151036.54548044591178.454519554090
4111451042.67891907441102.321080925588
4212511288.36771636018-37.3677163601832
4313631380.8787309328-17.8787309328007
4413681395.62571508613-27.6257150861273
4515351624.81691680969-89.8169168096881
4618531777.7944348613875.2055651386188
4718661670.7708790336195.229120966398
4820231839.72224635081183.277753649192
4913731645.74624310029-272.74624310029
5019681492.04803547663475.951964523373
5114241464.90009882256-40.9000988225573
5211601355.33743437338-195.337434373375
5312431394.71110676361-151.711106763612
5413751442.20849325329-67.2084932532887
5515391568.07194877566-29.0719487756588

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 1038 & 964.101567569352 & 73.8984324306477 \tabularnewline
2 & 934 & 963.129988509344 & -29.1299885093441 \tabularnewline
3 & 988 & 948.717527749679 & 39.2824722503214 \tabularnewline
4 & 870 & 813.682280029876 & 56.3177199701244 \tabularnewline
5 & 854 & 846.177661364638 & 7.82233863536176 \tabularnewline
6 & 834 & 774.614186129275 & 59.3858138707251 \tabularnewline
7 & 872 & 836.503797921011 & 35.4962020789889 \tabularnewline
8 & 954 & 883.289506786007 & 70.7104932139932 \tabularnewline
9 & 870 & 923.864500629006 & -53.8645006290055 \tabularnewline
10 & 1238 & 1194.55406007125 & 43.445939928748 \tabularnewline
11 & 1082 & 1015.49761528195 & 66.502384718047 \tabularnewline
12 & 1053 & 991.885909581895 & 61.1140904181054 \tabularnewline
13 & 934 & 992.317118358785 & -58.3171183587849 \tabularnewline
14 & 787 & 1040.33897500161 & -253.338975001614 \tabularnewline
15 & 1081 & 1061.31709174386 & 19.6829082561369 \tabularnewline
16 & 908 & 899.619598440578 & 8.38040155942176 \tabularnewline
17 & 995 & 871.445943200952 & 123.554056799048 \tabularnewline
18 & 825 & 822.397083581267 & 2.60291641873282 \tabularnewline
19 & 822 & 900.64666682786 & -78.6466668278594 \tabularnewline
20 & 856 & 914.454902562276 & -58.4549025622758 \tabularnewline
21 & 887 & 882.531463708354 & 4.46853629164575 \tabularnewline
22 & 1094 & 1164.37270001152 & -70.372700011518 \tabularnewline
23 & 990 & 1147.18008721217 & -157.180087212171 \tabularnewline
24 & 936 & 1100.06776117793 & -164.067761177928 \tabularnewline
25 & 1097 & 1029.18637198511 & 67.8136280148916 \tabularnewline
26 & 918 & 1032.63309464325 & -114.633094643247 \tabularnewline
27 & 926 & 1014.08505205153 & -88.0850520515253 \tabularnewline
28 & 907 & 954.81520671026 & -47.8152067102606 \tabularnewline
29 & 899 & 980.986369596385 & -81.9863695963852 \tabularnewline
30 & 971 & 928.412520675986 & 42.587479324014 \tabularnewline
31 & 1087 & 996.89885554267 & 90.10114445733 \tabularnewline
32 & 1000 & 984.62987556559 & 15.3701244344101 \tabularnewline
33 & 1071 & 931.787118852952 & 139.212881147048 \tabularnewline
34 & 1190 & 1238.27880505585 & -48.2788050558488 \tabularnewline
35 & 1116 & 1220.55141847227 & -104.551418472275 \tabularnewline
36 & 1070 & 1150.32408288937 & -80.3240828893692 \tabularnewline
37 & 1314 & 1124.64869898646 & 189.351301013535 \tabularnewline
38 & 1068 & 1146.84990636917 & -78.8499063691678 \tabularnewline
39 & 1185 & 1114.98022963238 & 70.0197703676242 \tabularnewline
40 & 1215 & 1036.54548044591 & 178.454519554090 \tabularnewline
41 & 1145 & 1042.67891907441 & 102.321080925588 \tabularnewline
42 & 1251 & 1288.36771636018 & -37.3677163601832 \tabularnewline
43 & 1363 & 1380.8787309328 & -17.8787309328007 \tabularnewline
44 & 1368 & 1395.62571508613 & -27.6257150861273 \tabularnewline
45 & 1535 & 1624.81691680969 & -89.8169168096881 \tabularnewline
46 & 1853 & 1777.79443486138 & 75.2055651386188 \tabularnewline
47 & 1866 & 1670.7708790336 & 195.229120966398 \tabularnewline
48 & 2023 & 1839.72224635081 & 183.277753649192 \tabularnewline
49 & 1373 & 1645.74624310029 & -272.74624310029 \tabularnewline
50 & 1968 & 1492.04803547663 & 475.951964523373 \tabularnewline
51 & 1424 & 1464.90009882256 & -40.9000988225573 \tabularnewline
52 & 1160 & 1355.33743437338 & -195.337434373375 \tabularnewline
53 & 1243 & 1394.71110676361 & -151.711106763612 \tabularnewline
54 & 1375 & 1442.20849325329 & -67.2084932532887 \tabularnewline
55 & 1539 & 1568.07194877566 & -29.0719487756588 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=113766&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]1038[/C][C]964.101567569352[/C][C]73.8984324306477[/C][/ROW]
[ROW][C]2[/C][C]934[/C][C]963.129988509344[/C][C]-29.1299885093441[/C][/ROW]
[ROW][C]3[/C][C]988[/C][C]948.717527749679[/C][C]39.2824722503214[/C][/ROW]
[ROW][C]4[/C][C]870[/C][C]813.682280029876[/C][C]56.3177199701244[/C][/ROW]
[ROW][C]5[/C][C]854[/C][C]846.177661364638[/C][C]7.82233863536176[/C][/ROW]
[ROW][C]6[/C][C]834[/C][C]774.614186129275[/C][C]59.3858138707251[/C][/ROW]
[ROW][C]7[/C][C]872[/C][C]836.503797921011[/C][C]35.4962020789889[/C][/ROW]
[ROW][C]8[/C][C]954[/C][C]883.289506786007[/C][C]70.7104932139932[/C][/ROW]
[ROW][C]9[/C][C]870[/C][C]923.864500629006[/C][C]-53.8645006290055[/C][/ROW]
[ROW][C]10[/C][C]1238[/C][C]1194.55406007125[/C][C]43.445939928748[/C][/ROW]
[ROW][C]11[/C][C]1082[/C][C]1015.49761528195[/C][C]66.502384718047[/C][/ROW]
[ROW][C]12[/C][C]1053[/C][C]991.885909581895[/C][C]61.1140904181054[/C][/ROW]
[ROW][C]13[/C][C]934[/C][C]992.317118358785[/C][C]-58.3171183587849[/C][/ROW]
[ROW][C]14[/C][C]787[/C][C]1040.33897500161[/C][C]-253.338975001614[/C][/ROW]
[ROW][C]15[/C][C]1081[/C][C]1061.31709174386[/C][C]19.6829082561369[/C][/ROW]
[ROW][C]16[/C][C]908[/C][C]899.619598440578[/C][C]8.38040155942176[/C][/ROW]
[ROW][C]17[/C][C]995[/C][C]871.445943200952[/C][C]123.554056799048[/C][/ROW]
[ROW][C]18[/C][C]825[/C][C]822.397083581267[/C][C]2.60291641873282[/C][/ROW]
[ROW][C]19[/C][C]822[/C][C]900.64666682786[/C][C]-78.6466668278594[/C][/ROW]
[ROW][C]20[/C][C]856[/C][C]914.454902562276[/C][C]-58.4549025622758[/C][/ROW]
[ROW][C]21[/C][C]887[/C][C]882.531463708354[/C][C]4.46853629164575[/C][/ROW]
[ROW][C]22[/C][C]1094[/C][C]1164.37270001152[/C][C]-70.372700011518[/C][/ROW]
[ROW][C]23[/C][C]990[/C][C]1147.18008721217[/C][C]-157.180087212171[/C][/ROW]
[ROW][C]24[/C][C]936[/C][C]1100.06776117793[/C][C]-164.067761177928[/C][/ROW]
[ROW][C]25[/C][C]1097[/C][C]1029.18637198511[/C][C]67.8136280148916[/C][/ROW]
[ROW][C]26[/C][C]918[/C][C]1032.63309464325[/C][C]-114.633094643247[/C][/ROW]
[ROW][C]27[/C][C]926[/C][C]1014.08505205153[/C][C]-88.0850520515253[/C][/ROW]
[ROW][C]28[/C][C]907[/C][C]954.81520671026[/C][C]-47.8152067102606[/C][/ROW]
[ROW][C]29[/C][C]899[/C][C]980.986369596385[/C][C]-81.9863695963852[/C][/ROW]
[ROW][C]30[/C][C]971[/C][C]928.412520675986[/C][C]42.587479324014[/C][/ROW]
[ROW][C]31[/C][C]1087[/C][C]996.89885554267[/C][C]90.10114445733[/C][/ROW]
[ROW][C]32[/C][C]1000[/C][C]984.62987556559[/C][C]15.3701244344101[/C][/ROW]
[ROW][C]33[/C][C]1071[/C][C]931.787118852952[/C][C]139.212881147048[/C][/ROW]
[ROW][C]34[/C][C]1190[/C][C]1238.27880505585[/C][C]-48.2788050558488[/C][/ROW]
[ROW][C]35[/C][C]1116[/C][C]1220.55141847227[/C][C]-104.551418472275[/C][/ROW]
[ROW][C]36[/C][C]1070[/C][C]1150.32408288937[/C][C]-80.3240828893692[/C][/ROW]
[ROW][C]37[/C][C]1314[/C][C]1124.64869898646[/C][C]189.351301013535[/C][/ROW]
[ROW][C]38[/C][C]1068[/C][C]1146.84990636917[/C][C]-78.8499063691678[/C][/ROW]
[ROW][C]39[/C][C]1185[/C][C]1114.98022963238[/C][C]70.0197703676242[/C][/ROW]
[ROW][C]40[/C][C]1215[/C][C]1036.54548044591[/C][C]178.454519554090[/C][/ROW]
[ROW][C]41[/C][C]1145[/C][C]1042.67891907441[/C][C]102.321080925588[/C][/ROW]
[ROW][C]42[/C][C]1251[/C][C]1288.36771636018[/C][C]-37.3677163601832[/C][/ROW]
[ROW][C]43[/C][C]1363[/C][C]1380.8787309328[/C][C]-17.8787309328007[/C][/ROW]
[ROW][C]44[/C][C]1368[/C][C]1395.62571508613[/C][C]-27.6257150861273[/C][/ROW]
[ROW][C]45[/C][C]1535[/C][C]1624.81691680969[/C][C]-89.8169168096881[/C][/ROW]
[ROW][C]46[/C][C]1853[/C][C]1777.79443486138[/C][C]75.2055651386188[/C][/ROW]
[ROW][C]47[/C][C]1866[/C][C]1670.7708790336[/C][C]195.229120966398[/C][/ROW]
[ROW][C]48[/C][C]2023[/C][C]1839.72224635081[/C][C]183.277753649192[/C][/ROW]
[ROW][C]49[/C][C]1373[/C][C]1645.74624310029[/C][C]-272.74624310029[/C][/ROW]
[ROW][C]50[/C][C]1968[/C][C]1492.04803547663[/C][C]475.951964523373[/C][/ROW]
[ROW][C]51[/C][C]1424[/C][C]1464.90009882256[/C][C]-40.9000988225573[/C][/ROW]
[ROW][C]52[/C][C]1160[/C][C]1355.33743437338[/C][C]-195.337434373375[/C][/ROW]
[ROW][C]53[/C][C]1243[/C][C]1394.71110676361[/C][C]-151.711106763612[/C][/ROW]
[ROW][C]54[/C][C]1375[/C][C]1442.20849325329[/C][C]-67.2084932532887[/C][/ROW]
[ROW][C]55[/C][C]1539[/C][C]1568.07194877566[/C][C]-29.0719487756588[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=113766&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=113766&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
11038964.10156756935273.8984324306477
2934963.129988509344-29.1299885093441
3988948.71752774967939.2824722503214
4870813.68228002987656.3177199701244
5854846.1776613646387.82233863536176
6834774.61418612927559.3858138707251
7872836.50379792101135.4962020789889
8954883.28950678600770.7104932139932
9870923.864500629006-53.8645006290055
1012381194.5540600712543.445939928748
1110821015.4976152819566.502384718047
121053991.88590958189561.1140904181054
13934992.317118358785-58.3171183587849
147871040.33897500161-253.338975001614
1510811061.3170917438619.6829082561369
16908899.6195984405788.38040155942176
17995871.445943200952123.554056799048
18825822.3970835812672.60291641873282
19822900.64666682786-78.6466668278594
20856914.454902562276-58.4549025622758
21887882.5314637083544.46853629164575
2210941164.37270001152-70.372700011518
239901147.18008721217-157.180087212171
249361100.06776117793-164.067761177928
2510971029.1863719851167.8136280148916
269181032.63309464325-114.633094643247
279261014.08505205153-88.0850520515253
28907954.81520671026-47.8152067102606
29899980.986369596385-81.9863695963852
30971928.41252067598642.587479324014
311087996.8988555426790.10114445733
321000984.6298755655915.3701244344101
331071931.787118852952139.212881147048
3411901238.27880505585-48.2788050558488
3511161220.55141847227-104.551418472275
3610701150.32408288937-80.3240828893692
3713141124.64869898646189.351301013535
3810681146.84990636917-78.8499063691678
3911851114.9802296323870.0197703676242
4012151036.54548044591178.454519554090
4111451042.67891907441102.321080925588
4212511288.36771636018-37.3677163601832
4313631380.8787309328-17.8787309328007
4413681395.62571508613-27.6257150861273
4515351624.81691680969-89.8169168096881
4618531777.7944348613875.2055651386188
4718661670.7708790336195.229120966398
4820231839.72224635081183.277753649192
4913731645.74624310029-272.74624310029
5019681492.04803547663475.951964523373
5114241464.90009882256-40.9000988225573
5211601355.33743437338-195.337434373375
5312431394.71110676361-151.711106763612
5413751442.20849325329-67.2084932532887
5515391568.07194877566-29.0719487756588







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
210.1130922850540020.2261845701080040.886907714945998
220.04063241508174470.08126483016348950.959367584918255
230.01383748000610410.02767496001220810.986162519993896
240.01536470891787950.03072941783575900.98463529108212
250.01390184421992190.02780368843984380.986098155780078
260.008439109023411850.01687821804682370.991560890976588
270.003254896150466370.006509792300932740.996745103849534
280.001766414210605360.003532828421210730.998233585789395
290.0005751075862781980.001150215172556400.999424892413722
300.0003661145331774520.0007322290663549050.999633885466823
310.0004499087182326170.0008998174364652340.999550091281767
320.0001243823810394560.0002487647620789120.99987561761896
330.0001933581459394560.0003867162918789120.99980664185406
340.02847041321594830.05694082643189650.971529586784052

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
21 & 0.113092285054002 & 0.226184570108004 & 0.886907714945998 \tabularnewline
22 & 0.0406324150817447 & 0.0812648301634895 & 0.959367584918255 \tabularnewline
23 & 0.0138374800061041 & 0.0276749600122081 & 0.986162519993896 \tabularnewline
24 & 0.0153647089178795 & 0.0307294178357590 & 0.98463529108212 \tabularnewline
25 & 0.0139018442199219 & 0.0278036884398438 & 0.986098155780078 \tabularnewline
26 & 0.00843910902341185 & 0.0168782180468237 & 0.991560890976588 \tabularnewline
27 & 0.00325489615046637 & 0.00650979230093274 & 0.996745103849534 \tabularnewline
28 & 0.00176641421060536 & 0.00353282842121073 & 0.998233585789395 \tabularnewline
29 & 0.000575107586278198 & 0.00115021517255640 & 0.999424892413722 \tabularnewline
30 & 0.000366114533177452 & 0.000732229066354905 & 0.999633885466823 \tabularnewline
31 & 0.000449908718232617 & 0.000899817436465234 & 0.999550091281767 \tabularnewline
32 & 0.000124382381039456 & 0.000248764762078912 & 0.99987561761896 \tabularnewline
33 & 0.000193358145939456 & 0.000386716291878912 & 0.99980664185406 \tabularnewline
34 & 0.0284704132159483 & 0.0569408264318965 & 0.971529586784052 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=113766&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]21[/C][C]0.113092285054002[/C][C]0.226184570108004[/C][C]0.886907714945998[/C][/ROW]
[ROW][C]22[/C][C]0.0406324150817447[/C][C]0.0812648301634895[/C][C]0.959367584918255[/C][/ROW]
[ROW][C]23[/C][C]0.0138374800061041[/C][C]0.0276749600122081[/C][C]0.986162519993896[/C][/ROW]
[ROW][C]24[/C][C]0.0153647089178795[/C][C]0.0307294178357590[/C][C]0.98463529108212[/C][/ROW]
[ROW][C]25[/C][C]0.0139018442199219[/C][C]0.0278036884398438[/C][C]0.986098155780078[/C][/ROW]
[ROW][C]26[/C][C]0.00843910902341185[/C][C]0.0168782180468237[/C][C]0.991560890976588[/C][/ROW]
[ROW][C]27[/C][C]0.00325489615046637[/C][C]0.00650979230093274[/C][C]0.996745103849534[/C][/ROW]
[ROW][C]28[/C][C]0.00176641421060536[/C][C]0.00353282842121073[/C][C]0.998233585789395[/C][/ROW]
[ROW][C]29[/C][C]0.000575107586278198[/C][C]0.00115021517255640[/C][C]0.999424892413722[/C][/ROW]
[ROW][C]30[/C][C]0.000366114533177452[/C][C]0.000732229066354905[/C][C]0.999633885466823[/C][/ROW]
[ROW][C]31[/C][C]0.000449908718232617[/C][C]0.000899817436465234[/C][C]0.999550091281767[/C][/ROW]
[ROW][C]32[/C][C]0.000124382381039456[/C][C]0.000248764762078912[/C][C]0.99987561761896[/C][/ROW]
[ROW][C]33[/C][C]0.000193358145939456[/C][C]0.000386716291878912[/C][C]0.99980664185406[/C][/ROW]
[ROW][C]34[/C][C]0.0284704132159483[/C][C]0.0569408264318965[/C][C]0.971529586784052[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=113766&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=113766&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
210.1130922850540020.2261845701080040.886907714945998
220.04063241508174470.08126483016348950.959367584918255
230.01383748000610410.02767496001220810.986162519993896
240.01536470891787950.03072941783575900.98463529108212
250.01390184421992190.02780368843984380.986098155780078
260.008439109023411850.01687821804682370.991560890976588
270.003254896150466370.006509792300932740.996745103849534
280.001766414210605360.003532828421210730.998233585789395
290.0005751075862781980.001150215172556400.999424892413722
300.0003661145331774520.0007322290663549050.999633885466823
310.0004499087182326170.0008998174364652340.999550091281767
320.0001243823810394560.0002487647620789120.99987561761896
330.0001933581459394560.0003867162918789120.99980664185406
340.02847041321594830.05694082643189650.971529586784052







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level70.5NOK
5% type I error level110.785714285714286NOK
10% type I error level130.928571428571429NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 7 & 0.5 & NOK \tabularnewline
5% type I error level & 11 & 0.785714285714286 & NOK \tabularnewline
10% type I error level & 13 & 0.928571428571429 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=113766&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]7[/C][C]0.5[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]11[/C][C]0.785714285714286[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]13[/C][C]0.928571428571429[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=113766&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=113766&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level70.5NOK
5% type I error level110.785714285714286NOK
10% type I error level130.928571428571429NOK



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,gqarr[mypoint-kp3+1,1])
a<-table.element(a,gqarr[mypoint-kp3+1,2])
a<-table.element(a,gqarr[mypoint-kp3+1,3])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,numsignificant1)
a<-table.element(a,numsignificant1/numgqtests)
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,numsignificant5)
a<-table.element(a,numsignificant5/numgqtests)
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,numsignificant10)
a<-table.element(a,numsignificant10/numgqtests)
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}