Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 18 Nov 2009 09:26:23 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2009/Nov/18/t1258561638rg49k5jue0r6599.htm/, Retrieved Wed, 01 May 2024 17:19:09 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=57514, Retrieved Wed, 01 May 2024 17:19:09 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact243
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [Q1 The Seatbeltlaw] [2007-11-14 19:27:43] [8cd6641b921d30ebe00b648d1481bba0]
- RM D  [Multiple Regression] [Seatbelt] [2009-11-12 14:10:54] [b98453cac15ba1066b407e146608df68]
-    D      [Multiple Regression] [WS7 Y(t-4)] [2009-11-18 16:26:23] [82f421ff86a0429b20e3ed68bd89f1bd] [Current]
Feedback Forum

Post a new message
Dataseries X:
7,59	43,14	7,59	7,59	7,55	7,55
7,57	43,39	7,59	7,59	7,59	7,55
7,57	43,46	7,57	7,59	7,59	7,59
7,59	43,54	7,57	7,57	7,59	7,59
7,6	43,62	7,59	7,57	7,57	7,59
7,64	44,01	7,6	7,59	7,57	7,57
7,64	44,5	7,64	7,6	7,59	7,57
7,76	44,73	7,64	7,64	7,6	7,59
7,76	44,89	7,76	7,64	7,64	7,6
7,76	45,09	7,76	7,76	7,64	7,64
7,77	45,17	7,76	7,76	7,76	7,64
7,83	45,24	7,77	7,76	7,76	7,76
7,94	45,42	7,83	7,77	7,76	7,76
7,94	45,67	7,94	7,83	7,77	7,76
7,94	45,68	7,94	7,94	7,83	7,77
8,09	46,56	7,94	7,94	7,94	7,83
8,18	46,72	8,09	7,94	7,94	7,94
8,26	47,01	8,18	8,09	7,94	7,94
8,28	47,26	8,26	8,18	8,09	7,94
8,28	47,49	8,28	8,26	8,18	8,09
8,28	47,51	8,28	8,28	8,26	8,18
8,29	47,52	8,28	8,28	8,28	8,26
8,3	47,66	8,29	8,28	8,28	8,28
8,3	47,71	8,3	8,29	8,28	8,28
8,31	47,87	8,3	8,3	8,29	8,28
8,33	48	8,31	8,3	8,3	8,29
8,33	48	8,33	8,31	8,3	8,3
8,34	48,05	8,33	8,33	8,31	8,3
8,48	48,25	8,34	8,33	8,33	8,31
8,59	48,72	8,48	8,34	8,33	8,33
8,67	48,94	8,59	8,48	8,34	8,33
8,67	49,16	8,67	8,59	8,48	8,34
8,67	49,18	8,67	8,67	8,59	8,48
8,71	49,25	8,67	8,67	8,67	8,59
8,72	49,34	8,71	8,67	8,67	8,67
8,72	49,49	8,72	8,71	8,67	8,67
8,72	49,57	8,72	8,72	8,71	8,67
8,74	49,63	8,72	8,72	8,72	8,71
8,74	49,67	8,74	8,72	8,72	8,72
8,74	49,7	8,74	8,74	8,72	8,72
8,74	49,8	8,74	8,74	8,74	8,72
8,79	50,09	8,74	8,74	8,74	8,74
8,85	50,49	8,79	8,74	8,74	8,74
8,86	50,73	8,85	8,79	8,74	8,74
8,87	51,12	8,86	8,85	8,79	8,74
8,92	51,15	8,87	8,86	8,85	8,79
8,96	51,41	8,92	8,87	8,86	8,85
8,97	51,61	8,96	8,92	8,87	8,86
8,99	52,06	8,97	8,96	8,92	8,87
8,98	52,17	8,99	8,97	8,96	8,92
8,98	52,18	8,98	8,99	8,97	8,96
9,01	52,19	8,98	8,98	8,99	8,97
9,01	52,74	9,01	8,98	8,98	8,99
9,03	53,05	9,01	9,01	8,98	8,98
9,05	53,38	9,03	9,01	9,01	8,98
9,05	53,78	9,05	9,03	9,01	9,01




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57514&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57514&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57514&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 0.0140552377191030 + 0.0139606102770509X[t] + 1.23654257042361Y1[t] -0.419623941627338Y2[t] + 0.222584543954599Y3[t] -0.118384183953442Y4[t] + 0.00761324456546221M1[t] -0.0235156969591707M2[t] -0.0147316299733394M3[t] + 0.0209832902930584M4[t] + 0.0172570665543621M5[t] + 0.0312601726900203M6[t] -0.000475280260314817M7[t] -0.00282851059813795M8[t] -0.0261443878669578M9[t] + 0.00808598682866219M10[t] -0.00838414566602145M11[t] -0.000382389196704658t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Y[t] =  +  0.0140552377191030 +  0.0139606102770509X[t] +  1.23654257042361Y1[t] -0.419623941627338Y2[t] +  0.222584543954599Y3[t] -0.118384183953442Y4[t] +  0.00761324456546221M1[t] -0.0235156969591707M2[t] -0.0147316299733394M3[t] +  0.0209832902930584M4[t] +  0.0172570665543621M5[t] +  0.0312601726900203M6[t] -0.000475280260314817M7[t] -0.00282851059813795M8[t] -0.0261443878669578M9[t] +  0.00808598682866219M10[t] -0.00838414566602145M11[t] -0.000382389196704658t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57514&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Y[t] =  +  0.0140552377191030 +  0.0139606102770509X[t] +  1.23654257042361Y1[t] -0.419623941627338Y2[t] +  0.222584543954599Y3[t] -0.118384183953442Y4[t] +  0.00761324456546221M1[t] -0.0235156969591707M2[t] -0.0147316299733394M3[t] +  0.0209832902930584M4[t] +  0.0172570665543621M5[t] +  0.0312601726900203M6[t] -0.000475280260314817M7[t] -0.00282851059813795M8[t] -0.0261443878669578M9[t] +  0.00808598682866219M10[t] -0.00838414566602145M11[t] -0.000382389196704658t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57514&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57514&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 0.0140552377191030 + 0.0139606102770509X[t] + 1.23654257042361Y1[t] -0.419623941627338Y2[t] + 0.222584543954599Y3[t] -0.118384183953442Y4[t] + 0.00761324456546221M1[t] -0.0235156969591707M2[t] -0.0147316299733394M3[t] + 0.0209832902930584M4[t] + 0.0172570665543621M5[t] + 0.0312601726900203M6[t] -0.000475280260314817M7[t] -0.00282851059813795M8[t] -0.0261443878669578M9[t] + 0.00808598682866219M10[t] -0.00838414566602145M11[t] -0.000382389196704658t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)0.01405523771910300.9845150.01430.9886840.494342
X0.01396061027705090.0190850.73150.4689750.234487
Y11.236542570423610.1637127.553200
Y2-0.4196239416273380.254319-1.650.1071880.053594
Y30.2225845439545990.2551160.87250.3884250.194212
Y4-0.1183841839534420.172133-0.68770.4957880.247894
M10.007613244565462210.0261930.29070.7728930.386447
M2-0.02351569695917070.02647-0.88840.3799260.189963
M3-0.01473162997333940.026501-0.55590.5815450.290773
M40.02098329029305840.0271710.77230.4447260.222363
M50.01725706655436210.0270820.63720.5278020.263901
M60.03126017269002030.0268221.16550.2510960.125548
M7-0.0004752802603148170.028046-0.01690.9865680.493284
M8-0.002828510598137950.027802-0.10170.9194990.459749
M9-0.02614438786695780.029397-0.88940.3794060.189703
M100.008085986828662190.0280240.28850.7745030.387252
M11-0.008384145666021450.028244-0.29680.7682030.384102
t-0.0003823891967046580.004024-0.0950.9247980.462399

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 0.0140552377191030 & 0.984515 & 0.0143 & 0.988684 & 0.494342 \tabularnewline
X & 0.0139606102770509 & 0.019085 & 0.7315 & 0.468975 & 0.234487 \tabularnewline
Y1 & 1.23654257042361 & 0.163712 & 7.5532 & 0 & 0 \tabularnewline
Y2 & -0.419623941627338 & 0.254319 & -1.65 & 0.107188 & 0.053594 \tabularnewline
Y3 & 0.222584543954599 & 0.255116 & 0.8725 & 0.388425 & 0.194212 \tabularnewline
Y4 & -0.118384183953442 & 0.172133 & -0.6877 & 0.495788 & 0.247894 \tabularnewline
M1 & 0.00761324456546221 & 0.026193 & 0.2907 & 0.772893 & 0.386447 \tabularnewline
M2 & -0.0235156969591707 & 0.02647 & -0.8884 & 0.379926 & 0.189963 \tabularnewline
M3 & -0.0147316299733394 & 0.026501 & -0.5559 & 0.581545 & 0.290773 \tabularnewline
M4 & 0.0209832902930584 & 0.027171 & 0.7723 & 0.444726 & 0.222363 \tabularnewline
M5 & 0.0172570665543621 & 0.027082 & 0.6372 & 0.527802 & 0.263901 \tabularnewline
M6 & 0.0312601726900203 & 0.026822 & 1.1655 & 0.251096 & 0.125548 \tabularnewline
M7 & -0.000475280260314817 & 0.028046 & -0.0169 & 0.986568 & 0.493284 \tabularnewline
M8 & -0.00282851059813795 & 0.027802 & -0.1017 & 0.919499 & 0.459749 \tabularnewline
M9 & -0.0261443878669578 & 0.029397 & -0.8894 & 0.379406 & 0.189703 \tabularnewline
M10 & 0.00808598682866219 & 0.028024 & 0.2885 & 0.774503 & 0.387252 \tabularnewline
M11 & -0.00838414566602145 & 0.028244 & -0.2968 & 0.768203 & 0.384102 \tabularnewline
t & -0.000382389196704658 & 0.004024 & -0.095 & 0.924798 & 0.462399 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57514&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]0.0140552377191030[/C][C]0.984515[/C][C]0.0143[/C][C]0.988684[/C][C]0.494342[/C][/ROW]
[ROW][C]X[/C][C]0.0139606102770509[/C][C]0.019085[/C][C]0.7315[/C][C]0.468975[/C][C]0.234487[/C][/ROW]
[ROW][C]Y1[/C][C]1.23654257042361[/C][C]0.163712[/C][C]7.5532[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]Y2[/C][C]-0.419623941627338[/C][C]0.254319[/C][C]-1.65[/C][C]0.107188[/C][C]0.053594[/C][/ROW]
[ROW][C]Y3[/C][C]0.222584543954599[/C][C]0.255116[/C][C]0.8725[/C][C]0.388425[/C][C]0.194212[/C][/ROW]
[ROW][C]Y4[/C][C]-0.118384183953442[/C][C]0.172133[/C][C]-0.6877[/C][C]0.495788[/C][C]0.247894[/C][/ROW]
[ROW][C]M1[/C][C]0.00761324456546221[/C][C]0.026193[/C][C]0.2907[/C][C]0.772893[/C][C]0.386447[/C][/ROW]
[ROW][C]M2[/C][C]-0.0235156969591707[/C][C]0.02647[/C][C]-0.8884[/C][C]0.379926[/C][C]0.189963[/C][/ROW]
[ROW][C]M3[/C][C]-0.0147316299733394[/C][C]0.026501[/C][C]-0.5559[/C][C]0.581545[/C][C]0.290773[/C][/ROW]
[ROW][C]M4[/C][C]0.0209832902930584[/C][C]0.027171[/C][C]0.7723[/C][C]0.444726[/C][C]0.222363[/C][/ROW]
[ROW][C]M5[/C][C]0.0172570665543621[/C][C]0.027082[/C][C]0.6372[/C][C]0.527802[/C][C]0.263901[/C][/ROW]
[ROW][C]M6[/C][C]0.0312601726900203[/C][C]0.026822[/C][C]1.1655[/C][C]0.251096[/C][C]0.125548[/C][/ROW]
[ROW][C]M7[/C][C]-0.000475280260314817[/C][C]0.028046[/C][C]-0.0169[/C][C]0.986568[/C][C]0.493284[/C][/ROW]
[ROW][C]M8[/C][C]-0.00282851059813795[/C][C]0.027802[/C][C]-0.1017[/C][C]0.919499[/C][C]0.459749[/C][/ROW]
[ROW][C]M9[/C][C]-0.0261443878669578[/C][C]0.029397[/C][C]-0.8894[/C][C]0.379406[/C][C]0.189703[/C][/ROW]
[ROW][C]M10[/C][C]0.00808598682866219[/C][C]0.028024[/C][C]0.2885[/C][C]0.774503[/C][C]0.387252[/C][/ROW]
[ROW][C]M11[/C][C]-0.00838414566602145[/C][C]0.028244[/C][C]-0.2968[/C][C]0.768203[/C][C]0.384102[/C][/ROW]
[ROW][C]t[/C][C]-0.000382389196704658[/C][C]0.004024[/C][C]-0.095[/C][C]0.924798[/C][C]0.462399[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57514&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57514&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)0.01405523771910300.9845150.01430.9886840.494342
X0.01396061027705090.0190850.73150.4689750.234487
Y11.236542570423610.1637127.553200
Y2-0.4196239416273380.254319-1.650.1071880.053594
Y30.2225845439545990.2551160.87250.3884250.194212
Y4-0.1183841839534420.172133-0.68770.4957880.247894
M10.007613244565462210.0261930.29070.7728930.386447
M2-0.02351569695917070.02647-0.88840.3799260.189963
M3-0.01473162997333940.026501-0.55590.5815450.290773
M40.02098329029305840.0271710.77230.4447260.222363
M50.01725706655436210.0270820.63720.5278020.263901
M60.03126017269002030.0268221.16550.2510960.125548
M7-0.0004752802603148170.028046-0.01690.9865680.493284
M8-0.002828510598137950.027802-0.10170.9194990.459749
M9-0.02614438786695780.029397-0.88940.3794060.189703
M100.008085986828662190.0280240.28850.7745030.387252
M11-0.008384145666021450.028244-0.29680.7682030.384102
t-0.0003823891967046580.004024-0.0950.9247980.462399







Multiple Linear Regression - Regression Statistics
Multiple R0.997836985175587
R-squared0.995678648984306
Adjusted R-squared0.9937454130036
F-TEST (value)515.032132094405
F-TEST (DF numerator)17
F-TEST (DF denominator)38
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.0386849910476448
Sum Squared Residuals0.0568680842295417

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.997836985175587 \tabularnewline
R-squared & 0.995678648984306 \tabularnewline
Adjusted R-squared & 0.9937454130036 \tabularnewline
F-TEST (value) & 515.032132094405 \tabularnewline
F-TEST (DF numerator) & 17 \tabularnewline
F-TEST (DF denominator) & 38 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.0386849910476448 \tabularnewline
Sum Squared Residuals & 0.0568680842295417 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57514&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.997836985175587[/C][/ROW]
[ROW][C]R-squared[/C][C]0.995678648984306[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.9937454130036[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]515.032132094405[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]17[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]38[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.0386849910476448[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]0.0568680842295417[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57514&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57514&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.997836985175587
R-squared0.995678648984306
Adjusted R-squared0.9937454130036
F-TEST (value)515.032132094405
F-TEST (DF numerator)17
F-TEST (DF denominator)38
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.0386849910476448
Sum Squared Residuals0.0568680842295417







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
17.597.61067193101227-0.0206719310122711
27.577.59155413461837-0.0215541346183736
37.577.57146683636028-0.00146683636028320
47.597.61630869508469-0.0263086950846871
57.67.63359609150083-0.0335960915008302
67.647.65900207699859-0.0190020769985917
77.647.68344208816707-0.0434420881670698
87.767.666990613091650.0930093869083528
97.767.80163069263993-0.0416306926399335
107.767.78318055984084-0.0231805598408407
117.777.79415503224617-0.0241550322461685
127.837.80129335506470.0287066449352988
137.947.881033435092470.0589665649075286
147.947.9660803486289-0.0260803486288992
157.947.94063422973953-0.000634229739530575
168.098.005633346650830.0843666533491716
178.188.176217556688420.00378244331158241
188.268.242232090701740.0177679092982602
198.288.30814933360458-0.0281493336045811
208.288.30206057187496-0.0220605718749573
218.288.277401225742980.00259877425701461
228.298.30636977350749-0.016369773507488
238.38.30146947928005-0.00146947928005223
248.38.31833845255119-0.0183384525511864
258.318.32583261158755-0.0158326115875441
268.338.309543589506470.0204564104935287
278.338.33729603744826-0.00729603744826175
288.348.3671599656388-0.0271599656388069
298.488.381476749502610.098523250497391
308.598.568210990035740.0217890099642594
318.678.618662658507970.0513373414920335
328.678.70174113960338-0.0317411396033811
338.678.652662684094730.0173373159052656
348.718.692272415594530.0177275844054682
358.728.716667316928750.00333268307125159
368.728.72234363297876-0.00234363297876473
378.728.7353984795116-0.0153984795115971
388.748.70221526348830.0377847365117088
398.748.734722375257440.00527762474256289
408.748.7620812458029-0.0220812458028952
418.748.7638203847743-0.0238203847742912
428.798.779121995014520.0108780049854782
438.858.814415525499480.035584474500519
448.868.8682418095755-0.00824180957549614
458.878.848305397522350.0216946024776533
468.928.898177251057140.0218227489428605
478.968.937708171545030.0222918284549692
488.978.97802455940535-0.00802455940534777
498.998.99706354279612-0.00706354279611632
508.988.99060666375797-0.0106066637579645
518.988.975880521194490.00411947880551261
529.019.01881674682278-0.0088167468227824
539.019.05488921753385-0.0448892175338521
549.039.0614328472494-0.0314328472494062
559.059.0653303942209-0.0153303942209016
569.059.08096586585452-0.0309658658545183

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 7.59 & 7.61067193101227 & -0.0206719310122711 \tabularnewline
2 & 7.57 & 7.59155413461837 & -0.0215541346183736 \tabularnewline
3 & 7.57 & 7.57146683636028 & -0.00146683636028320 \tabularnewline
4 & 7.59 & 7.61630869508469 & -0.0263086950846871 \tabularnewline
5 & 7.6 & 7.63359609150083 & -0.0335960915008302 \tabularnewline
6 & 7.64 & 7.65900207699859 & -0.0190020769985917 \tabularnewline
7 & 7.64 & 7.68344208816707 & -0.0434420881670698 \tabularnewline
8 & 7.76 & 7.66699061309165 & 0.0930093869083528 \tabularnewline
9 & 7.76 & 7.80163069263993 & -0.0416306926399335 \tabularnewline
10 & 7.76 & 7.78318055984084 & -0.0231805598408407 \tabularnewline
11 & 7.77 & 7.79415503224617 & -0.0241550322461685 \tabularnewline
12 & 7.83 & 7.8012933550647 & 0.0287066449352988 \tabularnewline
13 & 7.94 & 7.88103343509247 & 0.0589665649075286 \tabularnewline
14 & 7.94 & 7.9660803486289 & -0.0260803486288992 \tabularnewline
15 & 7.94 & 7.94063422973953 & -0.000634229739530575 \tabularnewline
16 & 8.09 & 8.00563334665083 & 0.0843666533491716 \tabularnewline
17 & 8.18 & 8.17621755668842 & 0.00378244331158241 \tabularnewline
18 & 8.26 & 8.24223209070174 & 0.0177679092982602 \tabularnewline
19 & 8.28 & 8.30814933360458 & -0.0281493336045811 \tabularnewline
20 & 8.28 & 8.30206057187496 & -0.0220605718749573 \tabularnewline
21 & 8.28 & 8.27740122574298 & 0.00259877425701461 \tabularnewline
22 & 8.29 & 8.30636977350749 & -0.016369773507488 \tabularnewline
23 & 8.3 & 8.30146947928005 & -0.00146947928005223 \tabularnewline
24 & 8.3 & 8.31833845255119 & -0.0183384525511864 \tabularnewline
25 & 8.31 & 8.32583261158755 & -0.0158326115875441 \tabularnewline
26 & 8.33 & 8.30954358950647 & 0.0204564104935287 \tabularnewline
27 & 8.33 & 8.33729603744826 & -0.00729603744826175 \tabularnewline
28 & 8.34 & 8.3671599656388 & -0.0271599656388069 \tabularnewline
29 & 8.48 & 8.38147674950261 & 0.098523250497391 \tabularnewline
30 & 8.59 & 8.56821099003574 & 0.0217890099642594 \tabularnewline
31 & 8.67 & 8.61866265850797 & 0.0513373414920335 \tabularnewline
32 & 8.67 & 8.70174113960338 & -0.0317411396033811 \tabularnewline
33 & 8.67 & 8.65266268409473 & 0.0173373159052656 \tabularnewline
34 & 8.71 & 8.69227241559453 & 0.0177275844054682 \tabularnewline
35 & 8.72 & 8.71666731692875 & 0.00333268307125159 \tabularnewline
36 & 8.72 & 8.72234363297876 & -0.00234363297876473 \tabularnewline
37 & 8.72 & 8.7353984795116 & -0.0153984795115971 \tabularnewline
38 & 8.74 & 8.7022152634883 & 0.0377847365117088 \tabularnewline
39 & 8.74 & 8.73472237525744 & 0.00527762474256289 \tabularnewline
40 & 8.74 & 8.7620812458029 & -0.0220812458028952 \tabularnewline
41 & 8.74 & 8.7638203847743 & -0.0238203847742912 \tabularnewline
42 & 8.79 & 8.77912199501452 & 0.0108780049854782 \tabularnewline
43 & 8.85 & 8.81441552549948 & 0.035584474500519 \tabularnewline
44 & 8.86 & 8.8682418095755 & -0.00824180957549614 \tabularnewline
45 & 8.87 & 8.84830539752235 & 0.0216946024776533 \tabularnewline
46 & 8.92 & 8.89817725105714 & 0.0218227489428605 \tabularnewline
47 & 8.96 & 8.93770817154503 & 0.0222918284549692 \tabularnewline
48 & 8.97 & 8.97802455940535 & -0.00802455940534777 \tabularnewline
49 & 8.99 & 8.99706354279612 & -0.00706354279611632 \tabularnewline
50 & 8.98 & 8.99060666375797 & -0.0106066637579645 \tabularnewline
51 & 8.98 & 8.97588052119449 & 0.00411947880551261 \tabularnewline
52 & 9.01 & 9.01881674682278 & -0.0088167468227824 \tabularnewline
53 & 9.01 & 9.05488921753385 & -0.0448892175338521 \tabularnewline
54 & 9.03 & 9.0614328472494 & -0.0314328472494062 \tabularnewline
55 & 9.05 & 9.0653303942209 & -0.0153303942209016 \tabularnewline
56 & 9.05 & 9.08096586585452 & -0.0309658658545183 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57514&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]7.59[/C][C]7.61067193101227[/C][C]-0.0206719310122711[/C][/ROW]
[ROW][C]2[/C][C]7.57[/C][C]7.59155413461837[/C][C]-0.0215541346183736[/C][/ROW]
[ROW][C]3[/C][C]7.57[/C][C]7.57146683636028[/C][C]-0.00146683636028320[/C][/ROW]
[ROW][C]4[/C][C]7.59[/C][C]7.61630869508469[/C][C]-0.0263086950846871[/C][/ROW]
[ROW][C]5[/C][C]7.6[/C][C]7.63359609150083[/C][C]-0.0335960915008302[/C][/ROW]
[ROW][C]6[/C][C]7.64[/C][C]7.65900207699859[/C][C]-0.0190020769985917[/C][/ROW]
[ROW][C]7[/C][C]7.64[/C][C]7.68344208816707[/C][C]-0.0434420881670698[/C][/ROW]
[ROW][C]8[/C][C]7.76[/C][C]7.66699061309165[/C][C]0.0930093869083528[/C][/ROW]
[ROW][C]9[/C][C]7.76[/C][C]7.80163069263993[/C][C]-0.0416306926399335[/C][/ROW]
[ROW][C]10[/C][C]7.76[/C][C]7.78318055984084[/C][C]-0.0231805598408407[/C][/ROW]
[ROW][C]11[/C][C]7.77[/C][C]7.79415503224617[/C][C]-0.0241550322461685[/C][/ROW]
[ROW][C]12[/C][C]7.83[/C][C]7.8012933550647[/C][C]0.0287066449352988[/C][/ROW]
[ROW][C]13[/C][C]7.94[/C][C]7.88103343509247[/C][C]0.0589665649075286[/C][/ROW]
[ROW][C]14[/C][C]7.94[/C][C]7.9660803486289[/C][C]-0.0260803486288992[/C][/ROW]
[ROW][C]15[/C][C]7.94[/C][C]7.94063422973953[/C][C]-0.000634229739530575[/C][/ROW]
[ROW][C]16[/C][C]8.09[/C][C]8.00563334665083[/C][C]0.0843666533491716[/C][/ROW]
[ROW][C]17[/C][C]8.18[/C][C]8.17621755668842[/C][C]0.00378244331158241[/C][/ROW]
[ROW][C]18[/C][C]8.26[/C][C]8.24223209070174[/C][C]0.0177679092982602[/C][/ROW]
[ROW][C]19[/C][C]8.28[/C][C]8.30814933360458[/C][C]-0.0281493336045811[/C][/ROW]
[ROW][C]20[/C][C]8.28[/C][C]8.30206057187496[/C][C]-0.0220605718749573[/C][/ROW]
[ROW][C]21[/C][C]8.28[/C][C]8.27740122574298[/C][C]0.00259877425701461[/C][/ROW]
[ROW][C]22[/C][C]8.29[/C][C]8.30636977350749[/C][C]-0.016369773507488[/C][/ROW]
[ROW][C]23[/C][C]8.3[/C][C]8.30146947928005[/C][C]-0.00146947928005223[/C][/ROW]
[ROW][C]24[/C][C]8.3[/C][C]8.31833845255119[/C][C]-0.0183384525511864[/C][/ROW]
[ROW][C]25[/C][C]8.31[/C][C]8.32583261158755[/C][C]-0.0158326115875441[/C][/ROW]
[ROW][C]26[/C][C]8.33[/C][C]8.30954358950647[/C][C]0.0204564104935287[/C][/ROW]
[ROW][C]27[/C][C]8.33[/C][C]8.33729603744826[/C][C]-0.00729603744826175[/C][/ROW]
[ROW][C]28[/C][C]8.34[/C][C]8.3671599656388[/C][C]-0.0271599656388069[/C][/ROW]
[ROW][C]29[/C][C]8.48[/C][C]8.38147674950261[/C][C]0.098523250497391[/C][/ROW]
[ROW][C]30[/C][C]8.59[/C][C]8.56821099003574[/C][C]0.0217890099642594[/C][/ROW]
[ROW][C]31[/C][C]8.67[/C][C]8.61866265850797[/C][C]0.0513373414920335[/C][/ROW]
[ROW][C]32[/C][C]8.67[/C][C]8.70174113960338[/C][C]-0.0317411396033811[/C][/ROW]
[ROW][C]33[/C][C]8.67[/C][C]8.65266268409473[/C][C]0.0173373159052656[/C][/ROW]
[ROW][C]34[/C][C]8.71[/C][C]8.69227241559453[/C][C]0.0177275844054682[/C][/ROW]
[ROW][C]35[/C][C]8.72[/C][C]8.71666731692875[/C][C]0.00333268307125159[/C][/ROW]
[ROW][C]36[/C][C]8.72[/C][C]8.72234363297876[/C][C]-0.00234363297876473[/C][/ROW]
[ROW][C]37[/C][C]8.72[/C][C]8.7353984795116[/C][C]-0.0153984795115971[/C][/ROW]
[ROW][C]38[/C][C]8.74[/C][C]8.7022152634883[/C][C]0.0377847365117088[/C][/ROW]
[ROW][C]39[/C][C]8.74[/C][C]8.73472237525744[/C][C]0.00527762474256289[/C][/ROW]
[ROW][C]40[/C][C]8.74[/C][C]8.7620812458029[/C][C]-0.0220812458028952[/C][/ROW]
[ROW][C]41[/C][C]8.74[/C][C]8.7638203847743[/C][C]-0.0238203847742912[/C][/ROW]
[ROW][C]42[/C][C]8.79[/C][C]8.77912199501452[/C][C]0.0108780049854782[/C][/ROW]
[ROW][C]43[/C][C]8.85[/C][C]8.81441552549948[/C][C]0.035584474500519[/C][/ROW]
[ROW][C]44[/C][C]8.86[/C][C]8.8682418095755[/C][C]-0.00824180957549614[/C][/ROW]
[ROW][C]45[/C][C]8.87[/C][C]8.84830539752235[/C][C]0.0216946024776533[/C][/ROW]
[ROW][C]46[/C][C]8.92[/C][C]8.89817725105714[/C][C]0.0218227489428605[/C][/ROW]
[ROW][C]47[/C][C]8.96[/C][C]8.93770817154503[/C][C]0.0222918284549692[/C][/ROW]
[ROW][C]48[/C][C]8.97[/C][C]8.97802455940535[/C][C]-0.00802455940534777[/C][/ROW]
[ROW][C]49[/C][C]8.99[/C][C]8.99706354279612[/C][C]-0.00706354279611632[/C][/ROW]
[ROW][C]50[/C][C]8.98[/C][C]8.99060666375797[/C][C]-0.0106066637579645[/C][/ROW]
[ROW][C]51[/C][C]8.98[/C][C]8.97588052119449[/C][C]0.00411947880551261[/C][/ROW]
[ROW][C]52[/C][C]9.01[/C][C]9.01881674682278[/C][C]-0.0088167468227824[/C][/ROW]
[ROW][C]53[/C][C]9.01[/C][C]9.05488921753385[/C][C]-0.0448892175338521[/C][/ROW]
[ROW][C]54[/C][C]9.03[/C][C]9.0614328472494[/C][C]-0.0314328472494062[/C][/ROW]
[ROW][C]55[/C][C]9.05[/C][C]9.0653303942209[/C][C]-0.0153303942209016[/C][/ROW]
[ROW][C]56[/C][C]9.05[/C][C]9.08096586585452[/C][C]-0.0309658658545183[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57514&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57514&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
17.597.61067193101227-0.0206719310122711
27.577.59155413461837-0.0215541346183736
37.577.57146683636028-0.00146683636028320
47.597.61630869508469-0.0263086950846871
57.67.63359609150083-0.0335960915008302
67.647.65900207699859-0.0190020769985917
77.647.68344208816707-0.0434420881670698
87.767.666990613091650.0930093869083528
97.767.80163069263993-0.0416306926399335
107.767.78318055984084-0.0231805598408407
117.777.79415503224617-0.0241550322461685
127.837.80129335506470.0287066449352988
137.947.881033435092470.0589665649075286
147.947.9660803486289-0.0260803486288992
157.947.94063422973953-0.000634229739530575
168.098.005633346650830.0843666533491716
178.188.176217556688420.00378244331158241
188.268.242232090701740.0177679092982602
198.288.30814933360458-0.0281493336045811
208.288.30206057187496-0.0220605718749573
218.288.277401225742980.00259877425701461
228.298.30636977350749-0.016369773507488
238.38.30146947928005-0.00146947928005223
248.38.31833845255119-0.0183384525511864
258.318.32583261158755-0.0158326115875441
268.338.309543589506470.0204564104935287
278.338.33729603744826-0.00729603744826175
288.348.3671599656388-0.0271599656388069
298.488.381476749502610.098523250497391
308.598.568210990035740.0217890099642594
318.678.618662658507970.0513373414920335
328.678.70174113960338-0.0317411396033811
338.678.652662684094730.0173373159052656
348.718.692272415594530.0177275844054682
358.728.716667316928750.00333268307125159
368.728.72234363297876-0.00234363297876473
378.728.7353984795116-0.0153984795115971
388.748.70221526348830.0377847365117088
398.748.734722375257440.00527762474256289
408.748.7620812458029-0.0220812458028952
418.748.7638203847743-0.0238203847742912
428.798.779121995014520.0108780049854782
438.858.814415525499480.035584474500519
448.868.8682418095755-0.00824180957549614
458.878.848305397522350.0216946024776533
468.928.898177251057140.0218227489428605
478.968.937708171545030.0222918284549692
488.978.97802455940535-0.00802455940534777
498.998.99706354279612-0.00706354279611632
508.988.99060666375797-0.0106066637579645
518.988.975880521194490.00411947880551261
529.019.01881674682278-0.0088167468227824
539.019.05488921753385-0.0448892175338521
549.039.0614328472494-0.0314328472494062
559.059.0653303942209-0.0153303942209016
569.059.08096586585452-0.0309658658545183







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
210.8300458221099940.3399083557800120.169954177890006
220.7912649679490030.4174700641019930.208735032050997
230.6808162512343960.6383674975312080.319183748765604
240.6146777636145830.7706444727708340.385322236385417
250.7595045300685820.4809909398628360.240495469931418
260.6748964893284570.6502070213430860.325103510671543
270.6968271120091390.6063457759817210.303172887990861
280.987131531764880.02573693647023860.0128684682351193
290.9968989732579420.006202053484115920.00310102674205796
300.9934244946420570.0131510107158860.006575505357943
310.9950395971572840.009920805685432840.00496040284271642
320.9855181822663350.02896363546732940.0144818177336647
330.9751526981736640.04969460365267220.0248473018263361
340.981517968297190.03696406340561890.0184820317028095
350.9374075811049050.1251848377901890.0625924188950947

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
21 & 0.830045822109994 & 0.339908355780012 & 0.169954177890006 \tabularnewline
22 & 0.791264967949003 & 0.417470064101993 & 0.208735032050997 \tabularnewline
23 & 0.680816251234396 & 0.638367497531208 & 0.319183748765604 \tabularnewline
24 & 0.614677763614583 & 0.770644472770834 & 0.385322236385417 \tabularnewline
25 & 0.759504530068582 & 0.480990939862836 & 0.240495469931418 \tabularnewline
26 & 0.674896489328457 & 0.650207021343086 & 0.325103510671543 \tabularnewline
27 & 0.696827112009139 & 0.606345775981721 & 0.303172887990861 \tabularnewline
28 & 0.98713153176488 & 0.0257369364702386 & 0.0128684682351193 \tabularnewline
29 & 0.996898973257942 & 0.00620205348411592 & 0.00310102674205796 \tabularnewline
30 & 0.993424494642057 & 0.013151010715886 & 0.006575505357943 \tabularnewline
31 & 0.995039597157284 & 0.00992080568543284 & 0.00496040284271642 \tabularnewline
32 & 0.985518182266335 & 0.0289636354673294 & 0.0144818177336647 \tabularnewline
33 & 0.975152698173664 & 0.0496946036526722 & 0.0248473018263361 \tabularnewline
34 & 0.98151796829719 & 0.0369640634056189 & 0.0184820317028095 \tabularnewline
35 & 0.937407581104905 & 0.125184837790189 & 0.0625924188950947 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57514&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]21[/C][C]0.830045822109994[/C][C]0.339908355780012[/C][C]0.169954177890006[/C][/ROW]
[ROW][C]22[/C][C]0.791264967949003[/C][C]0.417470064101993[/C][C]0.208735032050997[/C][/ROW]
[ROW][C]23[/C][C]0.680816251234396[/C][C]0.638367497531208[/C][C]0.319183748765604[/C][/ROW]
[ROW][C]24[/C][C]0.614677763614583[/C][C]0.770644472770834[/C][C]0.385322236385417[/C][/ROW]
[ROW][C]25[/C][C]0.759504530068582[/C][C]0.480990939862836[/C][C]0.240495469931418[/C][/ROW]
[ROW][C]26[/C][C]0.674896489328457[/C][C]0.650207021343086[/C][C]0.325103510671543[/C][/ROW]
[ROW][C]27[/C][C]0.696827112009139[/C][C]0.606345775981721[/C][C]0.303172887990861[/C][/ROW]
[ROW][C]28[/C][C]0.98713153176488[/C][C]0.0257369364702386[/C][C]0.0128684682351193[/C][/ROW]
[ROW][C]29[/C][C]0.996898973257942[/C][C]0.00620205348411592[/C][C]0.00310102674205796[/C][/ROW]
[ROW][C]30[/C][C]0.993424494642057[/C][C]0.013151010715886[/C][C]0.006575505357943[/C][/ROW]
[ROW][C]31[/C][C]0.995039597157284[/C][C]0.00992080568543284[/C][C]0.00496040284271642[/C][/ROW]
[ROW][C]32[/C][C]0.985518182266335[/C][C]0.0289636354673294[/C][C]0.0144818177336647[/C][/ROW]
[ROW][C]33[/C][C]0.975152698173664[/C][C]0.0496946036526722[/C][C]0.0248473018263361[/C][/ROW]
[ROW][C]34[/C][C]0.98151796829719[/C][C]0.0369640634056189[/C][C]0.0184820317028095[/C][/ROW]
[ROW][C]35[/C][C]0.937407581104905[/C][C]0.125184837790189[/C][C]0.0625924188950947[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57514&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57514&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
210.8300458221099940.3399083557800120.169954177890006
220.7912649679490030.4174700641019930.208735032050997
230.6808162512343960.6383674975312080.319183748765604
240.6146777636145830.7706444727708340.385322236385417
250.7595045300685820.4809909398628360.240495469931418
260.6748964893284570.6502070213430860.325103510671543
270.6968271120091390.6063457759817210.303172887990861
280.987131531764880.02573693647023860.0128684682351193
290.9968989732579420.006202053484115920.00310102674205796
300.9934244946420570.0131510107158860.006575505357943
310.9950395971572840.009920805685432840.00496040284271642
320.9855181822663350.02896363546732940.0144818177336647
330.9751526981736640.04969460365267220.0248473018263361
340.981517968297190.03696406340561890.0184820317028095
350.9374075811049050.1251848377901890.0625924188950947







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level20.133333333333333NOK
5% type I error level70.466666666666667NOK
10% type I error level70.466666666666667NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 2 & 0.133333333333333 & NOK \tabularnewline
5% type I error level & 7 & 0.466666666666667 & NOK \tabularnewline
10% type I error level & 7 & 0.466666666666667 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57514&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]2[/C][C]0.133333333333333[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]7[/C][C]0.466666666666667[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]7[/C][C]0.466666666666667[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57514&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57514&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level20.133333333333333NOK
5% type I error level70.466666666666667NOK
10% type I error level70.466666666666667NOK



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,gqarr[mypoint-kp3+1,1])
a<-table.element(a,gqarr[mypoint-kp3+1,2])
a<-table.element(a,gqarr[mypoint-kp3+1,3])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,numsignificant1)
a<-table.element(a,numsignificant1/numgqtests)
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,numsignificant5)
a<-table.element(a,numsignificant5/numgqtests)
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,numsignificant10)
a<-table.element(a,numsignificant10/numgqtests)
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}