Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationTue, 15 Dec 2015 10:25:52 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2015/Dec/15/t14501751754smaufvakavjjk2.htm/, Retrieved Sat, 18 May 2024 10:03:27 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=286451, Retrieved Sat, 18 May 2024 10:03:27 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact130
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Multiple regressi...] [2015-12-15 10:25:52] [07325d4e03e5d5deea478d79524d9715] [Current]
Feedback Forum

Post a new message
Dataseries X:
0.5215052 1
0.4248284 1
0.4250311 1
0.4771938 1
0.8280212 1
0.6156186 1
0.366627 0
0.4308883 0
0.2810287 0
0.4646245 0
0.2693951 0
0.5779049 0
0.5661151 0
0.5077584 0
0.7507175 0
0.6808395 0
0.7661091 0
0.4561473 0
0.4977496 0
0.4193273 0
0.6095514 0
0.457337 0
0.5705478 0
0.3478996 0
0.3874993 1
0.5824285 1
0.2391033 1
0.2367445 1
0.2626158 1
0.4240934 1
0.365275 1
0.3750758 0
0.4090056 0
0.3891676 0
0.240261 0
0.1589496 1
0.4393373 1
0.5094681 1
0.3743465 1
0.4339828 1
0.4130557 1
0.3288928 1
0.5186648 1
0.5486504 1
0.5469111 1
0.4963494 1
0.5308929 0
0.5957761 0
0.5570584 1
0.5731325 1
0.5005416 1
0.5431269 1
0.5593657 0
0.6911693 0
0.4403485 0
0.5676662 0
0.5969114 0
0.4735537 0
0.5923935 0
0.5975556 0
0.6334127 0
0.6057115 0
0.7046107 0
0.4805263 0
0.702686 0
0.7009017 0
0.6030854 0
0.6980919 0
0.597656 0
0.8023421 0
0.6017109 0
0.5993127 0
0.6025625 0
0.7016625 0
0.4995714 0
0.4980918 1
0.497569 1
0.600183 0
0.3339542 0
0.274437 0
0.3209428 0
0.5406671 0
0.4050209 0
0.2885961 0
0.3275942 0
0.3132606 0
0.2575562 0
0.2138386 0
0.1861856 1
0.1592713 1




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Sir Ronald Aylmer Fisher' @ fisher.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286451&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Sir Ronald Aylmer Fisher' @ fisher.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286451&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286451&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net







Multiple Linear Regression - Estimated Regression Equation
Homicide[t] = + 0.504311 -0.107702War[t] + 0.0867491M1[t] + 0.110408M2[t] + 0.052825M3[t] + 0.0436594M4[t] + 0.125831M5[t] + 0.0832149M6[t] + 0.00581369M7[t] + 0.0181099M8[t] + 0.0257118M9[t] + 0.0639086M10[t] -0.00983117M11[t] -0.000763328t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Homicide[t] =  +  0.504311 -0.107702War[t] +  0.0867491M1[t] +  0.110408M2[t] +  0.052825M3[t] +  0.0436594M4[t] +  0.125831M5[t] +  0.0832149M6[t] +  0.00581369M7[t] +  0.0181099M8[t] +  0.0257118M9[t] +  0.0639086M10[t] -0.00983117M11[t] -0.000763328t  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286451&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Homicide[t] =  +  0.504311 -0.107702War[t] +  0.0867491M1[t] +  0.110408M2[t] +  0.052825M3[t] +  0.0436594M4[t] +  0.125831M5[t] +  0.0832149M6[t] +  0.00581369M7[t] +  0.0181099M8[t] +  0.0257118M9[t] +  0.0639086M10[t] -0.00983117M11[t] -0.000763328t  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286451&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286451&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Homicide[t] = + 0.504311 -0.107702War[t] + 0.0867491M1[t] + 0.110408M2[t] + 0.052825M3[t] + 0.0436594M4[t] + 0.125831M5[t] + 0.0832149M6[t] + 0.00581369M7[t] + 0.0181099M8[t] + 0.0257118M9[t] + 0.0639086M10[t] -0.00983117M11[t] -0.000763328t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+0.5043 0.0666+7.5730e+00 7.209e-11 3.605e-11
War-0.1077 0.03889-2.7700e+00 0.007051 0.003525
M1+0.08675 0.08051+1.0770e+00 0.2847 0.1423
M2+0.1104 0.08052+1.3710e+00 0.1743 0.08717
M3+0.05282 0.08053+6.5600e-01 0.5138 0.2569
M4+0.04366 0.0815+5.3570e-01 0.5937 0.2969
M5+0.1258 0.08153+1.5430e+00 0.1269 0.06346
M6+0.08321 0.08059+1.0330e+00 0.3051 0.1525
M7+0.005814 0.08218+7.0740e-02 0.9438 0.4719
M8+0.01811 0.08203+2.2080e-01 0.8259 0.4129
M9+0.02571 0.08201+3.1350e-01 0.7547 0.3774
M10+0.06391 0.082+7.7940e-01 0.4382 0.2191
M11-0.009831 0.08219-1.1960e-01 0.9051 0.4526
t-0.0007633 0.0006444-1.1850e+00 0.2399 0.1199

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +0.5043 &  0.0666 & +7.5730e+00 &  7.209e-11 &  3.605e-11 \tabularnewline
War & -0.1077 &  0.03889 & -2.7700e+00 &  0.007051 &  0.003525 \tabularnewline
M1 & +0.08675 &  0.08051 & +1.0770e+00 &  0.2847 &  0.1423 \tabularnewline
M2 & +0.1104 &  0.08052 & +1.3710e+00 &  0.1743 &  0.08717 \tabularnewline
M3 & +0.05282 &  0.08053 & +6.5600e-01 &  0.5138 &  0.2569 \tabularnewline
M4 & +0.04366 &  0.0815 & +5.3570e-01 &  0.5937 &  0.2969 \tabularnewline
M5 & +0.1258 &  0.08153 & +1.5430e+00 &  0.1269 &  0.06346 \tabularnewline
M6 & +0.08321 &  0.08059 & +1.0330e+00 &  0.3051 &  0.1525 \tabularnewline
M7 & +0.005814 &  0.08218 & +7.0740e-02 &  0.9438 &  0.4719 \tabularnewline
M8 & +0.01811 &  0.08203 & +2.2080e-01 &  0.8259 &  0.4129 \tabularnewline
M9 & +0.02571 &  0.08201 & +3.1350e-01 &  0.7547 &  0.3774 \tabularnewline
M10 & +0.06391 &  0.082 & +7.7940e-01 &  0.4382 &  0.2191 \tabularnewline
M11 & -0.009831 &  0.08219 & -1.1960e-01 &  0.9051 &  0.4526 \tabularnewline
t & -0.0007633 &  0.0006444 & -1.1850e+00 &  0.2399 &  0.1199 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286451&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+0.5043[/C][C] 0.0666[/C][C]+7.5730e+00[/C][C] 7.209e-11[/C][C] 3.605e-11[/C][/ROW]
[ROW][C]War[/C][C]-0.1077[/C][C] 0.03889[/C][C]-2.7700e+00[/C][C] 0.007051[/C][C] 0.003525[/C][/ROW]
[ROW][C]M1[/C][C]+0.08675[/C][C] 0.08051[/C][C]+1.0770e+00[/C][C] 0.2847[/C][C] 0.1423[/C][/ROW]
[ROW][C]M2[/C][C]+0.1104[/C][C] 0.08052[/C][C]+1.3710e+00[/C][C] 0.1743[/C][C] 0.08717[/C][/ROW]
[ROW][C]M3[/C][C]+0.05282[/C][C] 0.08053[/C][C]+6.5600e-01[/C][C] 0.5138[/C][C] 0.2569[/C][/ROW]
[ROW][C]M4[/C][C]+0.04366[/C][C] 0.0815[/C][C]+5.3570e-01[/C][C] 0.5937[/C][C] 0.2969[/C][/ROW]
[ROW][C]M5[/C][C]+0.1258[/C][C] 0.08153[/C][C]+1.5430e+00[/C][C] 0.1269[/C][C] 0.06346[/C][/ROW]
[ROW][C]M6[/C][C]+0.08321[/C][C] 0.08059[/C][C]+1.0330e+00[/C][C] 0.3051[/C][C] 0.1525[/C][/ROW]
[ROW][C]M7[/C][C]+0.005814[/C][C] 0.08218[/C][C]+7.0740e-02[/C][C] 0.9438[/C][C] 0.4719[/C][/ROW]
[ROW][C]M8[/C][C]+0.01811[/C][C] 0.08203[/C][C]+2.2080e-01[/C][C] 0.8259[/C][C] 0.4129[/C][/ROW]
[ROW][C]M9[/C][C]+0.02571[/C][C] 0.08201[/C][C]+3.1350e-01[/C][C] 0.7547[/C][C] 0.3774[/C][/ROW]
[ROW][C]M10[/C][C]+0.06391[/C][C] 0.082[/C][C]+7.7940e-01[/C][C] 0.4382[/C][C] 0.2191[/C][/ROW]
[ROW][C]M11[/C][C]-0.009831[/C][C] 0.08219[/C][C]-1.1960e-01[/C][C] 0.9051[/C][C] 0.4526[/C][/ROW]
[ROW][C]t[/C][C]-0.0007633[/C][C] 0.0006444[/C][C]-1.1850e+00[/C][C] 0.2399[/C][C] 0.1199[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286451&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286451&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+0.5043 0.0666+7.5730e+00 7.209e-11 3.605e-11
War-0.1077 0.03889-2.7700e+00 0.007051 0.003525
M1+0.08675 0.08051+1.0770e+00 0.2847 0.1423
M2+0.1104 0.08052+1.3710e+00 0.1743 0.08717
M3+0.05282 0.08053+6.5600e-01 0.5138 0.2569
M4+0.04366 0.0815+5.3570e-01 0.5937 0.2969
M5+0.1258 0.08153+1.5430e+00 0.1269 0.06346
M6+0.08321 0.08059+1.0330e+00 0.3051 0.1525
M7+0.005814 0.08218+7.0740e-02 0.9438 0.4719
M8+0.01811 0.08203+2.2080e-01 0.8259 0.4129
M9+0.02571 0.08201+3.1350e-01 0.7547 0.3774
M10+0.06391 0.082+7.7940e-01 0.4382 0.2191
M11-0.009831 0.08219-1.1960e-01 0.9051 0.4526
t-0.0007633 0.0006444-1.1850e+00 0.2399 0.1199







Multiple Linear Regression - Regression Statistics
Multiple R 0.3591
R-squared 0.129
Adjusted R-squared-0.02003
F-TEST (value) 0.8656
F-TEST (DF numerator)13
F-TEST (DF denominator)76
p-value 0.5913
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 0.1534
Sum Squared Residuals 1.788

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.3591 \tabularnewline
R-squared &  0.129 \tabularnewline
Adjusted R-squared & -0.02003 \tabularnewline
F-TEST (value) &  0.8656 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 76 \tabularnewline
p-value &  0.5913 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  0.1534 \tabularnewline
Sum Squared Residuals &  1.788 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286451&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.3591[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.129[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.02003[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 0.8656[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]76[/C][/ROW]
[ROW][C]p-value[/C][C] 0.5913[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 0.1534[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 1.788[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286451&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286451&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.3591
R-squared 0.129
Adjusted R-squared-0.02003
F-TEST (value) 0.8656
F-TEST (DF numerator)13
F-TEST (DF denominator)76
p-value 0.5913
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 0.1534
Sum Squared Residuals 1.788







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 0.5215 0.4826 0.03891
2 0.4248 0.5055-0.08066
3 0.425 0.4471-0.02211
4 0.4772 0.4372 0.03998
5 0.828 0.5186 0.3094
6 0.6156 0.4752 0.1404
7 0.3666 0.5048-0.1382
8 0.4309 0.5163-0.08543
9 0.281 0.5232-0.2421
10 0.4646 0.5606-0.09596
11 0.2694 0.4861-0.2167
12 0.5779 0.4952 0.08275
13 0.5661 0.5811-0.01502
14 0.5078 0.604-0.09627
15 0.7507 0.5457 0.205
16 0.6808 0.5358 0.1451
17 0.7661 0.6172 0.1489
18 0.4561 0.5738-0.1176
19 0.4978 0.4956 0.002128
20 0.4193 0.5072-0.08783
21 0.6096 0.514 0.09556
22 0.4573 0.5514-0.09409
23 0.5705 0.4769 0.09362
24 0.3479 0.486-0.1381
25 0.3875 0.4643-0.07677
26 0.5824 0.4872 0.09526
27 0.2391 0.4288-0.1897
28 0.2367 0.4189-0.1822
29 0.2626 0.5003-0.2377
30 0.4241 0.4569-0.03283
31 0.3653 0.3788-0.01348
32 0.3751 0.498-0.1229
33 0.409 0.5048-0.09583
34 0.3892 0.5423-0.1531
35 0.2403 0.4678-0.2275
36 0.159 0.3691-0.2102
37 0.4393 0.4551-0.01578
38 0.5095 0.478 0.03146
39 0.3743 0.4197-0.04532
40 0.434 0.4097 0.02425
41 0.4131 0.4911-0.07809
42 0.3289 0.4478-0.1189
43 0.5187 0.3696 0.1491
44 0.5486 0.3811 0.1675
45 0.5469 0.388 0.1589
46 0.4963 0.4254 0.07095
47 0.5309 0.4586 0.07229
48 0.5958 0.4677 0.1281
49 0.5571 0.446 0.1111
50 0.5731 0.4688 0.1043
51 0.5005 0.4105 0.09004
52 0.5431 0.4006 0.1426
53 0.5594 0.5897-0.03032
54 0.6912 0.5463 0.1449
55 0.4403 0.4681-0.02779
56 0.5677 0.4797 0.08799
57 0.5969 0.4865 0.1104
58 0.4736 0.5239-0.05039
59 0.5924 0.4494 0.1429
60 0.5976 0.4585 0.139
61 0.6334 0.5445 0.08892
62 0.6057 0.5674 0.03832
63 0.7046 0.509 0.1956
64 0.4805 0.4991-0.01859
65 0.7027 0.5805 0.1222
66 0.7009 0.5371 0.1638
67 0.6031 0.459 0.1441
68 0.6981 0.4705 0.2276
69 0.5977 0.4774 0.1203
70 0.8023 0.5148 0.2876
71 0.6017 0.4403 0.1614
72 0.5993 0.4494 0.15
73 0.6026 0.5353 0.06723
74 0.7017 0.5582 0.1434
75 0.4996 0.4999-0.0003148
76 0.4981 0.3823 0.1158
77 0.4976 0.4637 0.03391
78 0.6002 0.528 0.0722
79 0.334 0.4498-0.1159
80 0.2744 0.4614-0.1869
81 0.3209 0.4682-0.1472
82 0.5407 0.5056 0.03504
83 0.405 0.4311-0.0261
84 0.2886 0.4402-0.1516
85 0.3276 0.5262-0.1986
86 0.3133 0.5491-0.2358
87 0.2576 0.4907-0.2332
88 0.2138 0.4808-0.267
89 0.1862 0.4545-0.2683
90 0.1593 0.4111-0.2519

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  0.5215 &  0.4826 &  0.03891 \tabularnewline
2 &  0.4248 &  0.5055 & -0.08066 \tabularnewline
3 &  0.425 &  0.4471 & -0.02211 \tabularnewline
4 &  0.4772 &  0.4372 &  0.03998 \tabularnewline
5 &  0.828 &  0.5186 &  0.3094 \tabularnewline
6 &  0.6156 &  0.4752 &  0.1404 \tabularnewline
7 &  0.3666 &  0.5048 & -0.1382 \tabularnewline
8 &  0.4309 &  0.5163 & -0.08543 \tabularnewline
9 &  0.281 &  0.5232 & -0.2421 \tabularnewline
10 &  0.4646 &  0.5606 & -0.09596 \tabularnewline
11 &  0.2694 &  0.4861 & -0.2167 \tabularnewline
12 &  0.5779 &  0.4952 &  0.08275 \tabularnewline
13 &  0.5661 &  0.5811 & -0.01502 \tabularnewline
14 &  0.5078 &  0.604 & -0.09627 \tabularnewline
15 &  0.7507 &  0.5457 &  0.205 \tabularnewline
16 &  0.6808 &  0.5358 &  0.1451 \tabularnewline
17 &  0.7661 &  0.6172 &  0.1489 \tabularnewline
18 &  0.4561 &  0.5738 & -0.1176 \tabularnewline
19 &  0.4978 &  0.4956 &  0.002128 \tabularnewline
20 &  0.4193 &  0.5072 & -0.08783 \tabularnewline
21 &  0.6096 &  0.514 &  0.09556 \tabularnewline
22 &  0.4573 &  0.5514 & -0.09409 \tabularnewline
23 &  0.5705 &  0.4769 &  0.09362 \tabularnewline
24 &  0.3479 &  0.486 & -0.1381 \tabularnewline
25 &  0.3875 &  0.4643 & -0.07677 \tabularnewline
26 &  0.5824 &  0.4872 &  0.09526 \tabularnewline
27 &  0.2391 &  0.4288 & -0.1897 \tabularnewline
28 &  0.2367 &  0.4189 & -0.1822 \tabularnewline
29 &  0.2626 &  0.5003 & -0.2377 \tabularnewline
30 &  0.4241 &  0.4569 & -0.03283 \tabularnewline
31 &  0.3653 &  0.3788 & -0.01348 \tabularnewline
32 &  0.3751 &  0.498 & -0.1229 \tabularnewline
33 &  0.409 &  0.5048 & -0.09583 \tabularnewline
34 &  0.3892 &  0.5423 & -0.1531 \tabularnewline
35 &  0.2403 &  0.4678 & -0.2275 \tabularnewline
36 &  0.159 &  0.3691 & -0.2102 \tabularnewline
37 &  0.4393 &  0.4551 & -0.01578 \tabularnewline
38 &  0.5095 &  0.478 &  0.03146 \tabularnewline
39 &  0.3743 &  0.4197 & -0.04532 \tabularnewline
40 &  0.434 &  0.4097 &  0.02425 \tabularnewline
41 &  0.4131 &  0.4911 & -0.07809 \tabularnewline
42 &  0.3289 &  0.4478 & -0.1189 \tabularnewline
43 &  0.5187 &  0.3696 &  0.1491 \tabularnewline
44 &  0.5486 &  0.3811 &  0.1675 \tabularnewline
45 &  0.5469 &  0.388 &  0.1589 \tabularnewline
46 &  0.4963 &  0.4254 &  0.07095 \tabularnewline
47 &  0.5309 &  0.4586 &  0.07229 \tabularnewline
48 &  0.5958 &  0.4677 &  0.1281 \tabularnewline
49 &  0.5571 &  0.446 &  0.1111 \tabularnewline
50 &  0.5731 &  0.4688 &  0.1043 \tabularnewline
51 &  0.5005 &  0.4105 &  0.09004 \tabularnewline
52 &  0.5431 &  0.4006 &  0.1426 \tabularnewline
53 &  0.5594 &  0.5897 & -0.03032 \tabularnewline
54 &  0.6912 &  0.5463 &  0.1449 \tabularnewline
55 &  0.4403 &  0.4681 & -0.02779 \tabularnewline
56 &  0.5677 &  0.4797 &  0.08799 \tabularnewline
57 &  0.5969 &  0.4865 &  0.1104 \tabularnewline
58 &  0.4736 &  0.5239 & -0.05039 \tabularnewline
59 &  0.5924 &  0.4494 &  0.1429 \tabularnewline
60 &  0.5976 &  0.4585 &  0.139 \tabularnewline
61 &  0.6334 &  0.5445 &  0.08892 \tabularnewline
62 &  0.6057 &  0.5674 &  0.03832 \tabularnewline
63 &  0.7046 &  0.509 &  0.1956 \tabularnewline
64 &  0.4805 &  0.4991 & -0.01859 \tabularnewline
65 &  0.7027 &  0.5805 &  0.1222 \tabularnewline
66 &  0.7009 &  0.5371 &  0.1638 \tabularnewline
67 &  0.6031 &  0.459 &  0.1441 \tabularnewline
68 &  0.6981 &  0.4705 &  0.2276 \tabularnewline
69 &  0.5977 &  0.4774 &  0.1203 \tabularnewline
70 &  0.8023 &  0.5148 &  0.2876 \tabularnewline
71 &  0.6017 &  0.4403 &  0.1614 \tabularnewline
72 &  0.5993 &  0.4494 &  0.15 \tabularnewline
73 &  0.6026 &  0.5353 &  0.06723 \tabularnewline
74 &  0.7017 &  0.5582 &  0.1434 \tabularnewline
75 &  0.4996 &  0.4999 & -0.0003148 \tabularnewline
76 &  0.4981 &  0.3823 &  0.1158 \tabularnewline
77 &  0.4976 &  0.4637 &  0.03391 \tabularnewline
78 &  0.6002 &  0.528 &  0.0722 \tabularnewline
79 &  0.334 &  0.4498 & -0.1159 \tabularnewline
80 &  0.2744 &  0.4614 & -0.1869 \tabularnewline
81 &  0.3209 &  0.4682 & -0.1472 \tabularnewline
82 &  0.5407 &  0.5056 &  0.03504 \tabularnewline
83 &  0.405 &  0.4311 & -0.0261 \tabularnewline
84 &  0.2886 &  0.4402 & -0.1516 \tabularnewline
85 &  0.3276 &  0.5262 & -0.1986 \tabularnewline
86 &  0.3133 &  0.5491 & -0.2358 \tabularnewline
87 &  0.2576 &  0.4907 & -0.2332 \tabularnewline
88 &  0.2138 &  0.4808 & -0.267 \tabularnewline
89 &  0.1862 &  0.4545 & -0.2683 \tabularnewline
90 &  0.1593 &  0.4111 & -0.2519 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286451&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 0.5215[/C][C] 0.4826[/C][C] 0.03891[/C][/ROW]
[ROW][C]2[/C][C] 0.4248[/C][C] 0.5055[/C][C]-0.08066[/C][/ROW]
[ROW][C]3[/C][C] 0.425[/C][C] 0.4471[/C][C]-0.02211[/C][/ROW]
[ROW][C]4[/C][C] 0.4772[/C][C] 0.4372[/C][C] 0.03998[/C][/ROW]
[ROW][C]5[/C][C] 0.828[/C][C] 0.5186[/C][C] 0.3094[/C][/ROW]
[ROW][C]6[/C][C] 0.6156[/C][C] 0.4752[/C][C] 0.1404[/C][/ROW]
[ROW][C]7[/C][C] 0.3666[/C][C] 0.5048[/C][C]-0.1382[/C][/ROW]
[ROW][C]8[/C][C] 0.4309[/C][C] 0.5163[/C][C]-0.08543[/C][/ROW]
[ROW][C]9[/C][C] 0.281[/C][C] 0.5232[/C][C]-0.2421[/C][/ROW]
[ROW][C]10[/C][C] 0.4646[/C][C] 0.5606[/C][C]-0.09596[/C][/ROW]
[ROW][C]11[/C][C] 0.2694[/C][C] 0.4861[/C][C]-0.2167[/C][/ROW]
[ROW][C]12[/C][C] 0.5779[/C][C] 0.4952[/C][C] 0.08275[/C][/ROW]
[ROW][C]13[/C][C] 0.5661[/C][C] 0.5811[/C][C]-0.01502[/C][/ROW]
[ROW][C]14[/C][C] 0.5078[/C][C] 0.604[/C][C]-0.09627[/C][/ROW]
[ROW][C]15[/C][C] 0.7507[/C][C] 0.5457[/C][C] 0.205[/C][/ROW]
[ROW][C]16[/C][C] 0.6808[/C][C] 0.5358[/C][C] 0.1451[/C][/ROW]
[ROW][C]17[/C][C] 0.7661[/C][C] 0.6172[/C][C] 0.1489[/C][/ROW]
[ROW][C]18[/C][C] 0.4561[/C][C] 0.5738[/C][C]-0.1176[/C][/ROW]
[ROW][C]19[/C][C] 0.4978[/C][C] 0.4956[/C][C] 0.002128[/C][/ROW]
[ROW][C]20[/C][C] 0.4193[/C][C] 0.5072[/C][C]-0.08783[/C][/ROW]
[ROW][C]21[/C][C] 0.6096[/C][C] 0.514[/C][C] 0.09556[/C][/ROW]
[ROW][C]22[/C][C] 0.4573[/C][C] 0.5514[/C][C]-0.09409[/C][/ROW]
[ROW][C]23[/C][C] 0.5705[/C][C] 0.4769[/C][C] 0.09362[/C][/ROW]
[ROW][C]24[/C][C] 0.3479[/C][C] 0.486[/C][C]-0.1381[/C][/ROW]
[ROW][C]25[/C][C] 0.3875[/C][C] 0.4643[/C][C]-0.07677[/C][/ROW]
[ROW][C]26[/C][C] 0.5824[/C][C] 0.4872[/C][C] 0.09526[/C][/ROW]
[ROW][C]27[/C][C] 0.2391[/C][C] 0.4288[/C][C]-0.1897[/C][/ROW]
[ROW][C]28[/C][C] 0.2367[/C][C] 0.4189[/C][C]-0.1822[/C][/ROW]
[ROW][C]29[/C][C] 0.2626[/C][C] 0.5003[/C][C]-0.2377[/C][/ROW]
[ROW][C]30[/C][C] 0.4241[/C][C] 0.4569[/C][C]-0.03283[/C][/ROW]
[ROW][C]31[/C][C] 0.3653[/C][C] 0.3788[/C][C]-0.01348[/C][/ROW]
[ROW][C]32[/C][C] 0.3751[/C][C] 0.498[/C][C]-0.1229[/C][/ROW]
[ROW][C]33[/C][C] 0.409[/C][C] 0.5048[/C][C]-0.09583[/C][/ROW]
[ROW][C]34[/C][C] 0.3892[/C][C] 0.5423[/C][C]-0.1531[/C][/ROW]
[ROW][C]35[/C][C] 0.2403[/C][C] 0.4678[/C][C]-0.2275[/C][/ROW]
[ROW][C]36[/C][C] 0.159[/C][C] 0.3691[/C][C]-0.2102[/C][/ROW]
[ROW][C]37[/C][C] 0.4393[/C][C] 0.4551[/C][C]-0.01578[/C][/ROW]
[ROW][C]38[/C][C] 0.5095[/C][C] 0.478[/C][C] 0.03146[/C][/ROW]
[ROW][C]39[/C][C] 0.3743[/C][C] 0.4197[/C][C]-0.04532[/C][/ROW]
[ROW][C]40[/C][C] 0.434[/C][C] 0.4097[/C][C] 0.02425[/C][/ROW]
[ROW][C]41[/C][C] 0.4131[/C][C] 0.4911[/C][C]-0.07809[/C][/ROW]
[ROW][C]42[/C][C] 0.3289[/C][C] 0.4478[/C][C]-0.1189[/C][/ROW]
[ROW][C]43[/C][C] 0.5187[/C][C] 0.3696[/C][C] 0.1491[/C][/ROW]
[ROW][C]44[/C][C] 0.5486[/C][C] 0.3811[/C][C] 0.1675[/C][/ROW]
[ROW][C]45[/C][C] 0.5469[/C][C] 0.388[/C][C] 0.1589[/C][/ROW]
[ROW][C]46[/C][C] 0.4963[/C][C] 0.4254[/C][C] 0.07095[/C][/ROW]
[ROW][C]47[/C][C] 0.5309[/C][C] 0.4586[/C][C] 0.07229[/C][/ROW]
[ROW][C]48[/C][C] 0.5958[/C][C] 0.4677[/C][C] 0.1281[/C][/ROW]
[ROW][C]49[/C][C] 0.5571[/C][C] 0.446[/C][C] 0.1111[/C][/ROW]
[ROW][C]50[/C][C] 0.5731[/C][C] 0.4688[/C][C] 0.1043[/C][/ROW]
[ROW][C]51[/C][C] 0.5005[/C][C] 0.4105[/C][C] 0.09004[/C][/ROW]
[ROW][C]52[/C][C] 0.5431[/C][C] 0.4006[/C][C] 0.1426[/C][/ROW]
[ROW][C]53[/C][C] 0.5594[/C][C] 0.5897[/C][C]-0.03032[/C][/ROW]
[ROW][C]54[/C][C] 0.6912[/C][C] 0.5463[/C][C] 0.1449[/C][/ROW]
[ROW][C]55[/C][C] 0.4403[/C][C] 0.4681[/C][C]-0.02779[/C][/ROW]
[ROW][C]56[/C][C] 0.5677[/C][C] 0.4797[/C][C] 0.08799[/C][/ROW]
[ROW][C]57[/C][C] 0.5969[/C][C] 0.4865[/C][C] 0.1104[/C][/ROW]
[ROW][C]58[/C][C] 0.4736[/C][C] 0.5239[/C][C]-0.05039[/C][/ROW]
[ROW][C]59[/C][C] 0.5924[/C][C] 0.4494[/C][C] 0.1429[/C][/ROW]
[ROW][C]60[/C][C] 0.5976[/C][C] 0.4585[/C][C] 0.139[/C][/ROW]
[ROW][C]61[/C][C] 0.6334[/C][C] 0.5445[/C][C] 0.08892[/C][/ROW]
[ROW][C]62[/C][C] 0.6057[/C][C] 0.5674[/C][C] 0.03832[/C][/ROW]
[ROW][C]63[/C][C] 0.7046[/C][C] 0.509[/C][C] 0.1956[/C][/ROW]
[ROW][C]64[/C][C] 0.4805[/C][C] 0.4991[/C][C]-0.01859[/C][/ROW]
[ROW][C]65[/C][C] 0.7027[/C][C] 0.5805[/C][C] 0.1222[/C][/ROW]
[ROW][C]66[/C][C] 0.7009[/C][C] 0.5371[/C][C] 0.1638[/C][/ROW]
[ROW][C]67[/C][C] 0.6031[/C][C] 0.459[/C][C] 0.1441[/C][/ROW]
[ROW][C]68[/C][C] 0.6981[/C][C] 0.4705[/C][C] 0.2276[/C][/ROW]
[ROW][C]69[/C][C] 0.5977[/C][C] 0.4774[/C][C] 0.1203[/C][/ROW]
[ROW][C]70[/C][C] 0.8023[/C][C] 0.5148[/C][C] 0.2876[/C][/ROW]
[ROW][C]71[/C][C] 0.6017[/C][C] 0.4403[/C][C] 0.1614[/C][/ROW]
[ROW][C]72[/C][C] 0.5993[/C][C] 0.4494[/C][C] 0.15[/C][/ROW]
[ROW][C]73[/C][C] 0.6026[/C][C] 0.5353[/C][C] 0.06723[/C][/ROW]
[ROW][C]74[/C][C] 0.7017[/C][C] 0.5582[/C][C] 0.1434[/C][/ROW]
[ROW][C]75[/C][C] 0.4996[/C][C] 0.4999[/C][C]-0.0003148[/C][/ROW]
[ROW][C]76[/C][C] 0.4981[/C][C] 0.3823[/C][C] 0.1158[/C][/ROW]
[ROW][C]77[/C][C] 0.4976[/C][C] 0.4637[/C][C] 0.03391[/C][/ROW]
[ROW][C]78[/C][C] 0.6002[/C][C] 0.528[/C][C] 0.0722[/C][/ROW]
[ROW][C]79[/C][C] 0.334[/C][C] 0.4498[/C][C]-0.1159[/C][/ROW]
[ROW][C]80[/C][C] 0.2744[/C][C] 0.4614[/C][C]-0.1869[/C][/ROW]
[ROW][C]81[/C][C] 0.3209[/C][C] 0.4682[/C][C]-0.1472[/C][/ROW]
[ROW][C]82[/C][C] 0.5407[/C][C] 0.5056[/C][C] 0.03504[/C][/ROW]
[ROW][C]83[/C][C] 0.405[/C][C] 0.4311[/C][C]-0.0261[/C][/ROW]
[ROW][C]84[/C][C] 0.2886[/C][C] 0.4402[/C][C]-0.1516[/C][/ROW]
[ROW][C]85[/C][C] 0.3276[/C][C] 0.5262[/C][C]-0.1986[/C][/ROW]
[ROW][C]86[/C][C] 0.3133[/C][C] 0.5491[/C][C]-0.2358[/C][/ROW]
[ROW][C]87[/C][C] 0.2576[/C][C] 0.4907[/C][C]-0.2332[/C][/ROW]
[ROW][C]88[/C][C] 0.2138[/C][C] 0.4808[/C][C]-0.267[/C][/ROW]
[ROW][C]89[/C][C] 0.1862[/C][C] 0.4545[/C][C]-0.2683[/C][/ROW]
[ROW][C]90[/C][C] 0.1593[/C][C] 0.4111[/C][C]-0.2519[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286451&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286451&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 0.5215 0.4826 0.03891
2 0.4248 0.5055-0.08066
3 0.425 0.4471-0.02211
4 0.4772 0.4372 0.03998
5 0.828 0.5186 0.3094
6 0.6156 0.4752 0.1404
7 0.3666 0.5048-0.1382
8 0.4309 0.5163-0.08543
9 0.281 0.5232-0.2421
10 0.4646 0.5606-0.09596
11 0.2694 0.4861-0.2167
12 0.5779 0.4952 0.08275
13 0.5661 0.5811-0.01502
14 0.5078 0.604-0.09627
15 0.7507 0.5457 0.205
16 0.6808 0.5358 0.1451
17 0.7661 0.6172 0.1489
18 0.4561 0.5738-0.1176
19 0.4978 0.4956 0.002128
20 0.4193 0.5072-0.08783
21 0.6096 0.514 0.09556
22 0.4573 0.5514-0.09409
23 0.5705 0.4769 0.09362
24 0.3479 0.486-0.1381
25 0.3875 0.4643-0.07677
26 0.5824 0.4872 0.09526
27 0.2391 0.4288-0.1897
28 0.2367 0.4189-0.1822
29 0.2626 0.5003-0.2377
30 0.4241 0.4569-0.03283
31 0.3653 0.3788-0.01348
32 0.3751 0.498-0.1229
33 0.409 0.5048-0.09583
34 0.3892 0.5423-0.1531
35 0.2403 0.4678-0.2275
36 0.159 0.3691-0.2102
37 0.4393 0.4551-0.01578
38 0.5095 0.478 0.03146
39 0.3743 0.4197-0.04532
40 0.434 0.4097 0.02425
41 0.4131 0.4911-0.07809
42 0.3289 0.4478-0.1189
43 0.5187 0.3696 0.1491
44 0.5486 0.3811 0.1675
45 0.5469 0.388 0.1589
46 0.4963 0.4254 0.07095
47 0.5309 0.4586 0.07229
48 0.5958 0.4677 0.1281
49 0.5571 0.446 0.1111
50 0.5731 0.4688 0.1043
51 0.5005 0.4105 0.09004
52 0.5431 0.4006 0.1426
53 0.5594 0.5897-0.03032
54 0.6912 0.5463 0.1449
55 0.4403 0.4681-0.02779
56 0.5677 0.4797 0.08799
57 0.5969 0.4865 0.1104
58 0.4736 0.5239-0.05039
59 0.5924 0.4494 0.1429
60 0.5976 0.4585 0.139
61 0.6334 0.5445 0.08892
62 0.6057 0.5674 0.03832
63 0.7046 0.509 0.1956
64 0.4805 0.4991-0.01859
65 0.7027 0.5805 0.1222
66 0.7009 0.5371 0.1638
67 0.6031 0.459 0.1441
68 0.6981 0.4705 0.2276
69 0.5977 0.4774 0.1203
70 0.8023 0.5148 0.2876
71 0.6017 0.4403 0.1614
72 0.5993 0.4494 0.15
73 0.6026 0.5353 0.06723
74 0.7017 0.5582 0.1434
75 0.4996 0.4999-0.0003148
76 0.4981 0.3823 0.1158
77 0.4976 0.4637 0.03391
78 0.6002 0.528 0.0722
79 0.334 0.4498-0.1159
80 0.2744 0.4614-0.1869
81 0.3209 0.4682-0.1472
82 0.5407 0.5056 0.03504
83 0.405 0.4311-0.0261
84 0.2886 0.4402-0.1516
85 0.3276 0.5262-0.1986
86 0.3133 0.5491-0.2358
87 0.2576 0.4907-0.2332
88 0.2138 0.4808-0.267
89 0.1862 0.4545-0.2683
90 0.1593 0.4111-0.2519







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
17 0.4185 0.8371 0.5815
18 0.495 0.99 0.505
19 0.3457 0.6914 0.6543
20 0.2574 0.5147 0.7426
21 0.2758 0.5516 0.7242
22 0.227 0.454 0.773
23 0.2022 0.4044 0.7978
24 0.3487 0.6974 0.6513
25 0.3623 0.7246 0.6377
26 0.2992 0.5983 0.7008
27 0.4586 0.9172 0.5414
28 0.5146 0.9708 0.4854
29 0.6928 0.6144 0.3072
30 0.6263 0.7474 0.3737
31 0.5932 0.8137 0.4068
32 0.5622 0.8756 0.4378
33 0.5284 0.9432 0.4716
34 0.5509 0.8982 0.4491
35 0.6737 0.6526 0.3263
36 0.7323 0.5355 0.2677
37 0.7028 0.5943 0.2972
38 0.6768 0.6463 0.3232
39 0.6494 0.7013 0.3506
40 0.6101 0.7797 0.3899
41 0.6043 0.7914 0.3957
42 0.6865 0.6271 0.3135
43 0.7243 0.5514 0.2757
44 0.7751 0.4499 0.2249
45 0.7803 0.4394 0.2197
46 0.7847 0.4305 0.2153
47 0.815 0.3699 0.185
48 0.8159 0.3681 0.1841
49 0.7767 0.4465 0.2233
50 0.7331 0.5339 0.2669
51 0.6995 0.601 0.3005
52 0.646 0.708 0.354
53 0.6576 0.6848 0.3424
54 0.6236 0.7528 0.3764
55 0.654 0.6921 0.346
56 0.6249 0.7502 0.3751
57 0.5739 0.8521 0.4261
58 0.8968 0.2064 0.1032
59 0.9247 0.1506 0.07528
60 0.9283 0.1435 0.07173
61 0.9311 0.1378 0.06889
62 0.9781 0.04389 0.02194
63 0.9664 0.06714 0.03357
64 0.9979 0.004173 0.002087
65 0.9969 0.006138 0.003069
66 0.9991 0.001881 0.0009405
67 0.9977 0.004588 0.002294
68 0.9984 0.00326 0.00163
69 0.9953 0.009468 0.004734
70 0.9899 0.02018 0.01009
71 0.9928 0.01442 0.007209
72 0.9754 0.04923 0.02462
73 0.9405 0.1191 0.05953

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
17 &  0.4185 &  0.8371 &  0.5815 \tabularnewline
18 &  0.495 &  0.99 &  0.505 \tabularnewline
19 &  0.3457 &  0.6914 &  0.6543 \tabularnewline
20 &  0.2574 &  0.5147 &  0.7426 \tabularnewline
21 &  0.2758 &  0.5516 &  0.7242 \tabularnewline
22 &  0.227 &  0.454 &  0.773 \tabularnewline
23 &  0.2022 &  0.4044 &  0.7978 \tabularnewline
24 &  0.3487 &  0.6974 &  0.6513 \tabularnewline
25 &  0.3623 &  0.7246 &  0.6377 \tabularnewline
26 &  0.2992 &  0.5983 &  0.7008 \tabularnewline
27 &  0.4586 &  0.9172 &  0.5414 \tabularnewline
28 &  0.5146 &  0.9708 &  0.4854 \tabularnewline
29 &  0.6928 &  0.6144 &  0.3072 \tabularnewline
30 &  0.6263 &  0.7474 &  0.3737 \tabularnewline
31 &  0.5932 &  0.8137 &  0.4068 \tabularnewline
32 &  0.5622 &  0.8756 &  0.4378 \tabularnewline
33 &  0.5284 &  0.9432 &  0.4716 \tabularnewline
34 &  0.5509 &  0.8982 &  0.4491 \tabularnewline
35 &  0.6737 &  0.6526 &  0.3263 \tabularnewline
36 &  0.7323 &  0.5355 &  0.2677 \tabularnewline
37 &  0.7028 &  0.5943 &  0.2972 \tabularnewline
38 &  0.6768 &  0.6463 &  0.3232 \tabularnewline
39 &  0.6494 &  0.7013 &  0.3506 \tabularnewline
40 &  0.6101 &  0.7797 &  0.3899 \tabularnewline
41 &  0.6043 &  0.7914 &  0.3957 \tabularnewline
42 &  0.6865 &  0.6271 &  0.3135 \tabularnewline
43 &  0.7243 &  0.5514 &  0.2757 \tabularnewline
44 &  0.7751 &  0.4499 &  0.2249 \tabularnewline
45 &  0.7803 &  0.4394 &  0.2197 \tabularnewline
46 &  0.7847 &  0.4305 &  0.2153 \tabularnewline
47 &  0.815 &  0.3699 &  0.185 \tabularnewline
48 &  0.8159 &  0.3681 &  0.1841 \tabularnewline
49 &  0.7767 &  0.4465 &  0.2233 \tabularnewline
50 &  0.7331 &  0.5339 &  0.2669 \tabularnewline
51 &  0.6995 &  0.601 &  0.3005 \tabularnewline
52 &  0.646 &  0.708 &  0.354 \tabularnewline
53 &  0.6576 &  0.6848 &  0.3424 \tabularnewline
54 &  0.6236 &  0.7528 &  0.3764 \tabularnewline
55 &  0.654 &  0.6921 &  0.346 \tabularnewline
56 &  0.6249 &  0.7502 &  0.3751 \tabularnewline
57 &  0.5739 &  0.8521 &  0.4261 \tabularnewline
58 &  0.8968 &  0.2064 &  0.1032 \tabularnewline
59 &  0.9247 &  0.1506 &  0.07528 \tabularnewline
60 &  0.9283 &  0.1435 &  0.07173 \tabularnewline
61 &  0.9311 &  0.1378 &  0.06889 \tabularnewline
62 &  0.9781 &  0.04389 &  0.02194 \tabularnewline
63 &  0.9664 &  0.06714 &  0.03357 \tabularnewline
64 &  0.9979 &  0.004173 &  0.002087 \tabularnewline
65 &  0.9969 &  0.006138 &  0.003069 \tabularnewline
66 &  0.9991 &  0.001881 &  0.0009405 \tabularnewline
67 &  0.9977 &  0.004588 &  0.002294 \tabularnewline
68 &  0.9984 &  0.00326 &  0.00163 \tabularnewline
69 &  0.9953 &  0.009468 &  0.004734 \tabularnewline
70 &  0.9899 &  0.02018 &  0.01009 \tabularnewline
71 &  0.9928 &  0.01442 &  0.007209 \tabularnewline
72 &  0.9754 &  0.04923 &  0.02462 \tabularnewline
73 &  0.9405 &  0.1191 &  0.05953 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286451&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]17[/C][C] 0.4185[/C][C] 0.8371[/C][C] 0.5815[/C][/ROW]
[ROW][C]18[/C][C] 0.495[/C][C] 0.99[/C][C] 0.505[/C][/ROW]
[ROW][C]19[/C][C] 0.3457[/C][C] 0.6914[/C][C] 0.6543[/C][/ROW]
[ROW][C]20[/C][C] 0.2574[/C][C] 0.5147[/C][C] 0.7426[/C][/ROW]
[ROW][C]21[/C][C] 0.2758[/C][C] 0.5516[/C][C] 0.7242[/C][/ROW]
[ROW][C]22[/C][C] 0.227[/C][C] 0.454[/C][C] 0.773[/C][/ROW]
[ROW][C]23[/C][C] 0.2022[/C][C] 0.4044[/C][C] 0.7978[/C][/ROW]
[ROW][C]24[/C][C] 0.3487[/C][C] 0.6974[/C][C] 0.6513[/C][/ROW]
[ROW][C]25[/C][C] 0.3623[/C][C] 0.7246[/C][C] 0.6377[/C][/ROW]
[ROW][C]26[/C][C] 0.2992[/C][C] 0.5983[/C][C] 0.7008[/C][/ROW]
[ROW][C]27[/C][C] 0.4586[/C][C] 0.9172[/C][C] 0.5414[/C][/ROW]
[ROW][C]28[/C][C] 0.5146[/C][C] 0.9708[/C][C] 0.4854[/C][/ROW]
[ROW][C]29[/C][C] 0.6928[/C][C] 0.6144[/C][C] 0.3072[/C][/ROW]
[ROW][C]30[/C][C] 0.6263[/C][C] 0.7474[/C][C] 0.3737[/C][/ROW]
[ROW][C]31[/C][C] 0.5932[/C][C] 0.8137[/C][C] 0.4068[/C][/ROW]
[ROW][C]32[/C][C] 0.5622[/C][C] 0.8756[/C][C] 0.4378[/C][/ROW]
[ROW][C]33[/C][C] 0.5284[/C][C] 0.9432[/C][C] 0.4716[/C][/ROW]
[ROW][C]34[/C][C] 0.5509[/C][C] 0.8982[/C][C] 0.4491[/C][/ROW]
[ROW][C]35[/C][C] 0.6737[/C][C] 0.6526[/C][C] 0.3263[/C][/ROW]
[ROW][C]36[/C][C] 0.7323[/C][C] 0.5355[/C][C] 0.2677[/C][/ROW]
[ROW][C]37[/C][C] 0.7028[/C][C] 0.5943[/C][C] 0.2972[/C][/ROW]
[ROW][C]38[/C][C] 0.6768[/C][C] 0.6463[/C][C] 0.3232[/C][/ROW]
[ROW][C]39[/C][C] 0.6494[/C][C] 0.7013[/C][C] 0.3506[/C][/ROW]
[ROW][C]40[/C][C] 0.6101[/C][C] 0.7797[/C][C] 0.3899[/C][/ROW]
[ROW][C]41[/C][C] 0.6043[/C][C] 0.7914[/C][C] 0.3957[/C][/ROW]
[ROW][C]42[/C][C] 0.6865[/C][C] 0.6271[/C][C] 0.3135[/C][/ROW]
[ROW][C]43[/C][C] 0.7243[/C][C] 0.5514[/C][C] 0.2757[/C][/ROW]
[ROW][C]44[/C][C] 0.7751[/C][C] 0.4499[/C][C] 0.2249[/C][/ROW]
[ROW][C]45[/C][C] 0.7803[/C][C] 0.4394[/C][C] 0.2197[/C][/ROW]
[ROW][C]46[/C][C] 0.7847[/C][C] 0.4305[/C][C] 0.2153[/C][/ROW]
[ROW][C]47[/C][C] 0.815[/C][C] 0.3699[/C][C] 0.185[/C][/ROW]
[ROW][C]48[/C][C] 0.8159[/C][C] 0.3681[/C][C] 0.1841[/C][/ROW]
[ROW][C]49[/C][C] 0.7767[/C][C] 0.4465[/C][C] 0.2233[/C][/ROW]
[ROW][C]50[/C][C] 0.7331[/C][C] 0.5339[/C][C] 0.2669[/C][/ROW]
[ROW][C]51[/C][C] 0.6995[/C][C] 0.601[/C][C] 0.3005[/C][/ROW]
[ROW][C]52[/C][C] 0.646[/C][C] 0.708[/C][C] 0.354[/C][/ROW]
[ROW][C]53[/C][C] 0.6576[/C][C] 0.6848[/C][C] 0.3424[/C][/ROW]
[ROW][C]54[/C][C] 0.6236[/C][C] 0.7528[/C][C] 0.3764[/C][/ROW]
[ROW][C]55[/C][C] 0.654[/C][C] 0.6921[/C][C] 0.346[/C][/ROW]
[ROW][C]56[/C][C] 0.6249[/C][C] 0.7502[/C][C] 0.3751[/C][/ROW]
[ROW][C]57[/C][C] 0.5739[/C][C] 0.8521[/C][C] 0.4261[/C][/ROW]
[ROW][C]58[/C][C] 0.8968[/C][C] 0.2064[/C][C] 0.1032[/C][/ROW]
[ROW][C]59[/C][C] 0.9247[/C][C] 0.1506[/C][C] 0.07528[/C][/ROW]
[ROW][C]60[/C][C] 0.9283[/C][C] 0.1435[/C][C] 0.07173[/C][/ROW]
[ROW][C]61[/C][C] 0.9311[/C][C] 0.1378[/C][C] 0.06889[/C][/ROW]
[ROW][C]62[/C][C] 0.9781[/C][C] 0.04389[/C][C] 0.02194[/C][/ROW]
[ROW][C]63[/C][C] 0.9664[/C][C] 0.06714[/C][C] 0.03357[/C][/ROW]
[ROW][C]64[/C][C] 0.9979[/C][C] 0.004173[/C][C] 0.002087[/C][/ROW]
[ROW][C]65[/C][C] 0.9969[/C][C] 0.006138[/C][C] 0.003069[/C][/ROW]
[ROW][C]66[/C][C] 0.9991[/C][C] 0.001881[/C][C] 0.0009405[/C][/ROW]
[ROW][C]67[/C][C] 0.9977[/C][C] 0.004588[/C][C] 0.002294[/C][/ROW]
[ROW][C]68[/C][C] 0.9984[/C][C] 0.00326[/C][C] 0.00163[/C][/ROW]
[ROW][C]69[/C][C] 0.9953[/C][C] 0.009468[/C][C] 0.004734[/C][/ROW]
[ROW][C]70[/C][C] 0.9899[/C][C] 0.02018[/C][C] 0.01009[/C][/ROW]
[ROW][C]71[/C][C] 0.9928[/C][C] 0.01442[/C][C] 0.007209[/C][/ROW]
[ROW][C]72[/C][C] 0.9754[/C][C] 0.04923[/C][C] 0.02462[/C][/ROW]
[ROW][C]73[/C][C] 0.9405[/C][C] 0.1191[/C][C] 0.05953[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286451&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286451&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
17 0.4185 0.8371 0.5815
18 0.495 0.99 0.505
19 0.3457 0.6914 0.6543
20 0.2574 0.5147 0.7426
21 0.2758 0.5516 0.7242
22 0.227 0.454 0.773
23 0.2022 0.4044 0.7978
24 0.3487 0.6974 0.6513
25 0.3623 0.7246 0.6377
26 0.2992 0.5983 0.7008
27 0.4586 0.9172 0.5414
28 0.5146 0.9708 0.4854
29 0.6928 0.6144 0.3072
30 0.6263 0.7474 0.3737
31 0.5932 0.8137 0.4068
32 0.5622 0.8756 0.4378
33 0.5284 0.9432 0.4716
34 0.5509 0.8982 0.4491
35 0.6737 0.6526 0.3263
36 0.7323 0.5355 0.2677
37 0.7028 0.5943 0.2972
38 0.6768 0.6463 0.3232
39 0.6494 0.7013 0.3506
40 0.6101 0.7797 0.3899
41 0.6043 0.7914 0.3957
42 0.6865 0.6271 0.3135
43 0.7243 0.5514 0.2757
44 0.7751 0.4499 0.2249
45 0.7803 0.4394 0.2197
46 0.7847 0.4305 0.2153
47 0.815 0.3699 0.185
48 0.8159 0.3681 0.1841
49 0.7767 0.4465 0.2233
50 0.7331 0.5339 0.2669
51 0.6995 0.601 0.3005
52 0.646 0.708 0.354
53 0.6576 0.6848 0.3424
54 0.6236 0.7528 0.3764
55 0.654 0.6921 0.346
56 0.6249 0.7502 0.3751
57 0.5739 0.8521 0.4261
58 0.8968 0.2064 0.1032
59 0.9247 0.1506 0.07528
60 0.9283 0.1435 0.07173
61 0.9311 0.1378 0.06889
62 0.9781 0.04389 0.02194
63 0.9664 0.06714 0.03357
64 0.9979 0.004173 0.002087
65 0.9969 0.006138 0.003069
66 0.9991 0.001881 0.0009405
67 0.9977 0.004588 0.002294
68 0.9984 0.00326 0.00163
69 0.9953 0.009468 0.004734
70 0.9899 0.02018 0.01009
71 0.9928 0.01442 0.007209
72 0.9754 0.04923 0.02462
73 0.9405 0.1191 0.05953







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level6 0.1053NOK
5% type I error level100.175439NOK
10% type I error level110.192982NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 6 &  0.1053 & NOK \tabularnewline
5% type I error level & 10 & 0.175439 & NOK \tabularnewline
10% type I error level & 11 & 0.192982 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286451&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]6[/C][C] 0.1053[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]10[/C][C]0.175439[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]11[/C][C]0.192982[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286451&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286451&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level6 0.1053NOK
5% type I error level100.175439NOK
10% type I error level110.192982NOK



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}