Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 23 Jan 2017 10:41:53 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2017/Jan/23/t14851645243saxzjybegww8j5.htm/, Retrieved Wed, 15 May 2024 23:51:04 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=304574, Retrieved Wed, 15 May 2024 23:51:04 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact49
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [11] [2017-01-23 09:41:53] [94c1b173d9287822f5e2740a4a602bdd] [Current]
Feedback Forum

Post a new message
Dataseries X:
14 22 13 22
19 24 16 24
17 21 17 26
17 21 NA 21
15 24 NA 26
20 20 16 25
15 22 NA 21
19 20 NA 24
15 19 NA 27
15 23 17 28
19 21 17 23
NA 19 15 25
20 19 16 24
18 21 14 24
15 21 16 24
14 22 17 25
20 22 NA 25
NA 19 NA NA
16 21 NA 25
16 21 NA 25
16 21 16 24
10 20 NA 26
19 22 16 26
19 22 NA 25
16 24 NA 26
15 21 NA 23
18 19 16 24
17 19 15 24
19 23 16 25
17 21 16 25
NA 21 13 24
19 19 15 28
20 21 17 27
5 19 NA NA
19 21 13 23
16 21 17 23
15 23 NA 24
16 19 14 24
18 19 14 22
16 19 18 25
15 18 NA 25
17 22 17 28
NA 18 13 22
20 22 16 28
19 18 15 25
7 22 15 24
13 22 NA 24
16 19 15 23
16 22 13 25
NA 25 NA NA
18 19 17 26
18 19 NA 25
16 19 NA 27
17 19 11 26
19 21 14 23
16 21 13 25
19 20 NA 21
13 19 17 22
16 19 16 24
13 22 NA 25
12 26 17 27
17 19 16 24
17 21 16 26
17 21 16 21
16 20 15 27
16 23 12 22
14 22 17 23
16 22 14 24
13 22 14 25
16 21 16 24
14 21 NA 23
20 22 NA 28
12 23 NA NA
13 18 NA 24
18 24 NA 26
14 22 15 22
19 21 16 25
18 21 14 25
14 21 15 24
18 23 17 24
19 21 NA 26
15 23 10 21
14 21 NA 25
17 19 17 25
19 21 NA 26
13 21 20 25
19 21 17 26
18 23 18 27
20 23 NA 25
15 20 17 NA
15 20 14 20
15 19 NA 24
20 23 17 26
15 22 NA 25
19 19 17 25
18 23 NA 24
18 22 16 26
15 22 18 25
20 21 18 28
17 21 16 27
12 21 NA 25
18 21 NA 26
19 22 15 26
20 25 13 26
NA 21 NA NA
17 23 NA 28
15 19 NA NA
16 22 NA 21
18 20 NA 25
18 21 16 25
14 25 NA 24
15 21 NA 24
12 19 NA 24
17 23 12 23
14 22 NA 23
18 21 16 24
17 24 16 24
17 21 NA 25
20 19 16 28
16 18 14 23
14 19 15 24
15 20 14 23
18 19 NA 24
20 22 15 25
17 21 NA 24
17 22 15 23
17 24 16 23
17 28 NA 25
15 19 NA 21
17 18 NA 22
18 23 11 19
17 19 NA 24
20 23 18 25
15 19 NA 21
16 22 11 22
15 21 NA 23
18 19 18 27
11 22 NA NA
15 21 15 26
18 23 19 29
20 22 17 28
19 19 NA 24
14 19 14 25
16 21 NA 25
15 22 13 22
17 21 17 25
18 20 14 26
20 23 19 26
17 22 14 24
18 23 NA 25
15 22 NA 19
16 21 16 25
11 20 16 23
15 18 15 25
18 18 12 25
17 20 NA 26
16 19 17 27
12 21 NA 24
19 24 NA 22
18 19 18 25
15 20 15 24
17 19 18 23
19 23 15 27
18 22 NA 24
19 21 NA 24
16 24 NA 21
16 21 16 25
16 21 NA 25
14 22 16 23




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time9 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time9 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=304574&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]9 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=304574&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304574&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time9 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
ITHSUM[t] = + 7.13508 -0.00181701Bevr_Leeftijd[t] -0.0610084TVDC[t] + 0.433477SKEOUSUM[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
ITHSUM[t] =  +  7.13508 -0.00181701Bevr_Leeftijd[t] -0.0610084TVDC[t] +  0.433477SKEOUSUM[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=304574&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]ITHSUM[t] =  +  7.13508 -0.00181701Bevr_Leeftijd[t] -0.0610084TVDC[t] +  0.433477SKEOUSUM[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=304574&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304574&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
ITHSUM[t] = + 7.13508 -0.00181701Bevr_Leeftijd[t] -0.0610084TVDC[t] + 0.433477SKEOUSUM[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+7.135 4.011+1.7790e+00 0.07849 0.03924
Bevr_Leeftijd-0.001817 0.1327-1.3690e-02 0.9891 0.4946
TVDC-0.06101 0.1334-4.5740e-01 0.6484 0.3242
SKEOUSUM+0.4335 0.1331+3.2560e+00 0.001568 0.0007838

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +7.135 &  4.011 & +1.7790e+00 &  0.07849 &  0.03924 \tabularnewline
Bevr_Leeftijd & -0.001817 &  0.1327 & -1.3690e-02 &  0.9891 &  0.4946 \tabularnewline
TVDC & -0.06101 &  0.1334 & -4.5740e-01 &  0.6484 &  0.3242 \tabularnewline
SKEOUSUM & +0.4335 &  0.1331 & +3.2560e+00 &  0.001568 &  0.0007838 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=304574&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+7.135[/C][C] 4.011[/C][C]+1.7790e+00[/C][C] 0.07849[/C][C] 0.03924[/C][/ROW]
[ROW][C]Bevr_Leeftijd[/C][C]-0.001817[/C][C] 0.1327[/C][C]-1.3690e-02[/C][C] 0.9891[/C][C] 0.4946[/C][/ROW]
[ROW][C]TVDC[/C][C]-0.06101[/C][C] 0.1334[/C][C]-4.5740e-01[/C][C] 0.6484[/C][C] 0.3242[/C][/ROW]
[ROW][C]SKEOUSUM[/C][C]+0.4335[/C][C] 0.1331[/C][C]+3.2560e+00[/C][C] 0.001568[/C][C] 0.0007838[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=304574&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304574&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+7.135 4.011+1.7790e+00 0.07849 0.03924
Bevr_Leeftijd-0.001817 0.1327-1.3690e-02 0.9891 0.4946
TVDC-0.06101 0.1334-4.5740e-01 0.6484 0.3242
SKEOUSUM+0.4335 0.1331+3.2560e+00 0.001568 0.0007838







Multiple Linear Regression - Regression Statistics
Multiple R 0.3354
R-squared 0.1125
Adjusted R-squared 0.08445
F-TEST (value) 4.013
F-TEST (DF numerator)3
F-TEST (DF denominator)95
p-value 0.009775
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 2.186
Sum Squared Residuals 453.9

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.3354 \tabularnewline
R-squared &  0.1125 \tabularnewline
Adjusted R-squared &  0.08445 \tabularnewline
F-TEST (value) &  4.013 \tabularnewline
F-TEST (DF numerator) & 3 \tabularnewline
F-TEST (DF denominator) & 95 \tabularnewline
p-value &  0.009775 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  2.186 \tabularnewline
Sum Squared Residuals &  453.9 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=304574&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.3354[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.1125[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.08445[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 4.013[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]3[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]95[/C][/ROW]
[ROW][C]p-value[/C][C] 0.009775[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 2.186[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 453.9[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=304574&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304574&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.3354
R-squared 0.1125
Adjusted R-squared 0.08445
F-TEST (value) 4.013
F-TEST (DF numerator)3
F-TEST (DF denominator)95
p-value 0.009775
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 2.186
Sum Squared Residuals 453.9







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 14 15.84-1.838
2 19 16.52 2.481
3 17 17.33-0.3302
4 20 16.96 3.04
5 15 18.19-3.193
6 19 16.03 2.97
7 20 16.53 3.472
8 18 16.65 1.354
9 15 16.52-1.524
10 14 16.89-2.895
11 16 16.52-0.5242
12 19 17.39 1.611
13 18 16.53 1.472
14 17 16.59 0.4111
15 19 16.95 2.046
16 17 16.96 0.04229
17 19 18.32 0.6772
18 20 17.76 2.236
19 19 16.27 2.726
20 16 16.03-0.02975
21 16 16.65-0.6499
22 18 15.78 2.217
23 16 16.84-0.8393
24 17 18.2-1.195
25 20 18.26 1.744
26 19 17.02 1.976
27 7 16.58-9.583
28 16 16.16-0.1554
29 16 17.14-1.139
30 18 17.33 0.6662
31 17 17.7-0.6999
32 19 16.21 2.787
33 16 17.14-1.141
34 13 15.6-2.6
35 16 16.53-0.5279
36 12 17.75-5.755
37 17 16.53 0.4721
38 17 17.39-0.3912
39 17 15.22 1.776
40 16 17.89-1.887
41 16 15.9 0.1023
42 14 16.03-2.028
43 16 16.64-0.6444
44 13 17.08-4.078
45 16 16.52-0.5242
46 14 15.72-1.716
47 19 16.96 2.042
48 18 17.08 0.9203
49 14 16.59-2.585
50 18 16.46 1.54
51 15 15.59-0.5862
52 17 16.9 0.09966
53 13 16.71-3.714
54 19 17.33 1.67
55 18 17.7 0.301
56 15 14.91 0.08584
57 20 17.33 2.673
58 19 16.9 2.1
59 18 17.39 0.6106
60 15 16.83-1.834
61 20 18.14 1.864
62 17 17.82-0.8247
63 19 17.45 1.55
64 20 17.57 2.433
65 18 16.96 1.042
66 17 16.33 0.6688
67 18 16.52 1.476
68 17 16.52 0.4812
69 20 18.26 1.738
70 16 16.22-0.2182
71 14 16.59-2.589
72 15 16.21-1.215
73 20 17.02 2.983
74 17 16.15 0.8501
75 17 16.09 0.9147
76 18 14.66 3.342
77 20 16.83 3.168
78 16 15.96 0.03949
79 18 17.71 0.2937
80 15 17.45-2.452
81 18 18.5-0.505
82 20 18.2 1.805
83 14 17.08-3.083
84 15 15.84-0.8385
85 17 16.9 0.1033
86 18 17.52 0.485
87 20 17.2 2.795
88 17 16.64 0.3556
89 16 16.96-0.9577
90 11 16.09-5.093
91 15 17.02-2.024
92 18 17.21 0.7928
93 16 17.77-1.767
94 18 16.84 1.161
95 15 16.59-1.587
96 17 15.97 1.028
97 19 17.88 1.118
98 16 16.96-0.9577
99 14 16.09-2.089

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  14 &  15.84 & -1.838 \tabularnewline
2 &  19 &  16.52 &  2.481 \tabularnewline
3 &  17 &  17.33 & -0.3302 \tabularnewline
4 &  20 &  16.96 &  3.04 \tabularnewline
5 &  15 &  18.19 & -3.193 \tabularnewline
6 &  19 &  16.03 &  2.97 \tabularnewline
7 &  20 &  16.53 &  3.472 \tabularnewline
8 &  18 &  16.65 &  1.354 \tabularnewline
9 &  15 &  16.52 & -1.524 \tabularnewline
10 &  14 &  16.89 & -2.895 \tabularnewline
11 &  16 &  16.52 & -0.5242 \tabularnewline
12 &  19 &  17.39 &  1.611 \tabularnewline
13 &  18 &  16.53 &  1.472 \tabularnewline
14 &  17 &  16.59 &  0.4111 \tabularnewline
15 &  19 &  16.95 &  2.046 \tabularnewline
16 &  17 &  16.96 &  0.04229 \tabularnewline
17 &  19 &  18.32 &  0.6772 \tabularnewline
18 &  20 &  17.76 &  2.236 \tabularnewline
19 &  19 &  16.27 &  2.726 \tabularnewline
20 &  16 &  16.03 & -0.02975 \tabularnewline
21 &  16 &  16.65 & -0.6499 \tabularnewline
22 &  18 &  15.78 &  2.217 \tabularnewline
23 &  16 &  16.84 & -0.8393 \tabularnewline
24 &  17 &  18.2 & -1.195 \tabularnewline
25 &  20 &  18.26 &  1.744 \tabularnewline
26 &  19 &  17.02 &  1.976 \tabularnewline
27 &  7 &  16.58 & -9.583 \tabularnewline
28 &  16 &  16.16 & -0.1554 \tabularnewline
29 &  16 &  17.14 & -1.139 \tabularnewline
30 &  18 &  17.33 &  0.6662 \tabularnewline
31 &  17 &  17.7 & -0.6999 \tabularnewline
32 &  19 &  16.21 &  2.787 \tabularnewline
33 &  16 &  17.14 & -1.141 \tabularnewline
34 &  13 &  15.6 & -2.6 \tabularnewline
35 &  16 &  16.53 & -0.5279 \tabularnewline
36 &  12 &  17.75 & -5.755 \tabularnewline
37 &  17 &  16.53 &  0.4721 \tabularnewline
38 &  17 &  17.39 & -0.3912 \tabularnewline
39 &  17 &  15.22 &  1.776 \tabularnewline
40 &  16 &  17.89 & -1.887 \tabularnewline
41 &  16 &  15.9 &  0.1023 \tabularnewline
42 &  14 &  16.03 & -2.028 \tabularnewline
43 &  16 &  16.64 & -0.6444 \tabularnewline
44 &  13 &  17.08 & -4.078 \tabularnewline
45 &  16 &  16.52 & -0.5242 \tabularnewline
46 &  14 &  15.72 & -1.716 \tabularnewline
47 &  19 &  16.96 &  2.042 \tabularnewline
48 &  18 &  17.08 &  0.9203 \tabularnewline
49 &  14 &  16.59 & -2.585 \tabularnewline
50 &  18 &  16.46 &  1.54 \tabularnewline
51 &  15 &  15.59 & -0.5862 \tabularnewline
52 &  17 &  16.9 &  0.09966 \tabularnewline
53 &  13 &  16.71 & -3.714 \tabularnewline
54 &  19 &  17.33 &  1.67 \tabularnewline
55 &  18 &  17.7 &  0.301 \tabularnewline
56 &  15 &  14.91 &  0.08584 \tabularnewline
57 &  20 &  17.33 &  2.673 \tabularnewline
58 &  19 &  16.9 &  2.1 \tabularnewline
59 &  18 &  17.39 &  0.6106 \tabularnewline
60 &  15 &  16.83 & -1.834 \tabularnewline
61 &  20 &  18.14 &  1.864 \tabularnewline
62 &  17 &  17.82 & -0.8247 \tabularnewline
63 &  19 &  17.45 &  1.55 \tabularnewline
64 &  20 &  17.57 &  2.433 \tabularnewline
65 &  18 &  16.96 &  1.042 \tabularnewline
66 &  17 &  16.33 &  0.6688 \tabularnewline
67 &  18 &  16.52 &  1.476 \tabularnewline
68 &  17 &  16.52 &  0.4812 \tabularnewline
69 &  20 &  18.26 &  1.738 \tabularnewline
70 &  16 &  16.22 & -0.2182 \tabularnewline
71 &  14 &  16.59 & -2.589 \tabularnewline
72 &  15 &  16.21 & -1.215 \tabularnewline
73 &  20 &  17.02 &  2.983 \tabularnewline
74 &  17 &  16.15 &  0.8501 \tabularnewline
75 &  17 &  16.09 &  0.9147 \tabularnewline
76 &  18 &  14.66 &  3.342 \tabularnewline
77 &  20 &  16.83 &  3.168 \tabularnewline
78 &  16 &  15.96 &  0.03949 \tabularnewline
79 &  18 &  17.71 &  0.2937 \tabularnewline
80 &  15 &  17.45 & -2.452 \tabularnewline
81 &  18 &  18.5 & -0.505 \tabularnewline
82 &  20 &  18.2 &  1.805 \tabularnewline
83 &  14 &  17.08 & -3.083 \tabularnewline
84 &  15 &  15.84 & -0.8385 \tabularnewline
85 &  17 &  16.9 &  0.1033 \tabularnewline
86 &  18 &  17.52 &  0.485 \tabularnewline
87 &  20 &  17.2 &  2.795 \tabularnewline
88 &  17 &  16.64 &  0.3556 \tabularnewline
89 &  16 &  16.96 & -0.9577 \tabularnewline
90 &  11 &  16.09 & -5.093 \tabularnewline
91 &  15 &  17.02 & -2.024 \tabularnewline
92 &  18 &  17.21 &  0.7928 \tabularnewline
93 &  16 &  17.77 & -1.767 \tabularnewline
94 &  18 &  16.84 &  1.161 \tabularnewline
95 &  15 &  16.59 & -1.587 \tabularnewline
96 &  17 &  15.97 &  1.028 \tabularnewline
97 &  19 &  17.88 &  1.118 \tabularnewline
98 &  16 &  16.96 & -0.9577 \tabularnewline
99 &  14 &  16.09 & -2.089 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=304574&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 14[/C][C] 15.84[/C][C]-1.838[/C][/ROW]
[ROW][C]2[/C][C] 19[/C][C] 16.52[/C][C] 2.481[/C][/ROW]
[ROW][C]3[/C][C] 17[/C][C] 17.33[/C][C]-0.3302[/C][/ROW]
[ROW][C]4[/C][C] 20[/C][C] 16.96[/C][C] 3.04[/C][/ROW]
[ROW][C]5[/C][C] 15[/C][C] 18.19[/C][C]-3.193[/C][/ROW]
[ROW][C]6[/C][C] 19[/C][C] 16.03[/C][C] 2.97[/C][/ROW]
[ROW][C]7[/C][C] 20[/C][C] 16.53[/C][C] 3.472[/C][/ROW]
[ROW][C]8[/C][C] 18[/C][C] 16.65[/C][C] 1.354[/C][/ROW]
[ROW][C]9[/C][C] 15[/C][C] 16.52[/C][C]-1.524[/C][/ROW]
[ROW][C]10[/C][C] 14[/C][C] 16.89[/C][C]-2.895[/C][/ROW]
[ROW][C]11[/C][C] 16[/C][C] 16.52[/C][C]-0.5242[/C][/ROW]
[ROW][C]12[/C][C] 19[/C][C] 17.39[/C][C] 1.611[/C][/ROW]
[ROW][C]13[/C][C] 18[/C][C] 16.53[/C][C] 1.472[/C][/ROW]
[ROW][C]14[/C][C] 17[/C][C] 16.59[/C][C] 0.4111[/C][/ROW]
[ROW][C]15[/C][C] 19[/C][C] 16.95[/C][C] 2.046[/C][/ROW]
[ROW][C]16[/C][C] 17[/C][C] 16.96[/C][C] 0.04229[/C][/ROW]
[ROW][C]17[/C][C] 19[/C][C] 18.32[/C][C] 0.6772[/C][/ROW]
[ROW][C]18[/C][C] 20[/C][C] 17.76[/C][C] 2.236[/C][/ROW]
[ROW][C]19[/C][C] 19[/C][C] 16.27[/C][C] 2.726[/C][/ROW]
[ROW][C]20[/C][C] 16[/C][C] 16.03[/C][C]-0.02975[/C][/ROW]
[ROW][C]21[/C][C] 16[/C][C] 16.65[/C][C]-0.6499[/C][/ROW]
[ROW][C]22[/C][C] 18[/C][C] 15.78[/C][C] 2.217[/C][/ROW]
[ROW][C]23[/C][C] 16[/C][C] 16.84[/C][C]-0.8393[/C][/ROW]
[ROW][C]24[/C][C] 17[/C][C] 18.2[/C][C]-1.195[/C][/ROW]
[ROW][C]25[/C][C] 20[/C][C] 18.26[/C][C] 1.744[/C][/ROW]
[ROW][C]26[/C][C] 19[/C][C] 17.02[/C][C] 1.976[/C][/ROW]
[ROW][C]27[/C][C] 7[/C][C] 16.58[/C][C]-9.583[/C][/ROW]
[ROW][C]28[/C][C] 16[/C][C] 16.16[/C][C]-0.1554[/C][/ROW]
[ROW][C]29[/C][C] 16[/C][C] 17.14[/C][C]-1.139[/C][/ROW]
[ROW][C]30[/C][C] 18[/C][C] 17.33[/C][C] 0.6662[/C][/ROW]
[ROW][C]31[/C][C] 17[/C][C] 17.7[/C][C]-0.6999[/C][/ROW]
[ROW][C]32[/C][C] 19[/C][C] 16.21[/C][C] 2.787[/C][/ROW]
[ROW][C]33[/C][C] 16[/C][C] 17.14[/C][C]-1.141[/C][/ROW]
[ROW][C]34[/C][C] 13[/C][C] 15.6[/C][C]-2.6[/C][/ROW]
[ROW][C]35[/C][C] 16[/C][C] 16.53[/C][C]-0.5279[/C][/ROW]
[ROW][C]36[/C][C] 12[/C][C] 17.75[/C][C]-5.755[/C][/ROW]
[ROW][C]37[/C][C] 17[/C][C] 16.53[/C][C] 0.4721[/C][/ROW]
[ROW][C]38[/C][C] 17[/C][C] 17.39[/C][C]-0.3912[/C][/ROW]
[ROW][C]39[/C][C] 17[/C][C] 15.22[/C][C] 1.776[/C][/ROW]
[ROW][C]40[/C][C] 16[/C][C] 17.89[/C][C]-1.887[/C][/ROW]
[ROW][C]41[/C][C] 16[/C][C] 15.9[/C][C] 0.1023[/C][/ROW]
[ROW][C]42[/C][C] 14[/C][C] 16.03[/C][C]-2.028[/C][/ROW]
[ROW][C]43[/C][C] 16[/C][C] 16.64[/C][C]-0.6444[/C][/ROW]
[ROW][C]44[/C][C] 13[/C][C] 17.08[/C][C]-4.078[/C][/ROW]
[ROW][C]45[/C][C] 16[/C][C] 16.52[/C][C]-0.5242[/C][/ROW]
[ROW][C]46[/C][C] 14[/C][C] 15.72[/C][C]-1.716[/C][/ROW]
[ROW][C]47[/C][C] 19[/C][C] 16.96[/C][C] 2.042[/C][/ROW]
[ROW][C]48[/C][C] 18[/C][C] 17.08[/C][C] 0.9203[/C][/ROW]
[ROW][C]49[/C][C] 14[/C][C] 16.59[/C][C]-2.585[/C][/ROW]
[ROW][C]50[/C][C] 18[/C][C] 16.46[/C][C] 1.54[/C][/ROW]
[ROW][C]51[/C][C] 15[/C][C] 15.59[/C][C]-0.5862[/C][/ROW]
[ROW][C]52[/C][C] 17[/C][C] 16.9[/C][C] 0.09966[/C][/ROW]
[ROW][C]53[/C][C] 13[/C][C] 16.71[/C][C]-3.714[/C][/ROW]
[ROW][C]54[/C][C] 19[/C][C] 17.33[/C][C] 1.67[/C][/ROW]
[ROW][C]55[/C][C] 18[/C][C] 17.7[/C][C] 0.301[/C][/ROW]
[ROW][C]56[/C][C] 15[/C][C] 14.91[/C][C] 0.08584[/C][/ROW]
[ROW][C]57[/C][C] 20[/C][C] 17.33[/C][C] 2.673[/C][/ROW]
[ROW][C]58[/C][C] 19[/C][C] 16.9[/C][C] 2.1[/C][/ROW]
[ROW][C]59[/C][C] 18[/C][C] 17.39[/C][C] 0.6106[/C][/ROW]
[ROW][C]60[/C][C] 15[/C][C] 16.83[/C][C]-1.834[/C][/ROW]
[ROW][C]61[/C][C] 20[/C][C] 18.14[/C][C] 1.864[/C][/ROW]
[ROW][C]62[/C][C] 17[/C][C] 17.82[/C][C]-0.8247[/C][/ROW]
[ROW][C]63[/C][C] 19[/C][C] 17.45[/C][C] 1.55[/C][/ROW]
[ROW][C]64[/C][C] 20[/C][C] 17.57[/C][C] 2.433[/C][/ROW]
[ROW][C]65[/C][C] 18[/C][C] 16.96[/C][C] 1.042[/C][/ROW]
[ROW][C]66[/C][C] 17[/C][C] 16.33[/C][C] 0.6688[/C][/ROW]
[ROW][C]67[/C][C] 18[/C][C] 16.52[/C][C] 1.476[/C][/ROW]
[ROW][C]68[/C][C] 17[/C][C] 16.52[/C][C] 0.4812[/C][/ROW]
[ROW][C]69[/C][C] 20[/C][C] 18.26[/C][C] 1.738[/C][/ROW]
[ROW][C]70[/C][C] 16[/C][C] 16.22[/C][C]-0.2182[/C][/ROW]
[ROW][C]71[/C][C] 14[/C][C] 16.59[/C][C]-2.589[/C][/ROW]
[ROW][C]72[/C][C] 15[/C][C] 16.21[/C][C]-1.215[/C][/ROW]
[ROW][C]73[/C][C] 20[/C][C] 17.02[/C][C] 2.983[/C][/ROW]
[ROW][C]74[/C][C] 17[/C][C] 16.15[/C][C] 0.8501[/C][/ROW]
[ROW][C]75[/C][C] 17[/C][C] 16.09[/C][C] 0.9147[/C][/ROW]
[ROW][C]76[/C][C] 18[/C][C] 14.66[/C][C] 3.342[/C][/ROW]
[ROW][C]77[/C][C] 20[/C][C] 16.83[/C][C] 3.168[/C][/ROW]
[ROW][C]78[/C][C] 16[/C][C] 15.96[/C][C] 0.03949[/C][/ROW]
[ROW][C]79[/C][C] 18[/C][C] 17.71[/C][C] 0.2937[/C][/ROW]
[ROW][C]80[/C][C] 15[/C][C] 17.45[/C][C]-2.452[/C][/ROW]
[ROW][C]81[/C][C] 18[/C][C] 18.5[/C][C]-0.505[/C][/ROW]
[ROW][C]82[/C][C] 20[/C][C] 18.2[/C][C] 1.805[/C][/ROW]
[ROW][C]83[/C][C] 14[/C][C] 17.08[/C][C]-3.083[/C][/ROW]
[ROW][C]84[/C][C] 15[/C][C] 15.84[/C][C]-0.8385[/C][/ROW]
[ROW][C]85[/C][C] 17[/C][C] 16.9[/C][C] 0.1033[/C][/ROW]
[ROW][C]86[/C][C] 18[/C][C] 17.52[/C][C] 0.485[/C][/ROW]
[ROW][C]87[/C][C] 20[/C][C] 17.2[/C][C] 2.795[/C][/ROW]
[ROW][C]88[/C][C] 17[/C][C] 16.64[/C][C] 0.3556[/C][/ROW]
[ROW][C]89[/C][C] 16[/C][C] 16.96[/C][C]-0.9577[/C][/ROW]
[ROW][C]90[/C][C] 11[/C][C] 16.09[/C][C]-5.093[/C][/ROW]
[ROW][C]91[/C][C] 15[/C][C] 17.02[/C][C]-2.024[/C][/ROW]
[ROW][C]92[/C][C] 18[/C][C] 17.21[/C][C] 0.7928[/C][/ROW]
[ROW][C]93[/C][C] 16[/C][C] 17.77[/C][C]-1.767[/C][/ROW]
[ROW][C]94[/C][C] 18[/C][C] 16.84[/C][C] 1.161[/C][/ROW]
[ROW][C]95[/C][C] 15[/C][C] 16.59[/C][C]-1.587[/C][/ROW]
[ROW][C]96[/C][C] 17[/C][C] 15.97[/C][C] 1.028[/C][/ROW]
[ROW][C]97[/C][C] 19[/C][C] 17.88[/C][C] 1.118[/C][/ROW]
[ROW][C]98[/C][C] 16[/C][C] 16.96[/C][C]-0.9577[/C][/ROW]
[ROW][C]99[/C][C] 14[/C][C] 16.09[/C][C]-2.089[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=304574&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304574&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 14 15.84-1.838
2 19 16.52 2.481
3 17 17.33-0.3302
4 20 16.96 3.04
5 15 18.19-3.193
6 19 16.03 2.97
7 20 16.53 3.472
8 18 16.65 1.354
9 15 16.52-1.524
10 14 16.89-2.895
11 16 16.52-0.5242
12 19 17.39 1.611
13 18 16.53 1.472
14 17 16.59 0.4111
15 19 16.95 2.046
16 17 16.96 0.04229
17 19 18.32 0.6772
18 20 17.76 2.236
19 19 16.27 2.726
20 16 16.03-0.02975
21 16 16.65-0.6499
22 18 15.78 2.217
23 16 16.84-0.8393
24 17 18.2-1.195
25 20 18.26 1.744
26 19 17.02 1.976
27 7 16.58-9.583
28 16 16.16-0.1554
29 16 17.14-1.139
30 18 17.33 0.6662
31 17 17.7-0.6999
32 19 16.21 2.787
33 16 17.14-1.141
34 13 15.6-2.6
35 16 16.53-0.5279
36 12 17.75-5.755
37 17 16.53 0.4721
38 17 17.39-0.3912
39 17 15.22 1.776
40 16 17.89-1.887
41 16 15.9 0.1023
42 14 16.03-2.028
43 16 16.64-0.6444
44 13 17.08-4.078
45 16 16.52-0.5242
46 14 15.72-1.716
47 19 16.96 2.042
48 18 17.08 0.9203
49 14 16.59-2.585
50 18 16.46 1.54
51 15 15.59-0.5862
52 17 16.9 0.09966
53 13 16.71-3.714
54 19 17.33 1.67
55 18 17.7 0.301
56 15 14.91 0.08584
57 20 17.33 2.673
58 19 16.9 2.1
59 18 17.39 0.6106
60 15 16.83-1.834
61 20 18.14 1.864
62 17 17.82-0.8247
63 19 17.45 1.55
64 20 17.57 2.433
65 18 16.96 1.042
66 17 16.33 0.6688
67 18 16.52 1.476
68 17 16.52 0.4812
69 20 18.26 1.738
70 16 16.22-0.2182
71 14 16.59-2.589
72 15 16.21-1.215
73 20 17.02 2.983
74 17 16.15 0.8501
75 17 16.09 0.9147
76 18 14.66 3.342
77 20 16.83 3.168
78 16 15.96 0.03949
79 18 17.71 0.2937
80 15 17.45-2.452
81 18 18.5-0.505
82 20 18.2 1.805
83 14 17.08-3.083
84 15 15.84-0.8385
85 17 16.9 0.1033
86 18 17.52 0.485
87 20 17.2 2.795
88 17 16.64 0.3556
89 16 16.96-0.9577
90 11 16.09-5.093
91 15 17.02-2.024
92 18 17.21 0.7928
93 16 17.77-1.767
94 18 16.84 1.161
95 15 16.59-1.587
96 17 15.97 1.028
97 19 17.88 1.118
98 16 16.96-0.9577
99 14 16.09-2.089







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
7 0.624 0.7519 0.376
8 0.5756 0.8488 0.4244
9 0.698 0.604 0.302
10 0.8014 0.3971 0.1986
11 0.746 0.5081 0.254
12 0.7518 0.4963 0.2482
13 0.6755 0.649 0.3245
14 0.6009 0.7981 0.3991
15 0.6119 0.7762 0.3881
16 0.5248 0.9505 0.4752
17 0.4583 0.9166 0.5417
18 0.4446 0.8892 0.5554
19 0.4489 0.8979 0.5511
20 0.3917 0.7833 0.6083
21 0.372 0.7441 0.628
22 0.3225 0.645 0.6775
23 0.2991 0.5982 0.7009
24 0.2471 0.4942 0.7529
25 0.2372 0.4744 0.7628
26 0.2017 0.4033 0.7983
27 0.9863 0.02747 0.01374
28 0.9808 0.03835 0.01917
29 0.9735 0.05295 0.02648
30 0.9634 0.07311 0.03655
31 0.9508 0.09847 0.04924
32 0.9588 0.08237 0.04118
33 0.9463 0.1074 0.05372
34 0.9597 0.08059 0.0403
35 0.9469 0.1061 0.05307
36 0.9923 0.01535 0.007675
37 0.9893 0.02134 0.01067
38 0.9844 0.03121 0.01561
39 0.9839 0.03218 0.01609
40 0.9827 0.03465 0.01733
41 0.9757 0.04868 0.02434
42 0.9736 0.05274 0.02637
43 0.9643 0.07138 0.03569
44 0.9876 0.0247 0.01235
45 0.9822 0.03555 0.01778
46 0.9794 0.04117 0.02058
47 0.9797 0.04058 0.02029
48 0.9731 0.05375 0.02687
49 0.9778 0.04441 0.0222
50 0.9752 0.04967 0.02483
51 0.9685 0.06303 0.03152
52 0.9583 0.08338 0.04169
53 0.98 0.0399 0.01995
54 0.9775 0.045 0.0225
55 0.9717 0.05667 0.02833
56 0.9619 0.07617 0.03808
57 0.9668 0.0663 0.03315
58 0.9732 0.0535 0.02675
59 0.9631 0.07378 0.03689
60 0.965 0.07005 0.03503
61 0.9604 0.07925 0.03963
62 0.9494 0.1012 0.05061
63 0.9383 0.1235 0.06174
64 0.9348 0.1304 0.06522
65 0.9172 0.1657 0.08283
66 0.8909 0.2182 0.1091
67 0.8757 0.2485 0.1243
68 0.8469 0.3061 0.1531
69 0.8577 0.2845 0.1423
70 0.8381 0.3238 0.1619
71 0.8308 0.3384 0.1692
72 0.7907 0.4186 0.2093
73 0.8242 0.3515 0.1758
74 0.7783 0.4434 0.2217
75 0.7283 0.5434 0.2717
76 0.8236 0.3529 0.1764
77 0.8638 0.2724 0.1362
78 0.8376 0.3249 0.1624
79 0.7852 0.4296 0.2148
80 0.7948 0.4104 0.2052
81 0.8175 0.365 0.1825
82 0.7598 0.4804 0.2402
83 0.7729 0.4543 0.2271
84 0.7139 0.5722 0.2861
85 0.6255 0.749 0.3745
86 0.5351 0.9299 0.4649
87 0.5456 0.9088 0.4544
88 0.4885 0.9771 0.5115
89 0.3731 0.7461 0.6269
90 0.647 0.7059 0.353
91 0.612 0.776 0.388
92 0.6172 0.7656 0.3828

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
7 &  0.624 &  0.7519 &  0.376 \tabularnewline
8 &  0.5756 &  0.8488 &  0.4244 \tabularnewline
9 &  0.698 &  0.604 &  0.302 \tabularnewline
10 &  0.8014 &  0.3971 &  0.1986 \tabularnewline
11 &  0.746 &  0.5081 &  0.254 \tabularnewline
12 &  0.7518 &  0.4963 &  0.2482 \tabularnewline
13 &  0.6755 &  0.649 &  0.3245 \tabularnewline
14 &  0.6009 &  0.7981 &  0.3991 \tabularnewline
15 &  0.6119 &  0.7762 &  0.3881 \tabularnewline
16 &  0.5248 &  0.9505 &  0.4752 \tabularnewline
17 &  0.4583 &  0.9166 &  0.5417 \tabularnewline
18 &  0.4446 &  0.8892 &  0.5554 \tabularnewline
19 &  0.4489 &  0.8979 &  0.5511 \tabularnewline
20 &  0.3917 &  0.7833 &  0.6083 \tabularnewline
21 &  0.372 &  0.7441 &  0.628 \tabularnewline
22 &  0.3225 &  0.645 &  0.6775 \tabularnewline
23 &  0.2991 &  0.5982 &  0.7009 \tabularnewline
24 &  0.2471 &  0.4942 &  0.7529 \tabularnewline
25 &  0.2372 &  0.4744 &  0.7628 \tabularnewline
26 &  0.2017 &  0.4033 &  0.7983 \tabularnewline
27 &  0.9863 &  0.02747 &  0.01374 \tabularnewline
28 &  0.9808 &  0.03835 &  0.01917 \tabularnewline
29 &  0.9735 &  0.05295 &  0.02648 \tabularnewline
30 &  0.9634 &  0.07311 &  0.03655 \tabularnewline
31 &  0.9508 &  0.09847 &  0.04924 \tabularnewline
32 &  0.9588 &  0.08237 &  0.04118 \tabularnewline
33 &  0.9463 &  0.1074 &  0.05372 \tabularnewline
34 &  0.9597 &  0.08059 &  0.0403 \tabularnewline
35 &  0.9469 &  0.1061 &  0.05307 \tabularnewline
36 &  0.9923 &  0.01535 &  0.007675 \tabularnewline
37 &  0.9893 &  0.02134 &  0.01067 \tabularnewline
38 &  0.9844 &  0.03121 &  0.01561 \tabularnewline
39 &  0.9839 &  0.03218 &  0.01609 \tabularnewline
40 &  0.9827 &  0.03465 &  0.01733 \tabularnewline
41 &  0.9757 &  0.04868 &  0.02434 \tabularnewline
42 &  0.9736 &  0.05274 &  0.02637 \tabularnewline
43 &  0.9643 &  0.07138 &  0.03569 \tabularnewline
44 &  0.9876 &  0.0247 &  0.01235 \tabularnewline
45 &  0.9822 &  0.03555 &  0.01778 \tabularnewline
46 &  0.9794 &  0.04117 &  0.02058 \tabularnewline
47 &  0.9797 &  0.04058 &  0.02029 \tabularnewline
48 &  0.9731 &  0.05375 &  0.02687 \tabularnewline
49 &  0.9778 &  0.04441 &  0.0222 \tabularnewline
50 &  0.9752 &  0.04967 &  0.02483 \tabularnewline
51 &  0.9685 &  0.06303 &  0.03152 \tabularnewline
52 &  0.9583 &  0.08338 &  0.04169 \tabularnewline
53 &  0.98 &  0.0399 &  0.01995 \tabularnewline
54 &  0.9775 &  0.045 &  0.0225 \tabularnewline
55 &  0.9717 &  0.05667 &  0.02833 \tabularnewline
56 &  0.9619 &  0.07617 &  0.03808 \tabularnewline
57 &  0.9668 &  0.0663 &  0.03315 \tabularnewline
58 &  0.9732 &  0.0535 &  0.02675 \tabularnewline
59 &  0.9631 &  0.07378 &  0.03689 \tabularnewline
60 &  0.965 &  0.07005 &  0.03503 \tabularnewline
61 &  0.9604 &  0.07925 &  0.03963 \tabularnewline
62 &  0.9494 &  0.1012 &  0.05061 \tabularnewline
63 &  0.9383 &  0.1235 &  0.06174 \tabularnewline
64 &  0.9348 &  0.1304 &  0.06522 \tabularnewline
65 &  0.9172 &  0.1657 &  0.08283 \tabularnewline
66 &  0.8909 &  0.2182 &  0.1091 \tabularnewline
67 &  0.8757 &  0.2485 &  0.1243 \tabularnewline
68 &  0.8469 &  0.3061 &  0.1531 \tabularnewline
69 &  0.8577 &  0.2845 &  0.1423 \tabularnewline
70 &  0.8381 &  0.3238 &  0.1619 \tabularnewline
71 &  0.8308 &  0.3384 &  0.1692 \tabularnewline
72 &  0.7907 &  0.4186 &  0.2093 \tabularnewline
73 &  0.8242 &  0.3515 &  0.1758 \tabularnewline
74 &  0.7783 &  0.4434 &  0.2217 \tabularnewline
75 &  0.7283 &  0.5434 &  0.2717 \tabularnewline
76 &  0.8236 &  0.3529 &  0.1764 \tabularnewline
77 &  0.8638 &  0.2724 &  0.1362 \tabularnewline
78 &  0.8376 &  0.3249 &  0.1624 \tabularnewline
79 &  0.7852 &  0.4296 &  0.2148 \tabularnewline
80 &  0.7948 &  0.4104 &  0.2052 \tabularnewline
81 &  0.8175 &  0.365 &  0.1825 \tabularnewline
82 &  0.7598 &  0.4804 &  0.2402 \tabularnewline
83 &  0.7729 &  0.4543 &  0.2271 \tabularnewline
84 &  0.7139 &  0.5722 &  0.2861 \tabularnewline
85 &  0.6255 &  0.749 &  0.3745 \tabularnewline
86 &  0.5351 &  0.9299 &  0.4649 \tabularnewline
87 &  0.5456 &  0.9088 &  0.4544 \tabularnewline
88 &  0.4885 &  0.9771 &  0.5115 \tabularnewline
89 &  0.3731 &  0.7461 &  0.6269 \tabularnewline
90 &  0.647 &  0.7059 &  0.353 \tabularnewline
91 &  0.612 &  0.776 &  0.388 \tabularnewline
92 &  0.6172 &  0.7656 &  0.3828 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=304574&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]7[/C][C] 0.624[/C][C] 0.7519[/C][C] 0.376[/C][/ROW]
[ROW][C]8[/C][C] 0.5756[/C][C] 0.8488[/C][C] 0.4244[/C][/ROW]
[ROW][C]9[/C][C] 0.698[/C][C] 0.604[/C][C] 0.302[/C][/ROW]
[ROW][C]10[/C][C] 0.8014[/C][C] 0.3971[/C][C] 0.1986[/C][/ROW]
[ROW][C]11[/C][C] 0.746[/C][C] 0.5081[/C][C] 0.254[/C][/ROW]
[ROW][C]12[/C][C] 0.7518[/C][C] 0.4963[/C][C] 0.2482[/C][/ROW]
[ROW][C]13[/C][C] 0.6755[/C][C] 0.649[/C][C] 0.3245[/C][/ROW]
[ROW][C]14[/C][C] 0.6009[/C][C] 0.7981[/C][C] 0.3991[/C][/ROW]
[ROW][C]15[/C][C] 0.6119[/C][C] 0.7762[/C][C] 0.3881[/C][/ROW]
[ROW][C]16[/C][C] 0.5248[/C][C] 0.9505[/C][C] 0.4752[/C][/ROW]
[ROW][C]17[/C][C] 0.4583[/C][C] 0.9166[/C][C] 0.5417[/C][/ROW]
[ROW][C]18[/C][C] 0.4446[/C][C] 0.8892[/C][C] 0.5554[/C][/ROW]
[ROW][C]19[/C][C] 0.4489[/C][C] 0.8979[/C][C] 0.5511[/C][/ROW]
[ROW][C]20[/C][C] 0.3917[/C][C] 0.7833[/C][C] 0.6083[/C][/ROW]
[ROW][C]21[/C][C] 0.372[/C][C] 0.7441[/C][C] 0.628[/C][/ROW]
[ROW][C]22[/C][C] 0.3225[/C][C] 0.645[/C][C] 0.6775[/C][/ROW]
[ROW][C]23[/C][C] 0.2991[/C][C] 0.5982[/C][C] 0.7009[/C][/ROW]
[ROW][C]24[/C][C] 0.2471[/C][C] 0.4942[/C][C] 0.7529[/C][/ROW]
[ROW][C]25[/C][C] 0.2372[/C][C] 0.4744[/C][C] 0.7628[/C][/ROW]
[ROW][C]26[/C][C] 0.2017[/C][C] 0.4033[/C][C] 0.7983[/C][/ROW]
[ROW][C]27[/C][C] 0.9863[/C][C] 0.02747[/C][C] 0.01374[/C][/ROW]
[ROW][C]28[/C][C] 0.9808[/C][C] 0.03835[/C][C] 0.01917[/C][/ROW]
[ROW][C]29[/C][C] 0.9735[/C][C] 0.05295[/C][C] 0.02648[/C][/ROW]
[ROW][C]30[/C][C] 0.9634[/C][C] 0.07311[/C][C] 0.03655[/C][/ROW]
[ROW][C]31[/C][C] 0.9508[/C][C] 0.09847[/C][C] 0.04924[/C][/ROW]
[ROW][C]32[/C][C] 0.9588[/C][C] 0.08237[/C][C] 0.04118[/C][/ROW]
[ROW][C]33[/C][C] 0.9463[/C][C] 0.1074[/C][C] 0.05372[/C][/ROW]
[ROW][C]34[/C][C] 0.9597[/C][C] 0.08059[/C][C] 0.0403[/C][/ROW]
[ROW][C]35[/C][C] 0.9469[/C][C] 0.1061[/C][C] 0.05307[/C][/ROW]
[ROW][C]36[/C][C] 0.9923[/C][C] 0.01535[/C][C] 0.007675[/C][/ROW]
[ROW][C]37[/C][C] 0.9893[/C][C] 0.02134[/C][C] 0.01067[/C][/ROW]
[ROW][C]38[/C][C] 0.9844[/C][C] 0.03121[/C][C] 0.01561[/C][/ROW]
[ROW][C]39[/C][C] 0.9839[/C][C] 0.03218[/C][C] 0.01609[/C][/ROW]
[ROW][C]40[/C][C] 0.9827[/C][C] 0.03465[/C][C] 0.01733[/C][/ROW]
[ROW][C]41[/C][C] 0.9757[/C][C] 0.04868[/C][C] 0.02434[/C][/ROW]
[ROW][C]42[/C][C] 0.9736[/C][C] 0.05274[/C][C] 0.02637[/C][/ROW]
[ROW][C]43[/C][C] 0.9643[/C][C] 0.07138[/C][C] 0.03569[/C][/ROW]
[ROW][C]44[/C][C] 0.9876[/C][C] 0.0247[/C][C] 0.01235[/C][/ROW]
[ROW][C]45[/C][C] 0.9822[/C][C] 0.03555[/C][C] 0.01778[/C][/ROW]
[ROW][C]46[/C][C] 0.9794[/C][C] 0.04117[/C][C] 0.02058[/C][/ROW]
[ROW][C]47[/C][C] 0.9797[/C][C] 0.04058[/C][C] 0.02029[/C][/ROW]
[ROW][C]48[/C][C] 0.9731[/C][C] 0.05375[/C][C] 0.02687[/C][/ROW]
[ROW][C]49[/C][C] 0.9778[/C][C] 0.04441[/C][C] 0.0222[/C][/ROW]
[ROW][C]50[/C][C] 0.9752[/C][C] 0.04967[/C][C] 0.02483[/C][/ROW]
[ROW][C]51[/C][C] 0.9685[/C][C] 0.06303[/C][C] 0.03152[/C][/ROW]
[ROW][C]52[/C][C] 0.9583[/C][C] 0.08338[/C][C] 0.04169[/C][/ROW]
[ROW][C]53[/C][C] 0.98[/C][C] 0.0399[/C][C] 0.01995[/C][/ROW]
[ROW][C]54[/C][C] 0.9775[/C][C] 0.045[/C][C] 0.0225[/C][/ROW]
[ROW][C]55[/C][C] 0.9717[/C][C] 0.05667[/C][C] 0.02833[/C][/ROW]
[ROW][C]56[/C][C] 0.9619[/C][C] 0.07617[/C][C] 0.03808[/C][/ROW]
[ROW][C]57[/C][C] 0.9668[/C][C] 0.0663[/C][C] 0.03315[/C][/ROW]
[ROW][C]58[/C][C] 0.9732[/C][C] 0.0535[/C][C] 0.02675[/C][/ROW]
[ROW][C]59[/C][C] 0.9631[/C][C] 0.07378[/C][C] 0.03689[/C][/ROW]
[ROW][C]60[/C][C] 0.965[/C][C] 0.07005[/C][C] 0.03503[/C][/ROW]
[ROW][C]61[/C][C] 0.9604[/C][C] 0.07925[/C][C] 0.03963[/C][/ROW]
[ROW][C]62[/C][C] 0.9494[/C][C] 0.1012[/C][C] 0.05061[/C][/ROW]
[ROW][C]63[/C][C] 0.9383[/C][C] 0.1235[/C][C] 0.06174[/C][/ROW]
[ROW][C]64[/C][C] 0.9348[/C][C] 0.1304[/C][C] 0.06522[/C][/ROW]
[ROW][C]65[/C][C] 0.9172[/C][C] 0.1657[/C][C] 0.08283[/C][/ROW]
[ROW][C]66[/C][C] 0.8909[/C][C] 0.2182[/C][C] 0.1091[/C][/ROW]
[ROW][C]67[/C][C] 0.8757[/C][C] 0.2485[/C][C] 0.1243[/C][/ROW]
[ROW][C]68[/C][C] 0.8469[/C][C] 0.3061[/C][C] 0.1531[/C][/ROW]
[ROW][C]69[/C][C] 0.8577[/C][C] 0.2845[/C][C] 0.1423[/C][/ROW]
[ROW][C]70[/C][C] 0.8381[/C][C] 0.3238[/C][C] 0.1619[/C][/ROW]
[ROW][C]71[/C][C] 0.8308[/C][C] 0.3384[/C][C] 0.1692[/C][/ROW]
[ROW][C]72[/C][C] 0.7907[/C][C] 0.4186[/C][C] 0.2093[/C][/ROW]
[ROW][C]73[/C][C] 0.8242[/C][C] 0.3515[/C][C] 0.1758[/C][/ROW]
[ROW][C]74[/C][C] 0.7783[/C][C] 0.4434[/C][C] 0.2217[/C][/ROW]
[ROW][C]75[/C][C] 0.7283[/C][C] 0.5434[/C][C] 0.2717[/C][/ROW]
[ROW][C]76[/C][C] 0.8236[/C][C] 0.3529[/C][C] 0.1764[/C][/ROW]
[ROW][C]77[/C][C] 0.8638[/C][C] 0.2724[/C][C] 0.1362[/C][/ROW]
[ROW][C]78[/C][C] 0.8376[/C][C] 0.3249[/C][C] 0.1624[/C][/ROW]
[ROW][C]79[/C][C] 0.7852[/C][C] 0.4296[/C][C] 0.2148[/C][/ROW]
[ROW][C]80[/C][C] 0.7948[/C][C] 0.4104[/C][C] 0.2052[/C][/ROW]
[ROW][C]81[/C][C] 0.8175[/C][C] 0.365[/C][C] 0.1825[/C][/ROW]
[ROW][C]82[/C][C] 0.7598[/C][C] 0.4804[/C][C] 0.2402[/C][/ROW]
[ROW][C]83[/C][C] 0.7729[/C][C] 0.4543[/C][C] 0.2271[/C][/ROW]
[ROW][C]84[/C][C] 0.7139[/C][C] 0.5722[/C][C] 0.2861[/C][/ROW]
[ROW][C]85[/C][C] 0.6255[/C][C] 0.749[/C][C] 0.3745[/C][/ROW]
[ROW][C]86[/C][C] 0.5351[/C][C] 0.9299[/C][C] 0.4649[/C][/ROW]
[ROW][C]87[/C][C] 0.5456[/C][C] 0.9088[/C][C] 0.4544[/C][/ROW]
[ROW][C]88[/C][C] 0.4885[/C][C] 0.9771[/C][C] 0.5115[/C][/ROW]
[ROW][C]89[/C][C] 0.3731[/C][C] 0.7461[/C][C] 0.6269[/C][/ROW]
[ROW][C]90[/C][C] 0.647[/C][C] 0.7059[/C][C] 0.353[/C][/ROW]
[ROW][C]91[/C][C] 0.612[/C][C] 0.776[/C][C] 0.388[/C][/ROW]
[ROW][C]92[/C][C] 0.6172[/C][C] 0.7656[/C][C] 0.3828[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=304574&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304574&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
7 0.624 0.7519 0.376
8 0.5756 0.8488 0.4244
9 0.698 0.604 0.302
10 0.8014 0.3971 0.1986
11 0.746 0.5081 0.254
12 0.7518 0.4963 0.2482
13 0.6755 0.649 0.3245
14 0.6009 0.7981 0.3991
15 0.6119 0.7762 0.3881
16 0.5248 0.9505 0.4752
17 0.4583 0.9166 0.5417
18 0.4446 0.8892 0.5554
19 0.4489 0.8979 0.5511
20 0.3917 0.7833 0.6083
21 0.372 0.7441 0.628
22 0.3225 0.645 0.6775
23 0.2991 0.5982 0.7009
24 0.2471 0.4942 0.7529
25 0.2372 0.4744 0.7628
26 0.2017 0.4033 0.7983
27 0.9863 0.02747 0.01374
28 0.9808 0.03835 0.01917
29 0.9735 0.05295 0.02648
30 0.9634 0.07311 0.03655
31 0.9508 0.09847 0.04924
32 0.9588 0.08237 0.04118
33 0.9463 0.1074 0.05372
34 0.9597 0.08059 0.0403
35 0.9469 0.1061 0.05307
36 0.9923 0.01535 0.007675
37 0.9893 0.02134 0.01067
38 0.9844 0.03121 0.01561
39 0.9839 0.03218 0.01609
40 0.9827 0.03465 0.01733
41 0.9757 0.04868 0.02434
42 0.9736 0.05274 0.02637
43 0.9643 0.07138 0.03569
44 0.9876 0.0247 0.01235
45 0.9822 0.03555 0.01778
46 0.9794 0.04117 0.02058
47 0.9797 0.04058 0.02029
48 0.9731 0.05375 0.02687
49 0.9778 0.04441 0.0222
50 0.9752 0.04967 0.02483
51 0.9685 0.06303 0.03152
52 0.9583 0.08338 0.04169
53 0.98 0.0399 0.01995
54 0.9775 0.045 0.0225
55 0.9717 0.05667 0.02833
56 0.9619 0.07617 0.03808
57 0.9668 0.0663 0.03315
58 0.9732 0.0535 0.02675
59 0.9631 0.07378 0.03689
60 0.965 0.07005 0.03503
61 0.9604 0.07925 0.03963
62 0.9494 0.1012 0.05061
63 0.9383 0.1235 0.06174
64 0.9348 0.1304 0.06522
65 0.9172 0.1657 0.08283
66 0.8909 0.2182 0.1091
67 0.8757 0.2485 0.1243
68 0.8469 0.3061 0.1531
69 0.8577 0.2845 0.1423
70 0.8381 0.3238 0.1619
71 0.8308 0.3384 0.1692
72 0.7907 0.4186 0.2093
73 0.8242 0.3515 0.1758
74 0.7783 0.4434 0.2217
75 0.7283 0.5434 0.2717
76 0.8236 0.3529 0.1764
77 0.8638 0.2724 0.1362
78 0.8376 0.3249 0.1624
79 0.7852 0.4296 0.2148
80 0.7948 0.4104 0.2052
81 0.8175 0.365 0.1825
82 0.7598 0.4804 0.2402
83 0.7729 0.4543 0.2271
84 0.7139 0.5722 0.2861
85 0.6255 0.749 0.3745
86 0.5351 0.9299 0.4649
87 0.5456 0.9088 0.4544
88 0.4885 0.9771 0.5115
89 0.3731 0.7461 0.6269
90 0.647 0.7059 0.353
91 0.612 0.776 0.388
92 0.6172 0.7656 0.3828







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level160.186047NOK
10% type I error level330.383721NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 16 & 0.186047 & NOK \tabularnewline
10% type I error level & 33 & 0.383721 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=304574&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]16[/C][C]0.186047[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]33[/C][C]0.383721[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=304574&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304574&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level160.186047NOK
10% type I error level330.383721NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.0868, df1 = 2, df2 = 93, p-value = 0.3415
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3994, df1 = 6, df2 = 89, p-value = 0.2237
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.2832, df1 = 2, df2 = 93, p-value = 0.282

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.0868, df1 = 2, df2 = 93, p-value = 0.3415
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3994, df1 = 6, df2 = 89, p-value = 0.2237
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.2832, df1 = 2, df2 = 93, p-value = 0.282
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=304574&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.0868, df1 = 2, df2 = 93, p-value = 0.3415
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3994, df1 = 6, df2 = 89, p-value = 0.2237
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.2832, df1 = 2, df2 = 93, p-value = 0.282
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=304574&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304574&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.0868, df1 = 2, df2 = 93, p-value = 0.3415
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3994, df1 = 6, df2 = 89, p-value = 0.2237
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.2832, df1 = 2, df2 = 93, p-value = 0.282







Variance Inflation Factors (Multicollinearity)
> vif
Bevr_Leeftijd          TVDC      SKEOUSUM 
     1.002095      1.276005      1.277438 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
Bevr_Leeftijd          TVDC      SKEOUSUM 
     1.002095      1.276005      1.277438 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=304574&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
Bevr_Leeftijd          TVDC      SKEOUSUM 
     1.002095      1.276005      1.277438 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=304574&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304574&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
Bevr_Leeftijd          TVDC      SKEOUSUM 
     1.002095      1.276005      1.277438 



Parameters (Session):
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = ; par5 = ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')