Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 25 Jan 2017 09:26:00 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2017/Jan/25/t1485332785ltqmg6ef675ggtp.htm/, Retrieved Tue, 14 May 2024 00:23:41 +0200
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=, Retrieved Tue, 14 May 2024 00:23:41 +0200
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact0
Dataseries X:
4 2
5 3
4 4
3 4
4 4
3 4
3 4
3 4
4 5
4 5
4 4
4 4
4 4
3 3
4 4
3 4
3 4
NA NA
5 5
4 4
3 4
4 4
4 4
4 4
4 4
3 4
3 4
4 4
2 4
5 4
4 3
4 5
5 4
4 3
2 3
4 5
3 4
4 3
4 3
4 4
5 4
4 5
3 3
5 5
5 4
4 4
4 4
3 5
4 4
2 3
4 5
5 5
5 5
4 3
4 3
4 4
3 4
3 4
4 4
4 4
5 5
2 4
4 4
3 4
4 4
4 2
4 4
4 4
5 4
3 4
3 4
4 5
4 4
4 4
4 4
3 4
4 4
3 4
3 3
4 3
4 4
3 3
4 4
4 4
4 4
5 4
5 4
4 4
3 4
3 NA
4 2
4 4
4 4
4 4
4 5
3 4
4 4
5 4
5 4
4 5
3 4
5 3
4 4
5 4
3 4
5 4
4 4
4 4
4 4
4 4
3 4
4 4
4 4
3 3
4 4
3 4
4 4
5 4
5 4
4 4
4 4
3 4
4 4
4 4
4 5
3 4
4 4
4 4
3 4
4 4
3 2
4 4
5 4
2 4
3 3
4 4
5 5
NA NA
4 5
5 5
4 5
4 4
3 4
4 4
4 4
4 4
4 4
5 4
4 3
4 4
3 3
4 5
4 4
4 4
3 4
4 4
5 4
4 4
2 3
4 4
4 3
4 4
4 5
5 4
5 4
3 3
4 4
4 4
2 3




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R ServerBig Analytics Cloud Computing Center
R Engine error message
Error in vif.default(mylm) : model contains fewer than 2 terms
Calls: vif -> vif.default
Execution halted

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time6 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
R Engine error message & 
Error in vif.default(mylm) : model contains fewer than 2 terms
Calls: vif -> vif.default
Execution halted
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]6 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [ROW]R Engine error message[/C][C]
Error in vif.default(mylm) : model contains fewer than 2 terms
Calls: vif -> vif.default
Execution halted
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R ServerBig Analytics Cloud Computing Center
R Engine error message
Error in vif.default(mylm) : model contains fewer than 2 terms
Calls: vif -> vif.default
Execution halted







Multiple Linear Regression - Estimated Regression Equation
SKEOU1[t] = + 2.50324 + 0.341165SKEOU2[t] + e[t]
Warning: you did not specify the column number of the endogenous series! The first column was selected by default.

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
SKEOU1[t] =  +  2.50324 +  0.341165SKEOU2[t]  + e[t] \tabularnewline
Warning: you did not specify the column number of the endogenous series! The first column was selected by default. \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]SKEOU1[t] =  +  2.50324 +  0.341165SKEOU2[t]  + e[t][/C][/ROW]
[ROW][C]Warning: you did not specify the column number of the endogenous series! The first column was selected by default.[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
SKEOU1[t] = + 2.50324 + 0.341165SKEOU2[t] + e[t]
Warning: you did not specify the column number of the endogenous series! The first column was selected by default.







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+2.503 0.3633+6.8910e+00 1.133e-10 5.664e-11
SKEOU2+0.3412 0.09101+3.7490e+00 0.0002459 0.000123

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +2.503 &  0.3633 & +6.8910e+00 &  1.133e-10 &  5.664e-11 \tabularnewline
SKEOU2 & +0.3412 &  0.09101 & +3.7490e+00 &  0.0002459 &  0.000123 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+2.503[/C][C] 0.3633[/C][C]+6.8910e+00[/C][C] 1.133e-10[/C][C] 5.664e-11[/C][/ROW]
[ROW][C]SKEOU2[/C][C]+0.3412[/C][C] 0.09101[/C][C]+3.7490e+00[/C][C] 0.0002459[/C][C] 0.000123[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+2.503 0.3633+6.8910e+00 1.133e-10 5.664e-11
SKEOU2+0.3412 0.09101+3.7490e+00 0.0002459 0.000123







Multiple Linear Regression - Regression Statistics
Multiple R 0.2809
R-squared 0.07893
Adjusted R-squared 0.07331
F-TEST (value) 14.05
F-TEST (DF numerator)1
F-TEST (DF denominator)164
p-value 0.0002459
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 0.7079
Sum Squared Residuals 82.19

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.2809 \tabularnewline
R-squared &  0.07893 \tabularnewline
Adjusted R-squared &  0.07331 \tabularnewline
F-TEST (value) &  14.05 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 164 \tabularnewline
p-value &  0.0002459 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  0.7079 \tabularnewline
Sum Squared Residuals &  82.19 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.2809[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.07893[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.07331[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 14.05[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]164[/C][/ROW]
[ROW][C]p-value[/C][C] 0.0002459[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 0.7079[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 82.19[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.2809
R-squared 0.07893
Adjusted R-squared 0.07331
F-TEST (value) 14.05
F-TEST (DF numerator)1
F-TEST (DF denominator)164
p-value 0.0002459
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 0.7079
Sum Squared Residuals 82.19







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 4 3.186 0.8144
2 5 3.527 1.473
3 4 3.868 0.1321
4 3 3.868-0.8679
5 4 3.868 0.1321
6 3 3.868-0.8679
7 3 3.868-0.8679
8 3 3.868-0.8679
9 4 4.209-0.2091
10 4 4.209-0.2091
11 4 3.868 0.1321
12 4 3.868 0.1321
13 4 3.868 0.1321
14 3 3.527-0.5267
15 4 3.868 0.1321
16 3 3.868-0.8679
17 3 3.868-0.8679
18 5 4.209 0.7909
19 4 3.868 0.1321
20 3 3.868-0.8679
21 4 3.868 0.1321
22 4 3.868 0.1321
23 4 3.868 0.1321
24 4 3.868 0.1321
25 3 3.868-0.8679
26 3 3.868-0.8679
27 4 3.868 0.1321
28 2 3.868-1.868
29 5 3.868 1.132
30 4 3.527 0.4733
31 4 4.209-0.2091
32 5 3.868 1.132
33 4 3.527 0.4733
34 2 3.527-1.527
35 4 4.209-0.2091
36 3 3.868-0.8679
37 4 3.527 0.4733
38 4 3.527 0.4733
39 4 3.868 0.1321
40 5 3.868 1.132
41 4 4.209-0.2091
42 3 3.527-0.5267
43 5 4.209 0.7909
44 5 3.868 1.132
45 4 3.868 0.1321
46 4 3.868 0.1321
47 3 4.209-1.209
48 4 3.868 0.1321
49 2 3.527-1.527
50 4 4.209-0.2091
51 5 4.209 0.7909
52 5 4.209 0.7909
53 4 3.527 0.4733
54 4 3.527 0.4733
55 4 3.868 0.1321
56 3 3.868-0.8679
57 3 3.868-0.8679
58 4 3.868 0.1321
59 4 3.868 0.1321
60 5 4.209 0.7909
61 2 3.868-1.868
62 4 3.868 0.1321
63 3 3.868-0.8679
64 4 3.868 0.1321
65 4 3.186 0.8144
66 4 3.868 0.1321
67 4 3.868 0.1321
68 5 3.868 1.132
69 3 3.868-0.8679
70 3 3.868-0.8679
71 4 4.209-0.2091
72 4 3.868 0.1321
73 4 3.868 0.1321
74 4 3.868 0.1321
75 3 3.868-0.8679
76 4 3.868 0.1321
77 3 3.868-0.8679
78 3 3.527-0.5267
79 4 3.527 0.4733
80 4 3.868 0.1321
81 3 3.527-0.5267
82 4 3.868 0.1321
83 4 3.868 0.1321
84 4 3.868 0.1321
85 5 3.868 1.132
86 5 3.868 1.132
87 4 3.868 0.1321
88 3 3.868-0.8679
89 4 3.186 0.8144
90 4 3.868 0.1321
91 4 3.868 0.1321
92 4 3.868 0.1321
93 4 4.209-0.2091
94 3 3.868-0.8679
95 4 3.868 0.1321
96 5 3.868 1.132
97 5 3.868 1.132
98 4 4.209-0.2091
99 3 3.868-0.8679
100 5 3.527 1.473
101 4 3.868 0.1321
102 5 3.868 1.132
103 3 3.868-0.8679
104 5 3.868 1.132
105 4 3.868 0.1321
106 4 3.868 0.1321
107 4 3.868 0.1321
108 4 3.868 0.1321
109 3 3.868-0.8679
110 4 3.868 0.1321
111 4 3.868 0.1321
112 3 3.527-0.5267
113 4 3.868 0.1321
114 3 3.868-0.8679
115 4 3.868 0.1321
116 5 3.868 1.132
117 5 3.868 1.132
118 4 3.868 0.1321
119 4 3.868 0.1321
120 3 3.868-0.8679
121 4 3.868 0.1321
122 4 3.868 0.1321
123 4 4.209-0.2091
124 3 3.868-0.8679
125 4 3.868 0.1321
126 4 3.868 0.1321
127 3 3.868-0.8679
128 4 3.868 0.1321
129 3 3.186-0.1856
130 4 3.868 0.1321
131 5 3.868 1.132
132 2 3.868-1.868
133 3 3.527-0.5267
134 4 3.868 0.1321
135 5 4.209 0.7909
136 4 4.209-0.2091
137 5 4.209 0.7909
138 4 4.209-0.2091
139 4 3.868 0.1321
140 3 3.868-0.8679
141 4 3.868 0.1321
142 4 3.868 0.1321
143 4 3.868 0.1321
144 4 3.868 0.1321
145 5 3.868 1.132
146 4 3.527 0.4733
147 4 3.868 0.1321
148 3 3.527-0.5267
149 4 4.209-0.2091
150 4 3.868 0.1321
151 4 3.868 0.1321
152 3 3.868-0.8679
153 4 3.868 0.1321
154 5 3.868 1.132
155 4 3.868 0.1321
156 2 3.527-1.527
157 4 3.868 0.1321
158 4 3.527 0.4733
159 4 3.868 0.1321
160 4 4.209-0.2091
161 5 3.868 1.132
162 5 3.868 1.132
163 3 3.527-0.5267
164 4 3.868 0.1321
165 4 3.868 0.1321
166 2 3.527-1.527

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  4 &  3.186 &  0.8144 \tabularnewline
2 &  5 &  3.527 &  1.473 \tabularnewline
3 &  4 &  3.868 &  0.1321 \tabularnewline
4 &  3 &  3.868 & -0.8679 \tabularnewline
5 &  4 &  3.868 &  0.1321 \tabularnewline
6 &  3 &  3.868 & -0.8679 \tabularnewline
7 &  3 &  3.868 & -0.8679 \tabularnewline
8 &  3 &  3.868 & -0.8679 \tabularnewline
9 &  4 &  4.209 & -0.2091 \tabularnewline
10 &  4 &  4.209 & -0.2091 \tabularnewline
11 &  4 &  3.868 &  0.1321 \tabularnewline
12 &  4 &  3.868 &  0.1321 \tabularnewline
13 &  4 &  3.868 &  0.1321 \tabularnewline
14 &  3 &  3.527 & -0.5267 \tabularnewline
15 &  4 &  3.868 &  0.1321 \tabularnewline
16 &  3 &  3.868 & -0.8679 \tabularnewline
17 &  3 &  3.868 & -0.8679 \tabularnewline
18 &  5 &  4.209 &  0.7909 \tabularnewline
19 &  4 &  3.868 &  0.1321 \tabularnewline
20 &  3 &  3.868 & -0.8679 \tabularnewline
21 &  4 &  3.868 &  0.1321 \tabularnewline
22 &  4 &  3.868 &  0.1321 \tabularnewline
23 &  4 &  3.868 &  0.1321 \tabularnewline
24 &  4 &  3.868 &  0.1321 \tabularnewline
25 &  3 &  3.868 & -0.8679 \tabularnewline
26 &  3 &  3.868 & -0.8679 \tabularnewline
27 &  4 &  3.868 &  0.1321 \tabularnewline
28 &  2 &  3.868 & -1.868 \tabularnewline
29 &  5 &  3.868 &  1.132 \tabularnewline
30 &  4 &  3.527 &  0.4733 \tabularnewline
31 &  4 &  4.209 & -0.2091 \tabularnewline
32 &  5 &  3.868 &  1.132 \tabularnewline
33 &  4 &  3.527 &  0.4733 \tabularnewline
34 &  2 &  3.527 & -1.527 \tabularnewline
35 &  4 &  4.209 & -0.2091 \tabularnewline
36 &  3 &  3.868 & -0.8679 \tabularnewline
37 &  4 &  3.527 &  0.4733 \tabularnewline
38 &  4 &  3.527 &  0.4733 \tabularnewline
39 &  4 &  3.868 &  0.1321 \tabularnewline
40 &  5 &  3.868 &  1.132 \tabularnewline
41 &  4 &  4.209 & -0.2091 \tabularnewline
42 &  3 &  3.527 & -0.5267 \tabularnewline
43 &  5 &  4.209 &  0.7909 \tabularnewline
44 &  5 &  3.868 &  1.132 \tabularnewline
45 &  4 &  3.868 &  0.1321 \tabularnewline
46 &  4 &  3.868 &  0.1321 \tabularnewline
47 &  3 &  4.209 & -1.209 \tabularnewline
48 &  4 &  3.868 &  0.1321 \tabularnewline
49 &  2 &  3.527 & -1.527 \tabularnewline
50 &  4 &  4.209 & -0.2091 \tabularnewline
51 &  5 &  4.209 &  0.7909 \tabularnewline
52 &  5 &  4.209 &  0.7909 \tabularnewline
53 &  4 &  3.527 &  0.4733 \tabularnewline
54 &  4 &  3.527 &  0.4733 \tabularnewline
55 &  4 &  3.868 &  0.1321 \tabularnewline
56 &  3 &  3.868 & -0.8679 \tabularnewline
57 &  3 &  3.868 & -0.8679 \tabularnewline
58 &  4 &  3.868 &  0.1321 \tabularnewline
59 &  4 &  3.868 &  0.1321 \tabularnewline
60 &  5 &  4.209 &  0.7909 \tabularnewline
61 &  2 &  3.868 & -1.868 \tabularnewline
62 &  4 &  3.868 &  0.1321 \tabularnewline
63 &  3 &  3.868 & -0.8679 \tabularnewline
64 &  4 &  3.868 &  0.1321 \tabularnewline
65 &  4 &  3.186 &  0.8144 \tabularnewline
66 &  4 &  3.868 &  0.1321 \tabularnewline
67 &  4 &  3.868 &  0.1321 \tabularnewline
68 &  5 &  3.868 &  1.132 \tabularnewline
69 &  3 &  3.868 & -0.8679 \tabularnewline
70 &  3 &  3.868 & -0.8679 \tabularnewline
71 &  4 &  4.209 & -0.2091 \tabularnewline
72 &  4 &  3.868 &  0.1321 \tabularnewline
73 &  4 &  3.868 &  0.1321 \tabularnewline
74 &  4 &  3.868 &  0.1321 \tabularnewline
75 &  3 &  3.868 & -0.8679 \tabularnewline
76 &  4 &  3.868 &  0.1321 \tabularnewline
77 &  3 &  3.868 & -0.8679 \tabularnewline
78 &  3 &  3.527 & -0.5267 \tabularnewline
79 &  4 &  3.527 &  0.4733 \tabularnewline
80 &  4 &  3.868 &  0.1321 \tabularnewline
81 &  3 &  3.527 & -0.5267 \tabularnewline
82 &  4 &  3.868 &  0.1321 \tabularnewline
83 &  4 &  3.868 &  0.1321 \tabularnewline
84 &  4 &  3.868 &  0.1321 \tabularnewline
85 &  5 &  3.868 &  1.132 \tabularnewline
86 &  5 &  3.868 &  1.132 \tabularnewline
87 &  4 &  3.868 &  0.1321 \tabularnewline
88 &  3 &  3.868 & -0.8679 \tabularnewline
89 &  4 &  3.186 &  0.8144 \tabularnewline
90 &  4 &  3.868 &  0.1321 \tabularnewline
91 &  4 &  3.868 &  0.1321 \tabularnewline
92 &  4 &  3.868 &  0.1321 \tabularnewline
93 &  4 &  4.209 & -0.2091 \tabularnewline
94 &  3 &  3.868 & -0.8679 \tabularnewline
95 &  4 &  3.868 &  0.1321 \tabularnewline
96 &  5 &  3.868 &  1.132 \tabularnewline
97 &  5 &  3.868 &  1.132 \tabularnewline
98 &  4 &  4.209 & -0.2091 \tabularnewline
99 &  3 &  3.868 & -0.8679 \tabularnewline
100 &  5 &  3.527 &  1.473 \tabularnewline
101 &  4 &  3.868 &  0.1321 \tabularnewline
102 &  5 &  3.868 &  1.132 \tabularnewline
103 &  3 &  3.868 & -0.8679 \tabularnewline
104 &  5 &  3.868 &  1.132 \tabularnewline
105 &  4 &  3.868 &  0.1321 \tabularnewline
106 &  4 &  3.868 &  0.1321 \tabularnewline
107 &  4 &  3.868 &  0.1321 \tabularnewline
108 &  4 &  3.868 &  0.1321 \tabularnewline
109 &  3 &  3.868 & -0.8679 \tabularnewline
110 &  4 &  3.868 &  0.1321 \tabularnewline
111 &  4 &  3.868 &  0.1321 \tabularnewline
112 &  3 &  3.527 & -0.5267 \tabularnewline
113 &  4 &  3.868 &  0.1321 \tabularnewline
114 &  3 &  3.868 & -0.8679 \tabularnewline
115 &  4 &  3.868 &  0.1321 \tabularnewline
116 &  5 &  3.868 &  1.132 \tabularnewline
117 &  5 &  3.868 &  1.132 \tabularnewline
118 &  4 &  3.868 &  0.1321 \tabularnewline
119 &  4 &  3.868 &  0.1321 \tabularnewline
120 &  3 &  3.868 & -0.8679 \tabularnewline
121 &  4 &  3.868 &  0.1321 \tabularnewline
122 &  4 &  3.868 &  0.1321 \tabularnewline
123 &  4 &  4.209 & -0.2091 \tabularnewline
124 &  3 &  3.868 & -0.8679 \tabularnewline
125 &  4 &  3.868 &  0.1321 \tabularnewline
126 &  4 &  3.868 &  0.1321 \tabularnewline
127 &  3 &  3.868 & -0.8679 \tabularnewline
128 &  4 &  3.868 &  0.1321 \tabularnewline
129 &  3 &  3.186 & -0.1856 \tabularnewline
130 &  4 &  3.868 &  0.1321 \tabularnewline
131 &  5 &  3.868 &  1.132 \tabularnewline
132 &  2 &  3.868 & -1.868 \tabularnewline
133 &  3 &  3.527 & -0.5267 \tabularnewline
134 &  4 &  3.868 &  0.1321 \tabularnewline
135 &  5 &  4.209 &  0.7909 \tabularnewline
136 &  4 &  4.209 & -0.2091 \tabularnewline
137 &  5 &  4.209 &  0.7909 \tabularnewline
138 &  4 &  4.209 & -0.2091 \tabularnewline
139 &  4 &  3.868 &  0.1321 \tabularnewline
140 &  3 &  3.868 & -0.8679 \tabularnewline
141 &  4 &  3.868 &  0.1321 \tabularnewline
142 &  4 &  3.868 &  0.1321 \tabularnewline
143 &  4 &  3.868 &  0.1321 \tabularnewline
144 &  4 &  3.868 &  0.1321 \tabularnewline
145 &  5 &  3.868 &  1.132 \tabularnewline
146 &  4 &  3.527 &  0.4733 \tabularnewline
147 &  4 &  3.868 &  0.1321 \tabularnewline
148 &  3 &  3.527 & -0.5267 \tabularnewline
149 &  4 &  4.209 & -0.2091 \tabularnewline
150 &  4 &  3.868 &  0.1321 \tabularnewline
151 &  4 &  3.868 &  0.1321 \tabularnewline
152 &  3 &  3.868 & -0.8679 \tabularnewline
153 &  4 &  3.868 &  0.1321 \tabularnewline
154 &  5 &  3.868 &  1.132 \tabularnewline
155 &  4 &  3.868 &  0.1321 \tabularnewline
156 &  2 &  3.527 & -1.527 \tabularnewline
157 &  4 &  3.868 &  0.1321 \tabularnewline
158 &  4 &  3.527 &  0.4733 \tabularnewline
159 &  4 &  3.868 &  0.1321 \tabularnewline
160 &  4 &  4.209 & -0.2091 \tabularnewline
161 &  5 &  3.868 &  1.132 \tabularnewline
162 &  5 &  3.868 &  1.132 \tabularnewline
163 &  3 &  3.527 & -0.5267 \tabularnewline
164 &  4 &  3.868 &  0.1321 \tabularnewline
165 &  4 &  3.868 &  0.1321 \tabularnewline
166 &  2 &  3.527 & -1.527 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 4[/C][C] 3.186[/C][C] 0.8144[/C][/ROW]
[ROW][C]2[/C][C] 5[/C][C] 3.527[/C][C] 1.473[/C][/ROW]
[ROW][C]3[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]4[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]5[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]6[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]7[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]8[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]9[/C][C] 4[/C][C] 4.209[/C][C]-0.2091[/C][/ROW]
[ROW][C]10[/C][C] 4[/C][C] 4.209[/C][C]-0.2091[/C][/ROW]
[ROW][C]11[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]12[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]13[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]14[/C][C] 3[/C][C] 3.527[/C][C]-0.5267[/C][/ROW]
[ROW][C]15[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]16[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]17[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]18[/C][C] 5[/C][C] 4.209[/C][C] 0.7909[/C][/ROW]
[ROW][C]19[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]20[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]21[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]22[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]23[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]24[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]25[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]26[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]27[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]28[/C][C] 2[/C][C] 3.868[/C][C]-1.868[/C][/ROW]
[ROW][C]29[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]30[/C][C] 4[/C][C] 3.527[/C][C] 0.4733[/C][/ROW]
[ROW][C]31[/C][C] 4[/C][C] 4.209[/C][C]-0.2091[/C][/ROW]
[ROW][C]32[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]33[/C][C] 4[/C][C] 3.527[/C][C] 0.4733[/C][/ROW]
[ROW][C]34[/C][C] 2[/C][C] 3.527[/C][C]-1.527[/C][/ROW]
[ROW][C]35[/C][C] 4[/C][C] 4.209[/C][C]-0.2091[/C][/ROW]
[ROW][C]36[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]37[/C][C] 4[/C][C] 3.527[/C][C] 0.4733[/C][/ROW]
[ROW][C]38[/C][C] 4[/C][C] 3.527[/C][C] 0.4733[/C][/ROW]
[ROW][C]39[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]40[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]41[/C][C] 4[/C][C] 4.209[/C][C]-0.2091[/C][/ROW]
[ROW][C]42[/C][C] 3[/C][C] 3.527[/C][C]-0.5267[/C][/ROW]
[ROW][C]43[/C][C] 5[/C][C] 4.209[/C][C] 0.7909[/C][/ROW]
[ROW][C]44[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]45[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]46[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]47[/C][C] 3[/C][C] 4.209[/C][C]-1.209[/C][/ROW]
[ROW][C]48[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]49[/C][C] 2[/C][C] 3.527[/C][C]-1.527[/C][/ROW]
[ROW][C]50[/C][C] 4[/C][C] 4.209[/C][C]-0.2091[/C][/ROW]
[ROW][C]51[/C][C] 5[/C][C] 4.209[/C][C] 0.7909[/C][/ROW]
[ROW][C]52[/C][C] 5[/C][C] 4.209[/C][C] 0.7909[/C][/ROW]
[ROW][C]53[/C][C] 4[/C][C] 3.527[/C][C] 0.4733[/C][/ROW]
[ROW][C]54[/C][C] 4[/C][C] 3.527[/C][C] 0.4733[/C][/ROW]
[ROW][C]55[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]56[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]57[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]58[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]59[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]60[/C][C] 5[/C][C] 4.209[/C][C] 0.7909[/C][/ROW]
[ROW][C]61[/C][C] 2[/C][C] 3.868[/C][C]-1.868[/C][/ROW]
[ROW][C]62[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]63[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]64[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]65[/C][C] 4[/C][C] 3.186[/C][C] 0.8144[/C][/ROW]
[ROW][C]66[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]67[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]68[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]69[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]70[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]71[/C][C] 4[/C][C] 4.209[/C][C]-0.2091[/C][/ROW]
[ROW][C]72[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]73[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]74[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]75[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]76[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]77[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]78[/C][C] 3[/C][C] 3.527[/C][C]-0.5267[/C][/ROW]
[ROW][C]79[/C][C] 4[/C][C] 3.527[/C][C] 0.4733[/C][/ROW]
[ROW][C]80[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]81[/C][C] 3[/C][C] 3.527[/C][C]-0.5267[/C][/ROW]
[ROW][C]82[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]83[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]84[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]85[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]86[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]87[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]88[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]89[/C][C] 4[/C][C] 3.186[/C][C] 0.8144[/C][/ROW]
[ROW][C]90[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]91[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]92[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]93[/C][C] 4[/C][C] 4.209[/C][C]-0.2091[/C][/ROW]
[ROW][C]94[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]95[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]96[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]97[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]98[/C][C] 4[/C][C] 4.209[/C][C]-0.2091[/C][/ROW]
[ROW][C]99[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]100[/C][C] 5[/C][C] 3.527[/C][C] 1.473[/C][/ROW]
[ROW][C]101[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]102[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]103[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]104[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]105[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]106[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]107[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]108[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]109[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]110[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]111[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]112[/C][C] 3[/C][C] 3.527[/C][C]-0.5267[/C][/ROW]
[ROW][C]113[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]114[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]115[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]116[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]117[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]118[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]119[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]120[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]121[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]122[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]123[/C][C] 4[/C][C] 4.209[/C][C]-0.2091[/C][/ROW]
[ROW][C]124[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]125[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]126[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]127[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]128[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]129[/C][C] 3[/C][C] 3.186[/C][C]-0.1856[/C][/ROW]
[ROW][C]130[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]131[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]132[/C][C] 2[/C][C] 3.868[/C][C]-1.868[/C][/ROW]
[ROW][C]133[/C][C] 3[/C][C] 3.527[/C][C]-0.5267[/C][/ROW]
[ROW][C]134[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]135[/C][C] 5[/C][C] 4.209[/C][C] 0.7909[/C][/ROW]
[ROW][C]136[/C][C] 4[/C][C] 4.209[/C][C]-0.2091[/C][/ROW]
[ROW][C]137[/C][C] 5[/C][C] 4.209[/C][C] 0.7909[/C][/ROW]
[ROW][C]138[/C][C] 4[/C][C] 4.209[/C][C]-0.2091[/C][/ROW]
[ROW][C]139[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]140[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]141[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]142[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]143[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]144[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]145[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]146[/C][C] 4[/C][C] 3.527[/C][C] 0.4733[/C][/ROW]
[ROW][C]147[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]148[/C][C] 3[/C][C] 3.527[/C][C]-0.5267[/C][/ROW]
[ROW][C]149[/C][C] 4[/C][C] 4.209[/C][C]-0.2091[/C][/ROW]
[ROW][C]150[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]151[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]152[/C][C] 3[/C][C] 3.868[/C][C]-0.8679[/C][/ROW]
[ROW][C]153[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]154[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]155[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]156[/C][C] 2[/C][C] 3.527[/C][C]-1.527[/C][/ROW]
[ROW][C]157[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]158[/C][C] 4[/C][C] 3.527[/C][C] 0.4733[/C][/ROW]
[ROW][C]159[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]160[/C][C] 4[/C][C] 4.209[/C][C]-0.2091[/C][/ROW]
[ROW][C]161[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]162[/C][C] 5[/C][C] 3.868[/C][C] 1.132[/C][/ROW]
[ROW][C]163[/C][C] 3[/C][C] 3.527[/C][C]-0.5267[/C][/ROW]
[ROW][C]164[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]165[/C][C] 4[/C][C] 3.868[/C][C] 0.1321[/C][/ROW]
[ROW][C]166[/C][C] 2[/C][C] 3.527[/C][C]-1.527[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 4 3.186 0.8144
2 5 3.527 1.473
3 4 3.868 0.1321
4 3 3.868-0.8679
5 4 3.868 0.1321
6 3 3.868-0.8679
7 3 3.868-0.8679
8 3 3.868-0.8679
9 4 4.209-0.2091
10 4 4.209-0.2091
11 4 3.868 0.1321
12 4 3.868 0.1321
13 4 3.868 0.1321
14 3 3.527-0.5267
15 4 3.868 0.1321
16 3 3.868-0.8679
17 3 3.868-0.8679
18 5 4.209 0.7909
19 4 3.868 0.1321
20 3 3.868-0.8679
21 4 3.868 0.1321
22 4 3.868 0.1321
23 4 3.868 0.1321
24 4 3.868 0.1321
25 3 3.868-0.8679
26 3 3.868-0.8679
27 4 3.868 0.1321
28 2 3.868-1.868
29 5 3.868 1.132
30 4 3.527 0.4733
31 4 4.209-0.2091
32 5 3.868 1.132
33 4 3.527 0.4733
34 2 3.527-1.527
35 4 4.209-0.2091
36 3 3.868-0.8679
37 4 3.527 0.4733
38 4 3.527 0.4733
39 4 3.868 0.1321
40 5 3.868 1.132
41 4 4.209-0.2091
42 3 3.527-0.5267
43 5 4.209 0.7909
44 5 3.868 1.132
45 4 3.868 0.1321
46 4 3.868 0.1321
47 3 4.209-1.209
48 4 3.868 0.1321
49 2 3.527-1.527
50 4 4.209-0.2091
51 5 4.209 0.7909
52 5 4.209 0.7909
53 4 3.527 0.4733
54 4 3.527 0.4733
55 4 3.868 0.1321
56 3 3.868-0.8679
57 3 3.868-0.8679
58 4 3.868 0.1321
59 4 3.868 0.1321
60 5 4.209 0.7909
61 2 3.868-1.868
62 4 3.868 0.1321
63 3 3.868-0.8679
64 4 3.868 0.1321
65 4 3.186 0.8144
66 4 3.868 0.1321
67 4 3.868 0.1321
68 5 3.868 1.132
69 3 3.868-0.8679
70 3 3.868-0.8679
71 4 4.209-0.2091
72 4 3.868 0.1321
73 4 3.868 0.1321
74 4 3.868 0.1321
75 3 3.868-0.8679
76 4 3.868 0.1321
77 3 3.868-0.8679
78 3 3.527-0.5267
79 4 3.527 0.4733
80 4 3.868 0.1321
81 3 3.527-0.5267
82 4 3.868 0.1321
83 4 3.868 0.1321
84 4 3.868 0.1321
85 5 3.868 1.132
86 5 3.868 1.132
87 4 3.868 0.1321
88 3 3.868-0.8679
89 4 3.186 0.8144
90 4 3.868 0.1321
91 4 3.868 0.1321
92 4 3.868 0.1321
93 4 4.209-0.2091
94 3 3.868-0.8679
95 4 3.868 0.1321
96 5 3.868 1.132
97 5 3.868 1.132
98 4 4.209-0.2091
99 3 3.868-0.8679
100 5 3.527 1.473
101 4 3.868 0.1321
102 5 3.868 1.132
103 3 3.868-0.8679
104 5 3.868 1.132
105 4 3.868 0.1321
106 4 3.868 0.1321
107 4 3.868 0.1321
108 4 3.868 0.1321
109 3 3.868-0.8679
110 4 3.868 0.1321
111 4 3.868 0.1321
112 3 3.527-0.5267
113 4 3.868 0.1321
114 3 3.868-0.8679
115 4 3.868 0.1321
116 5 3.868 1.132
117 5 3.868 1.132
118 4 3.868 0.1321
119 4 3.868 0.1321
120 3 3.868-0.8679
121 4 3.868 0.1321
122 4 3.868 0.1321
123 4 4.209-0.2091
124 3 3.868-0.8679
125 4 3.868 0.1321
126 4 3.868 0.1321
127 3 3.868-0.8679
128 4 3.868 0.1321
129 3 3.186-0.1856
130 4 3.868 0.1321
131 5 3.868 1.132
132 2 3.868-1.868
133 3 3.527-0.5267
134 4 3.868 0.1321
135 5 4.209 0.7909
136 4 4.209-0.2091
137 5 4.209 0.7909
138 4 4.209-0.2091
139 4 3.868 0.1321
140 3 3.868-0.8679
141 4 3.868 0.1321
142 4 3.868 0.1321
143 4 3.868 0.1321
144 4 3.868 0.1321
145 5 3.868 1.132
146 4 3.527 0.4733
147 4 3.868 0.1321
148 3 3.527-0.5267
149 4 4.209-0.2091
150 4 3.868 0.1321
151 4 3.868 0.1321
152 3 3.868-0.8679
153 4 3.868 0.1321
154 5 3.868 1.132
155 4 3.868 0.1321
156 2 3.527-1.527
157 4 3.868 0.1321
158 4 3.527 0.4733
159 4 3.868 0.1321
160 4 4.209-0.2091
161 5 3.868 1.132
162 5 3.868 1.132
163 3 3.527-0.5267
164 4 3.868 0.1321
165 4 3.868 0.1321
166 2 3.527-1.527







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
5 0.6657 0.6687 0.3343
6 0.6483 0.7034 0.3517
7 0.5927 0.8145 0.4073
8 0.5216 0.9568 0.4784
9 0.58 0.8401 0.42
10 0.5437 0.9125 0.4563
11 0.4585 0.917 0.5415
12 0.3763 0.7526 0.6237
13 0.3005 0.6011 0.6995
14 0.345 0.6899 0.655
15 0.2802 0.5604 0.7198
16 0.2897 0.5794 0.7103
17 0.2904 0.5808 0.7096
18 0.4705 0.941 0.5295
19 0.4059 0.8117 0.5941
20 0.4147 0.8294 0.5853
21 0.3566 0.7132 0.6434
22 0.3019 0.6038 0.6981
23 0.2515 0.5031 0.7485
24 0.2062 0.4124 0.7938
25 0.2189 0.4378 0.7811
26 0.2276 0.4552 0.7724
27 0.1893 0.3786 0.8107
28 0.4734 0.9467 0.5266
29 0.6122 0.7757 0.3878
30 0.5713 0.8574 0.4287
31 0.5204 0.9593 0.4796
32 0.6335 0.733 0.3665
33 0.5921 0.8158 0.4079
34 0.7829 0.4343 0.2171
35 0.7427 0.5146 0.2573
36 0.7474 0.5051 0.2526
37 0.7208 0.5584 0.2792
38 0.6914 0.6172 0.3086
39 0.6488 0.7024 0.3512
40 0.7339 0.5322 0.2661
41 0.6925 0.6149 0.3075
42 0.6763 0.6474 0.3237
43 0.7097 0.5805 0.2903
44 0.7758 0.4483 0.2242
45 0.7387 0.5226 0.2613
46 0.6988 0.6025 0.3012
47 0.7542 0.4916 0.2458
48 0.7164 0.5673 0.2836
49 0.8429 0.3142 0.1571
50 0.814 0.3721 0.186
51 0.8278 0.3444 0.1722
52 0.8375 0.325 0.1625
53 0.8213 0.3575 0.1787
54 0.8034 0.3933 0.1966
55 0.7706 0.4588 0.2294
56 0.7838 0.4324 0.2162
57 0.7961 0.4079 0.2039
58 0.7633 0.4733 0.2367
59 0.728 0.544 0.272
60 0.7396 0.5208 0.2604
61 0.9001 0.1997 0.09986
62 0.8797 0.2405 0.1203
63 0.8883 0.2234 0.1117
64 0.8665 0.2671 0.1336
65 0.8744 0.2513 0.1256
66 0.8507 0.2987 0.1493
67 0.8242 0.3516 0.1758
68 0.8657 0.2686 0.1343
69 0.8761 0.2478 0.1239
70 0.886 0.2279 0.114
71 0.8658 0.2684 0.1342
72 0.8413 0.3175 0.1587
73 0.8139 0.3721 0.1861
74 0.7839 0.4323 0.2161
75 0.7992 0.4016 0.2008
76 0.7679 0.4642 0.2321
77 0.7844 0.4312 0.2156
78 0.7685 0.4629 0.2315
79 0.7501 0.4999 0.2499
80 0.7151 0.5699 0.2849
81 0.6969 0.6062 0.3031
82 0.6588 0.6824 0.3412
83 0.6189 0.7622 0.3811
84 0.5777 0.8446 0.4223
85 0.6431 0.7138 0.3569
86 0.704 0.5919 0.296
87 0.6657 0.6686 0.3343
88 0.687 0.6261 0.313
89 0.7072 0.5856 0.2928
90 0.6689 0.6621 0.3311
91 0.6288 0.7425 0.3712
92 0.5871 0.8259 0.4129
93 0.5521 0.8959 0.4479
94 0.577 0.846 0.423
95 0.5339 0.9322 0.4661
96 0.5998 0.8003 0.4002
97 0.6638 0.6723 0.3362
98 0.6315 0.7371 0.3685
99 0.6563 0.6875 0.3437
100 0.8055 0.3891 0.1945
101 0.7732 0.4536 0.2268
102 0.8246 0.3509 0.1754
103 0.8424 0.3153 0.1576
104 0.8838 0.2325 0.1162
105 0.86 0.28 0.14
106 0.8331 0.3338 0.1669
107 0.803 0.3941 0.197
108 0.7697 0.4607 0.2303
109 0.7906 0.4187 0.2094
110 0.7557 0.4885 0.2443
111 0.7178 0.5644 0.2822
112 0.6911 0.6178 0.3089
113 0.6486 0.7027 0.3514
114 0.6753 0.6495 0.3247
115 0.6313 0.7373 0.3687
116 0.7005 0.599 0.2995
117 0.7672 0.4655 0.2328
118 0.7289 0.5422 0.2711
119 0.6875 0.625 0.3125
120 0.7114 0.5772 0.2886
121 0.668 0.664 0.332
122 0.6219 0.7561 0.3781
123 0.5906 0.8188 0.4094
124 0.6204 0.7593 0.3796
125 0.5708 0.8584 0.4292
126 0.5198 0.9603 0.4802
127 0.5528 0.8943 0.4472
128 0.5004 0.9992 0.4996
129 0.4632 0.9265 0.5368
130 0.4107 0.8214 0.5893
131 0.4907 0.9813 0.5093
132 0.8155 0.3689 0.1845
133 0.7848 0.4305 0.2152
134 0.7401 0.5198 0.2599
135 0.7191 0.5619 0.2809
136 0.7002 0.5997 0.2998
137 0.6736 0.6528 0.3264
138 0.6573 0.6854 0.3427
139 0.5987 0.8026 0.4013
140 0.654 0.692 0.346
141 0.5929 0.8142 0.4071
142 0.5286 0.9428 0.4714
143 0.4627 0.9253 0.5373
144 0.3968 0.7936 0.6032
145 0.4718 0.9436 0.5282
146 0.4907 0.9813 0.5093
147 0.4198 0.8395 0.5802
148 0.3552 0.7105 0.6448
149 0.3728 0.7456 0.6272
150 0.3019 0.6038 0.6981
151 0.2364 0.4728 0.7636
152 0.2975 0.595 0.7025
153 0.2287 0.4574 0.7713
154 0.2765 0.553 0.7235
155 0.2041 0.4083 0.7959
156 0.3108 0.6217 0.6892
157 0.2248 0.4495 0.7752
158 0.237 0.474 0.763
159 0.1541 0.3082 0.8459
160 0.546 0.908 0.454
161 0.5007 0.9986 0.4993

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
5 &  0.6657 &  0.6687 &  0.3343 \tabularnewline
6 &  0.6483 &  0.7034 &  0.3517 \tabularnewline
7 &  0.5927 &  0.8145 &  0.4073 \tabularnewline
8 &  0.5216 &  0.9568 &  0.4784 \tabularnewline
9 &  0.58 &  0.8401 &  0.42 \tabularnewline
10 &  0.5437 &  0.9125 &  0.4563 \tabularnewline
11 &  0.4585 &  0.917 &  0.5415 \tabularnewline
12 &  0.3763 &  0.7526 &  0.6237 \tabularnewline
13 &  0.3005 &  0.6011 &  0.6995 \tabularnewline
14 &  0.345 &  0.6899 &  0.655 \tabularnewline
15 &  0.2802 &  0.5604 &  0.7198 \tabularnewline
16 &  0.2897 &  0.5794 &  0.7103 \tabularnewline
17 &  0.2904 &  0.5808 &  0.7096 \tabularnewline
18 &  0.4705 &  0.941 &  0.5295 \tabularnewline
19 &  0.4059 &  0.8117 &  0.5941 \tabularnewline
20 &  0.4147 &  0.8294 &  0.5853 \tabularnewline
21 &  0.3566 &  0.7132 &  0.6434 \tabularnewline
22 &  0.3019 &  0.6038 &  0.6981 \tabularnewline
23 &  0.2515 &  0.5031 &  0.7485 \tabularnewline
24 &  0.2062 &  0.4124 &  0.7938 \tabularnewline
25 &  0.2189 &  0.4378 &  0.7811 \tabularnewline
26 &  0.2276 &  0.4552 &  0.7724 \tabularnewline
27 &  0.1893 &  0.3786 &  0.8107 \tabularnewline
28 &  0.4734 &  0.9467 &  0.5266 \tabularnewline
29 &  0.6122 &  0.7757 &  0.3878 \tabularnewline
30 &  0.5713 &  0.8574 &  0.4287 \tabularnewline
31 &  0.5204 &  0.9593 &  0.4796 \tabularnewline
32 &  0.6335 &  0.733 &  0.3665 \tabularnewline
33 &  0.5921 &  0.8158 &  0.4079 \tabularnewline
34 &  0.7829 &  0.4343 &  0.2171 \tabularnewline
35 &  0.7427 &  0.5146 &  0.2573 \tabularnewline
36 &  0.7474 &  0.5051 &  0.2526 \tabularnewline
37 &  0.7208 &  0.5584 &  0.2792 \tabularnewline
38 &  0.6914 &  0.6172 &  0.3086 \tabularnewline
39 &  0.6488 &  0.7024 &  0.3512 \tabularnewline
40 &  0.7339 &  0.5322 &  0.2661 \tabularnewline
41 &  0.6925 &  0.6149 &  0.3075 \tabularnewline
42 &  0.6763 &  0.6474 &  0.3237 \tabularnewline
43 &  0.7097 &  0.5805 &  0.2903 \tabularnewline
44 &  0.7758 &  0.4483 &  0.2242 \tabularnewline
45 &  0.7387 &  0.5226 &  0.2613 \tabularnewline
46 &  0.6988 &  0.6025 &  0.3012 \tabularnewline
47 &  0.7542 &  0.4916 &  0.2458 \tabularnewline
48 &  0.7164 &  0.5673 &  0.2836 \tabularnewline
49 &  0.8429 &  0.3142 &  0.1571 \tabularnewline
50 &  0.814 &  0.3721 &  0.186 \tabularnewline
51 &  0.8278 &  0.3444 &  0.1722 \tabularnewline
52 &  0.8375 &  0.325 &  0.1625 \tabularnewline
53 &  0.8213 &  0.3575 &  0.1787 \tabularnewline
54 &  0.8034 &  0.3933 &  0.1966 \tabularnewline
55 &  0.7706 &  0.4588 &  0.2294 \tabularnewline
56 &  0.7838 &  0.4324 &  0.2162 \tabularnewline
57 &  0.7961 &  0.4079 &  0.2039 \tabularnewline
58 &  0.7633 &  0.4733 &  0.2367 \tabularnewline
59 &  0.728 &  0.544 &  0.272 \tabularnewline
60 &  0.7396 &  0.5208 &  0.2604 \tabularnewline
61 &  0.9001 &  0.1997 &  0.09986 \tabularnewline
62 &  0.8797 &  0.2405 &  0.1203 \tabularnewline
63 &  0.8883 &  0.2234 &  0.1117 \tabularnewline
64 &  0.8665 &  0.2671 &  0.1336 \tabularnewline
65 &  0.8744 &  0.2513 &  0.1256 \tabularnewline
66 &  0.8507 &  0.2987 &  0.1493 \tabularnewline
67 &  0.8242 &  0.3516 &  0.1758 \tabularnewline
68 &  0.8657 &  0.2686 &  0.1343 \tabularnewline
69 &  0.8761 &  0.2478 &  0.1239 \tabularnewline
70 &  0.886 &  0.2279 &  0.114 \tabularnewline
71 &  0.8658 &  0.2684 &  0.1342 \tabularnewline
72 &  0.8413 &  0.3175 &  0.1587 \tabularnewline
73 &  0.8139 &  0.3721 &  0.1861 \tabularnewline
74 &  0.7839 &  0.4323 &  0.2161 \tabularnewline
75 &  0.7992 &  0.4016 &  0.2008 \tabularnewline
76 &  0.7679 &  0.4642 &  0.2321 \tabularnewline
77 &  0.7844 &  0.4312 &  0.2156 \tabularnewline
78 &  0.7685 &  0.4629 &  0.2315 \tabularnewline
79 &  0.7501 &  0.4999 &  0.2499 \tabularnewline
80 &  0.7151 &  0.5699 &  0.2849 \tabularnewline
81 &  0.6969 &  0.6062 &  0.3031 \tabularnewline
82 &  0.6588 &  0.6824 &  0.3412 \tabularnewline
83 &  0.6189 &  0.7622 &  0.3811 \tabularnewline
84 &  0.5777 &  0.8446 &  0.4223 \tabularnewline
85 &  0.6431 &  0.7138 &  0.3569 \tabularnewline
86 &  0.704 &  0.5919 &  0.296 \tabularnewline
87 &  0.6657 &  0.6686 &  0.3343 \tabularnewline
88 &  0.687 &  0.6261 &  0.313 \tabularnewline
89 &  0.7072 &  0.5856 &  0.2928 \tabularnewline
90 &  0.6689 &  0.6621 &  0.3311 \tabularnewline
91 &  0.6288 &  0.7425 &  0.3712 \tabularnewline
92 &  0.5871 &  0.8259 &  0.4129 \tabularnewline
93 &  0.5521 &  0.8959 &  0.4479 \tabularnewline
94 &  0.577 &  0.846 &  0.423 \tabularnewline
95 &  0.5339 &  0.9322 &  0.4661 \tabularnewline
96 &  0.5998 &  0.8003 &  0.4002 \tabularnewline
97 &  0.6638 &  0.6723 &  0.3362 \tabularnewline
98 &  0.6315 &  0.7371 &  0.3685 \tabularnewline
99 &  0.6563 &  0.6875 &  0.3437 \tabularnewline
100 &  0.8055 &  0.3891 &  0.1945 \tabularnewline
101 &  0.7732 &  0.4536 &  0.2268 \tabularnewline
102 &  0.8246 &  0.3509 &  0.1754 \tabularnewline
103 &  0.8424 &  0.3153 &  0.1576 \tabularnewline
104 &  0.8838 &  0.2325 &  0.1162 \tabularnewline
105 &  0.86 &  0.28 &  0.14 \tabularnewline
106 &  0.8331 &  0.3338 &  0.1669 \tabularnewline
107 &  0.803 &  0.3941 &  0.197 \tabularnewline
108 &  0.7697 &  0.4607 &  0.2303 \tabularnewline
109 &  0.7906 &  0.4187 &  0.2094 \tabularnewline
110 &  0.7557 &  0.4885 &  0.2443 \tabularnewline
111 &  0.7178 &  0.5644 &  0.2822 \tabularnewline
112 &  0.6911 &  0.6178 &  0.3089 \tabularnewline
113 &  0.6486 &  0.7027 &  0.3514 \tabularnewline
114 &  0.6753 &  0.6495 &  0.3247 \tabularnewline
115 &  0.6313 &  0.7373 &  0.3687 \tabularnewline
116 &  0.7005 &  0.599 &  0.2995 \tabularnewline
117 &  0.7672 &  0.4655 &  0.2328 \tabularnewline
118 &  0.7289 &  0.5422 &  0.2711 \tabularnewline
119 &  0.6875 &  0.625 &  0.3125 \tabularnewline
120 &  0.7114 &  0.5772 &  0.2886 \tabularnewline
121 &  0.668 &  0.664 &  0.332 \tabularnewline
122 &  0.6219 &  0.7561 &  0.3781 \tabularnewline
123 &  0.5906 &  0.8188 &  0.4094 \tabularnewline
124 &  0.6204 &  0.7593 &  0.3796 \tabularnewline
125 &  0.5708 &  0.8584 &  0.4292 \tabularnewline
126 &  0.5198 &  0.9603 &  0.4802 \tabularnewline
127 &  0.5528 &  0.8943 &  0.4472 \tabularnewline
128 &  0.5004 &  0.9992 &  0.4996 \tabularnewline
129 &  0.4632 &  0.9265 &  0.5368 \tabularnewline
130 &  0.4107 &  0.8214 &  0.5893 \tabularnewline
131 &  0.4907 &  0.9813 &  0.5093 \tabularnewline
132 &  0.8155 &  0.3689 &  0.1845 \tabularnewline
133 &  0.7848 &  0.4305 &  0.2152 \tabularnewline
134 &  0.7401 &  0.5198 &  0.2599 \tabularnewline
135 &  0.7191 &  0.5619 &  0.2809 \tabularnewline
136 &  0.7002 &  0.5997 &  0.2998 \tabularnewline
137 &  0.6736 &  0.6528 &  0.3264 \tabularnewline
138 &  0.6573 &  0.6854 &  0.3427 \tabularnewline
139 &  0.5987 &  0.8026 &  0.4013 \tabularnewline
140 &  0.654 &  0.692 &  0.346 \tabularnewline
141 &  0.5929 &  0.8142 &  0.4071 \tabularnewline
142 &  0.5286 &  0.9428 &  0.4714 \tabularnewline
143 &  0.4627 &  0.9253 &  0.5373 \tabularnewline
144 &  0.3968 &  0.7936 &  0.6032 \tabularnewline
145 &  0.4718 &  0.9436 &  0.5282 \tabularnewline
146 &  0.4907 &  0.9813 &  0.5093 \tabularnewline
147 &  0.4198 &  0.8395 &  0.5802 \tabularnewline
148 &  0.3552 &  0.7105 &  0.6448 \tabularnewline
149 &  0.3728 &  0.7456 &  0.6272 \tabularnewline
150 &  0.3019 &  0.6038 &  0.6981 \tabularnewline
151 &  0.2364 &  0.4728 &  0.7636 \tabularnewline
152 &  0.2975 &  0.595 &  0.7025 \tabularnewline
153 &  0.2287 &  0.4574 &  0.7713 \tabularnewline
154 &  0.2765 &  0.553 &  0.7235 \tabularnewline
155 &  0.2041 &  0.4083 &  0.7959 \tabularnewline
156 &  0.3108 &  0.6217 &  0.6892 \tabularnewline
157 &  0.2248 &  0.4495 &  0.7752 \tabularnewline
158 &  0.237 &  0.474 &  0.763 \tabularnewline
159 &  0.1541 &  0.3082 &  0.8459 \tabularnewline
160 &  0.546 &  0.908 &  0.454 \tabularnewline
161 &  0.5007 &  0.9986 &  0.4993 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]5[/C][C] 0.6657[/C][C] 0.6687[/C][C] 0.3343[/C][/ROW]
[ROW][C]6[/C][C] 0.6483[/C][C] 0.7034[/C][C] 0.3517[/C][/ROW]
[ROW][C]7[/C][C] 0.5927[/C][C] 0.8145[/C][C] 0.4073[/C][/ROW]
[ROW][C]8[/C][C] 0.5216[/C][C] 0.9568[/C][C] 0.4784[/C][/ROW]
[ROW][C]9[/C][C] 0.58[/C][C] 0.8401[/C][C] 0.42[/C][/ROW]
[ROW][C]10[/C][C] 0.5437[/C][C] 0.9125[/C][C] 0.4563[/C][/ROW]
[ROW][C]11[/C][C] 0.4585[/C][C] 0.917[/C][C] 0.5415[/C][/ROW]
[ROW][C]12[/C][C] 0.3763[/C][C] 0.7526[/C][C] 0.6237[/C][/ROW]
[ROW][C]13[/C][C] 0.3005[/C][C] 0.6011[/C][C] 0.6995[/C][/ROW]
[ROW][C]14[/C][C] 0.345[/C][C] 0.6899[/C][C] 0.655[/C][/ROW]
[ROW][C]15[/C][C] 0.2802[/C][C] 0.5604[/C][C] 0.7198[/C][/ROW]
[ROW][C]16[/C][C] 0.2897[/C][C] 0.5794[/C][C] 0.7103[/C][/ROW]
[ROW][C]17[/C][C] 0.2904[/C][C] 0.5808[/C][C] 0.7096[/C][/ROW]
[ROW][C]18[/C][C] 0.4705[/C][C] 0.941[/C][C] 0.5295[/C][/ROW]
[ROW][C]19[/C][C] 0.4059[/C][C] 0.8117[/C][C] 0.5941[/C][/ROW]
[ROW][C]20[/C][C] 0.4147[/C][C] 0.8294[/C][C] 0.5853[/C][/ROW]
[ROW][C]21[/C][C] 0.3566[/C][C] 0.7132[/C][C] 0.6434[/C][/ROW]
[ROW][C]22[/C][C] 0.3019[/C][C] 0.6038[/C][C] 0.6981[/C][/ROW]
[ROW][C]23[/C][C] 0.2515[/C][C] 0.5031[/C][C] 0.7485[/C][/ROW]
[ROW][C]24[/C][C] 0.2062[/C][C] 0.4124[/C][C] 0.7938[/C][/ROW]
[ROW][C]25[/C][C] 0.2189[/C][C] 0.4378[/C][C] 0.7811[/C][/ROW]
[ROW][C]26[/C][C] 0.2276[/C][C] 0.4552[/C][C] 0.7724[/C][/ROW]
[ROW][C]27[/C][C] 0.1893[/C][C] 0.3786[/C][C] 0.8107[/C][/ROW]
[ROW][C]28[/C][C] 0.4734[/C][C] 0.9467[/C][C] 0.5266[/C][/ROW]
[ROW][C]29[/C][C] 0.6122[/C][C] 0.7757[/C][C] 0.3878[/C][/ROW]
[ROW][C]30[/C][C] 0.5713[/C][C] 0.8574[/C][C] 0.4287[/C][/ROW]
[ROW][C]31[/C][C] 0.5204[/C][C] 0.9593[/C][C] 0.4796[/C][/ROW]
[ROW][C]32[/C][C] 0.6335[/C][C] 0.733[/C][C] 0.3665[/C][/ROW]
[ROW][C]33[/C][C] 0.5921[/C][C] 0.8158[/C][C] 0.4079[/C][/ROW]
[ROW][C]34[/C][C] 0.7829[/C][C] 0.4343[/C][C] 0.2171[/C][/ROW]
[ROW][C]35[/C][C] 0.7427[/C][C] 0.5146[/C][C] 0.2573[/C][/ROW]
[ROW][C]36[/C][C] 0.7474[/C][C] 0.5051[/C][C] 0.2526[/C][/ROW]
[ROW][C]37[/C][C] 0.7208[/C][C] 0.5584[/C][C] 0.2792[/C][/ROW]
[ROW][C]38[/C][C] 0.6914[/C][C] 0.6172[/C][C] 0.3086[/C][/ROW]
[ROW][C]39[/C][C] 0.6488[/C][C] 0.7024[/C][C] 0.3512[/C][/ROW]
[ROW][C]40[/C][C] 0.7339[/C][C] 0.5322[/C][C] 0.2661[/C][/ROW]
[ROW][C]41[/C][C] 0.6925[/C][C] 0.6149[/C][C] 0.3075[/C][/ROW]
[ROW][C]42[/C][C] 0.6763[/C][C] 0.6474[/C][C] 0.3237[/C][/ROW]
[ROW][C]43[/C][C] 0.7097[/C][C] 0.5805[/C][C] 0.2903[/C][/ROW]
[ROW][C]44[/C][C] 0.7758[/C][C] 0.4483[/C][C] 0.2242[/C][/ROW]
[ROW][C]45[/C][C] 0.7387[/C][C] 0.5226[/C][C] 0.2613[/C][/ROW]
[ROW][C]46[/C][C] 0.6988[/C][C] 0.6025[/C][C] 0.3012[/C][/ROW]
[ROW][C]47[/C][C] 0.7542[/C][C] 0.4916[/C][C] 0.2458[/C][/ROW]
[ROW][C]48[/C][C] 0.7164[/C][C] 0.5673[/C][C] 0.2836[/C][/ROW]
[ROW][C]49[/C][C] 0.8429[/C][C] 0.3142[/C][C] 0.1571[/C][/ROW]
[ROW][C]50[/C][C] 0.814[/C][C] 0.3721[/C][C] 0.186[/C][/ROW]
[ROW][C]51[/C][C] 0.8278[/C][C] 0.3444[/C][C] 0.1722[/C][/ROW]
[ROW][C]52[/C][C] 0.8375[/C][C] 0.325[/C][C] 0.1625[/C][/ROW]
[ROW][C]53[/C][C] 0.8213[/C][C] 0.3575[/C][C] 0.1787[/C][/ROW]
[ROW][C]54[/C][C] 0.8034[/C][C] 0.3933[/C][C] 0.1966[/C][/ROW]
[ROW][C]55[/C][C] 0.7706[/C][C] 0.4588[/C][C] 0.2294[/C][/ROW]
[ROW][C]56[/C][C] 0.7838[/C][C] 0.4324[/C][C] 0.2162[/C][/ROW]
[ROW][C]57[/C][C] 0.7961[/C][C] 0.4079[/C][C] 0.2039[/C][/ROW]
[ROW][C]58[/C][C] 0.7633[/C][C] 0.4733[/C][C] 0.2367[/C][/ROW]
[ROW][C]59[/C][C] 0.728[/C][C] 0.544[/C][C] 0.272[/C][/ROW]
[ROW][C]60[/C][C] 0.7396[/C][C] 0.5208[/C][C] 0.2604[/C][/ROW]
[ROW][C]61[/C][C] 0.9001[/C][C] 0.1997[/C][C] 0.09986[/C][/ROW]
[ROW][C]62[/C][C] 0.8797[/C][C] 0.2405[/C][C] 0.1203[/C][/ROW]
[ROW][C]63[/C][C] 0.8883[/C][C] 0.2234[/C][C] 0.1117[/C][/ROW]
[ROW][C]64[/C][C] 0.8665[/C][C] 0.2671[/C][C] 0.1336[/C][/ROW]
[ROW][C]65[/C][C] 0.8744[/C][C] 0.2513[/C][C] 0.1256[/C][/ROW]
[ROW][C]66[/C][C] 0.8507[/C][C] 0.2987[/C][C] 0.1493[/C][/ROW]
[ROW][C]67[/C][C] 0.8242[/C][C] 0.3516[/C][C] 0.1758[/C][/ROW]
[ROW][C]68[/C][C] 0.8657[/C][C] 0.2686[/C][C] 0.1343[/C][/ROW]
[ROW][C]69[/C][C] 0.8761[/C][C] 0.2478[/C][C] 0.1239[/C][/ROW]
[ROW][C]70[/C][C] 0.886[/C][C] 0.2279[/C][C] 0.114[/C][/ROW]
[ROW][C]71[/C][C] 0.8658[/C][C] 0.2684[/C][C] 0.1342[/C][/ROW]
[ROW][C]72[/C][C] 0.8413[/C][C] 0.3175[/C][C] 0.1587[/C][/ROW]
[ROW][C]73[/C][C] 0.8139[/C][C] 0.3721[/C][C] 0.1861[/C][/ROW]
[ROW][C]74[/C][C] 0.7839[/C][C] 0.4323[/C][C] 0.2161[/C][/ROW]
[ROW][C]75[/C][C] 0.7992[/C][C] 0.4016[/C][C] 0.2008[/C][/ROW]
[ROW][C]76[/C][C] 0.7679[/C][C] 0.4642[/C][C] 0.2321[/C][/ROW]
[ROW][C]77[/C][C] 0.7844[/C][C] 0.4312[/C][C] 0.2156[/C][/ROW]
[ROW][C]78[/C][C] 0.7685[/C][C] 0.4629[/C][C] 0.2315[/C][/ROW]
[ROW][C]79[/C][C] 0.7501[/C][C] 0.4999[/C][C] 0.2499[/C][/ROW]
[ROW][C]80[/C][C] 0.7151[/C][C] 0.5699[/C][C] 0.2849[/C][/ROW]
[ROW][C]81[/C][C] 0.6969[/C][C] 0.6062[/C][C] 0.3031[/C][/ROW]
[ROW][C]82[/C][C] 0.6588[/C][C] 0.6824[/C][C] 0.3412[/C][/ROW]
[ROW][C]83[/C][C] 0.6189[/C][C] 0.7622[/C][C] 0.3811[/C][/ROW]
[ROW][C]84[/C][C] 0.5777[/C][C] 0.8446[/C][C] 0.4223[/C][/ROW]
[ROW][C]85[/C][C] 0.6431[/C][C] 0.7138[/C][C] 0.3569[/C][/ROW]
[ROW][C]86[/C][C] 0.704[/C][C] 0.5919[/C][C] 0.296[/C][/ROW]
[ROW][C]87[/C][C] 0.6657[/C][C] 0.6686[/C][C] 0.3343[/C][/ROW]
[ROW][C]88[/C][C] 0.687[/C][C] 0.6261[/C][C] 0.313[/C][/ROW]
[ROW][C]89[/C][C] 0.7072[/C][C] 0.5856[/C][C] 0.2928[/C][/ROW]
[ROW][C]90[/C][C] 0.6689[/C][C] 0.6621[/C][C] 0.3311[/C][/ROW]
[ROW][C]91[/C][C] 0.6288[/C][C] 0.7425[/C][C] 0.3712[/C][/ROW]
[ROW][C]92[/C][C] 0.5871[/C][C] 0.8259[/C][C] 0.4129[/C][/ROW]
[ROW][C]93[/C][C] 0.5521[/C][C] 0.8959[/C][C] 0.4479[/C][/ROW]
[ROW][C]94[/C][C] 0.577[/C][C] 0.846[/C][C] 0.423[/C][/ROW]
[ROW][C]95[/C][C] 0.5339[/C][C] 0.9322[/C][C] 0.4661[/C][/ROW]
[ROW][C]96[/C][C] 0.5998[/C][C] 0.8003[/C][C] 0.4002[/C][/ROW]
[ROW][C]97[/C][C] 0.6638[/C][C] 0.6723[/C][C] 0.3362[/C][/ROW]
[ROW][C]98[/C][C] 0.6315[/C][C] 0.7371[/C][C] 0.3685[/C][/ROW]
[ROW][C]99[/C][C] 0.6563[/C][C] 0.6875[/C][C] 0.3437[/C][/ROW]
[ROW][C]100[/C][C] 0.8055[/C][C] 0.3891[/C][C] 0.1945[/C][/ROW]
[ROW][C]101[/C][C] 0.7732[/C][C] 0.4536[/C][C] 0.2268[/C][/ROW]
[ROW][C]102[/C][C] 0.8246[/C][C] 0.3509[/C][C] 0.1754[/C][/ROW]
[ROW][C]103[/C][C] 0.8424[/C][C] 0.3153[/C][C] 0.1576[/C][/ROW]
[ROW][C]104[/C][C] 0.8838[/C][C] 0.2325[/C][C] 0.1162[/C][/ROW]
[ROW][C]105[/C][C] 0.86[/C][C] 0.28[/C][C] 0.14[/C][/ROW]
[ROW][C]106[/C][C] 0.8331[/C][C] 0.3338[/C][C] 0.1669[/C][/ROW]
[ROW][C]107[/C][C] 0.803[/C][C] 0.3941[/C][C] 0.197[/C][/ROW]
[ROW][C]108[/C][C] 0.7697[/C][C] 0.4607[/C][C] 0.2303[/C][/ROW]
[ROW][C]109[/C][C] 0.7906[/C][C] 0.4187[/C][C] 0.2094[/C][/ROW]
[ROW][C]110[/C][C] 0.7557[/C][C] 0.4885[/C][C] 0.2443[/C][/ROW]
[ROW][C]111[/C][C] 0.7178[/C][C] 0.5644[/C][C] 0.2822[/C][/ROW]
[ROW][C]112[/C][C] 0.6911[/C][C] 0.6178[/C][C] 0.3089[/C][/ROW]
[ROW][C]113[/C][C] 0.6486[/C][C] 0.7027[/C][C] 0.3514[/C][/ROW]
[ROW][C]114[/C][C] 0.6753[/C][C] 0.6495[/C][C] 0.3247[/C][/ROW]
[ROW][C]115[/C][C] 0.6313[/C][C] 0.7373[/C][C] 0.3687[/C][/ROW]
[ROW][C]116[/C][C] 0.7005[/C][C] 0.599[/C][C] 0.2995[/C][/ROW]
[ROW][C]117[/C][C] 0.7672[/C][C] 0.4655[/C][C] 0.2328[/C][/ROW]
[ROW][C]118[/C][C] 0.7289[/C][C] 0.5422[/C][C] 0.2711[/C][/ROW]
[ROW][C]119[/C][C] 0.6875[/C][C] 0.625[/C][C] 0.3125[/C][/ROW]
[ROW][C]120[/C][C] 0.7114[/C][C] 0.5772[/C][C] 0.2886[/C][/ROW]
[ROW][C]121[/C][C] 0.668[/C][C] 0.664[/C][C] 0.332[/C][/ROW]
[ROW][C]122[/C][C] 0.6219[/C][C] 0.7561[/C][C] 0.3781[/C][/ROW]
[ROW][C]123[/C][C] 0.5906[/C][C] 0.8188[/C][C] 0.4094[/C][/ROW]
[ROW][C]124[/C][C] 0.6204[/C][C] 0.7593[/C][C] 0.3796[/C][/ROW]
[ROW][C]125[/C][C] 0.5708[/C][C] 0.8584[/C][C] 0.4292[/C][/ROW]
[ROW][C]126[/C][C] 0.5198[/C][C] 0.9603[/C][C] 0.4802[/C][/ROW]
[ROW][C]127[/C][C] 0.5528[/C][C] 0.8943[/C][C] 0.4472[/C][/ROW]
[ROW][C]128[/C][C] 0.5004[/C][C] 0.9992[/C][C] 0.4996[/C][/ROW]
[ROW][C]129[/C][C] 0.4632[/C][C] 0.9265[/C][C] 0.5368[/C][/ROW]
[ROW][C]130[/C][C] 0.4107[/C][C] 0.8214[/C][C] 0.5893[/C][/ROW]
[ROW][C]131[/C][C] 0.4907[/C][C] 0.9813[/C][C] 0.5093[/C][/ROW]
[ROW][C]132[/C][C] 0.8155[/C][C] 0.3689[/C][C] 0.1845[/C][/ROW]
[ROW][C]133[/C][C] 0.7848[/C][C] 0.4305[/C][C] 0.2152[/C][/ROW]
[ROW][C]134[/C][C] 0.7401[/C][C] 0.5198[/C][C] 0.2599[/C][/ROW]
[ROW][C]135[/C][C] 0.7191[/C][C] 0.5619[/C][C] 0.2809[/C][/ROW]
[ROW][C]136[/C][C] 0.7002[/C][C] 0.5997[/C][C] 0.2998[/C][/ROW]
[ROW][C]137[/C][C] 0.6736[/C][C] 0.6528[/C][C] 0.3264[/C][/ROW]
[ROW][C]138[/C][C] 0.6573[/C][C] 0.6854[/C][C] 0.3427[/C][/ROW]
[ROW][C]139[/C][C] 0.5987[/C][C] 0.8026[/C][C] 0.4013[/C][/ROW]
[ROW][C]140[/C][C] 0.654[/C][C] 0.692[/C][C] 0.346[/C][/ROW]
[ROW][C]141[/C][C] 0.5929[/C][C] 0.8142[/C][C] 0.4071[/C][/ROW]
[ROW][C]142[/C][C] 0.5286[/C][C] 0.9428[/C][C] 0.4714[/C][/ROW]
[ROW][C]143[/C][C] 0.4627[/C][C] 0.9253[/C][C] 0.5373[/C][/ROW]
[ROW][C]144[/C][C] 0.3968[/C][C] 0.7936[/C][C] 0.6032[/C][/ROW]
[ROW][C]145[/C][C] 0.4718[/C][C] 0.9436[/C][C] 0.5282[/C][/ROW]
[ROW][C]146[/C][C] 0.4907[/C][C] 0.9813[/C][C] 0.5093[/C][/ROW]
[ROW][C]147[/C][C] 0.4198[/C][C] 0.8395[/C][C] 0.5802[/C][/ROW]
[ROW][C]148[/C][C] 0.3552[/C][C] 0.7105[/C][C] 0.6448[/C][/ROW]
[ROW][C]149[/C][C] 0.3728[/C][C] 0.7456[/C][C] 0.6272[/C][/ROW]
[ROW][C]150[/C][C] 0.3019[/C][C] 0.6038[/C][C] 0.6981[/C][/ROW]
[ROW][C]151[/C][C] 0.2364[/C][C] 0.4728[/C][C] 0.7636[/C][/ROW]
[ROW][C]152[/C][C] 0.2975[/C][C] 0.595[/C][C] 0.7025[/C][/ROW]
[ROW][C]153[/C][C] 0.2287[/C][C] 0.4574[/C][C] 0.7713[/C][/ROW]
[ROW][C]154[/C][C] 0.2765[/C][C] 0.553[/C][C] 0.7235[/C][/ROW]
[ROW][C]155[/C][C] 0.2041[/C][C] 0.4083[/C][C] 0.7959[/C][/ROW]
[ROW][C]156[/C][C] 0.3108[/C][C] 0.6217[/C][C] 0.6892[/C][/ROW]
[ROW][C]157[/C][C] 0.2248[/C][C] 0.4495[/C][C] 0.7752[/C][/ROW]
[ROW][C]158[/C][C] 0.237[/C][C] 0.474[/C][C] 0.763[/C][/ROW]
[ROW][C]159[/C][C] 0.1541[/C][C] 0.3082[/C][C] 0.8459[/C][/ROW]
[ROW][C]160[/C][C] 0.546[/C][C] 0.908[/C][C] 0.454[/C][/ROW]
[ROW][C]161[/C][C] 0.5007[/C][C] 0.9986[/C][C] 0.4993[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
5 0.6657 0.6687 0.3343
6 0.6483 0.7034 0.3517
7 0.5927 0.8145 0.4073
8 0.5216 0.9568 0.4784
9 0.58 0.8401 0.42
10 0.5437 0.9125 0.4563
11 0.4585 0.917 0.5415
12 0.3763 0.7526 0.6237
13 0.3005 0.6011 0.6995
14 0.345 0.6899 0.655
15 0.2802 0.5604 0.7198
16 0.2897 0.5794 0.7103
17 0.2904 0.5808 0.7096
18 0.4705 0.941 0.5295
19 0.4059 0.8117 0.5941
20 0.4147 0.8294 0.5853
21 0.3566 0.7132 0.6434
22 0.3019 0.6038 0.6981
23 0.2515 0.5031 0.7485
24 0.2062 0.4124 0.7938
25 0.2189 0.4378 0.7811
26 0.2276 0.4552 0.7724
27 0.1893 0.3786 0.8107
28 0.4734 0.9467 0.5266
29 0.6122 0.7757 0.3878
30 0.5713 0.8574 0.4287
31 0.5204 0.9593 0.4796
32 0.6335 0.733 0.3665
33 0.5921 0.8158 0.4079
34 0.7829 0.4343 0.2171
35 0.7427 0.5146 0.2573
36 0.7474 0.5051 0.2526
37 0.7208 0.5584 0.2792
38 0.6914 0.6172 0.3086
39 0.6488 0.7024 0.3512
40 0.7339 0.5322 0.2661
41 0.6925 0.6149 0.3075
42 0.6763 0.6474 0.3237
43 0.7097 0.5805 0.2903
44 0.7758 0.4483 0.2242
45 0.7387 0.5226 0.2613
46 0.6988 0.6025 0.3012
47 0.7542 0.4916 0.2458
48 0.7164 0.5673 0.2836
49 0.8429 0.3142 0.1571
50 0.814 0.3721 0.186
51 0.8278 0.3444 0.1722
52 0.8375 0.325 0.1625
53 0.8213 0.3575 0.1787
54 0.8034 0.3933 0.1966
55 0.7706 0.4588 0.2294
56 0.7838 0.4324 0.2162
57 0.7961 0.4079 0.2039
58 0.7633 0.4733 0.2367
59 0.728 0.544 0.272
60 0.7396 0.5208 0.2604
61 0.9001 0.1997 0.09986
62 0.8797 0.2405 0.1203
63 0.8883 0.2234 0.1117
64 0.8665 0.2671 0.1336
65 0.8744 0.2513 0.1256
66 0.8507 0.2987 0.1493
67 0.8242 0.3516 0.1758
68 0.8657 0.2686 0.1343
69 0.8761 0.2478 0.1239
70 0.886 0.2279 0.114
71 0.8658 0.2684 0.1342
72 0.8413 0.3175 0.1587
73 0.8139 0.3721 0.1861
74 0.7839 0.4323 0.2161
75 0.7992 0.4016 0.2008
76 0.7679 0.4642 0.2321
77 0.7844 0.4312 0.2156
78 0.7685 0.4629 0.2315
79 0.7501 0.4999 0.2499
80 0.7151 0.5699 0.2849
81 0.6969 0.6062 0.3031
82 0.6588 0.6824 0.3412
83 0.6189 0.7622 0.3811
84 0.5777 0.8446 0.4223
85 0.6431 0.7138 0.3569
86 0.704 0.5919 0.296
87 0.6657 0.6686 0.3343
88 0.687 0.6261 0.313
89 0.7072 0.5856 0.2928
90 0.6689 0.6621 0.3311
91 0.6288 0.7425 0.3712
92 0.5871 0.8259 0.4129
93 0.5521 0.8959 0.4479
94 0.577 0.846 0.423
95 0.5339 0.9322 0.4661
96 0.5998 0.8003 0.4002
97 0.6638 0.6723 0.3362
98 0.6315 0.7371 0.3685
99 0.6563 0.6875 0.3437
100 0.8055 0.3891 0.1945
101 0.7732 0.4536 0.2268
102 0.8246 0.3509 0.1754
103 0.8424 0.3153 0.1576
104 0.8838 0.2325 0.1162
105 0.86 0.28 0.14
106 0.8331 0.3338 0.1669
107 0.803 0.3941 0.197
108 0.7697 0.4607 0.2303
109 0.7906 0.4187 0.2094
110 0.7557 0.4885 0.2443
111 0.7178 0.5644 0.2822
112 0.6911 0.6178 0.3089
113 0.6486 0.7027 0.3514
114 0.6753 0.6495 0.3247
115 0.6313 0.7373 0.3687
116 0.7005 0.599 0.2995
117 0.7672 0.4655 0.2328
118 0.7289 0.5422 0.2711
119 0.6875 0.625 0.3125
120 0.7114 0.5772 0.2886
121 0.668 0.664 0.332
122 0.6219 0.7561 0.3781
123 0.5906 0.8188 0.4094
124 0.6204 0.7593 0.3796
125 0.5708 0.8584 0.4292
126 0.5198 0.9603 0.4802
127 0.5528 0.8943 0.4472
128 0.5004 0.9992 0.4996
129 0.4632 0.9265 0.5368
130 0.4107 0.8214 0.5893
131 0.4907 0.9813 0.5093
132 0.8155 0.3689 0.1845
133 0.7848 0.4305 0.2152
134 0.7401 0.5198 0.2599
135 0.7191 0.5619 0.2809
136 0.7002 0.5997 0.2998
137 0.6736 0.6528 0.3264
138 0.6573 0.6854 0.3427
139 0.5987 0.8026 0.4013
140 0.654 0.692 0.346
141 0.5929 0.8142 0.4071
142 0.5286 0.9428 0.4714
143 0.4627 0.9253 0.5373
144 0.3968 0.7936 0.6032
145 0.4718 0.9436 0.5282
146 0.4907 0.9813 0.5093
147 0.4198 0.8395 0.5802
148 0.3552 0.7105 0.6448
149 0.3728 0.7456 0.6272
150 0.3019 0.6038 0.6981
151 0.2364 0.4728 0.7636
152 0.2975 0.595 0.7025
153 0.2287 0.4574 0.7713
154 0.2765 0.553 0.7235
155 0.2041 0.4083 0.7959
156 0.3108 0.6217 0.6892
157 0.2248 0.4495 0.7752
158 0.237 0.474 0.763
159 0.1541 0.3082 0.8459
160 0.546 0.908 0.454
161 0.5007 0.9986 0.4993







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level00OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 0 & 0 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level00OK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.8011, df1 = 2, df2 = 162, p-value = 0.1684
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.8011, df1 = 2, df2 = 162, p-value = 0.1684
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.8011, df1 = 2, df2 = 162, p-value = 0.1684

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.8011, df1 = 2, df2 = 162, p-value = 0.1684
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.8011, df1 = 2, df2 = 162, p-value = 0.1684
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.8011, df1 = 2, df2 = 162, p-value = 0.1684
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.8011, df1 = 2, df2 = 162, p-value = 0.1684
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.8011, df1 = 2, df2 = 162, p-value = 0.1684
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.8011, df1 = 2, df2 = 162, p-value = 0.1684
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.8011, df1 = 2, df2 = 162, p-value = 0.1684
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.8011, df1 = 2, df2 = 162, p-value = 0.1684
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.8011, df1 = 2, df2 = 162, p-value = 0.1684



Parameters (Session):
Parameters (R input):
par1 = ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = ; par5 = ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')