Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationSun, 22 Jan 2017 10:49:22 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2017/Jan/22/t1485078615vu7bw46liclira6.htm/, Retrieved Tue, 14 May 2024 19:40:01 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=303331, Retrieved Tue, 14 May 2024 19:40:01 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact110
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [] [2017-01-22 09:49:22] [2e11ca31a00cf8de75c33c1af2d59434] [Current]
Feedback Forum

Post a new message
Dataseries X:
149 0.5 2011 1 0
139 0.5 2011 1 1
148 0.4 2011 1 0
158 0.5 2011 1 1
128 0.7 2011 1 1
224 0.3 2011 1 1
159 0.4 2011 1 0
105 0.4 2011 1 1
159 0.7 2011 1 1
167 0.6 2011 1 1
165 0.6 2011 1 1
159 0.2 2011 1 1
119 0.4 2011 1 1
176 0.4 2011 1 0
54 0.5 2011 1 0
91 0.3 2011 0 0
163 0.4 2011 1 1
124 0.7 2011 1 0
137 0.5 2011 0 1
121 0.2 2011 1 0
153 0.3 2011 1 1
148 0.6 2011 1 1
221 0.6 2011 1 0
188 0.2 2011 1 1
149 0.7 2011 1 1
244 0.2 2011 1 1
148 1 2011 0 1
92 0.4 2011 0 0
150 0.4 2011 1 1
153 0.2 2011 1 0
94 0.4 2011 1 0
156 0.4 2011 1 0
132 0.7 2011 1 1
161 0.2 2011 1 1
105 0.6 2011 1 1
97 0.3 2011 1 1
151 0.3 2011 1 0
131 0.2 2011 0 1
166 0.5 2011 1 1
157 0.7 2011 1 0
111 0.6 2011 1 1
145 0.4 2011 1 1
162 0.6 2011 1 1
163 0.4 2011 1 1
59 0.3 2011 0 1
187 0.5 2011 1 0
109 0.2 2011 1 1
90 0.3 2011 0 1
105 0.5 2011 1 0
83 0.7 2011 0 1
116 0.4 2011 0 1
42 0.3 2011 0 1
148 0.2 2011 1 1
155 0.5 2011 0 1
125 0.4 2011 1 1
116 0.6 2011 1 1
128 0.4 2011 0 0
138 0.4 2011 1 1
49 0.2 2011 0 0
96 0.9 2011 0 1
164 0.8 2011 1 1
162 0.8 2011 1 0
99 0.3 2011 1 0
202 0.2 2011 1 1
186 0.4 2011 1 0
66 0.2 2011 0 1
183 0.2 2011 1 0
214 0.1 2011 1 1
188 0.4 2011 1 1
104 0.5 2011 0 0
177 0.8 2011 1 0
126 0.4 2011 1 0
76 0.6 2011 0 0
99 0.5 2011 0 1
139 0.3 2011 1 0
162 0.4 2011 1 0
108 0.6 2011 0 1
159 0.4 2011 1 0
74 0.3 2011 0 0
110 0.8 2011 1 1
96 0.6 2011 0 0
116 0.3 2011 0 0
87 0.5 2011 0 0
97 0.4 2011 0 1
127 0.3 2011 0 0
106 0.7 2011 0 1
80 0.2 2011 0 1
74 0.4 2011 0 0
91 0.6 2011 0 0
133 0.6 2011 0 0
74 0.6 2011 0 1
114 0.4 2011 0 1
140 0.6 2011 0 1
95 0.5 2011 0 0
98 0.5 2011 0 1
121 0.6 2011 0 0
126 0.8 2011 0 1
98 0.5 2011 0 1
95 0.6 2011 0 1
110 0.4 2011 0 1
70 0.3 2011 0 1
102 0.3 2011 0 0
86 0.2 2011 0 1
130 0.4 2011 0 1
96 0.5 2011 0 1
102 0.3 2011 0 0
100 0.4 2011 0 0
94 0.5 2011 0 0
52 0.3 2011 0 0
98 0.5 2011 0 0
118 0.4 2011 0 0
99 0.4 2011 0 1
48 0.6 2012 1 1
50 0.3 2012 1 1
150 0.4 2012 1 1
154 0.3 2012 1 1
109 1 2012 0 0
68 0.4 2012 0 1
194 0.8 2012 1 1
158 0.3 2012 1 0
159 0.5 2012 1 1
67 0.4 2012 1 0
147 0.3 2012 1 0
39 0.5 2012 1 1
100 0.3 2012 1 1
111 0.3 2012 1 1
138 0.4 2012 1 1
101 0.3 2012 1 1
131 0.6 2012 0 1
101 0.6 2012 1 1
114 0.4 2012 1 1
165 0.4 2012 1 0
114 0.4 2012 1 1
111 0.3 2012 1 1
75 0.2 2012 1 1
82 0.5 2012 1 1
121 0.4 2012 1 1
32 0.4 2012 1 1
150 0.4 2012 1 0
117 0.3 2012 1 1
71 0.4 2012 0 1
165 0.2 2012 1 1
154 0 2012 1 1
126 0.4 2012 1 1
149 0.6 2012 1 0
145 0.4 2012 1 0
120 0.4 2012 1 1
109 0.4 2012 1 0
132 0.2 2012 1 0
172 0.4 2012 1 1
169 0.3 2012 1 0
114 0.6 2012 1 1
156 0.6 2012 1 1
172 0.4 2012 1 0
68 0.5 2012 0 1
89 0.4 2012 0 1
167 0.6 2012 1 1
113 0.6 2012 1 0
115 0.9 2012 0 0
78 0.4 2012 0 0
118 0.8 2012 0 0
87 0.5 2012 0 1
173 0.4 2012 1 0
2 0.4 2012 1 1
162 0.7 2012 0 0
49 0.4 2012 0 1
122 0.8 2012 0 0
96 0.4 2012 0 1
100 0.3 2012 0 0
82 0.5 2012 0 0
100 0.8 2012 0 1
115 0.4 2012 0 0
141 1 2012 0 1
165 0.5 2012 1 1
165 0.5 2012 1 1
110 0.3 2012 0 1
118 0.3 2012 1 1
158 0.3 2012 1 0
146 0.4 2012 0 1
49 0.5 2012 1 0
90 0.5 2012 0 0
121 0.4 2012 0 0
155 0.7 2012 1 1
104 0.5 2012 0 0
147 0.4 2012 0 1
110 0.7 2012 0 0
108 0.7 2012 0 0
113 0.7 2012 0 0
115 0.7 2012 0 0
61 0.7 2012 0 1
60 0.7 2012 0 1
109 0.1 2012 0 1
68 0.2 2012 0 1
111 0.3 2012 0 0
77 0.6 2012 0 0
73 0.8 2012 0 1
151 0.8 2012 1 0
89 0 2012 0 0
78 0.3 2012 0 0
110 0.6 2012 0 0
220 0.5 2012 1 1
65 0.7 2012 0 1
141 0.3 2012 1 0
117 0.3 2012 0 0
122 0.4 2012 1 1
63 0.4 2012 0 0
44 0.1 2012 1 1
52 0.5 2012 0 1
131 0 2012 0 0
101 0.4 2012 0 1
42 0.6 2012 0 1
152 0.4 2012 1 1
107 0.1 2012 1 0
77 0.3 2012 0 0
154 0.7 2012 1 0
103 0.3 2012 1 1
96 0.5 2012 0 1
175 0.3 2012 1 1
57 0.6 2012 0 1
112 0.9 2012 0 0
143 0.4 2012 1 0
49 0.3 2012 0 0
110 0.9 2012 1 1
131 0.5 2012 1 1
167 0.3 2012 1 0
56 0.6 2012 0 0
137 0.2 2012 1 0
86 0.4 2012 0 1
121 0.5 2012 1 1
149 0.4 2012 1 0
168 0 2012 1 0
140 0.2 2012 1 0
88 0.5 2012 0 1
168 0.3 2012 1 1
94 0 2012 1 1
51 0.5 2012 1 1
48 0.6 2012 0 0
145 0.3 2012 1 1
66 0 2012 1 1
85 0.3 2012 0 1
109 0.5 2012 1 0
63 0.4 2012 0 0
102 0.5 2012 0 1
162 0.7 2012 0 0
86 0.8 2012 0 1
114 0.6 2012 0 1
164 0.4 2012 1 0
119 0.5 2012 1 1
126 0.5 2012 1 0
132 0.3 2012 1 1
142 0.6 2012 1 1
83 0.3 2012 1 0
94 0.6 2012 0 1
81 0.3 2012 0 0
166 0.7 2012 1 1
110 0.7 2012 0 0
64 0.6 2012 0 1
93 0.5 2012 1 0
104 0.5 2012 0 0
105 0.4 2012 0 1
49 0.4 2012 0 1
88 0.7 2012 0 0
95 0.2 2012 0 1
102 0.5 2012 0 1
99 0.4 2012 0 0
63 0.2 2012 0 1
76 0.5 2012 0 0
109 0.4 2012 0 0
117 0.7 2012 0 1
57 0.6 2012 0 1
120 0.4 2012 0 0
73 0.5 2012 0 1
91 0 2012 0 0
108 0.7 2012 0 0
105 0.4 2012 0 1
117 0.5 2012 1 0
119 0.6 2012 0 0
31 0.8 2012 0 1









Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center
R Framework error message
Warning: there are blank lines in the 'Data X' field.
Please, use NA for missing data - blank lines are simply
 deleted and are NOT treated as missing values.

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time7 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
R Framework error message & 
Warning: there are blank lines in the 'Data X' field.
Please, use NA for missing data - blank lines are simply
 deleted and are NOT treated as missing values.
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=303331&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]7 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [ROW]R Framework error message[/C][C]
Warning: there are blank lines in the 'Data X' field.
Please, use NA for missing data - blank lines are simply
 deleted and are NOT treated as missing values.
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=303331&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=303331&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center
R Framework error message
Warning: there are blank lines in the 'Data X' field.
Please, use NA for missing data - blank lines are simply
 deleted and are NOT treated as missing values.







Multiple Linear Regression - Estimated Regression Equation
LFM[t] = + 31733 + 17.2873Algebraic_Reasoning[t] -15.7297year[t] + 42.2394group[t] -7.47667gender[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
LFM[t] =  +  31733 +  17.2873Algebraic_Reasoning[t] -15.7297year[t] +  42.2394group[t] -7.47667gender[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=303331&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]LFM[t] =  +  31733 +  17.2873Algebraic_Reasoning[t] -15.7297year[t] +  42.2394group[t] -7.47667gender[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=303331&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=303331&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
LFM[t] = + 31733 + 17.2873Algebraic_Reasoning[t] -15.7297year[t] + 42.2394group[t] -7.47667gender[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+3.173e+04 8162+3.8880e+00 0.0001271 6.356e-05
Algebraic_Reasoning+17.29 10.5+1.6470e+00 0.1008 0.05038
year-15.73 4.058-3.8770e+00 0.0001328 6.638e-05
group+42.24 4.064+1.0390e+01 1.531e-21 7.654e-22
gender-7.477 4.034-1.8530e+00 0.0649 0.03245

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +3.173e+04 &  8162 & +3.8880e+00 &  0.0001271 &  6.356e-05 \tabularnewline
Algebraic_Reasoning & +17.29 &  10.5 & +1.6470e+00 &  0.1008 &  0.05038 \tabularnewline
year & -15.73 &  4.058 & -3.8770e+00 &  0.0001328 &  6.638e-05 \tabularnewline
group & +42.24 &  4.064 & +1.0390e+01 &  1.531e-21 &  7.654e-22 \tabularnewline
gender & -7.477 &  4.034 & -1.8530e+00 &  0.0649 &  0.03245 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=303331&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+3.173e+04[/C][C] 8162[/C][C]+3.8880e+00[/C][C] 0.0001271[/C][C] 6.356e-05[/C][/ROW]
[ROW][C]Algebraic_Reasoning[/C][C]+17.29[/C][C] 10.5[/C][C]+1.6470e+00[/C][C] 0.1008[/C][C] 0.05038[/C][/ROW]
[ROW][C]year[/C][C]-15.73[/C][C] 4.058[/C][C]-3.8770e+00[/C][C] 0.0001328[/C][C] 6.638e-05[/C][/ROW]
[ROW][C]group[/C][C]+42.24[/C][C] 4.064[/C][C]+1.0390e+01[/C][C] 1.531e-21[/C][C] 7.654e-22[/C][/ROW]
[ROW][C]gender[/C][C]-7.477[/C][C] 4.034[/C][C]-1.8530e+00[/C][C] 0.0649[/C][C] 0.03245[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=303331&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=303331&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+3.173e+04 8162+3.8880e+00 0.0001271 6.356e-05
Algebraic_Reasoning+17.29 10.5+1.6470e+00 0.1008 0.05038
year-15.73 4.058-3.8770e+00 0.0001328 6.638e-05
group+42.24 4.064+1.0390e+01 1.531e-21 7.654e-22
gender-7.477 4.034-1.8530e+00 0.0649 0.03245







Multiple Linear Regression - Regression Statistics
Multiple R 0.5634
R-squared 0.3175
Adjusted R-squared 0.3075
F-TEST (value) 31.74
F-TEST (DF numerator)4
F-TEST (DF denominator)273
p-value 0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 33.15
Sum Squared Residuals 3e+05

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.5634 \tabularnewline
R-squared &  0.3175 \tabularnewline
Adjusted R-squared &  0.3075 \tabularnewline
F-TEST (value) &  31.74 \tabularnewline
F-TEST (DF numerator) & 4 \tabularnewline
F-TEST (DF denominator) & 273 \tabularnewline
p-value &  0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  33.15 \tabularnewline
Sum Squared Residuals &  3e+05 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=303331&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.5634[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.3175[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.3075[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 31.74[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]4[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]273[/C][/ROW]
[ROW][C]p-value[/C][C] 0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 33.15[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 3e+05[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=303331&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=303331&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.5634
R-squared 0.3175
Adjusted R-squared 0.3075
F-TEST (value) 31.74
F-TEST (DF numerator)4
F-TEST (DF denominator)273
p-value 0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 33.15
Sum Squared Residuals 3e+05







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.5246, df1 = 2, df2 = 271, p-value = 0.2196
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.51069, df1 = 8, df2 = 265, p-value = 0.848
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.4445, df1 = 2, df2 = 271, p-value = 0.2377

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.5246, df1 = 2, df2 = 271, p-value = 0.2196
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.51069, df1 = 8, df2 = 265, p-value = 0.848
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.4445, df1 = 2, df2 = 271, p-value = 0.2377
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=303331&T=4

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.5246, df1 = 2, df2 = 271, p-value = 0.2196
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.51069, df1 = 8, df2 = 265, p-value = 0.848
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.4445, df1 = 2, df2 = 271, p-value = 0.2377
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=303331&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=303331&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.5246, df1 = 2, df2 = 271, p-value = 0.2196
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.51069, df1 = 8, df2 = 265, p-value = 0.848
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.4445, df1 = 2, df2 = 271, p-value = 0.2377







Variance Inflation Factors (Multicollinearity)
> vif
Algebraic_Reasoning                year               group              gender 
           1.033994            1.001986            1.043805            1.009937 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
Algebraic_Reasoning                year               group              gender 
           1.033994            1.001986            1.043805            1.009937 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=303331&T=5

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
Algebraic_Reasoning                year               group              gender 
           1.033994            1.001986            1.043805            1.009937 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=303331&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=303331&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
Algebraic_Reasoning                year               group              gender 
           1.033994            1.001986            1.043805            1.009937 



Parameters (Session):
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = ; par5 = ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')