Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationSun, 21 Dec 2008 23:08:27 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2008/Dec/22/t122992622252vv7q0xzdfujsb.htm/, Retrieved Mon, 13 May 2024 20:24:57 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=35935, Retrieved Mon, 13 May 2024 20:24:57 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact174
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Voedingsmiddelen 2] [2008-12-22 06:08:27] [8e1dd6a8d7300d49f515697199ea9e73] [Current]
Feedback Forum

Post a new message
Dataseries X:
100,29	0
101,12	0
102,65	0
102,71	0
103,39	0
102,8	0
102,07	0
102,15	0
101,21	0
101,27	0
101,86	0
101,65	0
101,94	0
102,62	0
102,71	0
103,39	0
104,51	0
104,09	0
104,29	0
104,57	0
105,39	0
105,15	0
106,13	0
105,46	0
106,47	0
106,62	0
106,52	0
108,04	0
107,15	0
107,32	0
107,76	0
107,26	0
107,89	0
109,08	0
110,4	0
111,03	0
112,05	0
112,28	0
112,8	0
114,17	0
114,92	0
114,65	0
115,49	0
114,67	1
114,71	1
115,15	1
115,03	1




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 7 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=35935&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]7 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=35935&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=35935&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
voedingsmiddelen[t] = + 98.372 + 2.01866666666667dummy[t] + 0.739722222222227M1[t] + 0.892444444444448M2[t] + 1.08266666666667M3[t] + 1.67038888888889M4[t] + 1.76561111111111M5[t] + 1.16833333333333M6[t] + 1.03605555555556M7[t] -0.0283888888888880M8[t] -0.210666666666671M9[t] -0.167944444444444M10[t] + 0.204777777777778M11[t] + 0.319777777777778t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
voedingsmiddelen[t] =  +  98.372 +  2.01866666666667dummy[t] +  0.739722222222227M1[t] +  0.892444444444448M2[t] +  1.08266666666667M3[t] +  1.67038888888889M4[t] +  1.76561111111111M5[t] +  1.16833333333333M6[t] +  1.03605555555556M7[t] -0.0283888888888880M8[t] -0.210666666666671M9[t] -0.167944444444444M10[t] +  0.204777777777778M11[t] +  0.319777777777778t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=35935&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]voedingsmiddelen[t] =  +  98.372 +  2.01866666666667dummy[t] +  0.739722222222227M1[t] +  0.892444444444448M2[t] +  1.08266666666667M3[t] +  1.67038888888889M4[t] +  1.76561111111111M5[t] +  1.16833333333333M6[t] +  1.03605555555556M7[t] -0.0283888888888880M8[t] -0.210666666666671M9[t] -0.167944444444444M10[t] +  0.204777777777778M11[t] +  0.319777777777778t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=35935&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=35935&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
voedingsmiddelen[t] = + 98.372 + 2.01866666666667dummy[t] + 0.739722222222227M1[t] + 0.892444444444448M2[t] + 1.08266666666667M3[t] + 1.67038888888889M4[t] + 1.76561111111111M5[t] + 1.16833333333333M6[t] + 1.03605555555556M7[t] -0.0283888888888880M8[t] -0.210666666666671M9[t] -0.167944444444444M10[t] + 0.204777777777778M11[t] + 0.319777777777778t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)98.3720.910261108.070100
dummy2.018666666666670.9102612.21770.0335740.016787
M10.7397222222222271.0734580.68910.4955760.247788
M20.8924444444444481.0721880.83240.411190.205595
M31.082666666666671.0711991.01070.3195110.159756
M41.670388888888891.0704921.56040.1282070.064103
M51.765611111111111.0700671.650.1084310.054215
M61.168333333333331.0699261.0920.2827520.141376
M71.036055555555561.0700670.96820.339980.16999
M8-0.02838888888888801.091087-0.0260.9793990.489699
M9-0.2106666666666711.090115-0.19330.8479460.423973
M10-0.1679444444444441.08942-0.15420.8784230.439211
M110.2047777777777781.0890030.1880.8519960.425998
t0.3197777777777780.01740218.375600

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 98.372 & 0.910261 & 108.0701 & 0 & 0 \tabularnewline
dummy & 2.01866666666667 & 0.910261 & 2.2177 & 0.033574 & 0.016787 \tabularnewline
M1 & 0.739722222222227 & 1.073458 & 0.6891 & 0.495576 & 0.247788 \tabularnewline
M2 & 0.892444444444448 & 1.072188 & 0.8324 & 0.41119 & 0.205595 \tabularnewline
M3 & 1.08266666666667 & 1.071199 & 1.0107 & 0.319511 & 0.159756 \tabularnewline
M4 & 1.67038888888889 & 1.070492 & 1.5604 & 0.128207 & 0.064103 \tabularnewline
M5 & 1.76561111111111 & 1.070067 & 1.65 & 0.108431 & 0.054215 \tabularnewline
M6 & 1.16833333333333 & 1.069926 & 1.092 & 0.282752 & 0.141376 \tabularnewline
M7 & 1.03605555555556 & 1.070067 & 0.9682 & 0.33998 & 0.16999 \tabularnewline
M8 & -0.0283888888888880 & 1.091087 & -0.026 & 0.979399 & 0.489699 \tabularnewline
M9 & -0.210666666666671 & 1.090115 & -0.1933 & 0.847946 & 0.423973 \tabularnewline
M10 & -0.167944444444444 & 1.08942 & -0.1542 & 0.878423 & 0.439211 \tabularnewline
M11 & 0.204777777777778 & 1.089003 & 0.188 & 0.851996 & 0.425998 \tabularnewline
t & 0.319777777777778 & 0.017402 & 18.3756 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=35935&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]98.372[/C][C]0.910261[/C][C]108.0701[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]dummy[/C][C]2.01866666666667[/C][C]0.910261[/C][C]2.2177[/C][C]0.033574[/C][C]0.016787[/C][/ROW]
[ROW][C]M1[/C][C]0.739722222222227[/C][C]1.073458[/C][C]0.6891[/C][C]0.495576[/C][C]0.247788[/C][/ROW]
[ROW][C]M2[/C][C]0.892444444444448[/C][C]1.072188[/C][C]0.8324[/C][C]0.41119[/C][C]0.205595[/C][/ROW]
[ROW][C]M3[/C][C]1.08266666666667[/C][C]1.071199[/C][C]1.0107[/C][C]0.319511[/C][C]0.159756[/C][/ROW]
[ROW][C]M4[/C][C]1.67038888888889[/C][C]1.070492[/C][C]1.5604[/C][C]0.128207[/C][C]0.064103[/C][/ROW]
[ROW][C]M5[/C][C]1.76561111111111[/C][C]1.070067[/C][C]1.65[/C][C]0.108431[/C][C]0.054215[/C][/ROW]
[ROW][C]M6[/C][C]1.16833333333333[/C][C]1.069926[/C][C]1.092[/C][C]0.282752[/C][C]0.141376[/C][/ROW]
[ROW][C]M7[/C][C]1.03605555555556[/C][C]1.070067[/C][C]0.9682[/C][C]0.33998[/C][C]0.16999[/C][/ROW]
[ROW][C]M8[/C][C]-0.0283888888888880[/C][C]1.091087[/C][C]-0.026[/C][C]0.979399[/C][C]0.489699[/C][/ROW]
[ROW][C]M9[/C][C]-0.210666666666671[/C][C]1.090115[/C][C]-0.1933[/C][C]0.847946[/C][C]0.423973[/C][/ROW]
[ROW][C]M10[/C][C]-0.167944444444444[/C][C]1.08942[/C][C]-0.1542[/C][C]0.878423[/C][C]0.439211[/C][/ROW]
[ROW][C]M11[/C][C]0.204777777777778[/C][C]1.089003[/C][C]0.188[/C][C]0.851996[/C][C]0.425998[/C][/ROW]
[ROW][C]t[/C][C]0.319777777777778[/C][C]0.017402[/C][C]18.3756[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=35935&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=35935&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)98.3720.910261108.070100
dummy2.018666666666670.9102612.21770.0335740.016787
M10.7397222222222271.0734580.68910.4955760.247788
M20.8924444444444481.0721880.83240.411190.205595
M31.082666666666671.0711991.01070.3195110.159756
M41.670388888888891.0704921.56040.1282070.064103
M51.765611111111111.0700671.650.1084310.054215
M61.168333333333331.0699261.0920.2827520.141376
M71.036055555555561.0700670.96820.339980.16999
M8-0.02838888888888801.091087-0.0260.9793990.489699
M9-0.2106666666666711.090115-0.19330.8479460.423973
M10-0.1679444444444441.08942-0.15420.8784230.439211
M110.2047777777777781.0890030.1880.8519960.425998
t0.3197777777777780.01740218.375600







Multiple Linear Regression - Regression Statistics
Multiple R0.968552904813162
R-squared0.938094729422014
Adjusted R-squared0.913707804648868
F-TEST (value)38.4671186772597
F-TEST (DF numerator)13
F-TEST (DF denominator)33
p-value4.44089209850063e-16
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation1.40086189198081
Sum Squared Residuals64.7596633333334

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.968552904813162 \tabularnewline
R-squared & 0.938094729422014 \tabularnewline
Adjusted R-squared & 0.913707804648868 \tabularnewline
F-TEST (value) & 38.4671186772597 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 33 \tabularnewline
p-value & 4.44089209850063e-16 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 1.40086189198081 \tabularnewline
Sum Squared Residuals & 64.7596633333334 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=35935&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.968552904813162[/C][/ROW]
[ROW][C]R-squared[/C][C]0.938094729422014[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.913707804648868[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]38.4671186772597[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]33[/C][/ROW]
[ROW][C]p-value[/C][C]4.44089209850063e-16[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]1.40086189198081[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]64.7596633333334[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=35935&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=35935&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.968552904813162
R-squared0.938094729422014
Adjusted R-squared0.913707804648868
F-TEST (value)38.4671186772597
F-TEST (DF numerator)13
F-TEST (DF denominator)33
p-value4.44089209850063e-16
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation1.40086189198081
Sum Squared Residuals64.7596633333334







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1100.2999.43150.858500000000033
2101.1299.9041.21599999999999
3102.65100.4142.23600000000001
4102.71101.32151.38849999999999
5103.39101.73651.65350000000000
6102.8101.4591.34100000000000
7102.07101.64650.423499999999990
8102.15100.9018333333331.24816666666667
9101.21101.0393333333330.170666666666661
10101.27101.401833333333-0.131833333333340
11101.86102.094333333333-0.234333333333337
12101.65102.209333333333-0.55933333333333
13101.94103.268833333333-1.32883333333334
14102.62103.741333333333-1.12133333333332
15102.71104.251333333333-1.54133333333334
16103.39105.158833333333-1.76883333333334
17104.51105.573833333333-1.06383333333333
18104.09105.296333333333-1.20633333333333
19104.29105.483833333333-1.19383333333333
20104.57104.739166666667-0.169166666666675
21105.39104.8766666666670.513333333333336
22105.15105.239166666667-0.0891666666666628
23106.13105.9316666666670.198333333333328
24105.46106.046666666667-0.586666666666675
25106.47107.106166666667-0.636166666666675
26106.62107.578666666667-0.958666666666668
27106.52108.088666666667-1.56866666666667
28108.04108.996166666667-0.956166666666662
29107.15109.411166666667-2.26116666666666
30107.32109.133666666667-1.81366666666667
31107.76109.321166666667-1.56116666666666
32107.26108.5765-1.31650000000000
33107.89108.714-0.823999999999995
34109.08109.07650.00349999999999862
35110.4109.7690.631000000000009
36111.03109.8841.14600000000000
37112.05110.94351.10649999999999
38112.28111.4160.863999999999999
39112.8111.9260.874000000000003
40114.17112.83351.3365
41114.92113.24851.6715
42114.65112.9711.67900000000001
43115.49113.15852.3315
44114.67114.43250.237499999999999
45114.71114.570.139999999999997
46115.15114.93250.217500000000004
47115.03115.625-0.595

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 100.29 & 99.4315 & 0.858500000000033 \tabularnewline
2 & 101.12 & 99.904 & 1.21599999999999 \tabularnewline
3 & 102.65 & 100.414 & 2.23600000000001 \tabularnewline
4 & 102.71 & 101.3215 & 1.38849999999999 \tabularnewline
5 & 103.39 & 101.7365 & 1.65350000000000 \tabularnewline
6 & 102.8 & 101.459 & 1.34100000000000 \tabularnewline
7 & 102.07 & 101.6465 & 0.423499999999990 \tabularnewline
8 & 102.15 & 100.901833333333 & 1.24816666666667 \tabularnewline
9 & 101.21 & 101.039333333333 & 0.170666666666661 \tabularnewline
10 & 101.27 & 101.401833333333 & -0.131833333333340 \tabularnewline
11 & 101.86 & 102.094333333333 & -0.234333333333337 \tabularnewline
12 & 101.65 & 102.209333333333 & -0.55933333333333 \tabularnewline
13 & 101.94 & 103.268833333333 & -1.32883333333334 \tabularnewline
14 & 102.62 & 103.741333333333 & -1.12133333333332 \tabularnewline
15 & 102.71 & 104.251333333333 & -1.54133333333334 \tabularnewline
16 & 103.39 & 105.158833333333 & -1.76883333333334 \tabularnewline
17 & 104.51 & 105.573833333333 & -1.06383333333333 \tabularnewline
18 & 104.09 & 105.296333333333 & -1.20633333333333 \tabularnewline
19 & 104.29 & 105.483833333333 & -1.19383333333333 \tabularnewline
20 & 104.57 & 104.739166666667 & -0.169166666666675 \tabularnewline
21 & 105.39 & 104.876666666667 & 0.513333333333336 \tabularnewline
22 & 105.15 & 105.239166666667 & -0.0891666666666628 \tabularnewline
23 & 106.13 & 105.931666666667 & 0.198333333333328 \tabularnewline
24 & 105.46 & 106.046666666667 & -0.586666666666675 \tabularnewline
25 & 106.47 & 107.106166666667 & -0.636166666666675 \tabularnewline
26 & 106.62 & 107.578666666667 & -0.958666666666668 \tabularnewline
27 & 106.52 & 108.088666666667 & -1.56866666666667 \tabularnewline
28 & 108.04 & 108.996166666667 & -0.956166666666662 \tabularnewline
29 & 107.15 & 109.411166666667 & -2.26116666666666 \tabularnewline
30 & 107.32 & 109.133666666667 & -1.81366666666667 \tabularnewline
31 & 107.76 & 109.321166666667 & -1.56116666666666 \tabularnewline
32 & 107.26 & 108.5765 & -1.31650000000000 \tabularnewline
33 & 107.89 & 108.714 & -0.823999999999995 \tabularnewline
34 & 109.08 & 109.0765 & 0.00349999999999862 \tabularnewline
35 & 110.4 & 109.769 & 0.631000000000009 \tabularnewline
36 & 111.03 & 109.884 & 1.14600000000000 \tabularnewline
37 & 112.05 & 110.9435 & 1.10649999999999 \tabularnewline
38 & 112.28 & 111.416 & 0.863999999999999 \tabularnewline
39 & 112.8 & 111.926 & 0.874000000000003 \tabularnewline
40 & 114.17 & 112.8335 & 1.3365 \tabularnewline
41 & 114.92 & 113.2485 & 1.6715 \tabularnewline
42 & 114.65 & 112.971 & 1.67900000000001 \tabularnewline
43 & 115.49 & 113.1585 & 2.3315 \tabularnewline
44 & 114.67 & 114.4325 & 0.237499999999999 \tabularnewline
45 & 114.71 & 114.57 & 0.139999999999997 \tabularnewline
46 & 115.15 & 114.9325 & 0.217500000000004 \tabularnewline
47 & 115.03 & 115.625 & -0.595 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=35935&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]100.29[/C][C]99.4315[/C][C]0.858500000000033[/C][/ROW]
[ROW][C]2[/C][C]101.12[/C][C]99.904[/C][C]1.21599999999999[/C][/ROW]
[ROW][C]3[/C][C]102.65[/C][C]100.414[/C][C]2.23600000000001[/C][/ROW]
[ROW][C]4[/C][C]102.71[/C][C]101.3215[/C][C]1.38849999999999[/C][/ROW]
[ROW][C]5[/C][C]103.39[/C][C]101.7365[/C][C]1.65350000000000[/C][/ROW]
[ROW][C]6[/C][C]102.8[/C][C]101.459[/C][C]1.34100000000000[/C][/ROW]
[ROW][C]7[/C][C]102.07[/C][C]101.6465[/C][C]0.423499999999990[/C][/ROW]
[ROW][C]8[/C][C]102.15[/C][C]100.901833333333[/C][C]1.24816666666667[/C][/ROW]
[ROW][C]9[/C][C]101.21[/C][C]101.039333333333[/C][C]0.170666666666661[/C][/ROW]
[ROW][C]10[/C][C]101.27[/C][C]101.401833333333[/C][C]-0.131833333333340[/C][/ROW]
[ROW][C]11[/C][C]101.86[/C][C]102.094333333333[/C][C]-0.234333333333337[/C][/ROW]
[ROW][C]12[/C][C]101.65[/C][C]102.209333333333[/C][C]-0.55933333333333[/C][/ROW]
[ROW][C]13[/C][C]101.94[/C][C]103.268833333333[/C][C]-1.32883333333334[/C][/ROW]
[ROW][C]14[/C][C]102.62[/C][C]103.741333333333[/C][C]-1.12133333333332[/C][/ROW]
[ROW][C]15[/C][C]102.71[/C][C]104.251333333333[/C][C]-1.54133333333334[/C][/ROW]
[ROW][C]16[/C][C]103.39[/C][C]105.158833333333[/C][C]-1.76883333333334[/C][/ROW]
[ROW][C]17[/C][C]104.51[/C][C]105.573833333333[/C][C]-1.06383333333333[/C][/ROW]
[ROW][C]18[/C][C]104.09[/C][C]105.296333333333[/C][C]-1.20633333333333[/C][/ROW]
[ROW][C]19[/C][C]104.29[/C][C]105.483833333333[/C][C]-1.19383333333333[/C][/ROW]
[ROW][C]20[/C][C]104.57[/C][C]104.739166666667[/C][C]-0.169166666666675[/C][/ROW]
[ROW][C]21[/C][C]105.39[/C][C]104.876666666667[/C][C]0.513333333333336[/C][/ROW]
[ROW][C]22[/C][C]105.15[/C][C]105.239166666667[/C][C]-0.0891666666666628[/C][/ROW]
[ROW][C]23[/C][C]106.13[/C][C]105.931666666667[/C][C]0.198333333333328[/C][/ROW]
[ROW][C]24[/C][C]105.46[/C][C]106.046666666667[/C][C]-0.586666666666675[/C][/ROW]
[ROW][C]25[/C][C]106.47[/C][C]107.106166666667[/C][C]-0.636166666666675[/C][/ROW]
[ROW][C]26[/C][C]106.62[/C][C]107.578666666667[/C][C]-0.958666666666668[/C][/ROW]
[ROW][C]27[/C][C]106.52[/C][C]108.088666666667[/C][C]-1.56866666666667[/C][/ROW]
[ROW][C]28[/C][C]108.04[/C][C]108.996166666667[/C][C]-0.956166666666662[/C][/ROW]
[ROW][C]29[/C][C]107.15[/C][C]109.411166666667[/C][C]-2.26116666666666[/C][/ROW]
[ROW][C]30[/C][C]107.32[/C][C]109.133666666667[/C][C]-1.81366666666667[/C][/ROW]
[ROW][C]31[/C][C]107.76[/C][C]109.321166666667[/C][C]-1.56116666666666[/C][/ROW]
[ROW][C]32[/C][C]107.26[/C][C]108.5765[/C][C]-1.31650000000000[/C][/ROW]
[ROW][C]33[/C][C]107.89[/C][C]108.714[/C][C]-0.823999999999995[/C][/ROW]
[ROW][C]34[/C][C]109.08[/C][C]109.0765[/C][C]0.00349999999999862[/C][/ROW]
[ROW][C]35[/C][C]110.4[/C][C]109.769[/C][C]0.631000000000009[/C][/ROW]
[ROW][C]36[/C][C]111.03[/C][C]109.884[/C][C]1.14600000000000[/C][/ROW]
[ROW][C]37[/C][C]112.05[/C][C]110.9435[/C][C]1.10649999999999[/C][/ROW]
[ROW][C]38[/C][C]112.28[/C][C]111.416[/C][C]0.863999999999999[/C][/ROW]
[ROW][C]39[/C][C]112.8[/C][C]111.926[/C][C]0.874000000000003[/C][/ROW]
[ROW][C]40[/C][C]114.17[/C][C]112.8335[/C][C]1.3365[/C][/ROW]
[ROW][C]41[/C][C]114.92[/C][C]113.2485[/C][C]1.6715[/C][/ROW]
[ROW][C]42[/C][C]114.65[/C][C]112.971[/C][C]1.67900000000001[/C][/ROW]
[ROW][C]43[/C][C]115.49[/C][C]113.1585[/C][C]2.3315[/C][/ROW]
[ROW][C]44[/C][C]114.67[/C][C]114.4325[/C][C]0.237499999999999[/C][/ROW]
[ROW][C]45[/C][C]114.71[/C][C]114.57[/C][C]0.139999999999997[/C][/ROW]
[ROW][C]46[/C][C]115.15[/C][C]114.9325[/C][C]0.217500000000004[/C][/ROW]
[ROW][C]47[/C][C]115.03[/C][C]115.625[/C][C]-0.595[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=35935&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=35935&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1100.2999.43150.858500000000033
2101.1299.9041.21599999999999
3102.65100.4142.23600000000001
4102.71101.32151.38849999999999
5103.39101.73651.65350000000000
6102.8101.4591.34100000000000
7102.07101.64650.423499999999990
8102.15100.9018333333331.24816666666667
9101.21101.0393333333330.170666666666661
10101.27101.401833333333-0.131833333333340
11101.86102.094333333333-0.234333333333337
12101.65102.209333333333-0.55933333333333
13101.94103.268833333333-1.32883333333334
14102.62103.741333333333-1.12133333333332
15102.71104.251333333333-1.54133333333334
16103.39105.158833333333-1.76883333333334
17104.51105.573833333333-1.06383333333333
18104.09105.296333333333-1.20633333333333
19104.29105.483833333333-1.19383333333333
20104.57104.739166666667-0.169166666666675
21105.39104.8766666666670.513333333333336
22105.15105.239166666667-0.0891666666666628
23106.13105.9316666666670.198333333333328
24105.46106.046666666667-0.586666666666675
25106.47107.106166666667-0.636166666666675
26106.62107.578666666667-0.958666666666668
27106.52108.088666666667-1.56866666666667
28108.04108.996166666667-0.956166666666662
29107.15109.411166666667-2.26116666666666
30107.32109.133666666667-1.81366666666667
31107.76109.321166666667-1.56116666666666
32107.26108.5765-1.31650000000000
33107.89108.714-0.823999999999995
34109.08109.07650.00349999999999862
35110.4109.7690.631000000000009
36111.03109.8841.14600000000000
37112.05110.94351.10649999999999
38112.28111.4160.863999999999999
39112.8111.9260.874000000000003
40114.17112.83351.3365
41114.92113.24851.6715
42114.65112.9711.67900000000001
43115.49113.15852.3315
44114.67114.43250.237499999999999
45114.71114.570.139999999999997
46115.15114.93250.217500000000004
47115.03115.625-0.595







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
170.1200156016607550.2400312033215090.879984398339245
180.0486370525192120.0972741050384240.951362947480788
190.04926227954606770.09852455909213540.950737720453932
200.05707052097203040.1141410419440610.94292947902797
210.3707024737139900.7414049474279810.62929752628601
220.5371246320876760.9257507358246480.462875367912324
230.8739448115810930.2521103768378140.126055188418907
240.8681737431852150.2636525136295710.131826256814785
250.8845049054596590.2309901890806830.115495094540341
260.8779115178545120.2441769642909760.122088482145488
270.8237948419579720.3524103160840550.176205158042028
280.8290598407053310.3418803185893370.170940159294669
290.7100214739966840.5799570520066330.289978526003316
300.5374604138027050.925079172394590.462539586197295

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
17 & 0.120015601660755 & 0.240031203321509 & 0.879984398339245 \tabularnewline
18 & 0.048637052519212 & 0.097274105038424 & 0.951362947480788 \tabularnewline
19 & 0.0492622795460677 & 0.0985245590921354 & 0.950737720453932 \tabularnewline
20 & 0.0570705209720304 & 0.114141041944061 & 0.94292947902797 \tabularnewline
21 & 0.370702473713990 & 0.741404947427981 & 0.62929752628601 \tabularnewline
22 & 0.537124632087676 & 0.925750735824648 & 0.462875367912324 \tabularnewline
23 & 0.873944811581093 & 0.252110376837814 & 0.126055188418907 \tabularnewline
24 & 0.868173743185215 & 0.263652513629571 & 0.131826256814785 \tabularnewline
25 & 0.884504905459659 & 0.230990189080683 & 0.115495094540341 \tabularnewline
26 & 0.877911517854512 & 0.244176964290976 & 0.122088482145488 \tabularnewline
27 & 0.823794841957972 & 0.352410316084055 & 0.176205158042028 \tabularnewline
28 & 0.829059840705331 & 0.341880318589337 & 0.170940159294669 \tabularnewline
29 & 0.710021473996684 & 0.579957052006633 & 0.289978526003316 \tabularnewline
30 & 0.537460413802705 & 0.92507917239459 & 0.462539586197295 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=35935&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]17[/C][C]0.120015601660755[/C][C]0.240031203321509[/C][C]0.879984398339245[/C][/ROW]
[ROW][C]18[/C][C]0.048637052519212[/C][C]0.097274105038424[/C][C]0.951362947480788[/C][/ROW]
[ROW][C]19[/C][C]0.0492622795460677[/C][C]0.0985245590921354[/C][C]0.950737720453932[/C][/ROW]
[ROW][C]20[/C][C]0.0570705209720304[/C][C]0.114141041944061[/C][C]0.94292947902797[/C][/ROW]
[ROW][C]21[/C][C]0.370702473713990[/C][C]0.741404947427981[/C][C]0.62929752628601[/C][/ROW]
[ROW][C]22[/C][C]0.537124632087676[/C][C]0.925750735824648[/C][C]0.462875367912324[/C][/ROW]
[ROW][C]23[/C][C]0.873944811581093[/C][C]0.252110376837814[/C][C]0.126055188418907[/C][/ROW]
[ROW][C]24[/C][C]0.868173743185215[/C][C]0.263652513629571[/C][C]0.131826256814785[/C][/ROW]
[ROW][C]25[/C][C]0.884504905459659[/C][C]0.230990189080683[/C][C]0.115495094540341[/C][/ROW]
[ROW][C]26[/C][C]0.877911517854512[/C][C]0.244176964290976[/C][C]0.122088482145488[/C][/ROW]
[ROW][C]27[/C][C]0.823794841957972[/C][C]0.352410316084055[/C][C]0.176205158042028[/C][/ROW]
[ROW][C]28[/C][C]0.829059840705331[/C][C]0.341880318589337[/C][C]0.170940159294669[/C][/ROW]
[ROW][C]29[/C][C]0.710021473996684[/C][C]0.579957052006633[/C][C]0.289978526003316[/C][/ROW]
[ROW][C]30[/C][C]0.537460413802705[/C][C]0.92507917239459[/C][C]0.462539586197295[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=35935&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=35935&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
170.1200156016607550.2400312033215090.879984398339245
180.0486370525192120.0972741050384240.951362947480788
190.04926227954606770.09852455909213540.950737720453932
200.05707052097203040.1141410419440610.94292947902797
210.3707024737139900.7414049474279810.62929752628601
220.5371246320876760.9257507358246480.462875367912324
230.8739448115810930.2521103768378140.126055188418907
240.8681737431852150.2636525136295710.131826256814785
250.8845049054596590.2309901890806830.115495094540341
260.8779115178545120.2441769642909760.122088482145488
270.8237948419579720.3524103160840550.176205158042028
280.8290598407053310.3418803185893370.170940159294669
290.7100214739966840.5799570520066330.289978526003316
300.5374604138027050.925079172394590.462539586197295







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level20.142857142857143NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 2 & 0.142857142857143 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=35935&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]2[/C][C]0.142857142857143[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=35935&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=35935&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level20.142857142857143NOK



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,gqarr[mypoint-kp3+1,1])
a<-table.element(a,gqarr[mypoint-kp3+1,2])
a<-table.element(a,gqarr[mypoint-kp3+1,3])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,numsignificant1)
a<-table.element(a,numsignificant1/numgqtests)
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,numsignificant5)
a<-table.element(a,numsignificant5/numgqtests)
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,numsignificant10)
a<-table.element(a,numsignificant10/numgqtests)
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}