Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 24 Nov 2008 17:12:13 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2008/Nov/25/t12275719677ahdbkdxghil4rd.htm/, Retrieved Thu, 09 May 2024 17:28:34 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=25559, Retrieved Thu, 09 May 2024 17:28:34 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact190
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [Q1 The Seatbeltlaw] [2007-11-14 19:27:43] [8cd6641b921d30ebe00b648d1481bba0]
F    D  [Multiple Regression] [Q1 - People saved...] [2008-11-24 10:45:06] [af90f76a5211a482a7c35f2c76d2fd61]
F    D    [Multiple Regression] [] [2008-11-24 17:19:50] [86f0e8d954d872d9ff7f670e7130f4d5]
-    D        [Multiple Regression] [] [2008-11-25 00:12:13] [c0a347e3519123f7eef62b705326dad9] [Current]
Feedback Forum

Post a new message
Dataseries X:
89.6	0
92.8	0
107.6	0
104.6	0
103.0	0
106.9	0
56.3	0
93.4	0
109.1	0
113.8	0
97.4	0
72.5	0
82.7	0
88.9	0
105.9	0
100.8	0
94.0	0
105.0	0
58.5	0
87.6	0
113.1	0
112.5	0
89.6	0
74.5	0
82.7	0
90.1	0
109.4	0
96.0	0
89.2	0
109.1	0
49.1	1
92.9	1
107.7	1
103.5	1
91.1	1
79.8	1
71.9	1
82.9	1
90.1	1
100.7	1
90.7	1
108.8	1
44.1	1
93.6	1
107.4	1
96.5	1
93.6	1
76.5	1
76.7	1
84.0	1
103.3	1
88.5	1
99.0	1
105.9	1
44.7	1
94.0	0
107.1	0
104.8	0
102.5	0
77.7	0
85.2	0
91.3	0
106.5	0
92.4	0
97.5	0
107.0	0
51.1	0
98.6	0
102.2	0
114.3	0
99.4	0
72.5	0
92.3	0
99.4	0
85.9	0
109.4	0
97.6	0
104.7	0
56.5	0




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25559&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25559&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25559&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 77.6562368972747 -4.82405660377359x[t] + 7.14589198362787M1[t] + 14.0569606668664M2[t] + 25.3966007786762M3[t] + 23.0790980333433M4[t] + 20.0330238594390M5[t] + 30.9583782569632M6[t] -23.6414021164021M7[t] + 17.7223919337127M8[t] + 32.1501272836178M9[t] + 31.9611959668563M10[t] + 20.0055979834282M11[t] -0.0110686832384945t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  77.6562368972747 -4.82405660377359x[t] +  7.14589198362787M1[t] +  14.0569606668664M2[t] +  25.3966007786762M3[t] +  23.0790980333433M4[t] +  20.0330238594390M5[t] +  30.9583782569632M6[t] -23.6414021164021M7[t] +  17.7223919337127M8[t] +  32.1501272836178M9[t] +  31.9611959668563M10[t] +  20.0055979834282M11[t] -0.0110686832384945t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25559&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  77.6562368972747 -4.82405660377359x[t] +  7.14589198362787M1[t] +  14.0569606668664M2[t] +  25.3966007786762M3[t] +  23.0790980333433M4[t] +  20.0330238594390M5[t] +  30.9583782569632M6[t] -23.6414021164021M7[t] +  17.7223919337127M8[t] +  32.1501272836178M9[t] +  31.9611959668563M10[t] +  20.0055979834282M11[t] -0.0110686832384945t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25559&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25559&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 77.6562368972747 -4.82405660377359x[t] + 7.14589198362787M1[t] + 14.0569606668664M2[t] + 25.3966007786762M3[t] + 23.0790980333433M4[t] + 20.0330238594390M5[t] + 30.9583782569632M6[t] -23.6414021164021M7[t] + 17.7223919337127M8[t] + 32.1501272836178M9[t] + 31.9611959668563M10[t] + 20.0055979834282M11[t] -0.0110686832384945t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)77.65623689727472.44348831.780900
x-4.824056603773591.288105-3.74510.0003850.000192
M17.145891983627872.9428812.42820.0179510.008975
M214.05696066686642.9418744.77821e-055e-06
M325.39660077867622.9411028.635100
M423.07909803334332.9405647.848500
M520.03302385943902.940266.813400
M630.95837825696322.9401910.529400
M7-23.64140211640212.942131-8.035500
M817.72239193371273.0523215.806200
M932.15012728361783.05153110.535700
M1031.96119596685633.05096710.475800
M1120.00559798342823.0506286.557900
t-0.01106868323849450.026249-0.42170.6746470.337323

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 77.6562368972747 & 2.443488 & 31.7809 & 0 & 0 \tabularnewline
x & -4.82405660377359 & 1.288105 & -3.7451 & 0.000385 & 0.000192 \tabularnewline
M1 & 7.14589198362787 & 2.942881 & 2.4282 & 0.017951 & 0.008975 \tabularnewline
M2 & 14.0569606668664 & 2.941874 & 4.7782 & 1e-05 & 5e-06 \tabularnewline
M3 & 25.3966007786762 & 2.941102 & 8.6351 & 0 & 0 \tabularnewline
M4 & 23.0790980333433 & 2.940564 & 7.8485 & 0 & 0 \tabularnewline
M5 & 20.0330238594390 & 2.94026 & 6.8134 & 0 & 0 \tabularnewline
M6 & 30.9583782569632 & 2.94019 & 10.5294 & 0 & 0 \tabularnewline
M7 & -23.6414021164021 & 2.942131 & -8.0355 & 0 & 0 \tabularnewline
M8 & 17.7223919337127 & 3.052321 & 5.8062 & 0 & 0 \tabularnewline
M9 & 32.1501272836178 & 3.051531 & 10.5357 & 0 & 0 \tabularnewline
M10 & 31.9611959668563 & 3.050967 & 10.4758 & 0 & 0 \tabularnewline
M11 & 20.0055979834282 & 3.050628 & 6.5579 & 0 & 0 \tabularnewline
t & -0.0110686832384945 & 0.026249 & -0.4217 & 0.674647 & 0.337323 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25559&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]77.6562368972747[/C][C]2.443488[/C][C]31.7809[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]x[/C][C]-4.82405660377359[/C][C]1.288105[/C][C]-3.7451[/C][C]0.000385[/C][C]0.000192[/C][/ROW]
[ROW][C]M1[/C][C]7.14589198362787[/C][C]2.942881[/C][C]2.4282[/C][C]0.017951[/C][C]0.008975[/C][/ROW]
[ROW][C]M2[/C][C]14.0569606668664[/C][C]2.941874[/C][C]4.7782[/C][C]1e-05[/C][C]5e-06[/C][/ROW]
[ROW][C]M3[/C][C]25.3966007786762[/C][C]2.941102[/C][C]8.6351[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M4[/C][C]23.0790980333433[/C][C]2.940564[/C][C]7.8485[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M5[/C][C]20.0330238594390[/C][C]2.94026[/C][C]6.8134[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M6[/C][C]30.9583782569632[/C][C]2.94019[/C][C]10.5294[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M7[/C][C]-23.6414021164021[/C][C]2.942131[/C][C]-8.0355[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M8[/C][C]17.7223919337127[/C][C]3.052321[/C][C]5.8062[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M9[/C][C]32.1501272836178[/C][C]3.051531[/C][C]10.5357[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M10[/C][C]31.9611959668563[/C][C]3.050967[/C][C]10.4758[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M11[/C][C]20.0055979834282[/C][C]3.050628[/C][C]6.5579[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]t[/C][C]-0.0110686832384945[/C][C]0.026249[/C][C]-0.4217[/C][C]0.674647[/C][C]0.337323[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25559&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25559&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)77.65623689727472.44348831.780900
x-4.824056603773591.288105-3.74510.0003850.000192
M17.145891983627872.9428812.42820.0179510.008975
M214.05696066686642.9418744.77821e-055e-06
M325.39660077867622.9411028.635100
M423.07909803334332.9405647.848500
M520.03302385943902.940266.813400
M630.95837825696322.9401910.529400
M7-23.64140211640212.942131-8.035500
M817.72239193371273.0523215.806200
M932.15012728361783.05153110.535700
M1031.96119596685633.05096710.475800
M1120.00559798342823.0506286.557900
t-0.01106868323849450.026249-0.42170.6746470.337323







Multiple Linear Regression - Regression Statistics
Multiple R0.9569328575323
R-squared0.915720493824934
Adjusted R-squared0.898864592589921
F-TEST (value)54.3264036171971
F-TEST (DF numerator)13
F-TEST (DF denominator)65
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation5.28364714189449
Sum Squared Residuals1814.60026280323

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.9569328575323 \tabularnewline
R-squared & 0.915720493824934 \tabularnewline
Adjusted R-squared & 0.898864592589921 \tabularnewline
F-TEST (value) & 54.3264036171971 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 65 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 5.28364714189449 \tabularnewline
Sum Squared Residuals & 1814.60026280323 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25559&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.9569328575323[/C][/ROW]
[ROW][C]R-squared[/C][C]0.915720493824934[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.898864592589921[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]54.3264036171971[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]65[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]5.28364714189449[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]1814.60026280323[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25559&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25559&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.9569328575323
R-squared0.915720493824934
Adjusted R-squared0.898864592589921
F-TEST (value)54.3264036171971
F-TEST (DF numerator)13
F-TEST (DF denominator)65
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation5.28364714189449
Sum Squared Residuals1814.60026280323







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
189.684.79106019766394.80893980233612
292.891.6910601976641.10893980233604
3107.6103.0196316262354.58036837376456
4104.6100.6910601976643.90893980233596
510397.6339173405215.36608265947891
6106.9108.548203054807-1.64820305480682
756.353.9373539982032.36264600179695
893.495.2900793650794-1.89007936507938
9109.1109.706746031746-0.606746031746072
10113.8109.5067460317464.29325396825394
1197.497.5400793650794-0.140079365079344
1272.577.5234126984127-5.02341269841268
1382.784.658235998802-1.95823599880205
1488.991.558235998802-2.65823599880203
15105.9102.8868074273733.01319257262654
16100.8100.5582359988020.241764001197966
179497.5010931416592-3.50109314165918
18105108.415378855945-3.41537885594489
1958.553.80452979934114.69547020065889
2087.695.1572551662174-7.55725516621743
21113.1109.5739218328843.52607816711591
22112.5109.3739218328843.12607816711591
2389.697.4072551662174-7.80725516621744
2474.577.3905884995508-2.89058849955076
2582.784.5254117999401-1.82541179994012
2690.191.4254117999401-1.32541179994011
27109.4102.7539832285126.64601677148848
2896100.42541179994-4.42541179994009
2989.297.3682689427972-8.16826894279725
30109.1108.2825546570830.817445342917033
3149.148.84764899670560.252351003294395
3292.990.20037436358192.6996256364181
33107.7104.6170410302493.08295896975143
34103.5104.417041030249-0.917041030248573
3591.192.4503743635819-1.35037436358192
3679.872.43370769691527.36629230308475
3771.979.5685309973046-7.66853099730459
3882.986.4685309973046-3.56853099730458
3990.197.797102425876-7.69710242587601
40100.795.46853099730465.23146900269543
4190.792.4113881401617-1.71138814016172
42108.8103.3256738544475.47432614555256
4344.148.7148247978437-4.61482479784367
4493.690.067550164723.53244983528002
45107.4104.4842168313872.91578316861337
4696.5104.284216831387-7.78421683138664
4793.692.317550164721.28244983528002
4876.572.30088349805334.19911650194669
4976.779.4357067984427-2.73570679844266
508486.3357067984427-2.33570679844265
51103.397.6642782270145.63572177298593
5288.595.3357067984426-6.83570679844264
539992.27856394129986.7214360587002
54105.9103.1928496555862.7071503444145
5544.748.5820005989817-3.88200059898174
569494.7587825696316-0.758782569631617
57107.1109.175449236298-2.07544923629829
58104.8108.975449236298-4.17544923629829
59102.597.00878256963165.49121743036837
6077.776.9921159029650.707884097035043
6185.284.12693920335431.07306079664569
6291.391.02693920335430.273060796645696
63106.5102.3555106319264.14448936807428
6492.4100.026939203354-7.62693920335429
6597.596.96979634621140.530203653788556
66107107.884082060497-0.884082060497156
6751.153.2732330038934-2.17323300389338
6898.694.62595837076973.97404162923031
69102.2109.042625037436-6.84262503743635
70114.3108.8426250374365.45737496256364
7199.496.87595837076972.52404162923031
7272.576.859291704103-4.35929170410303
7392.383.99411500449248.30588499550762
7499.490.89411500449248.50588499550764
7585.9102.222686433064-16.3226864330638
76109.499.89411500449249.50588499550765
7797.696.83697214734950.763027852650485
78104.7107.751257861635-3.05125786163522
7956.553.14040880503143.35959119496856

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 89.6 & 84.7910601976639 & 4.80893980233612 \tabularnewline
2 & 92.8 & 91.691060197664 & 1.10893980233604 \tabularnewline
3 & 107.6 & 103.019631626235 & 4.58036837376456 \tabularnewline
4 & 104.6 & 100.691060197664 & 3.90893980233596 \tabularnewline
5 & 103 & 97.633917340521 & 5.36608265947891 \tabularnewline
6 & 106.9 & 108.548203054807 & -1.64820305480682 \tabularnewline
7 & 56.3 & 53.937353998203 & 2.36264600179695 \tabularnewline
8 & 93.4 & 95.2900793650794 & -1.89007936507938 \tabularnewline
9 & 109.1 & 109.706746031746 & -0.606746031746072 \tabularnewline
10 & 113.8 & 109.506746031746 & 4.29325396825394 \tabularnewline
11 & 97.4 & 97.5400793650794 & -0.140079365079344 \tabularnewline
12 & 72.5 & 77.5234126984127 & -5.02341269841268 \tabularnewline
13 & 82.7 & 84.658235998802 & -1.95823599880205 \tabularnewline
14 & 88.9 & 91.558235998802 & -2.65823599880203 \tabularnewline
15 & 105.9 & 102.886807427373 & 3.01319257262654 \tabularnewline
16 & 100.8 & 100.558235998802 & 0.241764001197966 \tabularnewline
17 & 94 & 97.5010931416592 & -3.50109314165918 \tabularnewline
18 & 105 & 108.415378855945 & -3.41537885594489 \tabularnewline
19 & 58.5 & 53.8045297993411 & 4.69547020065889 \tabularnewline
20 & 87.6 & 95.1572551662174 & -7.55725516621743 \tabularnewline
21 & 113.1 & 109.573921832884 & 3.52607816711591 \tabularnewline
22 & 112.5 & 109.373921832884 & 3.12607816711591 \tabularnewline
23 & 89.6 & 97.4072551662174 & -7.80725516621744 \tabularnewline
24 & 74.5 & 77.3905884995508 & -2.89058849955076 \tabularnewline
25 & 82.7 & 84.5254117999401 & -1.82541179994012 \tabularnewline
26 & 90.1 & 91.4254117999401 & -1.32541179994011 \tabularnewline
27 & 109.4 & 102.753983228512 & 6.64601677148848 \tabularnewline
28 & 96 & 100.42541179994 & -4.42541179994009 \tabularnewline
29 & 89.2 & 97.3682689427972 & -8.16826894279725 \tabularnewline
30 & 109.1 & 108.282554657083 & 0.817445342917033 \tabularnewline
31 & 49.1 & 48.8476489967056 & 0.252351003294395 \tabularnewline
32 & 92.9 & 90.2003743635819 & 2.6996256364181 \tabularnewline
33 & 107.7 & 104.617041030249 & 3.08295896975143 \tabularnewline
34 & 103.5 & 104.417041030249 & -0.917041030248573 \tabularnewline
35 & 91.1 & 92.4503743635819 & -1.35037436358192 \tabularnewline
36 & 79.8 & 72.4337076969152 & 7.36629230308475 \tabularnewline
37 & 71.9 & 79.5685309973046 & -7.66853099730459 \tabularnewline
38 & 82.9 & 86.4685309973046 & -3.56853099730458 \tabularnewline
39 & 90.1 & 97.797102425876 & -7.69710242587601 \tabularnewline
40 & 100.7 & 95.4685309973046 & 5.23146900269543 \tabularnewline
41 & 90.7 & 92.4113881401617 & -1.71138814016172 \tabularnewline
42 & 108.8 & 103.325673854447 & 5.47432614555256 \tabularnewline
43 & 44.1 & 48.7148247978437 & -4.61482479784367 \tabularnewline
44 & 93.6 & 90.06755016472 & 3.53244983528002 \tabularnewline
45 & 107.4 & 104.484216831387 & 2.91578316861337 \tabularnewline
46 & 96.5 & 104.284216831387 & -7.78421683138664 \tabularnewline
47 & 93.6 & 92.31755016472 & 1.28244983528002 \tabularnewline
48 & 76.5 & 72.3008834980533 & 4.19911650194669 \tabularnewline
49 & 76.7 & 79.4357067984427 & -2.73570679844266 \tabularnewline
50 & 84 & 86.3357067984427 & -2.33570679844265 \tabularnewline
51 & 103.3 & 97.664278227014 & 5.63572177298593 \tabularnewline
52 & 88.5 & 95.3357067984426 & -6.83570679844264 \tabularnewline
53 & 99 & 92.2785639412998 & 6.7214360587002 \tabularnewline
54 & 105.9 & 103.192849655586 & 2.7071503444145 \tabularnewline
55 & 44.7 & 48.5820005989817 & -3.88200059898174 \tabularnewline
56 & 94 & 94.7587825696316 & -0.758782569631617 \tabularnewline
57 & 107.1 & 109.175449236298 & -2.07544923629829 \tabularnewline
58 & 104.8 & 108.975449236298 & -4.17544923629829 \tabularnewline
59 & 102.5 & 97.0087825696316 & 5.49121743036837 \tabularnewline
60 & 77.7 & 76.992115902965 & 0.707884097035043 \tabularnewline
61 & 85.2 & 84.1269392033543 & 1.07306079664569 \tabularnewline
62 & 91.3 & 91.0269392033543 & 0.273060796645696 \tabularnewline
63 & 106.5 & 102.355510631926 & 4.14448936807428 \tabularnewline
64 & 92.4 & 100.026939203354 & -7.62693920335429 \tabularnewline
65 & 97.5 & 96.9697963462114 & 0.530203653788556 \tabularnewline
66 & 107 & 107.884082060497 & -0.884082060497156 \tabularnewline
67 & 51.1 & 53.2732330038934 & -2.17323300389338 \tabularnewline
68 & 98.6 & 94.6259583707697 & 3.97404162923031 \tabularnewline
69 & 102.2 & 109.042625037436 & -6.84262503743635 \tabularnewline
70 & 114.3 & 108.842625037436 & 5.45737496256364 \tabularnewline
71 & 99.4 & 96.8759583707697 & 2.52404162923031 \tabularnewline
72 & 72.5 & 76.859291704103 & -4.35929170410303 \tabularnewline
73 & 92.3 & 83.9941150044924 & 8.30588499550762 \tabularnewline
74 & 99.4 & 90.8941150044924 & 8.50588499550764 \tabularnewline
75 & 85.9 & 102.222686433064 & -16.3226864330638 \tabularnewline
76 & 109.4 & 99.8941150044924 & 9.50588499550765 \tabularnewline
77 & 97.6 & 96.8369721473495 & 0.763027852650485 \tabularnewline
78 & 104.7 & 107.751257861635 & -3.05125786163522 \tabularnewline
79 & 56.5 & 53.1404088050314 & 3.35959119496856 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25559&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]89.6[/C][C]84.7910601976639[/C][C]4.80893980233612[/C][/ROW]
[ROW][C]2[/C][C]92.8[/C][C]91.691060197664[/C][C]1.10893980233604[/C][/ROW]
[ROW][C]3[/C][C]107.6[/C][C]103.019631626235[/C][C]4.58036837376456[/C][/ROW]
[ROW][C]4[/C][C]104.6[/C][C]100.691060197664[/C][C]3.90893980233596[/C][/ROW]
[ROW][C]5[/C][C]103[/C][C]97.633917340521[/C][C]5.36608265947891[/C][/ROW]
[ROW][C]6[/C][C]106.9[/C][C]108.548203054807[/C][C]-1.64820305480682[/C][/ROW]
[ROW][C]7[/C][C]56.3[/C][C]53.937353998203[/C][C]2.36264600179695[/C][/ROW]
[ROW][C]8[/C][C]93.4[/C][C]95.2900793650794[/C][C]-1.89007936507938[/C][/ROW]
[ROW][C]9[/C][C]109.1[/C][C]109.706746031746[/C][C]-0.606746031746072[/C][/ROW]
[ROW][C]10[/C][C]113.8[/C][C]109.506746031746[/C][C]4.29325396825394[/C][/ROW]
[ROW][C]11[/C][C]97.4[/C][C]97.5400793650794[/C][C]-0.140079365079344[/C][/ROW]
[ROW][C]12[/C][C]72.5[/C][C]77.5234126984127[/C][C]-5.02341269841268[/C][/ROW]
[ROW][C]13[/C][C]82.7[/C][C]84.658235998802[/C][C]-1.95823599880205[/C][/ROW]
[ROW][C]14[/C][C]88.9[/C][C]91.558235998802[/C][C]-2.65823599880203[/C][/ROW]
[ROW][C]15[/C][C]105.9[/C][C]102.886807427373[/C][C]3.01319257262654[/C][/ROW]
[ROW][C]16[/C][C]100.8[/C][C]100.558235998802[/C][C]0.241764001197966[/C][/ROW]
[ROW][C]17[/C][C]94[/C][C]97.5010931416592[/C][C]-3.50109314165918[/C][/ROW]
[ROW][C]18[/C][C]105[/C][C]108.415378855945[/C][C]-3.41537885594489[/C][/ROW]
[ROW][C]19[/C][C]58.5[/C][C]53.8045297993411[/C][C]4.69547020065889[/C][/ROW]
[ROW][C]20[/C][C]87.6[/C][C]95.1572551662174[/C][C]-7.55725516621743[/C][/ROW]
[ROW][C]21[/C][C]113.1[/C][C]109.573921832884[/C][C]3.52607816711591[/C][/ROW]
[ROW][C]22[/C][C]112.5[/C][C]109.373921832884[/C][C]3.12607816711591[/C][/ROW]
[ROW][C]23[/C][C]89.6[/C][C]97.4072551662174[/C][C]-7.80725516621744[/C][/ROW]
[ROW][C]24[/C][C]74.5[/C][C]77.3905884995508[/C][C]-2.89058849955076[/C][/ROW]
[ROW][C]25[/C][C]82.7[/C][C]84.5254117999401[/C][C]-1.82541179994012[/C][/ROW]
[ROW][C]26[/C][C]90.1[/C][C]91.4254117999401[/C][C]-1.32541179994011[/C][/ROW]
[ROW][C]27[/C][C]109.4[/C][C]102.753983228512[/C][C]6.64601677148848[/C][/ROW]
[ROW][C]28[/C][C]96[/C][C]100.42541179994[/C][C]-4.42541179994009[/C][/ROW]
[ROW][C]29[/C][C]89.2[/C][C]97.3682689427972[/C][C]-8.16826894279725[/C][/ROW]
[ROW][C]30[/C][C]109.1[/C][C]108.282554657083[/C][C]0.817445342917033[/C][/ROW]
[ROW][C]31[/C][C]49.1[/C][C]48.8476489967056[/C][C]0.252351003294395[/C][/ROW]
[ROW][C]32[/C][C]92.9[/C][C]90.2003743635819[/C][C]2.6996256364181[/C][/ROW]
[ROW][C]33[/C][C]107.7[/C][C]104.617041030249[/C][C]3.08295896975143[/C][/ROW]
[ROW][C]34[/C][C]103.5[/C][C]104.417041030249[/C][C]-0.917041030248573[/C][/ROW]
[ROW][C]35[/C][C]91.1[/C][C]92.4503743635819[/C][C]-1.35037436358192[/C][/ROW]
[ROW][C]36[/C][C]79.8[/C][C]72.4337076969152[/C][C]7.36629230308475[/C][/ROW]
[ROW][C]37[/C][C]71.9[/C][C]79.5685309973046[/C][C]-7.66853099730459[/C][/ROW]
[ROW][C]38[/C][C]82.9[/C][C]86.4685309973046[/C][C]-3.56853099730458[/C][/ROW]
[ROW][C]39[/C][C]90.1[/C][C]97.797102425876[/C][C]-7.69710242587601[/C][/ROW]
[ROW][C]40[/C][C]100.7[/C][C]95.4685309973046[/C][C]5.23146900269543[/C][/ROW]
[ROW][C]41[/C][C]90.7[/C][C]92.4113881401617[/C][C]-1.71138814016172[/C][/ROW]
[ROW][C]42[/C][C]108.8[/C][C]103.325673854447[/C][C]5.47432614555256[/C][/ROW]
[ROW][C]43[/C][C]44.1[/C][C]48.7148247978437[/C][C]-4.61482479784367[/C][/ROW]
[ROW][C]44[/C][C]93.6[/C][C]90.06755016472[/C][C]3.53244983528002[/C][/ROW]
[ROW][C]45[/C][C]107.4[/C][C]104.484216831387[/C][C]2.91578316861337[/C][/ROW]
[ROW][C]46[/C][C]96.5[/C][C]104.284216831387[/C][C]-7.78421683138664[/C][/ROW]
[ROW][C]47[/C][C]93.6[/C][C]92.31755016472[/C][C]1.28244983528002[/C][/ROW]
[ROW][C]48[/C][C]76.5[/C][C]72.3008834980533[/C][C]4.19911650194669[/C][/ROW]
[ROW][C]49[/C][C]76.7[/C][C]79.4357067984427[/C][C]-2.73570679844266[/C][/ROW]
[ROW][C]50[/C][C]84[/C][C]86.3357067984427[/C][C]-2.33570679844265[/C][/ROW]
[ROW][C]51[/C][C]103.3[/C][C]97.664278227014[/C][C]5.63572177298593[/C][/ROW]
[ROW][C]52[/C][C]88.5[/C][C]95.3357067984426[/C][C]-6.83570679844264[/C][/ROW]
[ROW][C]53[/C][C]99[/C][C]92.2785639412998[/C][C]6.7214360587002[/C][/ROW]
[ROW][C]54[/C][C]105.9[/C][C]103.192849655586[/C][C]2.7071503444145[/C][/ROW]
[ROW][C]55[/C][C]44.7[/C][C]48.5820005989817[/C][C]-3.88200059898174[/C][/ROW]
[ROW][C]56[/C][C]94[/C][C]94.7587825696316[/C][C]-0.758782569631617[/C][/ROW]
[ROW][C]57[/C][C]107.1[/C][C]109.175449236298[/C][C]-2.07544923629829[/C][/ROW]
[ROW][C]58[/C][C]104.8[/C][C]108.975449236298[/C][C]-4.17544923629829[/C][/ROW]
[ROW][C]59[/C][C]102.5[/C][C]97.0087825696316[/C][C]5.49121743036837[/C][/ROW]
[ROW][C]60[/C][C]77.7[/C][C]76.992115902965[/C][C]0.707884097035043[/C][/ROW]
[ROW][C]61[/C][C]85.2[/C][C]84.1269392033543[/C][C]1.07306079664569[/C][/ROW]
[ROW][C]62[/C][C]91.3[/C][C]91.0269392033543[/C][C]0.273060796645696[/C][/ROW]
[ROW][C]63[/C][C]106.5[/C][C]102.355510631926[/C][C]4.14448936807428[/C][/ROW]
[ROW][C]64[/C][C]92.4[/C][C]100.026939203354[/C][C]-7.62693920335429[/C][/ROW]
[ROW][C]65[/C][C]97.5[/C][C]96.9697963462114[/C][C]0.530203653788556[/C][/ROW]
[ROW][C]66[/C][C]107[/C][C]107.884082060497[/C][C]-0.884082060497156[/C][/ROW]
[ROW][C]67[/C][C]51.1[/C][C]53.2732330038934[/C][C]-2.17323300389338[/C][/ROW]
[ROW][C]68[/C][C]98.6[/C][C]94.6259583707697[/C][C]3.97404162923031[/C][/ROW]
[ROW][C]69[/C][C]102.2[/C][C]109.042625037436[/C][C]-6.84262503743635[/C][/ROW]
[ROW][C]70[/C][C]114.3[/C][C]108.842625037436[/C][C]5.45737496256364[/C][/ROW]
[ROW][C]71[/C][C]99.4[/C][C]96.8759583707697[/C][C]2.52404162923031[/C][/ROW]
[ROW][C]72[/C][C]72.5[/C][C]76.859291704103[/C][C]-4.35929170410303[/C][/ROW]
[ROW][C]73[/C][C]92.3[/C][C]83.9941150044924[/C][C]8.30588499550762[/C][/ROW]
[ROW][C]74[/C][C]99.4[/C][C]90.8941150044924[/C][C]8.50588499550764[/C][/ROW]
[ROW][C]75[/C][C]85.9[/C][C]102.222686433064[/C][C]-16.3226864330638[/C][/ROW]
[ROW][C]76[/C][C]109.4[/C][C]99.8941150044924[/C][C]9.50588499550765[/C][/ROW]
[ROW][C]77[/C][C]97.6[/C][C]96.8369721473495[/C][C]0.763027852650485[/C][/ROW]
[ROW][C]78[/C][C]104.7[/C][C]107.751257861635[/C][C]-3.05125786163522[/C][/ROW]
[ROW][C]79[/C][C]56.5[/C][C]53.1404088050314[/C][C]3.35959119496856[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25559&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25559&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
189.684.79106019766394.80893980233612
292.891.6910601976641.10893980233604
3107.6103.0196316262354.58036837376456
4104.6100.6910601976643.90893980233596
510397.6339173405215.36608265947891
6106.9108.548203054807-1.64820305480682
756.353.9373539982032.36264600179695
893.495.2900793650794-1.89007936507938
9109.1109.706746031746-0.606746031746072
10113.8109.5067460317464.29325396825394
1197.497.5400793650794-0.140079365079344
1272.577.5234126984127-5.02341269841268
1382.784.658235998802-1.95823599880205
1488.991.558235998802-2.65823599880203
15105.9102.8868074273733.01319257262654
16100.8100.5582359988020.241764001197966
179497.5010931416592-3.50109314165918
18105108.415378855945-3.41537885594489
1958.553.80452979934114.69547020065889
2087.695.1572551662174-7.55725516621743
21113.1109.5739218328843.52607816711591
22112.5109.3739218328843.12607816711591
2389.697.4072551662174-7.80725516621744
2474.577.3905884995508-2.89058849955076
2582.784.5254117999401-1.82541179994012
2690.191.4254117999401-1.32541179994011
27109.4102.7539832285126.64601677148848
2896100.42541179994-4.42541179994009
2989.297.3682689427972-8.16826894279725
30109.1108.2825546570830.817445342917033
3149.148.84764899670560.252351003294395
3292.990.20037436358192.6996256364181
33107.7104.6170410302493.08295896975143
34103.5104.417041030249-0.917041030248573
3591.192.4503743635819-1.35037436358192
3679.872.43370769691527.36629230308475
3771.979.5685309973046-7.66853099730459
3882.986.4685309973046-3.56853099730458
3990.197.797102425876-7.69710242587601
40100.795.46853099730465.23146900269543
4190.792.4113881401617-1.71138814016172
42108.8103.3256738544475.47432614555256
4344.148.7148247978437-4.61482479784367
4493.690.067550164723.53244983528002
45107.4104.4842168313872.91578316861337
4696.5104.284216831387-7.78421683138664
4793.692.317550164721.28244983528002
4876.572.30088349805334.19911650194669
4976.779.4357067984427-2.73570679844266
508486.3357067984427-2.33570679844265
51103.397.6642782270145.63572177298593
5288.595.3357067984426-6.83570679844264
539992.27856394129986.7214360587002
54105.9103.1928496555862.7071503444145
5544.748.5820005989817-3.88200059898174
569494.7587825696316-0.758782569631617
57107.1109.175449236298-2.07544923629829
58104.8108.975449236298-4.17544923629829
59102.597.00878256963165.49121743036837
6077.776.9921159029650.707884097035043
6185.284.12693920335431.07306079664569
6291.391.02693920335430.273060796645696
63106.5102.3555106319264.14448936807428
6492.4100.026939203354-7.62693920335429
6597.596.96979634621140.530203653788556
66107107.884082060497-0.884082060497156
6751.153.2732330038934-2.17323300389338
6898.694.62595837076973.97404162923031
69102.2109.042625037436-6.84262503743635
70114.3108.8426250374365.45737496256364
7199.496.87595837076972.52404162923031
7272.576.859291704103-4.35929170410303
7392.383.99411500449248.30588499550762
7499.490.89411500449248.50588499550764
7585.9102.222686433064-16.3226864330638
76109.499.89411500449249.50588499550765
7797.696.83697214734950.763027852650485
78104.7107.751257861635-3.05125786163522
7956.553.14040880503143.35959119496856



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')