Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationThu, 11 Dec 2008 04:59:09 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2008/Dec/11/t1228996821ouozkuzy675hhx2.htm/, Retrieved Fri, 17 May 2024 02:02:38 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=32175, Retrieved Fri, 17 May 2024 02:02:38 +0000
QR Codes:

Original text written by user:In samenwerking met Katrien Bourdiaudhy, Stéphanie Claes en Kevin Engels
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact222
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [Q1 The Seatbeltlaw] [2007-11-14 19:27:43] [8cd6641b921d30ebe00b648d1481bba0]
F    D  [Multiple Regression] [Q1 Case: the Seat...] [2008-11-19 13:07:53] [c993f605b206b366f754f7f8c1fcc291]
-   PD    [Multiple Regression] [Q3 seatbelt case] [2008-11-30 21:48:04] [7173087adebe3e3a714c80ea2417b3eb]
-    D      [Multiple Regression] [Multiple Linear R...] [2008-12-11 11:30:21] [c993f605b206b366f754f7f8c1fcc291]
-   P           [Multiple Regression] [Multiple Regressi...] [2008-12-11 11:59:09] [70ba55c7ff8e068610dc28fc16e6d1e2] [Current]
Feedback Forum

Post a new message
Dataseries X:
5.014	0
6.153	0
6.441	0
5.584	0
6.427	0
6.062	0
5.589	0
6.216	0
5.809	0
4.989	0
6.706	0
7.174	0
6.122	0
8.075	0
6.292	0
6.337	0
8.576	0
6.077	0
5.931	0
6.288	0
7.167	0
6.054	0
6.468	0
6.401	0
6.927	0
7.914	0
7.728	0
8.699	0
8.522	0
6.481	0
7.502	0
7.778	0
7.424	0
6.941	0
8.574	0
9.169	0
7.701	0
9.035	0
7.158	0
8.195	0
8.124	1
7.073	1
7.017	1
7.390	1
7.776	1
6.197	1
6.889	1
7.087	1
6.485	1
7.654	1
6.501	1
6.313	1
7.826	1
6.589	1
6.729	1
5.684	1
8.105	1
6.391	1
5.901	1
6.758	1




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=32175&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=32175&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=32175&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 7.31533125 + 0.006171875x[t] -0.866765625M1[t] + 0.449634375000001M2[t] -0.492565624999999M3[t] -0.290965624999999M4[t] + 0.577200000000001M5[t] -0.861399999999998M6[t] -0.764199999999999M7[t] -0.6466M8[t] -0.0615999999999992M9[t] -1.2034M10[t] -0.410199999999999M11[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  7.31533125 +  0.006171875x[t] -0.866765625M1[t] +  0.449634375000001M2[t] -0.492565624999999M3[t] -0.290965624999999M4[t] +  0.577200000000001M5[t] -0.861399999999998M6[t] -0.764199999999999M7[t] -0.6466M8[t] -0.0615999999999992M9[t] -1.2034M10[t] -0.410199999999999M11[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=32175&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  7.31533125 +  0.006171875x[t] -0.866765625M1[t] +  0.449634375000001M2[t] -0.492565624999999M3[t] -0.290965624999999M4[t] +  0.577200000000001M5[t] -0.861399999999998M6[t] -0.764199999999999M7[t] -0.6466M8[t] -0.0615999999999992M9[t] -1.2034M10[t] -0.410199999999999M11[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=32175&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=32175&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 7.31533125 + 0.006171875x[t] -0.866765625M1[t] + 0.449634375000001M2[t] -0.492565624999999M3[t] -0.290965624999999M4[t] + 0.577200000000001M5[t] -0.861399999999998M6[t] -0.764199999999999M7[t] -0.6466M8[t] -0.0615999999999992M9[t] -1.2034M10[t] -0.410199999999999M11[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)7.315331250.42597317.173200
x0.0061718750.2582840.02390.9810370.490518
M1-0.8667656250.586709-1.47730.1462560.073128
M20.4496343750000010.5867090.76640.447290.223645
M3-0.4925656249999990.586709-0.83950.4054170.202708
M4-0.2909656249999990.586709-0.49590.6222560.311128
M50.5772000000000010.5844310.98760.3283920.164196
M6-0.8613999999999980.584431-1.47390.1471720.073586
M7-0.7641999999999990.584431-1.30760.197370.098685
M8-0.64660.584431-1.10640.2741950.137097
M9-0.06159999999999920.584431-0.10540.9165060.458253
M10-1.20340.584431-2.05910.045050.022525
M11-0.4101999999999990.584431-0.70190.4862150.243108

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 7.31533125 & 0.425973 & 17.1732 & 0 & 0 \tabularnewline
x & 0.006171875 & 0.258284 & 0.0239 & 0.981037 & 0.490518 \tabularnewline
M1 & -0.866765625 & 0.586709 & -1.4773 & 0.146256 & 0.073128 \tabularnewline
M2 & 0.449634375000001 & 0.586709 & 0.7664 & 0.44729 & 0.223645 \tabularnewline
M3 & -0.492565624999999 & 0.586709 & -0.8395 & 0.405417 & 0.202708 \tabularnewline
M4 & -0.290965624999999 & 0.586709 & -0.4959 & 0.622256 & 0.311128 \tabularnewline
M5 & 0.577200000000001 & 0.584431 & 0.9876 & 0.328392 & 0.164196 \tabularnewline
M6 & -0.861399999999998 & 0.584431 & -1.4739 & 0.147172 & 0.073586 \tabularnewline
M7 & -0.764199999999999 & 0.584431 & -1.3076 & 0.19737 & 0.098685 \tabularnewline
M8 & -0.6466 & 0.584431 & -1.1064 & 0.274195 & 0.137097 \tabularnewline
M9 & -0.0615999999999992 & 0.584431 & -0.1054 & 0.916506 & 0.458253 \tabularnewline
M10 & -1.2034 & 0.584431 & -2.0591 & 0.04505 & 0.022525 \tabularnewline
M11 & -0.410199999999999 & 0.584431 & -0.7019 & 0.486215 & 0.243108 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=32175&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]7.31533125[/C][C]0.425973[/C][C]17.1732[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]x[/C][C]0.006171875[/C][C]0.258284[/C][C]0.0239[/C][C]0.981037[/C][C]0.490518[/C][/ROW]
[ROW][C]M1[/C][C]-0.866765625[/C][C]0.586709[/C][C]-1.4773[/C][C]0.146256[/C][C]0.073128[/C][/ROW]
[ROW][C]M2[/C][C]0.449634375000001[/C][C]0.586709[/C][C]0.7664[/C][C]0.44729[/C][C]0.223645[/C][/ROW]
[ROW][C]M3[/C][C]-0.492565624999999[/C][C]0.586709[/C][C]-0.8395[/C][C]0.405417[/C][C]0.202708[/C][/ROW]
[ROW][C]M4[/C][C]-0.290965624999999[/C][C]0.586709[/C][C]-0.4959[/C][C]0.622256[/C][C]0.311128[/C][/ROW]
[ROW][C]M5[/C][C]0.577200000000001[/C][C]0.584431[/C][C]0.9876[/C][C]0.328392[/C][C]0.164196[/C][/ROW]
[ROW][C]M6[/C][C]-0.861399999999998[/C][C]0.584431[/C][C]-1.4739[/C][C]0.147172[/C][C]0.073586[/C][/ROW]
[ROW][C]M7[/C][C]-0.764199999999999[/C][C]0.584431[/C][C]-1.3076[/C][C]0.19737[/C][C]0.098685[/C][/ROW]
[ROW][C]M8[/C][C]-0.6466[/C][C]0.584431[/C][C]-1.1064[/C][C]0.274195[/C][C]0.137097[/C][/ROW]
[ROW][C]M9[/C][C]-0.0615999999999992[/C][C]0.584431[/C][C]-0.1054[/C][C]0.916506[/C][C]0.458253[/C][/ROW]
[ROW][C]M10[/C][C]-1.2034[/C][C]0.584431[/C][C]-2.0591[/C][C]0.04505[/C][C]0.022525[/C][/ROW]
[ROW][C]M11[/C][C]-0.410199999999999[/C][C]0.584431[/C][C]-0.7019[/C][C]0.486215[/C][C]0.243108[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=32175&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=32175&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)7.315331250.42597317.173200
x0.0061718750.2582840.02390.9810370.490518
M1-0.8667656250.586709-1.47730.1462560.073128
M20.4496343750000010.5867090.76640.447290.223645
M3-0.4925656249999990.586709-0.83950.4054170.202708
M4-0.2909656249999990.586709-0.49590.6222560.311128
M50.5772000000000010.5844310.98760.3283920.164196
M6-0.8613999999999980.584431-1.47390.1471720.073586
M7-0.7641999999999990.584431-1.30760.197370.098685
M8-0.64660.584431-1.10640.2741950.137097
M9-0.06159999999999920.584431-0.10540.9165060.458253
M10-1.20340.584431-2.05910.045050.022525
M11-0.4101999999999990.584431-0.70190.4862150.243108







Multiple Linear Regression - Regression Statistics
Multiple R0.536051580390871
R-squared0.28735129683955
Adjusted R-squared0.105398436458159
F-TEST (value)1.57926232232477
F-TEST (DF numerator)12
F-TEST (DF denominator)47
p-value0.13072387664634
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.924065766539892
Sum Squared Residuals40.133184421875

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.536051580390871 \tabularnewline
R-squared & 0.28735129683955 \tabularnewline
Adjusted R-squared & 0.105398436458159 \tabularnewline
F-TEST (value) & 1.57926232232477 \tabularnewline
F-TEST (DF numerator) & 12 \tabularnewline
F-TEST (DF denominator) & 47 \tabularnewline
p-value & 0.13072387664634 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.924065766539892 \tabularnewline
Sum Squared Residuals & 40.133184421875 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=32175&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.536051580390871[/C][/ROW]
[ROW][C]R-squared[/C][C]0.28735129683955[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.105398436458159[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]1.57926232232477[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]12[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]47[/C][/ROW]
[ROW][C]p-value[/C][C]0.13072387664634[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.924065766539892[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]40.133184421875[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=32175&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=32175&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.536051580390871
R-squared0.28735129683955
Adjusted R-squared0.105398436458159
F-TEST (value)1.57926232232477
F-TEST (DF numerator)12
F-TEST (DF denominator)47
p-value0.13072387664634
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.924065766539892
Sum Squared Residuals40.133184421875







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
15.0146.448565625-1.43456562500001
26.1537.764965625-1.61196562500000
36.4416.822765625-0.381765625000001
45.5847.024365625-1.440365625
56.4277.89253125-1.46553125
66.0626.45393125-0.391931250000000
75.5896.55113125-0.96213125
86.2166.66873125-0.452731250000001
95.8097.25373125-1.44473125
104.9896.11193125-1.12293125000000
116.7066.90513125-0.199131250000000
127.1747.31533125-0.141331249999998
136.1226.448565625-0.326565624999999
148.0757.7649656250.310034374999999
156.2926.822765625-0.530765625
166.3377.024365625-0.687365625
178.5767.892531250.68346875
186.0776.45393125-0.37693125
195.9316.55113125-0.62013125
206.2886.66873125-0.380731250000000
217.1677.25373125-0.0867312500000003
226.0546.11193125-0.05793125
236.4686.90513125-0.43713125
246.4017.31533125-0.91433125
256.9276.4485656250.478434375000001
267.9147.7649656250.149034374999999
277.7286.8227656250.905234375
288.6997.0243656251.674634375
298.5227.892531250.62946875
306.4816.453931250.0270687499999994
317.5026.551131250.95086875
327.7786.668731251.10926875
337.4247.253731250.170268750000000
346.9416.111931250.82906875
358.5746.905131251.66886875
369.1697.315331251.85366875
377.7016.4485656251.252434375
389.0357.7649656251.270034375
397.1586.8227656250.335234375000001
408.1957.0243656251.170634375
418.1247.8987031250.225296875000000
427.0736.4601031250.612896875
437.0176.5573031250.459696875
447.396.6749031250.715096875
457.7767.2599031250.516096875
466.1976.1181031250.0788968749999997
476.8896.911303125-0.0223031249999998
487.0877.321503125-0.234503124999999
496.4856.45473750.0302625000000019
507.6547.7711375-0.117137500000001
516.5016.8289375-0.327937499999999
526.3137.0305375-0.717537500
537.8267.898703125-0.0727031250000006
546.5896.4601031250.128896875
556.7296.5573031250.171696875
565.6846.674903125-0.990903125
578.1057.2599031250.845096875
586.3916.1181031250.272896875000000
595.9016.911303125-1.010303125
606.7587.321503125-0.563503124999999

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 5.014 & 6.448565625 & -1.43456562500001 \tabularnewline
2 & 6.153 & 7.764965625 & -1.61196562500000 \tabularnewline
3 & 6.441 & 6.822765625 & -0.381765625000001 \tabularnewline
4 & 5.584 & 7.024365625 & -1.440365625 \tabularnewline
5 & 6.427 & 7.89253125 & -1.46553125 \tabularnewline
6 & 6.062 & 6.45393125 & -0.391931250000000 \tabularnewline
7 & 5.589 & 6.55113125 & -0.96213125 \tabularnewline
8 & 6.216 & 6.66873125 & -0.452731250000001 \tabularnewline
9 & 5.809 & 7.25373125 & -1.44473125 \tabularnewline
10 & 4.989 & 6.11193125 & -1.12293125000000 \tabularnewline
11 & 6.706 & 6.90513125 & -0.199131250000000 \tabularnewline
12 & 7.174 & 7.31533125 & -0.141331249999998 \tabularnewline
13 & 6.122 & 6.448565625 & -0.326565624999999 \tabularnewline
14 & 8.075 & 7.764965625 & 0.310034374999999 \tabularnewline
15 & 6.292 & 6.822765625 & -0.530765625 \tabularnewline
16 & 6.337 & 7.024365625 & -0.687365625 \tabularnewline
17 & 8.576 & 7.89253125 & 0.68346875 \tabularnewline
18 & 6.077 & 6.45393125 & -0.37693125 \tabularnewline
19 & 5.931 & 6.55113125 & -0.62013125 \tabularnewline
20 & 6.288 & 6.66873125 & -0.380731250000000 \tabularnewline
21 & 7.167 & 7.25373125 & -0.0867312500000003 \tabularnewline
22 & 6.054 & 6.11193125 & -0.05793125 \tabularnewline
23 & 6.468 & 6.90513125 & -0.43713125 \tabularnewline
24 & 6.401 & 7.31533125 & -0.91433125 \tabularnewline
25 & 6.927 & 6.448565625 & 0.478434375000001 \tabularnewline
26 & 7.914 & 7.764965625 & 0.149034374999999 \tabularnewline
27 & 7.728 & 6.822765625 & 0.905234375 \tabularnewline
28 & 8.699 & 7.024365625 & 1.674634375 \tabularnewline
29 & 8.522 & 7.89253125 & 0.62946875 \tabularnewline
30 & 6.481 & 6.45393125 & 0.0270687499999994 \tabularnewline
31 & 7.502 & 6.55113125 & 0.95086875 \tabularnewline
32 & 7.778 & 6.66873125 & 1.10926875 \tabularnewline
33 & 7.424 & 7.25373125 & 0.170268750000000 \tabularnewline
34 & 6.941 & 6.11193125 & 0.82906875 \tabularnewline
35 & 8.574 & 6.90513125 & 1.66886875 \tabularnewline
36 & 9.169 & 7.31533125 & 1.85366875 \tabularnewline
37 & 7.701 & 6.448565625 & 1.252434375 \tabularnewline
38 & 9.035 & 7.764965625 & 1.270034375 \tabularnewline
39 & 7.158 & 6.822765625 & 0.335234375000001 \tabularnewline
40 & 8.195 & 7.024365625 & 1.170634375 \tabularnewline
41 & 8.124 & 7.898703125 & 0.225296875000000 \tabularnewline
42 & 7.073 & 6.460103125 & 0.612896875 \tabularnewline
43 & 7.017 & 6.557303125 & 0.459696875 \tabularnewline
44 & 7.39 & 6.674903125 & 0.715096875 \tabularnewline
45 & 7.776 & 7.259903125 & 0.516096875 \tabularnewline
46 & 6.197 & 6.118103125 & 0.0788968749999997 \tabularnewline
47 & 6.889 & 6.911303125 & -0.0223031249999998 \tabularnewline
48 & 7.087 & 7.321503125 & -0.234503124999999 \tabularnewline
49 & 6.485 & 6.4547375 & 0.0302625000000019 \tabularnewline
50 & 7.654 & 7.7711375 & -0.117137500000001 \tabularnewline
51 & 6.501 & 6.8289375 & -0.327937499999999 \tabularnewline
52 & 6.313 & 7.0305375 & -0.717537500 \tabularnewline
53 & 7.826 & 7.898703125 & -0.0727031250000006 \tabularnewline
54 & 6.589 & 6.460103125 & 0.128896875 \tabularnewline
55 & 6.729 & 6.557303125 & 0.171696875 \tabularnewline
56 & 5.684 & 6.674903125 & -0.990903125 \tabularnewline
57 & 8.105 & 7.259903125 & 0.845096875 \tabularnewline
58 & 6.391 & 6.118103125 & 0.272896875000000 \tabularnewline
59 & 5.901 & 6.911303125 & -1.010303125 \tabularnewline
60 & 6.758 & 7.321503125 & -0.563503124999999 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=32175&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]5.014[/C][C]6.448565625[/C][C]-1.43456562500001[/C][/ROW]
[ROW][C]2[/C][C]6.153[/C][C]7.764965625[/C][C]-1.61196562500000[/C][/ROW]
[ROW][C]3[/C][C]6.441[/C][C]6.822765625[/C][C]-0.381765625000001[/C][/ROW]
[ROW][C]4[/C][C]5.584[/C][C]7.024365625[/C][C]-1.440365625[/C][/ROW]
[ROW][C]5[/C][C]6.427[/C][C]7.89253125[/C][C]-1.46553125[/C][/ROW]
[ROW][C]6[/C][C]6.062[/C][C]6.45393125[/C][C]-0.391931250000000[/C][/ROW]
[ROW][C]7[/C][C]5.589[/C][C]6.55113125[/C][C]-0.96213125[/C][/ROW]
[ROW][C]8[/C][C]6.216[/C][C]6.66873125[/C][C]-0.452731250000001[/C][/ROW]
[ROW][C]9[/C][C]5.809[/C][C]7.25373125[/C][C]-1.44473125[/C][/ROW]
[ROW][C]10[/C][C]4.989[/C][C]6.11193125[/C][C]-1.12293125000000[/C][/ROW]
[ROW][C]11[/C][C]6.706[/C][C]6.90513125[/C][C]-0.199131250000000[/C][/ROW]
[ROW][C]12[/C][C]7.174[/C][C]7.31533125[/C][C]-0.141331249999998[/C][/ROW]
[ROW][C]13[/C][C]6.122[/C][C]6.448565625[/C][C]-0.326565624999999[/C][/ROW]
[ROW][C]14[/C][C]8.075[/C][C]7.764965625[/C][C]0.310034374999999[/C][/ROW]
[ROW][C]15[/C][C]6.292[/C][C]6.822765625[/C][C]-0.530765625[/C][/ROW]
[ROW][C]16[/C][C]6.337[/C][C]7.024365625[/C][C]-0.687365625[/C][/ROW]
[ROW][C]17[/C][C]8.576[/C][C]7.89253125[/C][C]0.68346875[/C][/ROW]
[ROW][C]18[/C][C]6.077[/C][C]6.45393125[/C][C]-0.37693125[/C][/ROW]
[ROW][C]19[/C][C]5.931[/C][C]6.55113125[/C][C]-0.62013125[/C][/ROW]
[ROW][C]20[/C][C]6.288[/C][C]6.66873125[/C][C]-0.380731250000000[/C][/ROW]
[ROW][C]21[/C][C]7.167[/C][C]7.25373125[/C][C]-0.0867312500000003[/C][/ROW]
[ROW][C]22[/C][C]6.054[/C][C]6.11193125[/C][C]-0.05793125[/C][/ROW]
[ROW][C]23[/C][C]6.468[/C][C]6.90513125[/C][C]-0.43713125[/C][/ROW]
[ROW][C]24[/C][C]6.401[/C][C]7.31533125[/C][C]-0.91433125[/C][/ROW]
[ROW][C]25[/C][C]6.927[/C][C]6.448565625[/C][C]0.478434375000001[/C][/ROW]
[ROW][C]26[/C][C]7.914[/C][C]7.764965625[/C][C]0.149034374999999[/C][/ROW]
[ROW][C]27[/C][C]7.728[/C][C]6.822765625[/C][C]0.905234375[/C][/ROW]
[ROW][C]28[/C][C]8.699[/C][C]7.024365625[/C][C]1.674634375[/C][/ROW]
[ROW][C]29[/C][C]8.522[/C][C]7.89253125[/C][C]0.62946875[/C][/ROW]
[ROW][C]30[/C][C]6.481[/C][C]6.45393125[/C][C]0.0270687499999994[/C][/ROW]
[ROW][C]31[/C][C]7.502[/C][C]6.55113125[/C][C]0.95086875[/C][/ROW]
[ROW][C]32[/C][C]7.778[/C][C]6.66873125[/C][C]1.10926875[/C][/ROW]
[ROW][C]33[/C][C]7.424[/C][C]7.25373125[/C][C]0.170268750000000[/C][/ROW]
[ROW][C]34[/C][C]6.941[/C][C]6.11193125[/C][C]0.82906875[/C][/ROW]
[ROW][C]35[/C][C]8.574[/C][C]6.90513125[/C][C]1.66886875[/C][/ROW]
[ROW][C]36[/C][C]9.169[/C][C]7.31533125[/C][C]1.85366875[/C][/ROW]
[ROW][C]37[/C][C]7.701[/C][C]6.448565625[/C][C]1.252434375[/C][/ROW]
[ROW][C]38[/C][C]9.035[/C][C]7.764965625[/C][C]1.270034375[/C][/ROW]
[ROW][C]39[/C][C]7.158[/C][C]6.822765625[/C][C]0.335234375000001[/C][/ROW]
[ROW][C]40[/C][C]8.195[/C][C]7.024365625[/C][C]1.170634375[/C][/ROW]
[ROW][C]41[/C][C]8.124[/C][C]7.898703125[/C][C]0.225296875000000[/C][/ROW]
[ROW][C]42[/C][C]7.073[/C][C]6.460103125[/C][C]0.612896875[/C][/ROW]
[ROW][C]43[/C][C]7.017[/C][C]6.557303125[/C][C]0.459696875[/C][/ROW]
[ROW][C]44[/C][C]7.39[/C][C]6.674903125[/C][C]0.715096875[/C][/ROW]
[ROW][C]45[/C][C]7.776[/C][C]7.259903125[/C][C]0.516096875[/C][/ROW]
[ROW][C]46[/C][C]6.197[/C][C]6.118103125[/C][C]0.0788968749999997[/C][/ROW]
[ROW][C]47[/C][C]6.889[/C][C]6.911303125[/C][C]-0.0223031249999998[/C][/ROW]
[ROW][C]48[/C][C]7.087[/C][C]7.321503125[/C][C]-0.234503124999999[/C][/ROW]
[ROW][C]49[/C][C]6.485[/C][C]6.4547375[/C][C]0.0302625000000019[/C][/ROW]
[ROW][C]50[/C][C]7.654[/C][C]7.7711375[/C][C]-0.117137500000001[/C][/ROW]
[ROW][C]51[/C][C]6.501[/C][C]6.8289375[/C][C]-0.327937499999999[/C][/ROW]
[ROW][C]52[/C][C]6.313[/C][C]7.0305375[/C][C]-0.717537500[/C][/ROW]
[ROW][C]53[/C][C]7.826[/C][C]7.898703125[/C][C]-0.0727031250000006[/C][/ROW]
[ROW][C]54[/C][C]6.589[/C][C]6.460103125[/C][C]0.128896875[/C][/ROW]
[ROW][C]55[/C][C]6.729[/C][C]6.557303125[/C][C]0.171696875[/C][/ROW]
[ROW][C]56[/C][C]5.684[/C][C]6.674903125[/C][C]-0.990903125[/C][/ROW]
[ROW][C]57[/C][C]8.105[/C][C]7.259903125[/C][C]0.845096875[/C][/ROW]
[ROW][C]58[/C][C]6.391[/C][C]6.118103125[/C][C]0.272896875000000[/C][/ROW]
[ROW][C]59[/C][C]5.901[/C][C]6.911303125[/C][C]-1.010303125[/C][/ROW]
[ROW][C]60[/C][C]6.758[/C][C]7.321503125[/C][C]-0.563503124999999[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=32175&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=32175&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
15.0146.448565625-1.43456562500001
26.1537.764965625-1.61196562500000
36.4416.822765625-0.381765625000001
45.5847.024365625-1.440365625
56.4277.89253125-1.46553125
66.0626.45393125-0.391931250000000
75.5896.55113125-0.96213125
86.2166.66873125-0.452731250000001
95.8097.25373125-1.44473125
104.9896.11193125-1.12293125000000
116.7066.90513125-0.199131250000000
127.1747.31533125-0.141331249999998
136.1226.448565625-0.326565624999999
148.0757.7649656250.310034374999999
156.2926.822765625-0.530765625
166.3377.024365625-0.687365625
178.5767.892531250.68346875
186.0776.45393125-0.37693125
195.9316.55113125-0.62013125
206.2886.66873125-0.380731250000000
217.1677.25373125-0.0867312500000003
226.0546.11193125-0.05793125
236.4686.90513125-0.43713125
246.4017.31533125-0.91433125
256.9276.4485656250.478434375000001
267.9147.7649656250.149034374999999
277.7286.8227656250.905234375
288.6997.0243656251.674634375
298.5227.892531250.62946875
306.4816.453931250.0270687499999994
317.5026.551131250.95086875
327.7786.668731251.10926875
337.4247.253731250.170268750000000
346.9416.111931250.82906875
358.5746.905131251.66886875
369.1697.315331251.85366875
377.7016.4485656251.252434375
389.0357.7649656251.270034375
397.1586.8227656250.335234375000001
408.1957.0243656251.170634375
418.1247.8987031250.225296875000000
427.0736.4601031250.612896875
437.0176.5573031250.459696875
447.396.6749031250.715096875
457.7767.2599031250.516096875
466.1976.1181031250.0788968749999997
476.8896.911303125-0.0223031249999998
487.0877.321503125-0.234503124999999
496.4856.45473750.0302625000000019
507.6547.7711375-0.117137500000001
516.5016.8289375-0.327937499999999
526.3137.0305375-0.717537500
537.8267.898703125-0.0727031250000006
546.5896.4601031250.128896875
556.7296.5573031250.171696875
565.6846.674903125-0.990903125
578.1057.2599031250.845096875
586.3916.1181031250.272896875000000
595.9016.911303125-1.010303125
606.7587.321503125-0.563503124999999



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')