Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_regression_trees1.wasp
Title produced by softwareRecursive Partitioning (Regression Trees)
Date of computationFri, 09 Dec 2011 11:00:06 -0500
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2011/Dec/09/t1323446478q9p57prq9g4tptt.htm/, Retrieved Fri, 03 May 2024 03:36:48 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=153406, Retrieved Fri, 03 May 2024 03:36:48 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact75
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Recursive Partitioning (Regression Trees)] [] [2010-12-05 18:59:57] [b98453cac15ba1066b407e146608df68]
- R PD    [Recursive Partitioning (Regression Trees)] [Recursive Partiti...] [2011-12-09 16:00:06] [01668b8db120351c61467eadc96b2965] [Current]
Feedback Forum

Post a new message
Dataseries X:
12008.00	4.00
9169.00	5.90
8788.00	7.10
8417.00	10.50
8247.00	15.10
8197.00	16.80
8236.00	15.30
8253.00	18.40
7733.00	16.10
8366.00	11.30
8626.00	7.90
8863.00	5.60
10102.00	3.40
8463.00	4.80
9114.00	6.50
8563.00	8.50
8872.00	15.10
8301.00	15.70
8301.00	18.70
8278.00	19.20
7736.00	12.90
7973.00	14.40
8268.00	6.20
9476.00	3.30
11100.00	4.60
8962.00	7.10
9173.00	7.80
8738.00	9.90
8459.00	13.60
8078.00	17.10
8411.00	17.80
8291.00	18.60
7810.00	14.70
8616.00	10.50
8312.00	8.60
9692.00	4.40
9911.00	2.30
8915.00	2.80
9452.00	8.80
9112.00	10.70
8472.00	13.90
8230.00	19.30
8384.00	19.50
8625.00	20.40
8221.00	15.30
8649.00	7.90
8625.00	8.30
10443.00	4.50
10357.00	3.20
8586.00	5.00
8892.00	6.60
8329.00	11.10
8101.00	12.80
7922.00	16.30
8120.00	17.40
7838.00	18.90
7735.00	15.80
8406.00	11.70
8209.00	6.40
9451.00	2.90
10041.00	4.70
9411.00	2.40
10405.00	7.20
8467.00	10.70
8464.00	13.40
8102.00	18.30
7627.00	18.40
7513.00	16.80
7510.00	16.60
8291.00	14.10
8064.00	6.10
9383.00	3.50
9706.00	1.70
8579.00	2.30
9474.00	4.50
8318.00	9.30
8213.00	14.20
8059.00	17.30
9111.00	23.00
7708.00	16.30
7680.00	18.40
8014.00	14.20
8007.00	9.10
8718.00	5.90
9486.00	7.20
9113.00	6.80
9025.00	8.00
8476.00	14.30
7952.00	14.60
7759.00	17.50
7835.00	17.20
7600.00	17.20
7651.00	14.10
8319.00	10.40
8812.00	6.80
8630.00	4.10




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gertrude Mary Cox' @ cox.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gertrude Mary Cox' @ cox.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=153406&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gertrude Mary Cox' @ cox.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=153406&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=153406&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gertrude Mary Cox' @ cox.wessa.net







Goodness of Fit
Correlation0.7634
R-squared0.5828
RMSE3.5787

\begin{tabular}{lllllllll}
\hline
Goodness of Fit \tabularnewline
Correlation & 0.7634 \tabularnewline
R-squared & 0.5828 \tabularnewline
RMSE & 3.5787 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=153406&T=1

[TABLE]
[ROW][C]Goodness of Fit[/C][/ROW]
[ROW][C]Correlation[/C][C]0.7634[/C][/ROW]
[ROW][C]R-squared[/C][C]0.5828[/C][/ROW]
[ROW][C]RMSE[/C][C]3.5787[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=153406&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=153406&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goodness of Fit
Correlation0.7634
R-squared0.5828
RMSE3.5787







Actuals, Predictions, and Residuals
#ActualsForecastsResiduals
144.27058823529412-0.270588235294118
25.98.42-2.52
37.18.42-1.32
410.512.06-1.56
515.115.4307692307692-0.330769230769231
616.815.43076923076921.36923076923077
715.315.4307692307692-0.13076923076923
818.415.43076923076922.96923076923077
916.115.43076923076920.66923076923077
1011.312.06-0.76
117.98.42-0.52
125.68.42-2.82
133.44.27058823529412-0.870588235294118
144.812.06-7.26
156.58.42-1.92
168.58.420.0800000000000001
1715.18.426.68
1815.715.43076923076920.269230769230768
1918.715.43076923076923.26923076923077
2019.215.43076923076923.76923076923077
2112.915.4307692307692-2.53076923076923
2214.415.4307692307692-1.03076923076923
236.215.4307692307692-9.23076923076923
243.34.27058823529412-0.970588235294118
254.64.270588235294120.329411764705882
267.18.42-1.32
277.88.42-0.62
289.98.421.48
2913.612.061.54
3017.115.43076923076921.66923076923077
3117.812.065.74
3218.615.43076923076923.16923076923077
3314.715.4307692307692-0.730769230769232
3410.58.422.08
358.612.06-3.46
364.44.270588235294120.129411764705883
372.34.27058823529412-1.97058823529412
382.88.42-5.62
398.84.270588235294124.52941176470588
4010.78.422.28
4113.912.061.84
4219.315.43076923076923.86923076923077
4319.512.067.44
4420.48.4211.98
4515.315.4307692307692-0.13076923076923
467.98.42-0.52
478.38.42-0.119999999999999
484.54.270588235294120.229411764705882
493.24.27058823529412-1.07058823529412
5058.42-3.42
516.68.42-1.82
5211.112.06-0.960000000000001
5312.815.4307692307692-2.63076923076923
5416.315.43076923076920.86923076923077
5517.415.43076923076921.96923076923077
5618.915.43076923076923.46923076923077
5715.815.43076923076920.36923076923077
5811.712.06-0.360000000000001
596.415.4307692307692-9.03076923076923
602.94.27058823529412-1.37058823529412
614.74.270588235294120.429411764705883
622.44.27058823529412-1.87058823529412
637.24.270588235294122.92941176470588
6410.712.06-1.36
6513.412.061.34
6618.315.43076923076922.86923076923077
6718.415.43076923076922.96923076923077
6816.815.43076923076921.36923076923077
6916.615.43076923076921.16923076923077
7014.115.4307692307692-1.33076923076923
716.115.4307692307692-9.33076923076923
723.54.27058823529412-0.770588235294118
731.74.27058823529412-2.57058823529412
742.38.42-6.12
754.54.270588235294120.229411764705882
769.312.06-2.76
7714.215.4307692307692-1.23076923076923
7817.315.43076923076921.86923076923077
79238.4214.58
8016.315.43076923076920.86923076923077
8118.415.43076923076922.96923076923077
8214.215.4307692307692-1.23076923076923
839.115.4307692307692-6.33076923076923
845.98.42-2.52
857.24.270588235294122.92941176470588
866.88.42-1.62
8788.42-0.42
8814.312.062.24
8914.615.4307692307692-0.830769230769231
9017.515.43076923076922.06923076923077
9117.215.43076923076921.76923076923077
9217.215.43076923076921.76923076923077
9314.115.4307692307692-1.33076923076923
9410.412.06-1.66
956.88.42-1.62
964.18.42-4.32

\begin{tabular}{lllllllll}
\hline
Actuals, Predictions, and Residuals \tabularnewline
# & Actuals & Forecasts & Residuals \tabularnewline
1 & 4 & 4.27058823529412 & -0.270588235294118 \tabularnewline
2 & 5.9 & 8.42 & -2.52 \tabularnewline
3 & 7.1 & 8.42 & -1.32 \tabularnewline
4 & 10.5 & 12.06 & -1.56 \tabularnewline
5 & 15.1 & 15.4307692307692 & -0.330769230769231 \tabularnewline
6 & 16.8 & 15.4307692307692 & 1.36923076923077 \tabularnewline
7 & 15.3 & 15.4307692307692 & -0.13076923076923 \tabularnewline
8 & 18.4 & 15.4307692307692 & 2.96923076923077 \tabularnewline
9 & 16.1 & 15.4307692307692 & 0.66923076923077 \tabularnewline
10 & 11.3 & 12.06 & -0.76 \tabularnewline
11 & 7.9 & 8.42 & -0.52 \tabularnewline
12 & 5.6 & 8.42 & -2.82 \tabularnewline
13 & 3.4 & 4.27058823529412 & -0.870588235294118 \tabularnewline
14 & 4.8 & 12.06 & -7.26 \tabularnewline
15 & 6.5 & 8.42 & -1.92 \tabularnewline
16 & 8.5 & 8.42 & 0.0800000000000001 \tabularnewline
17 & 15.1 & 8.42 & 6.68 \tabularnewline
18 & 15.7 & 15.4307692307692 & 0.269230769230768 \tabularnewline
19 & 18.7 & 15.4307692307692 & 3.26923076923077 \tabularnewline
20 & 19.2 & 15.4307692307692 & 3.76923076923077 \tabularnewline
21 & 12.9 & 15.4307692307692 & -2.53076923076923 \tabularnewline
22 & 14.4 & 15.4307692307692 & -1.03076923076923 \tabularnewline
23 & 6.2 & 15.4307692307692 & -9.23076923076923 \tabularnewline
24 & 3.3 & 4.27058823529412 & -0.970588235294118 \tabularnewline
25 & 4.6 & 4.27058823529412 & 0.329411764705882 \tabularnewline
26 & 7.1 & 8.42 & -1.32 \tabularnewline
27 & 7.8 & 8.42 & -0.62 \tabularnewline
28 & 9.9 & 8.42 & 1.48 \tabularnewline
29 & 13.6 & 12.06 & 1.54 \tabularnewline
30 & 17.1 & 15.4307692307692 & 1.66923076923077 \tabularnewline
31 & 17.8 & 12.06 & 5.74 \tabularnewline
32 & 18.6 & 15.4307692307692 & 3.16923076923077 \tabularnewline
33 & 14.7 & 15.4307692307692 & -0.730769230769232 \tabularnewline
34 & 10.5 & 8.42 & 2.08 \tabularnewline
35 & 8.6 & 12.06 & -3.46 \tabularnewline
36 & 4.4 & 4.27058823529412 & 0.129411764705883 \tabularnewline
37 & 2.3 & 4.27058823529412 & -1.97058823529412 \tabularnewline
38 & 2.8 & 8.42 & -5.62 \tabularnewline
39 & 8.8 & 4.27058823529412 & 4.52941176470588 \tabularnewline
40 & 10.7 & 8.42 & 2.28 \tabularnewline
41 & 13.9 & 12.06 & 1.84 \tabularnewline
42 & 19.3 & 15.4307692307692 & 3.86923076923077 \tabularnewline
43 & 19.5 & 12.06 & 7.44 \tabularnewline
44 & 20.4 & 8.42 & 11.98 \tabularnewline
45 & 15.3 & 15.4307692307692 & -0.13076923076923 \tabularnewline
46 & 7.9 & 8.42 & -0.52 \tabularnewline
47 & 8.3 & 8.42 & -0.119999999999999 \tabularnewline
48 & 4.5 & 4.27058823529412 & 0.229411764705882 \tabularnewline
49 & 3.2 & 4.27058823529412 & -1.07058823529412 \tabularnewline
50 & 5 & 8.42 & -3.42 \tabularnewline
51 & 6.6 & 8.42 & -1.82 \tabularnewline
52 & 11.1 & 12.06 & -0.960000000000001 \tabularnewline
53 & 12.8 & 15.4307692307692 & -2.63076923076923 \tabularnewline
54 & 16.3 & 15.4307692307692 & 0.86923076923077 \tabularnewline
55 & 17.4 & 15.4307692307692 & 1.96923076923077 \tabularnewline
56 & 18.9 & 15.4307692307692 & 3.46923076923077 \tabularnewline
57 & 15.8 & 15.4307692307692 & 0.36923076923077 \tabularnewline
58 & 11.7 & 12.06 & -0.360000000000001 \tabularnewline
59 & 6.4 & 15.4307692307692 & -9.03076923076923 \tabularnewline
60 & 2.9 & 4.27058823529412 & -1.37058823529412 \tabularnewline
61 & 4.7 & 4.27058823529412 & 0.429411764705883 \tabularnewline
62 & 2.4 & 4.27058823529412 & -1.87058823529412 \tabularnewline
63 & 7.2 & 4.27058823529412 & 2.92941176470588 \tabularnewline
64 & 10.7 & 12.06 & -1.36 \tabularnewline
65 & 13.4 & 12.06 & 1.34 \tabularnewline
66 & 18.3 & 15.4307692307692 & 2.86923076923077 \tabularnewline
67 & 18.4 & 15.4307692307692 & 2.96923076923077 \tabularnewline
68 & 16.8 & 15.4307692307692 & 1.36923076923077 \tabularnewline
69 & 16.6 & 15.4307692307692 & 1.16923076923077 \tabularnewline
70 & 14.1 & 15.4307692307692 & -1.33076923076923 \tabularnewline
71 & 6.1 & 15.4307692307692 & -9.33076923076923 \tabularnewline
72 & 3.5 & 4.27058823529412 & -0.770588235294118 \tabularnewline
73 & 1.7 & 4.27058823529412 & -2.57058823529412 \tabularnewline
74 & 2.3 & 8.42 & -6.12 \tabularnewline
75 & 4.5 & 4.27058823529412 & 0.229411764705882 \tabularnewline
76 & 9.3 & 12.06 & -2.76 \tabularnewline
77 & 14.2 & 15.4307692307692 & -1.23076923076923 \tabularnewline
78 & 17.3 & 15.4307692307692 & 1.86923076923077 \tabularnewline
79 & 23 & 8.42 & 14.58 \tabularnewline
80 & 16.3 & 15.4307692307692 & 0.86923076923077 \tabularnewline
81 & 18.4 & 15.4307692307692 & 2.96923076923077 \tabularnewline
82 & 14.2 & 15.4307692307692 & -1.23076923076923 \tabularnewline
83 & 9.1 & 15.4307692307692 & -6.33076923076923 \tabularnewline
84 & 5.9 & 8.42 & -2.52 \tabularnewline
85 & 7.2 & 4.27058823529412 & 2.92941176470588 \tabularnewline
86 & 6.8 & 8.42 & -1.62 \tabularnewline
87 & 8 & 8.42 & -0.42 \tabularnewline
88 & 14.3 & 12.06 & 2.24 \tabularnewline
89 & 14.6 & 15.4307692307692 & -0.830769230769231 \tabularnewline
90 & 17.5 & 15.4307692307692 & 2.06923076923077 \tabularnewline
91 & 17.2 & 15.4307692307692 & 1.76923076923077 \tabularnewline
92 & 17.2 & 15.4307692307692 & 1.76923076923077 \tabularnewline
93 & 14.1 & 15.4307692307692 & -1.33076923076923 \tabularnewline
94 & 10.4 & 12.06 & -1.66 \tabularnewline
95 & 6.8 & 8.42 & -1.62 \tabularnewline
96 & 4.1 & 8.42 & -4.32 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=153406&T=2

[TABLE]
[ROW][C]Actuals, Predictions, and Residuals[/C][/ROW]
[ROW][C]#[/C][C]Actuals[/C][C]Forecasts[/C][C]Residuals[/C][/ROW]
[ROW][C]1[/C][C]4[/C][C]4.27058823529412[/C][C]-0.270588235294118[/C][/ROW]
[ROW][C]2[/C][C]5.9[/C][C]8.42[/C][C]-2.52[/C][/ROW]
[ROW][C]3[/C][C]7.1[/C][C]8.42[/C][C]-1.32[/C][/ROW]
[ROW][C]4[/C][C]10.5[/C][C]12.06[/C][C]-1.56[/C][/ROW]
[ROW][C]5[/C][C]15.1[/C][C]15.4307692307692[/C][C]-0.330769230769231[/C][/ROW]
[ROW][C]6[/C][C]16.8[/C][C]15.4307692307692[/C][C]1.36923076923077[/C][/ROW]
[ROW][C]7[/C][C]15.3[/C][C]15.4307692307692[/C][C]-0.13076923076923[/C][/ROW]
[ROW][C]8[/C][C]18.4[/C][C]15.4307692307692[/C][C]2.96923076923077[/C][/ROW]
[ROW][C]9[/C][C]16.1[/C][C]15.4307692307692[/C][C]0.66923076923077[/C][/ROW]
[ROW][C]10[/C][C]11.3[/C][C]12.06[/C][C]-0.76[/C][/ROW]
[ROW][C]11[/C][C]7.9[/C][C]8.42[/C][C]-0.52[/C][/ROW]
[ROW][C]12[/C][C]5.6[/C][C]8.42[/C][C]-2.82[/C][/ROW]
[ROW][C]13[/C][C]3.4[/C][C]4.27058823529412[/C][C]-0.870588235294118[/C][/ROW]
[ROW][C]14[/C][C]4.8[/C][C]12.06[/C][C]-7.26[/C][/ROW]
[ROW][C]15[/C][C]6.5[/C][C]8.42[/C][C]-1.92[/C][/ROW]
[ROW][C]16[/C][C]8.5[/C][C]8.42[/C][C]0.0800000000000001[/C][/ROW]
[ROW][C]17[/C][C]15.1[/C][C]8.42[/C][C]6.68[/C][/ROW]
[ROW][C]18[/C][C]15.7[/C][C]15.4307692307692[/C][C]0.269230769230768[/C][/ROW]
[ROW][C]19[/C][C]18.7[/C][C]15.4307692307692[/C][C]3.26923076923077[/C][/ROW]
[ROW][C]20[/C][C]19.2[/C][C]15.4307692307692[/C][C]3.76923076923077[/C][/ROW]
[ROW][C]21[/C][C]12.9[/C][C]15.4307692307692[/C][C]-2.53076923076923[/C][/ROW]
[ROW][C]22[/C][C]14.4[/C][C]15.4307692307692[/C][C]-1.03076923076923[/C][/ROW]
[ROW][C]23[/C][C]6.2[/C][C]15.4307692307692[/C][C]-9.23076923076923[/C][/ROW]
[ROW][C]24[/C][C]3.3[/C][C]4.27058823529412[/C][C]-0.970588235294118[/C][/ROW]
[ROW][C]25[/C][C]4.6[/C][C]4.27058823529412[/C][C]0.329411764705882[/C][/ROW]
[ROW][C]26[/C][C]7.1[/C][C]8.42[/C][C]-1.32[/C][/ROW]
[ROW][C]27[/C][C]7.8[/C][C]8.42[/C][C]-0.62[/C][/ROW]
[ROW][C]28[/C][C]9.9[/C][C]8.42[/C][C]1.48[/C][/ROW]
[ROW][C]29[/C][C]13.6[/C][C]12.06[/C][C]1.54[/C][/ROW]
[ROW][C]30[/C][C]17.1[/C][C]15.4307692307692[/C][C]1.66923076923077[/C][/ROW]
[ROW][C]31[/C][C]17.8[/C][C]12.06[/C][C]5.74[/C][/ROW]
[ROW][C]32[/C][C]18.6[/C][C]15.4307692307692[/C][C]3.16923076923077[/C][/ROW]
[ROW][C]33[/C][C]14.7[/C][C]15.4307692307692[/C][C]-0.730769230769232[/C][/ROW]
[ROW][C]34[/C][C]10.5[/C][C]8.42[/C][C]2.08[/C][/ROW]
[ROW][C]35[/C][C]8.6[/C][C]12.06[/C][C]-3.46[/C][/ROW]
[ROW][C]36[/C][C]4.4[/C][C]4.27058823529412[/C][C]0.129411764705883[/C][/ROW]
[ROW][C]37[/C][C]2.3[/C][C]4.27058823529412[/C][C]-1.97058823529412[/C][/ROW]
[ROW][C]38[/C][C]2.8[/C][C]8.42[/C][C]-5.62[/C][/ROW]
[ROW][C]39[/C][C]8.8[/C][C]4.27058823529412[/C][C]4.52941176470588[/C][/ROW]
[ROW][C]40[/C][C]10.7[/C][C]8.42[/C][C]2.28[/C][/ROW]
[ROW][C]41[/C][C]13.9[/C][C]12.06[/C][C]1.84[/C][/ROW]
[ROW][C]42[/C][C]19.3[/C][C]15.4307692307692[/C][C]3.86923076923077[/C][/ROW]
[ROW][C]43[/C][C]19.5[/C][C]12.06[/C][C]7.44[/C][/ROW]
[ROW][C]44[/C][C]20.4[/C][C]8.42[/C][C]11.98[/C][/ROW]
[ROW][C]45[/C][C]15.3[/C][C]15.4307692307692[/C][C]-0.13076923076923[/C][/ROW]
[ROW][C]46[/C][C]7.9[/C][C]8.42[/C][C]-0.52[/C][/ROW]
[ROW][C]47[/C][C]8.3[/C][C]8.42[/C][C]-0.119999999999999[/C][/ROW]
[ROW][C]48[/C][C]4.5[/C][C]4.27058823529412[/C][C]0.229411764705882[/C][/ROW]
[ROW][C]49[/C][C]3.2[/C][C]4.27058823529412[/C][C]-1.07058823529412[/C][/ROW]
[ROW][C]50[/C][C]5[/C][C]8.42[/C][C]-3.42[/C][/ROW]
[ROW][C]51[/C][C]6.6[/C][C]8.42[/C][C]-1.82[/C][/ROW]
[ROW][C]52[/C][C]11.1[/C][C]12.06[/C][C]-0.960000000000001[/C][/ROW]
[ROW][C]53[/C][C]12.8[/C][C]15.4307692307692[/C][C]-2.63076923076923[/C][/ROW]
[ROW][C]54[/C][C]16.3[/C][C]15.4307692307692[/C][C]0.86923076923077[/C][/ROW]
[ROW][C]55[/C][C]17.4[/C][C]15.4307692307692[/C][C]1.96923076923077[/C][/ROW]
[ROW][C]56[/C][C]18.9[/C][C]15.4307692307692[/C][C]3.46923076923077[/C][/ROW]
[ROW][C]57[/C][C]15.8[/C][C]15.4307692307692[/C][C]0.36923076923077[/C][/ROW]
[ROW][C]58[/C][C]11.7[/C][C]12.06[/C][C]-0.360000000000001[/C][/ROW]
[ROW][C]59[/C][C]6.4[/C][C]15.4307692307692[/C][C]-9.03076923076923[/C][/ROW]
[ROW][C]60[/C][C]2.9[/C][C]4.27058823529412[/C][C]-1.37058823529412[/C][/ROW]
[ROW][C]61[/C][C]4.7[/C][C]4.27058823529412[/C][C]0.429411764705883[/C][/ROW]
[ROW][C]62[/C][C]2.4[/C][C]4.27058823529412[/C][C]-1.87058823529412[/C][/ROW]
[ROW][C]63[/C][C]7.2[/C][C]4.27058823529412[/C][C]2.92941176470588[/C][/ROW]
[ROW][C]64[/C][C]10.7[/C][C]12.06[/C][C]-1.36[/C][/ROW]
[ROW][C]65[/C][C]13.4[/C][C]12.06[/C][C]1.34[/C][/ROW]
[ROW][C]66[/C][C]18.3[/C][C]15.4307692307692[/C][C]2.86923076923077[/C][/ROW]
[ROW][C]67[/C][C]18.4[/C][C]15.4307692307692[/C][C]2.96923076923077[/C][/ROW]
[ROW][C]68[/C][C]16.8[/C][C]15.4307692307692[/C][C]1.36923076923077[/C][/ROW]
[ROW][C]69[/C][C]16.6[/C][C]15.4307692307692[/C][C]1.16923076923077[/C][/ROW]
[ROW][C]70[/C][C]14.1[/C][C]15.4307692307692[/C][C]-1.33076923076923[/C][/ROW]
[ROW][C]71[/C][C]6.1[/C][C]15.4307692307692[/C][C]-9.33076923076923[/C][/ROW]
[ROW][C]72[/C][C]3.5[/C][C]4.27058823529412[/C][C]-0.770588235294118[/C][/ROW]
[ROW][C]73[/C][C]1.7[/C][C]4.27058823529412[/C][C]-2.57058823529412[/C][/ROW]
[ROW][C]74[/C][C]2.3[/C][C]8.42[/C][C]-6.12[/C][/ROW]
[ROW][C]75[/C][C]4.5[/C][C]4.27058823529412[/C][C]0.229411764705882[/C][/ROW]
[ROW][C]76[/C][C]9.3[/C][C]12.06[/C][C]-2.76[/C][/ROW]
[ROW][C]77[/C][C]14.2[/C][C]15.4307692307692[/C][C]-1.23076923076923[/C][/ROW]
[ROW][C]78[/C][C]17.3[/C][C]15.4307692307692[/C][C]1.86923076923077[/C][/ROW]
[ROW][C]79[/C][C]23[/C][C]8.42[/C][C]14.58[/C][/ROW]
[ROW][C]80[/C][C]16.3[/C][C]15.4307692307692[/C][C]0.86923076923077[/C][/ROW]
[ROW][C]81[/C][C]18.4[/C][C]15.4307692307692[/C][C]2.96923076923077[/C][/ROW]
[ROW][C]82[/C][C]14.2[/C][C]15.4307692307692[/C][C]-1.23076923076923[/C][/ROW]
[ROW][C]83[/C][C]9.1[/C][C]15.4307692307692[/C][C]-6.33076923076923[/C][/ROW]
[ROW][C]84[/C][C]5.9[/C][C]8.42[/C][C]-2.52[/C][/ROW]
[ROW][C]85[/C][C]7.2[/C][C]4.27058823529412[/C][C]2.92941176470588[/C][/ROW]
[ROW][C]86[/C][C]6.8[/C][C]8.42[/C][C]-1.62[/C][/ROW]
[ROW][C]87[/C][C]8[/C][C]8.42[/C][C]-0.42[/C][/ROW]
[ROW][C]88[/C][C]14.3[/C][C]12.06[/C][C]2.24[/C][/ROW]
[ROW][C]89[/C][C]14.6[/C][C]15.4307692307692[/C][C]-0.830769230769231[/C][/ROW]
[ROW][C]90[/C][C]17.5[/C][C]15.4307692307692[/C][C]2.06923076923077[/C][/ROW]
[ROW][C]91[/C][C]17.2[/C][C]15.4307692307692[/C][C]1.76923076923077[/C][/ROW]
[ROW][C]92[/C][C]17.2[/C][C]15.4307692307692[/C][C]1.76923076923077[/C][/ROW]
[ROW][C]93[/C][C]14.1[/C][C]15.4307692307692[/C][C]-1.33076923076923[/C][/ROW]
[ROW][C]94[/C][C]10.4[/C][C]12.06[/C][C]-1.66[/C][/ROW]
[ROW][C]95[/C][C]6.8[/C][C]8.42[/C][C]-1.62[/C][/ROW]
[ROW][C]96[/C][C]4.1[/C][C]8.42[/C][C]-4.32[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=153406&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=153406&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Actuals, Predictions, and Residuals
#ActualsForecastsResiduals
144.27058823529412-0.270588235294118
25.98.42-2.52
37.18.42-1.32
410.512.06-1.56
515.115.4307692307692-0.330769230769231
616.815.43076923076921.36923076923077
715.315.4307692307692-0.13076923076923
818.415.43076923076922.96923076923077
916.115.43076923076920.66923076923077
1011.312.06-0.76
117.98.42-0.52
125.68.42-2.82
133.44.27058823529412-0.870588235294118
144.812.06-7.26
156.58.42-1.92
168.58.420.0800000000000001
1715.18.426.68
1815.715.43076923076920.269230769230768
1918.715.43076923076923.26923076923077
2019.215.43076923076923.76923076923077
2112.915.4307692307692-2.53076923076923
2214.415.4307692307692-1.03076923076923
236.215.4307692307692-9.23076923076923
243.34.27058823529412-0.970588235294118
254.64.270588235294120.329411764705882
267.18.42-1.32
277.88.42-0.62
289.98.421.48
2913.612.061.54
3017.115.43076923076921.66923076923077
3117.812.065.74
3218.615.43076923076923.16923076923077
3314.715.4307692307692-0.730769230769232
3410.58.422.08
358.612.06-3.46
364.44.270588235294120.129411764705883
372.34.27058823529412-1.97058823529412
382.88.42-5.62
398.84.270588235294124.52941176470588
4010.78.422.28
4113.912.061.84
4219.315.43076923076923.86923076923077
4319.512.067.44
4420.48.4211.98
4515.315.4307692307692-0.13076923076923
467.98.42-0.52
478.38.42-0.119999999999999
484.54.270588235294120.229411764705882
493.24.27058823529412-1.07058823529412
5058.42-3.42
516.68.42-1.82
5211.112.06-0.960000000000001
5312.815.4307692307692-2.63076923076923
5416.315.43076923076920.86923076923077
5517.415.43076923076921.96923076923077
5618.915.43076923076923.46923076923077
5715.815.43076923076920.36923076923077
5811.712.06-0.360000000000001
596.415.4307692307692-9.03076923076923
602.94.27058823529412-1.37058823529412
614.74.270588235294120.429411764705883
622.44.27058823529412-1.87058823529412
637.24.270588235294122.92941176470588
6410.712.06-1.36
6513.412.061.34
6618.315.43076923076922.86923076923077
6718.415.43076923076922.96923076923077
6816.815.43076923076921.36923076923077
6916.615.43076923076921.16923076923077
7014.115.4307692307692-1.33076923076923
716.115.4307692307692-9.33076923076923
723.54.27058823529412-0.770588235294118
731.74.27058823529412-2.57058823529412
742.38.42-6.12
754.54.270588235294120.229411764705882
769.312.06-2.76
7714.215.4307692307692-1.23076923076923
7817.315.43076923076921.86923076923077
79238.4214.58
8016.315.43076923076920.86923076923077
8118.415.43076923076922.96923076923077
8214.215.4307692307692-1.23076923076923
839.115.4307692307692-6.33076923076923
845.98.42-2.52
857.24.270588235294122.92941176470588
866.88.42-1.62
8788.42-0.42
8814.312.062.24
8914.615.4307692307692-0.830769230769231
9017.515.43076923076922.06923076923077
9117.215.43076923076921.76923076923077
9217.215.43076923076921.76923076923077
9314.115.4307692307692-1.33076923076923
9410.412.06-1.66
956.88.42-1.62
964.18.42-4.32



Parameters (Session):
par1 = 2 ; par2 = none ; par3 = 3 ; par4 = no ;
Parameters (R input):
par1 = 2 ; par2 = none ; par3 = 3 ; par4 = no ;
R code (references can be found in the software module):
library(party)
library(Hmisc)
par1 <- as.numeric(par1)
par3 <- as.numeric(par3)
x <- data.frame(t(y))
is.data.frame(x)
x <- x[!is.na(x[,par1]),]
k <- length(x[1,])
n <- length(x[,1])
colnames(x)[par1]
x[,par1]
if (par2 == 'kmeans') {
cl <- kmeans(x[,par1], par3)
print(cl)
clm <- matrix(cbind(cl$centers,1:par3),ncol=2)
clm <- clm[sort.list(clm[,1]),]
for (i in 1:par3) {
cl$cluster[cl$cluster==clm[i,2]] <- paste('C',i,sep='')
}
cl$cluster <- as.factor(cl$cluster)
print(cl$cluster)
x[,par1] <- cl$cluster
}
if (par2 == 'quantiles') {
x[,par1] <- cut2(x[,par1],g=par3)
}
if (par2 == 'hclust') {
hc <- hclust(dist(x[,par1])^2, 'cen')
print(hc)
memb <- cutree(hc, k = par3)
dum <- c(mean(x[memb==1,par1]))
for (i in 2:par3) {
dum <- c(dum, mean(x[memb==i,par1]))
}
hcm <- matrix(cbind(dum,1:par3),ncol=2)
hcm <- hcm[sort.list(hcm[,1]),]
for (i in 1:par3) {
memb[memb==hcm[i,2]] <- paste('C',i,sep='')
}
memb <- as.factor(memb)
print(memb)
x[,par1] <- memb
}
if (par2=='equal') {
ed <- cut(as.numeric(x[,par1]),par3,labels=paste('C',1:par3,sep=''))
x[,par1] <- as.factor(ed)
}
table(x[,par1])
colnames(x)
colnames(x)[par1]
x[,par1]
if (par2 == 'none') {
m <- ctree(as.formula(paste(colnames(x)[par1],' ~ .',sep='')),data = x)
}
load(file='createtable')
if (par2 != 'none') {
m <- ctree(as.formula(paste('as.factor(',colnames(x)[par1],') ~ .',sep='')),data = x)
if (par4=='yes') {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'10-Fold Cross Validation',3+2*par3,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'',1,TRUE)
a<-table.element(a,'Prediction (training)',par3+1,TRUE)
a<-table.element(a,'Prediction (testing)',par3+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Actual',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,paste('C',jjj,sep=''),1,TRUE)
a<-table.element(a,'CV',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,paste('C',jjj,sep=''),1,TRUE)
a<-table.element(a,'CV',1,TRUE)
a<-table.row.end(a)
for (i in 1:10) {
ind <- sample(2, nrow(x), replace=T, prob=c(0.9,0.1))
m.ct <- ctree(as.formula(paste('as.factor(',colnames(x)[par1],') ~ .',sep='')),data =x[ind==1,])
if (i==1) {
m.ct.i.pred <- predict(m.ct, newdata=x[ind==1,])
m.ct.i.actu <- x[ind==1,par1]
m.ct.x.pred <- predict(m.ct, newdata=x[ind==2,])
m.ct.x.actu <- x[ind==2,par1]
} else {
m.ct.i.pred <- c(m.ct.i.pred,predict(m.ct, newdata=x[ind==1,]))
m.ct.i.actu <- c(m.ct.i.actu,x[ind==1,par1])
m.ct.x.pred <- c(m.ct.x.pred,predict(m.ct, newdata=x[ind==2,]))
m.ct.x.actu <- c(m.ct.x.actu,x[ind==2,par1])
}
}
print(m.ct.i.tab <- table(m.ct.i.actu,m.ct.i.pred))
numer <- 0
for (i in 1:par3) {
print(m.ct.i.tab[i,i] / sum(m.ct.i.tab[i,]))
numer <- numer + m.ct.i.tab[i,i]
}
print(m.ct.i.cp <- numer / sum(m.ct.i.tab))
print(m.ct.x.tab <- table(m.ct.x.actu,m.ct.x.pred))
numer <- 0
for (i in 1:par3) {
print(m.ct.x.tab[i,i] / sum(m.ct.x.tab[i,]))
numer <- numer + m.ct.x.tab[i,i]
}
print(m.ct.x.cp <- numer / sum(m.ct.x.tab))
for (i in 1:par3) {
a<-table.row.start(a)
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
for (jjj in 1:par3) a<-table.element(a,m.ct.i.tab[i,jjj])
a<-table.element(a,round(m.ct.i.tab[i,i]/sum(m.ct.i.tab[i,]),4))
for (jjj in 1:par3) a<-table.element(a,m.ct.x.tab[i,jjj])
a<-table.element(a,round(m.ct.x.tab[i,i]/sum(m.ct.x.tab[i,]),4))
a<-table.row.end(a)
}
a<-table.row.start(a)
a<-table.element(a,'Overall',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,'-')
a<-table.element(a,round(m.ct.i.cp,4))
for (jjj in 1:par3) a<-table.element(a,'-')
a<-table.element(a,round(m.ct.x.cp,4))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
}
}
m
bitmap(file='test1.png')
plot(m)
dev.off()
bitmap(file='test1a.png')
plot(x[,par1] ~ as.factor(where(m)),main='Response by Terminal Node',xlab='Terminal Node',ylab='Response')
dev.off()
if (par2 == 'none') {
forec <- predict(m)
result <- as.data.frame(cbind(x[,par1],forec,x[,par1]-forec))
colnames(result) <- c('Actuals','Forecasts','Residuals')
print(result)
}
if (par2 != 'none') {
print(cbind(as.factor(x[,par1]),predict(m)))
myt <- table(as.factor(x[,par1]),predict(m))
print(myt)
}
bitmap(file='test2.png')
if(par2=='none') {
op <- par(mfrow=c(2,2))
plot(density(result$Actuals),main='Kernel Density Plot of Actuals')
plot(density(result$Residuals),main='Kernel Density Plot of Residuals')
plot(result$Forecasts,result$Actuals,main='Actuals versus Predictions',xlab='Predictions',ylab='Actuals')
plot(density(result$Forecasts),main='Kernel Density Plot of Predictions')
par(op)
}
if(par2!='none') {
plot(myt,main='Confusion Matrix',xlab='Actual',ylab='Predicted')
}
dev.off()
if (par2 == 'none') {
detcoef <- cor(result$Forecasts,result$Actuals)
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goodness of Fit',2,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Correlation',1,TRUE)
a<-table.element(a,round(detcoef,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'R-squared',1,TRUE)
a<-table.element(a,round(detcoef*detcoef,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'RMSE',1,TRUE)
a<-table.element(a,round(sqrt(mean((result$Residuals)^2)),4))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Actuals, Predictions, and Residuals',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'#',header=TRUE)
a<-table.element(a,'Actuals',header=TRUE)
a<-table.element(a,'Forecasts',header=TRUE)
a<-table.element(a,'Residuals',header=TRUE)
a<-table.row.end(a)
for (i in 1:length(result$Actuals)) {
a<-table.row.start(a)
a<-table.element(a,i,header=TRUE)
a<-table.element(a,result$Actuals[i])
a<-table.element(a,result$Forecasts[i])
a<-table.element(a,result$Residuals[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
}
if (par2 != 'none') {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Confusion Matrix (predicted in columns / actuals in rows)',par3+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'',1,TRUE)
for (i in 1:par3) {
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
}
a<-table.row.end(a)
for (i in 1:par3) {
a<-table.row.start(a)
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
for (j in 1:par3) {
a<-table.element(a,myt[i,j])
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
}