Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_regression_trees1.wasp
Title produced by softwareRecursive Partitioning (Regression Trees)
Date of computationFri, 09 Dec 2011 05:15:45 -0500
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2011/Dec/09/t1323425883okpo12u1v4hf75d.htm/, Retrieved Thu, 02 May 2024 18:55:03 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=153228, Retrieved Thu, 02 May 2024 18:55:03 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact121
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Recursive Partitioning (Regression Trees)] [] [2011-12-09 10:15:45] [13d85cac30d4a10947636c080219d4f4] [Current]
Feedback Forum

Post a new message
Dataseries X:
57.59	33306600	23.42	2120.30	0.0435
67.82	23898100	25.30	2232.82	0.0346
71.89	23279600	23.90	2205.32	0.0342
75.51	40699800	25.73	2305.82	0.0399
68.49	37646000	24.64	2281.39	0.036
62.72	37277000	24.95	2339.79	0.0336
70.39	39246800	22.15	2322.57	0.0355
59.77	27418400	20.85	2178.88	0.0417
57.27	30318700	21.45	2172.09	0.0432
67.96	32808100	22.15	2091.47	0.0415
67.85	28668200	23.75	2183.75	0.0382
76.98	32370300	25.27	2258.43	0.0206
81.08	24171100	26.53	2366.71	0.0131
91.66	25009100	27.22	2431.77	0.0197
84.84	32084300	27.69	2415.29	0.0254
85.73	50117500	28.61	2463.93	0.0208
84.61	27522200	26.21	2416.15	0.0242
92.91	26816800	25.93	2421.64	0.0278
99.80	25136100	27.86	2525.09	0.0257
121.19	30295600	28.65	2604.52	0.0269
122.04	41526100	27.51	2603.23	0.0269
131.76	43845100	27.06	2546.27	0.0236
138.48	39188900	26.91	2596.36	0.0197
153.47	40496400	27.60	2701.50	0.0276
189.95	37438400	34.48	2859.12	0.0354
182.22	46553700	31.58	2660.96	0.0431
198.08	31771400	33.46	2652.28	0.0408
135.36	62108100	30.64	2389.86	0.0428
125.02	46645400	25.66	2271.48	0.0403
143.50	42313100	26.78	2279.10	0.0398
173.95	38841700	26.91	2412.80	0.0394
188.75	32650300	26.82	2522.66	0.0418
167.44	34281100	26.05	2292.98	0.0502
158.95	33096200	24.36	2325.55	0.056
169.53	23273800	25.94	2367.52	0.0537
113.66	43697600	25.37	2091.88	0.0494
107.59	66902300	21.23	1720.95	0.0366
92.67	44957200	19.35	1535.57	0.0107
85.35	33800900	18.61	1577.03	0.0009
90.13	33487900	16.37	1476.42	0.0003
89.31	27394900	15.56	1377.84	0.0024
105.12	25963400	17.70	1528.59	-0.0038
125.83	20952600	19.52	1717.30	-0.0074
135.81	17702900	20.26	1774.33	-0.0128
142.43	21282100	23.05	1835.04	-0.0143
163.39	18449100	22.81	1978.50	-0.021
168.21	14415700	24.04	2009.06	-0.0148
185.35	17906300	25.08	2122.42	-0.0129
188.50	22197500	27.04	2045.11	-0.0018
199.91	15856500	28.81	2144.60	0.0184
210.73	19068700	29.86	2269.15	0.0272
192.06	30855100	27.61	2147.35	0.0263
204.62	21209000	28.22	2238.26	0.0214
235.00	19541600	28.83	2397.96	0.0231
261.09	21955000	30.06	2461.19	0.0224
256.88	33725900	25.51	2257.04	0.0202
251.53	28192800	22.75	2109.24	0.0105
257.25	27377000	25.52	2254.70	0.0124
243.10	16228100	23.33	2114.03	0.0115
283.75	21278900	24.34	2368.62	0.0114




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gertrude Mary Cox' @ cox.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gertrude Mary Cox' @ cox.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=153228&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gertrude Mary Cox' @ cox.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=153228&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=153228&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gertrude Mary Cox' @ cox.wessa.net







Goodness of Fit
Correlation0.4018
R-squared0.1614
RMSE56.3054

\begin{tabular}{lllllllll}
\hline
Goodness of Fit \tabularnewline
Correlation & 0.4018 \tabularnewline
R-squared & 0.1614 \tabularnewline
RMSE & 56.3054 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=153228&T=1

[TABLE]
[ROW][C]Goodness of Fit[/C][/ROW]
[ROW][C]Correlation[/C][C]0.4018[/C][/ROW]
[ROW][C]R-squared[/C][C]0.1614[/C][/ROW]
[ROW][C]RMSE[/C][C]56.3054[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=153228&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=153228&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goodness of Fit
Correlation0.4018
R-squared0.1614
RMSE56.3054







Actuals, Predictions, and Residuals
#ActualsForecastsResiduals
157.59128.874807692308-71.2848076923077
267.82128.874807692308-61.0548076923077
371.89128.874807692308-56.9848076923077
475.51128.874807692308-53.3648076923077
568.49128.874807692308-60.3848076923077
662.72128.874807692308-66.1548076923077
770.39128.874807692308-58.4848076923077
859.77128.874807692308-69.1048076923077
957.27128.874807692308-71.6048076923077
1067.96128.874807692308-60.9148076923077
1167.85128.874807692308-61.0248076923077
1276.98128.874807692308-51.8948076923077
1381.08128.874807692308-47.7948076923077
1491.66128.874807692308-37.2148076923077
1584.84128.874807692308-44.0348076923077
1685.73128.874807692308-43.1448076923077
1784.61128.874807692308-44.2648076923077
1892.91128.874807692308-35.9648076923077
1999.8128.874807692308-29.0748076923077
20121.19128.874807692308-7.68480769230769
21122.04128.874807692308-6.83480769230768
22131.76128.8748076923082.88519230769231
23138.48128.8748076923089.60519230769231
24153.47128.87480769230824.5951923076923
25189.95201.5425-11.5925
26182.22201.5425-19.3225
27198.08201.5425-3.46249999999998
28135.36201.5425-66.1825
29125.02128.874807692308-3.85480769230769
30143.5128.87480769230814.6251923076923
31173.95128.87480769230845.0751923076923
32188.75128.87480769230859.8751923076923
33167.44128.87480769230838.5651923076923
34158.95128.87480769230830.0751923076923
35169.53128.87480769230840.6551923076923
36113.66128.874807692308-15.2148076923077
37107.59128.874807692308-21.2848076923077
3892.67128.874807692308-36.2048076923077
3985.35128.874807692308-43.5248076923077
4090.13128.874807692308-38.7448076923077
4189.31128.874807692308-39.5648076923077
42105.12128.874807692308-23.7548076923077
43125.83128.874807692308-3.04480769230769
44135.81128.8748076923086.93519230769232
45142.43128.87480769230813.5551923076923
46163.39128.87480769230834.5151923076923
47168.21128.87480769230839.3351923076923
48185.35128.87480769230856.4751923076923
49188.5128.87480769230859.6251923076923
50199.91201.5425-1.63249999999999
51210.73201.54259.1875
52192.06128.87480769230863.1851923076923
53204.62128.87480769230875.7451923076923
54235201.542533.4575
55261.09201.542559.5475
56256.88128.874807692308128.005192307692
57251.53128.874807692308122.655192307692
58257.25128.874807692308128.375192307692
59243.1128.874807692308114.225192307692
60283.75128.874807692308154.875192307692

\begin{tabular}{lllllllll}
\hline
Actuals, Predictions, and Residuals \tabularnewline
# & Actuals & Forecasts & Residuals \tabularnewline
1 & 57.59 & 128.874807692308 & -71.2848076923077 \tabularnewline
2 & 67.82 & 128.874807692308 & -61.0548076923077 \tabularnewline
3 & 71.89 & 128.874807692308 & -56.9848076923077 \tabularnewline
4 & 75.51 & 128.874807692308 & -53.3648076923077 \tabularnewline
5 & 68.49 & 128.874807692308 & -60.3848076923077 \tabularnewline
6 & 62.72 & 128.874807692308 & -66.1548076923077 \tabularnewline
7 & 70.39 & 128.874807692308 & -58.4848076923077 \tabularnewline
8 & 59.77 & 128.874807692308 & -69.1048076923077 \tabularnewline
9 & 57.27 & 128.874807692308 & -71.6048076923077 \tabularnewline
10 & 67.96 & 128.874807692308 & -60.9148076923077 \tabularnewline
11 & 67.85 & 128.874807692308 & -61.0248076923077 \tabularnewline
12 & 76.98 & 128.874807692308 & -51.8948076923077 \tabularnewline
13 & 81.08 & 128.874807692308 & -47.7948076923077 \tabularnewline
14 & 91.66 & 128.874807692308 & -37.2148076923077 \tabularnewline
15 & 84.84 & 128.874807692308 & -44.0348076923077 \tabularnewline
16 & 85.73 & 128.874807692308 & -43.1448076923077 \tabularnewline
17 & 84.61 & 128.874807692308 & -44.2648076923077 \tabularnewline
18 & 92.91 & 128.874807692308 & -35.9648076923077 \tabularnewline
19 & 99.8 & 128.874807692308 & -29.0748076923077 \tabularnewline
20 & 121.19 & 128.874807692308 & -7.68480769230769 \tabularnewline
21 & 122.04 & 128.874807692308 & -6.83480769230768 \tabularnewline
22 & 131.76 & 128.874807692308 & 2.88519230769231 \tabularnewline
23 & 138.48 & 128.874807692308 & 9.60519230769231 \tabularnewline
24 & 153.47 & 128.874807692308 & 24.5951923076923 \tabularnewline
25 & 189.95 & 201.5425 & -11.5925 \tabularnewline
26 & 182.22 & 201.5425 & -19.3225 \tabularnewline
27 & 198.08 & 201.5425 & -3.46249999999998 \tabularnewline
28 & 135.36 & 201.5425 & -66.1825 \tabularnewline
29 & 125.02 & 128.874807692308 & -3.85480769230769 \tabularnewline
30 & 143.5 & 128.874807692308 & 14.6251923076923 \tabularnewline
31 & 173.95 & 128.874807692308 & 45.0751923076923 \tabularnewline
32 & 188.75 & 128.874807692308 & 59.8751923076923 \tabularnewline
33 & 167.44 & 128.874807692308 & 38.5651923076923 \tabularnewline
34 & 158.95 & 128.874807692308 & 30.0751923076923 \tabularnewline
35 & 169.53 & 128.874807692308 & 40.6551923076923 \tabularnewline
36 & 113.66 & 128.874807692308 & -15.2148076923077 \tabularnewline
37 & 107.59 & 128.874807692308 & -21.2848076923077 \tabularnewline
38 & 92.67 & 128.874807692308 & -36.2048076923077 \tabularnewline
39 & 85.35 & 128.874807692308 & -43.5248076923077 \tabularnewline
40 & 90.13 & 128.874807692308 & -38.7448076923077 \tabularnewline
41 & 89.31 & 128.874807692308 & -39.5648076923077 \tabularnewline
42 & 105.12 & 128.874807692308 & -23.7548076923077 \tabularnewline
43 & 125.83 & 128.874807692308 & -3.04480769230769 \tabularnewline
44 & 135.81 & 128.874807692308 & 6.93519230769232 \tabularnewline
45 & 142.43 & 128.874807692308 & 13.5551923076923 \tabularnewline
46 & 163.39 & 128.874807692308 & 34.5151923076923 \tabularnewline
47 & 168.21 & 128.874807692308 & 39.3351923076923 \tabularnewline
48 & 185.35 & 128.874807692308 & 56.4751923076923 \tabularnewline
49 & 188.5 & 128.874807692308 & 59.6251923076923 \tabularnewline
50 & 199.91 & 201.5425 & -1.63249999999999 \tabularnewline
51 & 210.73 & 201.5425 & 9.1875 \tabularnewline
52 & 192.06 & 128.874807692308 & 63.1851923076923 \tabularnewline
53 & 204.62 & 128.874807692308 & 75.7451923076923 \tabularnewline
54 & 235 & 201.5425 & 33.4575 \tabularnewline
55 & 261.09 & 201.5425 & 59.5475 \tabularnewline
56 & 256.88 & 128.874807692308 & 128.005192307692 \tabularnewline
57 & 251.53 & 128.874807692308 & 122.655192307692 \tabularnewline
58 & 257.25 & 128.874807692308 & 128.375192307692 \tabularnewline
59 & 243.1 & 128.874807692308 & 114.225192307692 \tabularnewline
60 & 283.75 & 128.874807692308 & 154.875192307692 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=153228&T=2

[TABLE]
[ROW][C]Actuals, Predictions, and Residuals[/C][/ROW]
[ROW][C]#[/C][C]Actuals[/C][C]Forecasts[/C][C]Residuals[/C][/ROW]
[ROW][C]1[/C][C]57.59[/C][C]128.874807692308[/C][C]-71.2848076923077[/C][/ROW]
[ROW][C]2[/C][C]67.82[/C][C]128.874807692308[/C][C]-61.0548076923077[/C][/ROW]
[ROW][C]3[/C][C]71.89[/C][C]128.874807692308[/C][C]-56.9848076923077[/C][/ROW]
[ROW][C]4[/C][C]75.51[/C][C]128.874807692308[/C][C]-53.3648076923077[/C][/ROW]
[ROW][C]5[/C][C]68.49[/C][C]128.874807692308[/C][C]-60.3848076923077[/C][/ROW]
[ROW][C]6[/C][C]62.72[/C][C]128.874807692308[/C][C]-66.1548076923077[/C][/ROW]
[ROW][C]7[/C][C]70.39[/C][C]128.874807692308[/C][C]-58.4848076923077[/C][/ROW]
[ROW][C]8[/C][C]59.77[/C][C]128.874807692308[/C][C]-69.1048076923077[/C][/ROW]
[ROW][C]9[/C][C]57.27[/C][C]128.874807692308[/C][C]-71.6048076923077[/C][/ROW]
[ROW][C]10[/C][C]67.96[/C][C]128.874807692308[/C][C]-60.9148076923077[/C][/ROW]
[ROW][C]11[/C][C]67.85[/C][C]128.874807692308[/C][C]-61.0248076923077[/C][/ROW]
[ROW][C]12[/C][C]76.98[/C][C]128.874807692308[/C][C]-51.8948076923077[/C][/ROW]
[ROW][C]13[/C][C]81.08[/C][C]128.874807692308[/C][C]-47.7948076923077[/C][/ROW]
[ROW][C]14[/C][C]91.66[/C][C]128.874807692308[/C][C]-37.2148076923077[/C][/ROW]
[ROW][C]15[/C][C]84.84[/C][C]128.874807692308[/C][C]-44.0348076923077[/C][/ROW]
[ROW][C]16[/C][C]85.73[/C][C]128.874807692308[/C][C]-43.1448076923077[/C][/ROW]
[ROW][C]17[/C][C]84.61[/C][C]128.874807692308[/C][C]-44.2648076923077[/C][/ROW]
[ROW][C]18[/C][C]92.91[/C][C]128.874807692308[/C][C]-35.9648076923077[/C][/ROW]
[ROW][C]19[/C][C]99.8[/C][C]128.874807692308[/C][C]-29.0748076923077[/C][/ROW]
[ROW][C]20[/C][C]121.19[/C][C]128.874807692308[/C][C]-7.68480769230769[/C][/ROW]
[ROW][C]21[/C][C]122.04[/C][C]128.874807692308[/C][C]-6.83480769230768[/C][/ROW]
[ROW][C]22[/C][C]131.76[/C][C]128.874807692308[/C][C]2.88519230769231[/C][/ROW]
[ROW][C]23[/C][C]138.48[/C][C]128.874807692308[/C][C]9.60519230769231[/C][/ROW]
[ROW][C]24[/C][C]153.47[/C][C]128.874807692308[/C][C]24.5951923076923[/C][/ROW]
[ROW][C]25[/C][C]189.95[/C][C]201.5425[/C][C]-11.5925[/C][/ROW]
[ROW][C]26[/C][C]182.22[/C][C]201.5425[/C][C]-19.3225[/C][/ROW]
[ROW][C]27[/C][C]198.08[/C][C]201.5425[/C][C]-3.46249999999998[/C][/ROW]
[ROW][C]28[/C][C]135.36[/C][C]201.5425[/C][C]-66.1825[/C][/ROW]
[ROW][C]29[/C][C]125.02[/C][C]128.874807692308[/C][C]-3.85480769230769[/C][/ROW]
[ROW][C]30[/C][C]143.5[/C][C]128.874807692308[/C][C]14.6251923076923[/C][/ROW]
[ROW][C]31[/C][C]173.95[/C][C]128.874807692308[/C][C]45.0751923076923[/C][/ROW]
[ROW][C]32[/C][C]188.75[/C][C]128.874807692308[/C][C]59.8751923076923[/C][/ROW]
[ROW][C]33[/C][C]167.44[/C][C]128.874807692308[/C][C]38.5651923076923[/C][/ROW]
[ROW][C]34[/C][C]158.95[/C][C]128.874807692308[/C][C]30.0751923076923[/C][/ROW]
[ROW][C]35[/C][C]169.53[/C][C]128.874807692308[/C][C]40.6551923076923[/C][/ROW]
[ROW][C]36[/C][C]113.66[/C][C]128.874807692308[/C][C]-15.2148076923077[/C][/ROW]
[ROW][C]37[/C][C]107.59[/C][C]128.874807692308[/C][C]-21.2848076923077[/C][/ROW]
[ROW][C]38[/C][C]92.67[/C][C]128.874807692308[/C][C]-36.2048076923077[/C][/ROW]
[ROW][C]39[/C][C]85.35[/C][C]128.874807692308[/C][C]-43.5248076923077[/C][/ROW]
[ROW][C]40[/C][C]90.13[/C][C]128.874807692308[/C][C]-38.7448076923077[/C][/ROW]
[ROW][C]41[/C][C]89.31[/C][C]128.874807692308[/C][C]-39.5648076923077[/C][/ROW]
[ROW][C]42[/C][C]105.12[/C][C]128.874807692308[/C][C]-23.7548076923077[/C][/ROW]
[ROW][C]43[/C][C]125.83[/C][C]128.874807692308[/C][C]-3.04480769230769[/C][/ROW]
[ROW][C]44[/C][C]135.81[/C][C]128.874807692308[/C][C]6.93519230769232[/C][/ROW]
[ROW][C]45[/C][C]142.43[/C][C]128.874807692308[/C][C]13.5551923076923[/C][/ROW]
[ROW][C]46[/C][C]163.39[/C][C]128.874807692308[/C][C]34.5151923076923[/C][/ROW]
[ROW][C]47[/C][C]168.21[/C][C]128.874807692308[/C][C]39.3351923076923[/C][/ROW]
[ROW][C]48[/C][C]185.35[/C][C]128.874807692308[/C][C]56.4751923076923[/C][/ROW]
[ROW][C]49[/C][C]188.5[/C][C]128.874807692308[/C][C]59.6251923076923[/C][/ROW]
[ROW][C]50[/C][C]199.91[/C][C]201.5425[/C][C]-1.63249999999999[/C][/ROW]
[ROW][C]51[/C][C]210.73[/C][C]201.5425[/C][C]9.1875[/C][/ROW]
[ROW][C]52[/C][C]192.06[/C][C]128.874807692308[/C][C]63.1851923076923[/C][/ROW]
[ROW][C]53[/C][C]204.62[/C][C]128.874807692308[/C][C]75.7451923076923[/C][/ROW]
[ROW][C]54[/C][C]235[/C][C]201.5425[/C][C]33.4575[/C][/ROW]
[ROW][C]55[/C][C]261.09[/C][C]201.5425[/C][C]59.5475[/C][/ROW]
[ROW][C]56[/C][C]256.88[/C][C]128.874807692308[/C][C]128.005192307692[/C][/ROW]
[ROW][C]57[/C][C]251.53[/C][C]128.874807692308[/C][C]122.655192307692[/C][/ROW]
[ROW][C]58[/C][C]257.25[/C][C]128.874807692308[/C][C]128.375192307692[/C][/ROW]
[ROW][C]59[/C][C]243.1[/C][C]128.874807692308[/C][C]114.225192307692[/C][/ROW]
[ROW][C]60[/C][C]283.75[/C][C]128.874807692308[/C][C]154.875192307692[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=153228&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=153228&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Actuals, Predictions, and Residuals
#ActualsForecastsResiduals
157.59128.874807692308-71.2848076923077
267.82128.874807692308-61.0548076923077
371.89128.874807692308-56.9848076923077
475.51128.874807692308-53.3648076923077
568.49128.874807692308-60.3848076923077
662.72128.874807692308-66.1548076923077
770.39128.874807692308-58.4848076923077
859.77128.874807692308-69.1048076923077
957.27128.874807692308-71.6048076923077
1067.96128.874807692308-60.9148076923077
1167.85128.874807692308-61.0248076923077
1276.98128.874807692308-51.8948076923077
1381.08128.874807692308-47.7948076923077
1491.66128.874807692308-37.2148076923077
1584.84128.874807692308-44.0348076923077
1685.73128.874807692308-43.1448076923077
1784.61128.874807692308-44.2648076923077
1892.91128.874807692308-35.9648076923077
1999.8128.874807692308-29.0748076923077
20121.19128.874807692308-7.68480769230769
21122.04128.874807692308-6.83480769230768
22131.76128.8748076923082.88519230769231
23138.48128.8748076923089.60519230769231
24153.47128.87480769230824.5951923076923
25189.95201.5425-11.5925
26182.22201.5425-19.3225
27198.08201.5425-3.46249999999998
28135.36201.5425-66.1825
29125.02128.874807692308-3.85480769230769
30143.5128.87480769230814.6251923076923
31173.95128.87480769230845.0751923076923
32188.75128.87480769230859.8751923076923
33167.44128.87480769230838.5651923076923
34158.95128.87480769230830.0751923076923
35169.53128.87480769230840.6551923076923
36113.66128.874807692308-15.2148076923077
37107.59128.874807692308-21.2848076923077
3892.67128.874807692308-36.2048076923077
3985.35128.874807692308-43.5248076923077
4090.13128.874807692308-38.7448076923077
4189.31128.874807692308-39.5648076923077
42105.12128.874807692308-23.7548076923077
43125.83128.874807692308-3.04480769230769
44135.81128.8748076923086.93519230769232
45142.43128.87480769230813.5551923076923
46163.39128.87480769230834.5151923076923
47168.21128.87480769230839.3351923076923
48185.35128.87480769230856.4751923076923
49188.5128.87480769230859.6251923076923
50199.91201.5425-1.63249999999999
51210.73201.54259.1875
52192.06128.87480769230863.1851923076923
53204.62128.87480769230875.7451923076923
54235201.542533.4575
55261.09201.542559.5475
56256.88128.874807692308128.005192307692
57251.53128.874807692308122.655192307692
58257.25128.874807692308128.375192307692
59243.1128.874807692308114.225192307692
60283.75128.874807692308154.875192307692



Parameters (Session):
par1 = 1 ; par2 = none ; par3 = 5 ; par4 = no ;
Parameters (R input):
par1 = 1 ; par2 = none ; par3 = 5 ; par4 = no ;
R code (references can be found in the software module):
library(party)
library(Hmisc)
par1 <- as.numeric(par1)
par3 <- as.numeric(par3)
x <- data.frame(t(y))
is.data.frame(x)
x <- x[!is.na(x[,par1]),]
k <- length(x[1,])
n <- length(x[,1])
colnames(x)[par1]
x[,par1]
if (par2 == 'kmeans') {
cl <- kmeans(x[,par1], par3)
print(cl)
clm <- matrix(cbind(cl$centers,1:par3),ncol=2)
clm <- clm[sort.list(clm[,1]),]
for (i in 1:par3) {
cl$cluster[cl$cluster==clm[i,2]] <- paste('C',i,sep='')
}
cl$cluster <- as.factor(cl$cluster)
print(cl$cluster)
x[,par1] <- cl$cluster
}
if (par2 == 'quantiles') {
x[,par1] <- cut2(x[,par1],g=par3)
}
if (par2 == 'hclust') {
hc <- hclust(dist(x[,par1])^2, 'cen')
print(hc)
memb <- cutree(hc, k = par3)
dum <- c(mean(x[memb==1,par1]))
for (i in 2:par3) {
dum <- c(dum, mean(x[memb==i,par1]))
}
hcm <- matrix(cbind(dum,1:par3),ncol=2)
hcm <- hcm[sort.list(hcm[,1]),]
for (i in 1:par3) {
memb[memb==hcm[i,2]] <- paste('C',i,sep='')
}
memb <- as.factor(memb)
print(memb)
x[,par1] <- memb
}
if (par2=='equal') {
ed <- cut(as.numeric(x[,par1]),par3,labels=paste('C',1:par3,sep=''))
x[,par1] <- as.factor(ed)
}
table(x[,par1])
colnames(x)
colnames(x)[par1]
x[,par1]
if (par2 == 'none') {
m <- ctree(as.formula(paste(colnames(x)[par1],' ~ .',sep='')),data = x)
}
load(file='createtable')
if (par2 != 'none') {
m <- ctree(as.formula(paste('as.factor(',colnames(x)[par1],') ~ .',sep='')),data = x)
if (par4=='yes') {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'10-Fold Cross Validation',3+2*par3,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'',1,TRUE)
a<-table.element(a,'Prediction (training)',par3+1,TRUE)
a<-table.element(a,'Prediction (testing)',par3+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Actual',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,paste('C',jjj,sep=''),1,TRUE)
a<-table.element(a,'CV',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,paste('C',jjj,sep=''),1,TRUE)
a<-table.element(a,'CV',1,TRUE)
a<-table.row.end(a)
for (i in 1:10) {
ind <- sample(2, nrow(x), replace=T, prob=c(0.9,0.1))
m.ct <- ctree(as.formula(paste('as.factor(',colnames(x)[par1],') ~ .',sep='')),data =x[ind==1,])
if (i==1) {
m.ct.i.pred <- predict(m.ct, newdata=x[ind==1,])
m.ct.i.actu <- x[ind==1,par1]
m.ct.x.pred <- predict(m.ct, newdata=x[ind==2,])
m.ct.x.actu <- x[ind==2,par1]
} else {
m.ct.i.pred <- c(m.ct.i.pred,predict(m.ct, newdata=x[ind==1,]))
m.ct.i.actu <- c(m.ct.i.actu,x[ind==1,par1])
m.ct.x.pred <- c(m.ct.x.pred,predict(m.ct, newdata=x[ind==2,]))
m.ct.x.actu <- c(m.ct.x.actu,x[ind==2,par1])
}
}
print(m.ct.i.tab <- table(m.ct.i.actu,m.ct.i.pred))
numer <- 0
for (i in 1:par3) {
print(m.ct.i.tab[i,i] / sum(m.ct.i.tab[i,]))
numer <- numer + m.ct.i.tab[i,i]
}
print(m.ct.i.cp <- numer / sum(m.ct.i.tab))
print(m.ct.x.tab <- table(m.ct.x.actu,m.ct.x.pred))
numer <- 0
for (i in 1:par3) {
print(m.ct.x.tab[i,i] / sum(m.ct.x.tab[i,]))
numer <- numer + m.ct.x.tab[i,i]
}
print(m.ct.x.cp <- numer / sum(m.ct.x.tab))
for (i in 1:par3) {
a<-table.row.start(a)
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
for (jjj in 1:par3) a<-table.element(a,m.ct.i.tab[i,jjj])
a<-table.element(a,round(m.ct.i.tab[i,i]/sum(m.ct.i.tab[i,]),4))
for (jjj in 1:par3) a<-table.element(a,m.ct.x.tab[i,jjj])
a<-table.element(a,round(m.ct.x.tab[i,i]/sum(m.ct.x.tab[i,]),4))
a<-table.row.end(a)
}
a<-table.row.start(a)
a<-table.element(a,'Overall',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,'-')
a<-table.element(a,round(m.ct.i.cp,4))
for (jjj in 1:par3) a<-table.element(a,'-')
a<-table.element(a,round(m.ct.x.cp,4))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
}
}
m
bitmap(file='test1.png')
plot(m)
dev.off()
bitmap(file='test1a.png')
plot(x[,par1] ~ as.factor(where(m)),main='Response by Terminal Node',xlab='Terminal Node',ylab='Response')
dev.off()
if (par2 == 'none') {
forec <- predict(m)
result <- as.data.frame(cbind(x[,par1],forec,x[,par1]-forec))
colnames(result) <- c('Actuals','Forecasts','Residuals')
print(result)
}
if (par2 != 'none') {
print(cbind(as.factor(x[,par1]),predict(m)))
myt <- table(as.factor(x[,par1]),predict(m))
print(myt)
}
bitmap(file='test2.png')
if(par2=='none') {
op <- par(mfrow=c(2,2))
plot(density(result$Actuals),main='Kernel Density Plot of Actuals')
plot(density(result$Residuals),main='Kernel Density Plot of Residuals')
plot(result$Forecasts,result$Actuals,main='Actuals versus Predictions',xlab='Predictions',ylab='Actuals')
plot(density(result$Forecasts),main='Kernel Density Plot of Predictions')
par(op)
}
if(par2!='none') {
plot(myt,main='Confusion Matrix',xlab='Actual',ylab='Predicted')
}
dev.off()
if (par2 == 'none') {
detcoef <- cor(result$Forecasts,result$Actuals)
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goodness of Fit',2,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Correlation',1,TRUE)
a<-table.element(a,round(detcoef,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'R-squared',1,TRUE)
a<-table.element(a,round(detcoef*detcoef,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'RMSE',1,TRUE)
a<-table.element(a,round(sqrt(mean((result$Residuals)^2)),4))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Actuals, Predictions, and Residuals',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'#',header=TRUE)
a<-table.element(a,'Actuals',header=TRUE)
a<-table.element(a,'Forecasts',header=TRUE)
a<-table.element(a,'Residuals',header=TRUE)
a<-table.row.end(a)
for (i in 1:length(result$Actuals)) {
a<-table.row.start(a)
a<-table.element(a,i,header=TRUE)
a<-table.element(a,result$Actuals[i])
a<-table.element(a,result$Forecasts[i])
a<-table.element(a,result$Residuals[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
}
if (par2 != 'none') {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Confusion Matrix (predicted in columns / actuals in rows)',par3+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'',1,TRUE)
for (i in 1:par3) {
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
}
a<-table.row.end(a)
for (i in 1:par3) {
a<-table.row.start(a)
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
for (j in 1:par3) {
a<-table.element(a,myt[i,j])
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
}