Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_regression_trees1.wasp
Title produced by softwareRecursive Partitioning (Regression Trees)
Date of computationFri, 09 Dec 2011 13:50:36 -0500
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2011/Dec/09/t13234566603xrki9bt2smfunn.htm/, Retrieved Thu, 02 May 2024 17:42:40 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=153433, Retrieved Thu, 02 May 2024 17:42:40 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact102
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Recursive Partitioning (Regression Trees)] [] [2010-12-05 18:59:57] [b98453cac15ba1066b407e146608df68]
- R PD  [Recursive Partitioning (Regression Trees)] [Workshop 10 Recur...] [2011-12-09 18:45:00] [de8512d9b386046939a89973b76869e3]
-           [Recursive Partitioning (Regression Trees)] [Workshop 10 Recur...] [2011-12-09 18:50:36] [5c44e6aad476a1bab98fc6774eca4c08] [Current]
-  MP         [Recursive Partitioning (Regression Trees)] [Paper SHW Recursi...] [2011-12-16 14:39:29] [74be16979710d4c4e7c6647856088456]
Feedback Forum

Post a new message
Dataseries X:
391	12	1	0	1168	5841	12
893	22	23	67	22618	23824	10
872	37	32	48	34777	14336	14
1138	25	24	66	94785	61023	9
874	28	23	69	192565	153197	8
1281	83	30	93	140867	68370	16
865	33	14	37	31081	58391	11
1179	39	28	80	49810	46341	19
1654	24	24	69	15986	25157	10
1222	47	25	81	30727	53907	12
1204	32	38	120	92696	20112	11
1054	67	29	107	95364	76669	8
1587	47	30	83	51513	53782	14
1386	71	36	98	40735	55515	13
1373	44	30	90	57793	59238	14
1468	33	25	73	51715	71299	14
1496	67	27	104	106671	71180	17
1425	105	34	120	69094	73815	16
2547	135	37	129	126846	72413	12
1583	43	26	93	116174	95757	14
1324	56	35	95	60578	69107	15
1420	62	33	98	61370	67808	15
1605	106	32	83	65567	84396	14
1383	59	28	90	79892	108016	14
1381	68	28	107	120293	93913	16
1559	81	31	63	87771	115338	16
1439	69	25	60	57635	85584	13
1403	69	28	72	83737	82981	14
1579	44	42	122	74007	82036	15
1111	46	43	139	86687	112494	12
2035	41	21	78	37238	10901	13
2147	73	31	114	82753	98579	15
2515	123	34	120	69112	85646	17
1530	60	38	93	83123	86146	8
1645	47	30	73	64466	89455	16
1626	124	33	118	102860	96971	10
1831	91	39	138	82875	93176	17
1833	114	28	91	92945	85298	14
1644	100	35	71	84651	106175	16
1641	111	31	98	102372	112283	7
1226	41	29	99	95260	129847	15
1424	92	35	116	74163	127748	9
1677	94	35	133	117478	146761	9
1418	79	30	94	112285	146283	13
1929	101	36	117	99052	121527	16
2352	76	29	96	80670	102996	15
2445	98	32	119	55801	77494	12
1638	105	33	120	72654	131741	16
1900	131	39	132	130115	139296	15
1982	93	41	139	109825	102255	8
2352	81	31	94	85323	130767	16
2186	63	31	119	91721	78876	13
1706	102	30	115	133824	136368	15
1659	131	31	90	161647	181248	15
1904	118	33	106	129838	168237	17
2152	77	31	71	101481	112642	13
1764	78	38	123	66198	143983	16
1964	58	29	104	111813	120336	16
1840	88	28	105	95536	132190	18
1944	133	42	110	101338	103950	19
2144	101	44	164	143558	160604	14
2699	98	33	115	76643	142775	15
2312	120	39	117	103772	120691	15
1973	123	35	124	105195	174141	17
2888	110	52	197	115929	146123	11
2527	96	32	120	83122	136815	16
2429	109	37	86	54990	147866	16
2158	100	43	152	93815	132432	16
3004	57	29	107	89691	105805	14
2452	107	33	124	101494	171975	17
2395	116	31	82	91413	209056	16
3261	113	37	133	135777	122037	17
4041	158	47	168	97668	151511	13
2662	84	36	126	79215	159676	12
2833	129	41	144	105547	170875	14
2253	79	39	140	115762	155135	10
2242	80	30	108	67654	127766	20
2970	118	38	111	106117	131722	16
2922	136	45	160	213688	214921	14
4308	76	34	110	100708	79336	16
3201	121	25	91	119182	195663	14




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time2 seconds
R Server'Gertrude Mary Cox' @ cox.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 2 seconds \tabularnewline
R Server & 'Gertrude Mary Cox' @ cox.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=153433&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]2 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gertrude Mary Cox' @ cox.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=153433&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=153433&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time2 seconds
R Server'Gertrude Mary Cox' @ cox.wessa.net







Goodness of Fit
Correlation0.7929
R-squared0.6288
RMSE18.41

\begin{tabular}{lllllllll}
\hline
Goodness of Fit \tabularnewline
Correlation & 0.7929 \tabularnewline
R-squared & 0.6288 \tabularnewline
RMSE & 18.41 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=153433&T=1

[TABLE]
[ROW][C]Goodness of Fit[/C][/ROW]
[ROW][C]Correlation[/C][C]0.7929[/C][/ROW]
[ROW][C]R-squared[/C][C]0.6288[/C][/ROW]
[ROW][C]RMSE[/C][C]18.41[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=153433&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=153433&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goodness of Fit
Correlation0.7929
R-squared0.6288
RMSE18.41







Actuals, Predictions, and Residuals
#ActualsForecastsResiduals
1062.8181818181818-62.8181818181818
26762.81818181818184.18181818181818
34894.125-46.125
46662.81818181818183.18181818181818
56962.81818181818186.18181818181818
69394.125-1.125
73762.8181818181818-25.8181818181818
88094.125-14.125
96962.81818181818186.18181818181818
108162.818181818181818.1818181818182
11120112.757.25
1210794.12512.875
138394.125-11.125
1498112.75-14.75
159094.125-4.125
167362.818181818181810.1818181818182
1710494.1259.875
18120112.757.25
19129112.7516.25
209394.125-1.125
2195112.75-17.75
2298112.75-14.75
238394.125-11.125
249094.125-4.125
2510794.12512.875
266394.125-31.125
276062.8181818181818-2.81818181818182
287294.125-22.125
29122144.428571428571-22.4285714285714
30139144.428571428571-5.42857142857142
317862.818181818181815.1818181818182
3211494.12519.875
33120112.757.25
3493112.75-19.75
357394.125-21.125
36118112.755.25
37138144.428571428571-6.42857142857142
389194.125-3.125
3971112.75-41.75
409894.1253.875
419994.1254.875
42116112.753.25
43133112.7520.25
449494.125-0.125
45117112.754.25
469694.1251.875
4711994.12524.875
48120112.757.25
49132144.428571428571-12.4285714285714
50139144.428571428571-5.42857142857142
519494.125-0.125
5211994.12524.875
5311594.12520.875
549094.125-4.125
55106112.75-6.75
567194.125-23.125
57123112.7510.25
5810494.1259.875
5910594.12510.875
60110144.428571428571-34.4285714285714
61164144.42857142857119.5714285714286
62115112.752.25
63117144.428571428571-27.4285714285714
64124112.7511.25
65197144.42857142857152.5714285714286
6612094.12525.875
6786112.75-26.75
68152144.4285714285717.57142857142858
6910794.12512.875
70124112.7511.25
718294.125-12.125
72133112.7520.25
73168144.42857142857123.5714285714286
74126112.7513.25
75144144.428571428571-0.428571428571416
76140144.428571428571-4.42857142857142
7710894.12513.875
78111112.75-1.75
79160144.42857142857115.5714285714286
80110112.75-2.75
819162.818181818181828.1818181818182

\begin{tabular}{lllllllll}
\hline
Actuals, Predictions, and Residuals \tabularnewline
# & Actuals & Forecasts & Residuals \tabularnewline
1 & 0 & 62.8181818181818 & -62.8181818181818 \tabularnewline
2 & 67 & 62.8181818181818 & 4.18181818181818 \tabularnewline
3 & 48 & 94.125 & -46.125 \tabularnewline
4 & 66 & 62.8181818181818 & 3.18181818181818 \tabularnewline
5 & 69 & 62.8181818181818 & 6.18181818181818 \tabularnewline
6 & 93 & 94.125 & -1.125 \tabularnewline
7 & 37 & 62.8181818181818 & -25.8181818181818 \tabularnewline
8 & 80 & 94.125 & -14.125 \tabularnewline
9 & 69 & 62.8181818181818 & 6.18181818181818 \tabularnewline
10 & 81 & 62.8181818181818 & 18.1818181818182 \tabularnewline
11 & 120 & 112.75 & 7.25 \tabularnewline
12 & 107 & 94.125 & 12.875 \tabularnewline
13 & 83 & 94.125 & -11.125 \tabularnewline
14 & 98 & 112.75 & -14.75 \tabularnewline
15 & 90 & 94.125 & -4.125 \tabularnewline
16 & 73 & 62.8181818181818 & 10.1818181818182 \tabularnewline
17 & 104 & 94.125 & 9.875 \tabularnewline
18 & 120 & 112.75 & 7.25 \tabularnewline
19 & 129 & 112.75 & 16.25 \tabularnewline
20 & 93 & 94.125 & -1.125 \tabularnewline
21 & 95 & 112.75 & -17.75 \tabularnewline
22 & 98 & 112.75 & -14.75 \tabularnewline
23 & 83 & 94.125 & -11.125 \tabularnewline
24 & 90 & 94.125 & -4.125 \tabularnewline
25 & 107 & 94.125 & 12.875 \tabularnewline
26 & 63 & 94.125 & -31.125 \tabularnewline
27 & 60 & 62.8181818181818 & -2.81818181818182 \tabularnewline
28 & 72 & 94.125 & -22.125 \tabularnewline
29 & 122 & 144.428571428571 & -22.4285714285714 \tabularnewline
30 & 139 & 144.428571428571 & -5.42857142857142 \tabularnewline
31 & 78 & 62.8181818181818 & 15.1818181818182 \tabularnewline
32 & 114 & 94.125 & 19.875 \tabularnewline
33 & 120 & 112.75 & 7.25 \tabularnewline
34 & 93 & 112.75 & -19.75 \tabularnewline
35 & 73 & 94.125 & -21.125 \tabularnewline
36 & 118 & 112.75 & 5.25 \tabularnewline
37 & 138 & 144.428571428571 & -6.42857142857142 \tabularnewline
38 & 91 & 94.125 & -3.125 \tabularnewline
39 & 71 & 112.75 & -41.75 \tabularnewline
40 & 98 & 94.125 & 3.875 \tabularnewline
41 & 99 & 94.125 & 4.875 \tabularnewline
42 & 116 & 112.75 & 3.25 \tabularnewline
43 & 133 & 112.75 & 20.25 \tabularnewline
44 & 94 & 94.125 & -0.125 \tabularnewline
45 & 117 & 112.75 & 4.25 \tabularnewline
46 & 96 & 94.125 & 1.875 \tabularnewline
47 & 119 & 94.125 & 24.875 \tabularnewline
48 & 120 & 112.75 & 7.25 \tabularnewline
49 & 132 & 144.428571428571 & -12.4285714285714 \tabularnewline
50 & 139 & 144.428571428571 & -5.42857142857142 \tabularnewline
51 & 94 & 94.125 & -0.125 \tabularnewline
52 & 119 & 94.125 & 24.875 \tabularnewline
53 & 115 & 94.125 & 20.875 \tabularnewline
54 & 90 & 94.125 & -4.125 \tabularnewline
55 & 106 & 112.75 & -6.75 \tabularnewline
56 & 71 & 94.125 & -23.125 \tabularnewline
57 & 123 & 112.75 & 10.25 \tabularnewline
58 & 104 & 94.125 & 9.875 \tabularnewline
59 & 105 & 94.125 & 10.875 \tabularnewline
60 & 110 & 144.428571428571 & -34.4285714285714 \tabularnewline
61 & 164 & 144.428571428571 & 19.5714285714286 \tabularnewline
62 & 115 & 112.75 & 2.25 \tabularnewline
63 & 117 & 144.428571428571 & -27.4285714285714 \tabularnewline
64 & 124 & 112.75 & 11.25 \tabularnewline
65 & 197 & 144.428571428571 & 52.5714285714286 \tabularnewline
66 & 120 & 94.125 & 25.875 \tabularnewline
67 & 86 & 112.75 & -26.75 \tabularnewline
68 & 152 & 144.428571428571 & 7.57142857142858 \tabularnewline
69 & 107 & 94.125 & 12.875 \tabularnewline
70 & 124 & 112.75 & 11.25 \tabularnewline
71 & 82 & 94.125 & -12.125 \tabularnewline
72 & 133 & 112.75 & 20.25 \tabularnewline
73 & 168 & 144.428571428571 & 23.5714285714286 \tabularnewline
74 & 126 & 112.75 & 13.25 \tabularnewline
75 & 144 & 144.428571428571 & -0.428571428571416 \tabularnewline
76 & 140 & 144.428571428571 & -4.42857142857142 \tabularnewline
77 & 108 & 94.125 & 13.875 \tabularnewline
78 & 111 & 112.75 & -1.75 \tabularnewline
79 & 160 & 144.428571428571 & 15.5714285714286 \tabularnewline
80 & 110 & 112.75 & -2.75 \tabularnewline
81 & 91 & 62.8181818181818 & 28.1818181818182 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=153433&T=2

[TABLE]
[ROW][C]Actuals, Predictions, and Residuals[/C][/ROW]
[ROW][C]#[/C][C]Actuals[/C][C]Forecasts[/C][C]Residuals[/C][/ROW]
[ROW][C]1[/C][C]0[/C][C]62.8181818181818[/C][C]-62.8181818181818[/C][/ROW]
[ROW][C]2[/C][C]67[/C][C]62.8181818181818[/C][C]4.18181818181818[/C][/ROW]
[ROW][C]3[/C][C]48[/C][C]94.125[/C][C]-46.125[/C][/ROW]
[ROW][C]4[/C][C]66[/C][C]62.8181818181818[/C][C]3.18181818181818[/C][/ROW]
[ROW][C]5[/C][C]69[/C][C]62.8181818181818[/C][C]6.18181818181818[/C][/ROW]
[ROW][C]6[/C][C]93[/C][C]94.125[/C][C]-1.125[/C][/ROW]
[ROW][C]7[/C][C]37[/C][C]62.8181818181818[/C][C]-25.8181818181818[/C][/ROW]
[ROW][C]8[/C][C]80[/C][C]94.125[/C][C]-14.125[/C][/ROW]
[ROW][C]9[/C][C]69[/C][C]62.8181818181818[/C][C]6.18181818181818[/C][/ROW]
[ROW][C]10[/C][C]81[/C][C]62.8181818181818[/C][C]18.1818181818182[/C][/ROW]
[ROW][C]11[/C][C]120[/C][C]112.75[/C][C]7.25[/C][/ROW]
[ROW][C]12[/C][C]107[/C][C]94.125[/C][C]12.875[/C][/ROW]
[ROW][C]13[/C][C]83[/C][C]94.125[/C][C]-11.125[/C][/ROW]
[ROW][C]14[/C][C]98[/C][C]112.75[/C][C]-14.75[/C][/ROW]
[ROW][C]15[/C][C]90[/C][C]94.125[/C][C]-4.125[/C][/ROW]
[ROW][C]16[/C][C]73[/C][C]62.8181818181818[/C][C]10.1818181818182[/C][/ROW]
[ROW][C]17[/C][C]104[/C][C]94.125[/C][C]9.875[/C][/ROW]
[ROW][C]18[/C][C]120[/C][C]112.75[/C][C]7.25[/C][/ROW]
[ROW][C]19[/C][C]129[/C][C]112.75[/C][C]16.25[/C][/ROW]
[ROW][C]20[/C][C]93[/C][C]94.125[/C][C]-1.125[/C][/ROW]
[ROW][C]21[/C][C]95[/C][C]112.75[/C][C]-17.75[/C][/ROW]
[ROW][C]22[/C][C]98[/C][C]112.75[/C][C]-14.75[/C][/ROW]
[ROW][C]23[/C][C]83[/C][C]94.125[/C][C]-11.125[/C][/ROW]
[ROW][C]24[/C][C]90[/C][C]94.125[/C][C]-4.125[/C][/ROW]
[ROW][C]25[/C][C]107[/C][C]94.125[/C][C]12.875[/C][/ROW]
[ROW][C]26[/C][C]63[/C][C]94.125[/C][C]-31.125[/C][/ROW]
[ROW][C]27[/C][C]60[/C][C]62.8181818181818[/C][C]-2.81818181818182[/C][/ROW]
[ROW][C]28[/C][C]72[/C][C]94.125[/C][C]-22.125[/C][/ROW]
[ROW][C]29[/C][C]122[/C][C]144.428571428571[/C][C]-22.4285714285714[/C][/ROW]
[ROW][C]30[/C][C]139[/C][C]144.428571428571[/C][C]-5.42857142857142[/C][/ROW]
[ROW][C]31[/C][C]78[/C][C]62.8181818181818[/C][C]15.1818181818182[/C][/ROW]
[ROW][C]32[/C][C]114[/C][C]94.125[/C][C]19.875[/C][/ROW]
[ROW][C]33[/C][C]120[/C][C]112.75[/C][C]7.25[/C][/ROW]
[ROW][C]34[/C][C]93[/C][C]112.75[/C][C]-19.75[/C][/ROW]
[ROW][C]35[/C][C]73[/C][C]94.125[/C][C]-21.125[/C][/ROW]
[ROW][C]36[/C][C]118[/C][C]112.75[/C][C]5.25[/C][/ROW]
[ROW][C]37[/C][C]138[/C][C]144.428571428571[/C][C]-6.42857142857142[/C][/ROW]
[ROW][C]38[/C][C]91[/C][C]94.125[/C][C]-3.125[/C][/ROW]
[ROW][C]39[/C][C]71[/C][C]112.75[/C][C]-41.75[/C][/ROW]
[ROW][C]40[/C][C]98[/C][C]94.125[/C][C]3.875[/C][/ROW]
[ROW][C]41[/C][C]99[/C][C]94.125[/C][C]4.875[/C][/ROW]
[ROW][C]42[/C][C]116[/C][C]112.75[/C][C]3.25[/C][/ROW]
[ROW][C]43[/C][C]133[/C][C]112.75[/C][C]20.25[/C][/ROW]
[ROW][C]44[/C][C]94[/C][C]94.125[/C][C]-0.125[/C][/ROW]
[ROW][C]45[/C][C]117[/C][C]112.75[/C][C]4.25[/C][/ROW]
[ROW][C]46[/C][C]96[/C][C]94.125[/C][C]1.875[/C][/ROW]
[ROW][C]47[/C][C]119[/C][C]94.125[/C][C]24.875[/C][/ROW]
[ROW][C]48[/C][C]120[/C][C]112.75[/C][C]7.25[/C][/ROW]
[ROW][C]49[/C][C]132[/C][C]144.428571428571[/C][C]-12.4285714285714[/C][/ROW]
[ROW][C]50[/C][C]139[/C][C]144.428571428571[/C][C]-5.42857142857142[/C][/ROW]
[ROW][C]51[/C][C]94[/C][C]94.125[/C][C]-0.125[/C][/ROW]
[ROW][C]52[/C][C]119[/C][C]94.125[/C][C]24.875[/C][/ROW]
[ROW][C]53[/C][C]115[/C][C]94.125[/C][C]20.875[/C][/ROW]
[ROW][C]54[/C][C]90[/C][C]94.125[/C][C]-4.125[/C][/ROW]
[ROW][C]55[/C][C]106[/C][C]112.75[/C][C]-6.75[/C][/ROW]
[ROW][C]56[/C][C]71[/C][C]94.125[/C][C]-23.125[/C][/ROW]
[ROW][C]57[/C][C]123[/C][C]112.75[/C][C]10.25[/C][/ROW]
[ROW][C]58[/C][C]104[/C][C]94.125[/C][C]9.875[/C][/ROW]
[ROW][C]59[/C][C]105[/C][C]94.125[/C][C]10.875[/C][/ROW]
[ROW][C]60[/C][C]110[/C][C]144.428571428571[/C][C]-34.4285714285714[/C][/ROW]
[ROW][C]61[/C][C]164[/C][C]144.428571428571[/C][C]19.5714285714286[/C][/ROW]
[ROW][C]62[/C][C]115[/C][C]112.75[/C][C]2.25[/C][/ROW]
[ROW][C]63[/C][C]117[/C][C]144.428571428571[/C][C]-27.4285714285714[/C][/ROW]
[ROW][C]64[/C][C]124[/C][C]112.75[/C][C]11.25[/C][/ROW]
[ROW][C]65[/C][C]197[/C][C]144.428571428571[/C][C]52.5714285714286[/C][/ROW]
[ROW][C]66[/C][C]120[/C][C]94.125[/C][C]25.875[/C][/ROW]
[ROW][C]67[/C][C]86[/C][C]112.75[/C][C]-26.75[/C][/ROW]
[ROW][C]68[/C][C]152[/C][C]144.428571428571[/C][C]7.57142857142858[/C][/ROW]
[ROW][C]69[/C][C]107[/C][C]94.125[/C][C]12.875[/C][/ROW]
[ROW][C]70[/C][C]124[/C][C]112.75[/C][C]11.25[/C][/ROW]
[ROW][C]71[/C][C]82[/C][C]94.125[/C][C]-12.125[/C][/ROW]
[ROW][C]72[/C][C]133[/C][C]112.75[/C][C]20.25[/C][/ROW]
[ROW][C]73[/C][C]168[/C][C]144.428571428571[/C][C]23.5714285714286[/C][/ROW]
[ROW][C]74[/C][C]126[/C][C]112.75[/C][C]13.25[/C][/ROW]
[ROW][C]75[/C][C]144[/C][C]144.428571428571[/C][C]-0.428571428571416[/C][/ROW]
[ROW][C]76[/C][C]140[/C][C]144.428571428571[/C][C]-4.42857142857142[/C][/ROW]
[ROW][C]77[/C][C]108[/C][C]94.125[/C][C]13.875[/C][/ROW]
[ROW][C]78[/C][C]111[/C][C]112.75[/C][C]-1.75[/C][/ROW]
[ROW][C]79[/C][C]160[/C][C]144.428571428571[/C][C]15.5714285714286[/C][/ROW]
[ROW][C]80[/C][C]110[/C][C]112.75[/C][C]-2.75[/C][/ROW]
[ROW][C]81[/C][C]91[/C][C]62.8181818181818[/C][C]28.1818181818182[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=153433&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=153433&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Actuals, Predictions, and Residuals
#ActualsForecastsResiduals
1062.8181818181818-62.8181818181818
26762.81818181818184.18181818181818
34894.125-46.125
46662.81818181818183.18181818181818
56962.81818181818186.18181818181818
69394.125-1.125
73762.8181818181818-25.8181818181818
88094.125-14.125
96962.81818181818186.18181818181818
108162.818181818181818.1818181818182
11120112.757.25
1210794.12512.875
138394.125-11.125
1498112.75-14.75
159094.125-4.125
167362.818181818181810.1818181818182
1710494.1259.875
18120112.757.25
19129112.7516.25
209394.125-1.125
2195112.75-17.75
2298112.75-14.75
238394.125-11.125
249094.125-4.125
2510794.12512.875
266394.125-31.125
276062.8181818181818-2.81818181818182
287294.125-22.125
29122144.428571428571-22.4285714285714
30139144.428571428571-5.42857142857142
317862.818181818181815.1818181818182
3211494.12519.875
33120112.757.25
3493112.75-19.75
357394.125-21.125
36118112.755.25
37138144.428571428571-6.42857142857142
389194.125-3.125
3971112.75-41.75
409894.1253.875
419994.1254.875
42116112.753.25
43133112.7520.25
449494.125-0.125
45117112.754.25
469694.1251.875
4711994.12524.875
48120112.757.25
49132144.428571428571-12.4285714285714
50139144.428571428571-5.42857142857142
519494.125-0.125
5211994.12524.875
5311594.12520.875
549094.125-4.125
55106112.75-6.75
567194.125-23.125
57123112.7510.25
5810494.1259.875
5910594.12510.875
60110144.428571428571-34.4285714285714
61164144.42857142857119.5714285714286
62115112.752.25
63117144.428571428571-27.4285714285714
64124112.7511.25
65197144.42857142857152.5714285714286
6612094.12525.875
6786112.75-26.75
68152144.4285714285717.57142857142858
6910794.12512.875
70124112.7511.25
718294.125-12.125
72133112.7520.25
73168144.42857142857123.5714285714286
74126112.7513.25
75144144.428571428571-0.428571428571416
76140144.428571428571-4.42857142857142
7710894.12513.875
78111112.75-1.75
79160144.42857142857115.5714285714286
80110112.75-2.75
819162.818181818181828.1818181818182



Parameters (Session):
par1 = pearson ; par2 = equal ; par3 = 2 ; par4 = no ;
Parameters (R input):
par1 = 4 ; par2 = none ; par3 = 3 ; par4 = no ;
R code (references can be found in the software module):
library(party)
library(Hmisc)
par1 <- as.numeric(par1)
par3 <- as.numeric(par3)
x <- data.frame(t(y))
is.data.frame(x)
x <- x[!is.na(x[,par1]),]
k <- length(x[1,])
n <- length(x[,1])
colnames(x)[par1]
x[,par1]
if (par2 == 'kmeans') {
cl <- kmeans(x[,par1], par3)
print(cl)
clm <- matrix(cbind(cl$centers,1:par3),ncol=2)
clm <- clm[sort.list(clm[,1]),]
for (i in 1:par3) {
cl$cluster[cl$cluster==clm[i,2]] <- paste('C',i,sep='')
}
cl$cluster <- as.factor(cl$cluster)
print(cl$cluster)
x[,par1] <- cl$cluster
}
if (par2 == 'quantiles') {
x[,par1] <- cut2(x[,par1],g=par3)
}
if (par2 == 'hclust') {
hc <- hclust(dist(x[,par1])^2, 'cen')
print(hc)
memb <- cutree(hc, k = par3)
dum <- c(mean(x[memb==1,par1]))
for (i in 2:par3) {
dum <- c(dum, mean(x[memb==i,par1]))
}
hcm <- matrix(cbind(dum,1:par3),ncol=2)
hcm <- hcm[sort.list(hcm[,1]),]
for (i in 1:par3) {
memb[memb==hcm[i,2]] <- paste('C',i,sep='')
}
memb <- as.factor(memb)
print(memb)
x[,par1] <- memb
}
if (par2=='equal') {
ed <- cut(as.numeric(x[,par1]),par3,labels=paste('C',1:par3,sep=''))
x[,par1] <- as.factor(ed)
}
table(x[,par1])
colnames(x)
colnames(x)[par1]
x[,par1]
if (par2 == 'none') {
m <- ctree(as.formula(paste(colnames(x)[par1],' ~ .',sep='')),data = x)
}
load(file='createtable')
if (par2 != 'none') {
m <- ctree(as.formula(paste('as.factor(',colnames(x)[par1],') ~ .',sep='')),data = x)
if (par4=='yes') {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'10-Fold Cross Validation',3+2*par3,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'',1,TRUE)
a<-table.element(a,'Prediction (training)',par3+1,TRUE)
a<-table.element(a,'Prediction (testing)',par3+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Actual',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,paste('C',jjj,sep=''),1,TRUE)
a<-table.element(a,'CV',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,paste('C',jjj,sep=''),1,TRUE)
a<-table.element(a,'CV',1,TRUE)
a<-table.row.end(a)
for (i in 1:10) {
ind <- sample(2, nrow(x), replace=T, prob=c(0.9,0.1))
m.ct <- ctree(as.formula(paste('as.factor(',colnames(x)[par1],') ~ .',sep='')),data =x[ind==1,])
if (i==1) {
m.ct.i.pred <- predict(m.ct, newdata=x[ind==1,])
m.ct.i.actu <- x[ind==1,par1]
m.ct.x.pred <- predict(m.ct, newdata=x[ind==2,])
m.ct.x.actu <- x[ind==2,par1]
} else {
m.ct.i.pred <- c(m.ct.i.pred,predict(m.ct, newdata=x[ind==1,]))
m.ct.i.actu <- c(m.ct.i.actu,x[ind==1,par1])
m.ct.x.pred <- c(m.ct.x.pred,predict(m.ct, newdata=x[ind==2,]))
m.ct.x.actu <- c(m.ct.x.actu,x[ind==2,par1])
}
}
print(m.ct.i.tab <- table(m.ct.i.actu,m.ct.i.pred))
numer <- 0
for (i in 1:par3) {
print(m.ct.i.tab[i,i] / sum(m.ct.i.tab[i,]))
numer <- numer + m.ct.i.tab[i,i]
}
print(m.ct.i.cp <- numer / sum(m.ct.i.tab))
print(m.ct.x.tab <- table(m.ct.x.actu,m.ct.x.pred))
numer <- 0
for (i in 1:par3) {
print(m.ct.x.tab[i,i] / sum(m.ct.x.tab[i,]))
numer <- numer + m.ct.x.tab[i,i]
}
print(m.ct.x.cp <- numer / sum(m.ct.x.tab))
for (i in 1:par3) {
a<-table.row.start(a)
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
for (jjj in 1:par3) a<-table.element(a,m.ct.i.tab[i,jjj])
a<-table.element(a,round(m.ct.i.tab[i,i]/sum(m.ct.i.tab[i,]),4))
for (jjj in 1:par3) a<-table.element(a,m.ct.x.tab[i,jjj])
a<-table.element(a,round(m.ct.x.tab[i,i]/sum(m.ct.x.tab[i,]),4))
a<-table.row.end(a)
}
a<-table.row.start(a)
a<-table.element(a,'Overall',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,'-')
a<-table.element(a,round(m.ct.i.cp,4))
for (jjj in 1:par3) a<-table.element(a,'-')
a<-table.element(a,round(m.ct.x.cp,4))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
}
}
m
bitmap(file='test1.png')
plot(m)
dev.off()
bitmap(file='test1a.png')
plot(x[,par1] ~ as.factor(where(m)),main='Response by Terminal Node',xlab='Terminal Node',ylab='Response')
dev.off()
if (par2 == 'none') {
forec <- predict(m)
result <- as.data.frame(cbind(x[,par1],forec,x[,par1]-forec))
colnames(result) <- c('Actuals','Forecasts','Residuals')
print(result)
}
if (par2 != 'none') {
print(cbind(as.factor(x[,par1]),predict(m)))
myt <- table(as.factor(x[,par1]),predict(m))
print(myt)
}
bitmap(file='test2.png')
if(par2=='none') {
op <- par(mfrow=c(2,2))
plot(density(result$Actuals),main='Kernel Density Plot of Actuals')
plot(density(result$Residuals),main='Kernel Density Plot of Residuals')
plot(result$Forecasts,result$Actuals,main='Actuals versus Predictions',xlab='Predictions',ylab='Actuals')
plot(density(result$Forecasts),main='Kernel Density Plot of Predictions')
par(op)
}
if(par2!='none') {
plot(myt,main='Confusion Matrix',xlab='Actual',ylab='Predicted')
}
dev.off()
if (par2 == 'none') {
detcoef <- cor(result$Forecasts,result$Actuals)
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goodness of Fit',2,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Correlation',1,TRUE)
a<-table.element(a,round(detcoef,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'R-squared',1,TRUE)
a<-table.element(a,round(detcoef*detcoef,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'RMSE',1,TRUE)
a<-table.element(a,round(sqrt(mean((result$Residuals)^2)),4))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Actuals, Predictions, and Residuals',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'#',header=TRUE)
a<-table.element(a,'Actuals',header=TRUE)
a<-table.element(a,'Forecasts',header=TRUE)
a<-table.element(a,'Residuals',header=TRUE)
a<-table.row.end(a)
for (i in 1:length(result$Actuals)) {
a<-table.row.start(a)
a<-table.element(a,i,header=TRUE)
a<-table.element(a,result$Actuals[i])
a<-table.element(a,result$Forecasts[i])
a<-table.element(a,result$Residuals[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
}
if (par2 != 'none') {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Confusion Matrix (predicted in columns / actuals in rows)',par3+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'',1,TRUE)
for (i in 1:par3) {
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
}
a<-table.row.end(a)
for (i in 1:par3) {
a<-table.row.start(a)
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
for (j in 1:par3) {
a<-table.element(a,myt[i,j])
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
}