Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_pairs.wasp
Title produced by softwareKendall tau Correlation Matrix
Date of computationWed, 14 Dec 2016 13:49:42 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/14/t14817198506147260s8e4mjsa.htm/, Retrieved Tue, 21 May 2024 04:55:14 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=299378, Retrieved Tue, 21 May 2024 04:55:14 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact61
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Kendall tau Correlation Matrix] [kendall correlation] [2016-12-14 12:49:42] [dfff7639a5c2d8e28b3442052a637c76] [Current]
Feedback Forum

Post a new message
Dataseries X:
2	2	3	4	3,250
4	2	1	4	4,000
4	2	5	4	4,250
4	3	4	4	3,667
3	4	3	3	4,000
4	3	2	5	4,000
1	4	4	4	4,333
4	2	5	4	4,000
3	NA	5	2	4,333
4	4	3	4	4,250
2	2	2	4	4,250
4	2	2	3	3,750
4	5	4	3	4,000
5	4	4	4	3,500
4	2	4	4	4,000
1	3	5	4	4,250
2	1	2	5	4,000
4	3	2	4	3,667
5	4	4	4	4,333
5	5	4	4	4,000
4	5	4	4	3,667
1	1	5	4	4,000
4	4	3	4	3,667
2	2	4	4	4,333
4	4	3	4	3,667
5	4	3	3	4,000
3	3	3	3	3,750
5	4	5	5	4,000
3	2	4	4	4,000
5	2	4	4	3,250
2	4	3	4	3,750
1	2	3	4	4,250
NA	4	5	1	3,667
4	2	3	3	3,250
4	4	3	4	4,250
3	3	3	4	3,667
5	3	5	5	3,500
4	4	3	4	3,500
NA	2	3	4	4,500
4	3	3	4	3,667
2	2	4	3	4,250
3	4	3	4	3,250
1	2	1	5	4,000
3	2	4	4	3,750
3	3	4	3	3,750
3	3	3	3	4,000
4	NA	4	5	3,750
4	4	4	4	3,250
4	5	5	1	3,000
4	4	4	4	4,250
4	4	4	4	4,333
2	4	3	4	4,333
5	2	2	4	2,750
3	2	4	3	3,500
3	1	3	4	3,250
4	3	3	3	3,667
4	4	3	4	4,250
4	3	4	2	4,000
3	3	4	4	3,667
4	2	3	4	4,250
4	3	4	4	4,000
4	2	5	3	4,000
4	4	2	4	4,000
4	3	3	3	3,750
2	2	3	4	3,000
4	4	3	3	4,250
4	5	4	4	3,500
4	4	3	4	3,500
4	3	4	4	4,000
4	2	3	4	3,667
5	3	1	3	3,667
3	4	4	3	3,333
2	4	3	2	3,333
4	4	2	4	4,333
5	5	3	5	3,750
4	4	3	4	4,000
5	4	4	5	3,500
5	4	5	2	3,750
2	3	3	4	4,250
4	2	4	4	4,000
4	4	2	4	2,500
4	4	2	4	4,000
3	4	2	5	4,250
4	2	3	4	4,333
2	2	4	4	5,000
5	1	3	4	4,250
3	NA	5	4	4,500
4	4	4	1	3,667
2	4	4	4	4,250
4	4	3	4	3,500
3	3	4	3	3,667
3	4	3	4	4,250
4	4	5	4	4,000
4	4	4	3	4,250
4	2	4	3	3,667
3	4	3	4	4,000
4	4	4	5	4,500
3	1	1	3	4,500
3	4	4	4	4,000
1	2	4	3	4,000
4	3	4	4	4,333
3	3	4	5	3,750
3	4	4	3	3,250
5	3	3	4	3,667
5	4	5	4	4,333
4	4	3	NA	4,000
5	4	5	5	4,000
4	4	4	4	3,667
4	5	4	4	4,000
4	5	4	5	4,000
4	2	4	3	3,333
3	1	3	3	3,667
4	3	4	3	3,000
3	3	3	4	4,667
4	1	3	4	4,000
2	4	3	4	4,000
1	4	3	4	4,333
5	2	2	4	4,000
4	4	4	4	3,500
3	3	3	3	3,750
4	4	2	4	3,500
4	4	4	5	4,000
4	2	4	4	3,750
4	2	3	3	4,333
2	4	4	4	3,750
4	4	5	4	4,000
4	2	4	3	4,000
4	2	NA	3	3,667
4	2	4	4	3,667
3	2	4	2	2,750
4	5	4	4	4,000
5	2	5	3	4,500
2	NA	2	4	3,333
5	2	4	4	2,750
4	4	4	4	4,000
3	5	5	4	4,500
NA	4	4	3	3,000
2	4	4	2	3,750
2	3	5	5	4,750
2	3	2	3	4,250
4	1	4	4	3,333
4	4	5	4	3,500
5	5	3	4	4,000
3	4	4	5	3,250
3	4	4	4	4,250
4	5	3	4	3,500
4	4	5	3	4,750
4	5	5	1	3,500
4	5	3	4	4,000
4	3	2	5	3,000
4	5	4	4	4,000
4	1	5	4	4,000
2	3	3	4	3,750
5	2	3	5	3,000
4	2	4	4	3,667
4	NA	3	4	4,250
4	4	2	4	3,333
4	2	3	4	3,667
4	5	3	4	4,500
2	4	4	3	3,750
3	5	1	5	4,500
3	3	4	3	3,750
4	2	3	4	3,667
4	4	3	4	4,000
4	2	2	5	3,333
4	3	3	4	4,000
3	3	3	4	3,333
3	2	5	2	4,000




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time1 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time1 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299378&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]1 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=299378&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299378&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time1 seconds
R ServerBig Analytics Cloud Computing Center







Correlations for all pairs of data series (method=kendall)
IVHB1IVHB2IVHB3IVHB4TVDC
IVHB110.1180.0650.105-0.13
IVHB20.11810.0740.0850.059
IVHB30.0650.0741-0.1230.051
IVHB40.1050.085-0.12310.074
TVDC-0.130.0590.0510.0741

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series (method=kendall) \tabularnewline
  & IVHB1 & IVHB2 & IVHB3 & IVHB4 & TVDC \tabularnewline
IVHB1 & 1 & 0.118 & 0.065 & 0.105 & -0.13 \tabularnewline
IVHB2 & 0.118 & 1 & 0.074 & 0.085 & 0.059 \tabularnewline
IVHB3 & 0.065 & 0.074 & 1 & -0.123 & 0.051 \tabularnewline
IVHB4 & 0.105 & 0.085 & -0.123 & 1 & 0.074 \tabularnewline
TVDC & -0.13 & 0.059 & 0.051 & 0.074 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299378&T=1

[TABLE]
[ROW][C]Correlations for all pairs of data series (method=kendall)[/C][/ROW]
[ROW][C] [/C][C]IVHB1[/C][C]IVHB2[/C][C]IVHB3[/C][C]IVHB4[/C][C]TVDC[/C][/ROW]
[ROW][C]IVHB1[/C][C]1[/C][C]0.118[/C][C]0.065[/C][C]0.105[/C][C]-0.13[/C][/ROW]
[ROW][C]IVHB2[/C][C]0.118[/C][C]1[/C][C]0.074[/C][C]0.085[/C][C]0.059[/C][/ROW]
[ROW][C]IVHB3[/C][C]0.065[/C][C]0.074[/C][C]1[/C][C]-0.123[/C][C]0.051[/C][/ROW]
[ROW][C]IVHB4[/C][C]0.105[/C][C]0.085[/C][C]-0.123[/C][C]1[/C][C]0.074[/C][/ROW]
[ROW][C]TVDC[/C][C]-0.13[/C][C]0.059[/C][C]0.051[/C][C]0.074[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299378&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299378&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series (method=kendall)
IVHB1IVHB2IVHB3IVHB4TVDC
IVHB110.1180.0650.105-0.13
IVHB20.11810.0740.0850.059
IVHB30.0650.0741-0.1230.051
IVHB40.1050.085-0.12310.074
TVDC-0.130.0590.0510.0741







Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
IVHB1;IVHB20.15090.13840.1177
p-value(0.0584)(0.0829)(0.0803)
IVHB1;IVHB30.06660.07480.0646
p-value(0.4056)(0.3504)(0.3424)
IVHB1;IVHB40.06230.11760.1054
p-value(0.4366)(0.1413)(0.131)
IVHB1;TVDC-0.1962-0.1633-0.1305
p-value(0.0135)(0.0403)(0.0414)
IVHB2;IVHB30.09840.08630.0736
p-value(0.2186)(0.2808)(0.2733)
IVHB2;IVHB40.02580.09990.0854
p-value(0.7476)(0.2116)(0.2156)
IVHB2;TVDC0.07670.07520.0591
p-value(0.3382)(0.348)(0.3495)
IVHB3;IVHB4-0.1883-0.1357-0.1234
p-value(0.0178)(0.0892)(0.0768)
IVHB3;TVDC0.07380.06110.0507
p-value(0.3565)(0.4458)(0.4273)
IVHB4;TVDC0.12940.09020.0736
p-value(0.1051)(0.2598)(0.2616)

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series with p-values \tabularnewline
pair & Pearson r & Spearman rho & Kendall tau \tabularnewline
IVHB1;IVHB2 & 0.1509 & 0.1384 & 0.1177 \tabularnewline
p-value & (0.0584) & (0.0829) & (0.0803) \tabularnewline
IVHB1;IVHB3 & 0.0666 & 0.0748 & 0.0646 \tabularnewline
p-value & (0.4056) & (0.3504) & (0.3424) \tabularnewline
IVHB1;IVHB4 & 0.0623 & 0.1176 & 0.1054 \tabularnewline
p-value & (0.4366) & (0.1413) & (0.131) \tabularnewline
IVHB1;TVDC & -0.1962 & -0.1633 & -0.1305 \tabularnewline
p-value & (0.0135) & (0.0403) & (0.0414) \tabularnewline
IVHB2;IVHB3 & 0.0984 & 0.0863 & 0.0736 \tabularnewline
p-value & (0.2186) & (0.2808) & (0.2733) \tabularnewline
IVHB2;IVHB4 & 0.0258 & 0.0999 & 0.0854 \tabularnewline
p-value & (0.7476) & (0.2116) & (0.2156) \tabularnewline
IVHB2;TVDC & 0.0767 & 0.0752 & 0.0591 \tabularnewline
p-value & (0.3382) & (0.348) & (0.3495) \tabularnewline
IVHB3;IVHB4 & -0.1883 & -0.1357 & -0.1234 \tabularnewline
p-value & (0.0178) & (0.0892) & (0.0768) \tabularnewline
IVHB3;TVDC & 0.0738 & 0.0611 & 0.0507 \tabularnewline
p-value & (0.3565) & (0.4458) & (0.4273) \tabularnewline
IVHB4;TVDC & 0.1294 & 0.0902 & 0.0736 \tabularnewline
p-value & (0.1051) & (0.2598) & (0.2616) \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299378&T=2

[TABLE]
[ROW][C]Correlations for all pairs of data series with p-values[/C][/ROW]
[ROW][C]pair[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]IVHB1;IVHB2[/C][C]0.1509[/C][C]0.1384[/C][C]0.1177[/C][/ROW]
[ROW][C]p-value[/C][C](0.0584)[/C][C](0.0829)[/C][C](0.0803)[/C][/ROW]
[ROW][C]IVHB1;IVHB3[/C][C]0.0666[/C][C]0.0748[/C][C]0.0646[/C][/ROW]
[ROW][C]p-value[/C][C](0.4056)[/C][C](0.3504)[/C][C](0.3424)[/C][/ROW]
[ROW][C]IVHB1;IVHB4[/C][C]0.0623[/C][C]0.1176[/C][C]0.1054[/C][/ROW]
[ROW][C]p-value[/C][C](0.4366)[/C][C](0.1413)[/C][C](0.131)[/C][/ROW]
[ROW][C]IVHB1;TVDC[/C][C]-0.1962[/C][C]-0.1633[/C][C]-0.1305[/C][/ROW]
[ROW][C]p-value[/C][C](0.0135)[/C][C](0.0403)[/C][C](0.0414)[/C][/ROW]
[ROW][C]IVHB2;IVHB3[/C][C]0.0984[/C][C]0.0863[/C][C]0.0736[/C][/ROW]
[ROW][C]p-value[/C][C](0.2186)[/C][C](0.2808)[/C][C](0.2733)[/C][/ROW]
[ROW][C]IVHB2;IVHB4[/C][C]0.0258[/C][C]0.0999[/C][C]0.0854[/C][/ROW]
[ROW][C]p-value[/C][C](0.7476)[/C][C](0.2116)[/C][C](0.2156)[/C][/ROW]
[ROW][C]IVHB2;TVDC[/C][C]0.0767[/C][C]0.0752[/C][C]0.0591[/C][/ROW]
[ROW][C]p-value[/C][C](0.3382)[/C][C](0.348)[/C][C](0.3495)[/C][/ROW]
[ROW][C]IVHB3;IVHB4[/C][C]-0.1883[/C][C]-0.1357[/C][C]-0.1234[/C][/ROW]
[ROW][C]p-value[/C][C](0.0178)[/C][C](0.0892)[/C][C](0.0768)[/C][/ROW]
[ROW][C]IVHB3;TVDC[/C][C]0.0738[/C][C]0.0611[/C][C]0.0507[/C][/ROW]
[ROW][C]p-value[/C][C](0.3565)[/C][C](0.4458)[/C][C](0.4273)[/C][/ROW]
[ROW][C]IVHB4;TVDC[/C][C]0.1294[/C][C]0.0902[/C][C]0.0736[/C][/ROW]
[ROW][C]p-value[/C][C](0.1051)[/C][C](0.2598)[/C][C](0.2616)[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299378&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299378&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
IVHB1;IVHB20.15090.13840.1177
p-value(0.0584)(0.0829)(0.0803)
IVHB1;IVHB30.06660.07480.0646
p-value(0.4056)(0.3504)(0.3424)
IVHB1;IVHB40.06230.11760.1054
p-value(0.4366)(0.1413)(0.131)
IVHB1;TVDC-0.1962-0.1633-0.1305
p-value(0.0135)(0.0403)(0.0414)
IVHB2;IVHB30.09840.08630.0736
p-value(0.2186)(0.2808)(0.2733)
IVHB2;IVHB40.02580.09990.0854
p-value(0.7476)(0.2116)(0.2156)
IVHB2;TVDC0.07670.07520.0591
p-value(0.3382)(0.348)(0.3495)
IVHB3;IVHB4-0.1883-0.1357-0.1234
p-value(0.0178)(0.0892)(0.0768)
IVHB3;TVDC0.07380.06110.0507
p-value(0.3565)(0.4458)(0.4273)
IVHB4;TVDC0.12940.09020.0736
p-value(0.1051)(0.2598)(0.2616)







Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.01000
0.020.200
0.030.200
0.040.200
0.050.20.10.1
0.060.30.10.1
0.070.30.10.1
0.080.30.10.2
0.090.30.30.3
0.10.30.30.3

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Correlation Tests \tabularnewline
Number of significant by total number of Correlations \tabularnewline
Type I error & Pearson r & Spearman rho & Kendall tau \tabularnewline
0.01 & 0 & 0 & 0 \tabularnewline
0.02 & 0.2 & 0 & 0 \tabularnewline
0.03 & 0.2 & 0 & 0 \tabularnewline
0.04 & 0.2 & 0 & 0 \tabularnewline
0.05 & 0.2 & 0.1 & 0.1 \tabularnewline
0.06 & 0.3 & 0.1 & 0.1 \tabularnewline
0.07 & 0.3 & 0.1 & 0.1 \tabularnewline
0.08 & 0.3 & 0.1 & 0.2 \tabularnewline
0.09 & 0.3 & 0.3 & 0.3 \tabularnewline
0.1 & 0.3 & 0.3 & 0.3 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299378&T=3

[TABLE]
[ROW][C]Meta Analysis of Correlation Tests[/C][/ROW]
[ROW][C]Number of significant by total number of Correlations[/C][/ROW]
[ROW][C]Type I error[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]0.01[/C][C]0[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.02[/C][C]0.2[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.03[/C][C]0.2[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.04[/C][C]0.2[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.05[/C][C]0.2[/C][C]0.1[/C][C]0.1[/C][/ROW]
[ROW][C]0.06[/C][C]0.3[/C][C]0.1[/C][C]0.1[/C][/ROW]
[ROW][C]0.07[/C][C]0.3[/C][C]0.1[/C][C]0.1[/C][/ROW]
[ROW][C]0.08[/C][C]0.3[/C][C]0.1[/C][C]0.2[/C][/ROW]
[ROW][C]0.09[/C][C]0.3[/C][C]0.3[/C][C]0.3[/C][/ROW]
[ROW][C]0.1[/C][C]0.3[/C][C]0.3[/C][C]0.3[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299378&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299378&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.01000
0.020.200
0.030.200
0.040.200
0.050.20.10.1
0.060.30.10.1
0.070.30.10.1
0.080.30.10.2
0.090.30.30.3
0.10.30.30.3



Parameters (Session):
par1 = kendall ;
Parameters (R input):
par1 = kendall ;
R code (references can be found in the software module):
panel.tau <- function(x, y, digits=2, prefix='', cex.cor)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(0, 1, 0, 1))
rr <- cor.test(x, y, method=par1)
r <- round(rr$p.value,2)
txt <- format(c(r, 0.123456789), digits=digits)[1]
txt <- paste(prefix, txt, sep='')
if(missing(cex.cor)) cex <- 0.5/strwidth(txt)
text(0.5, 0.5, txt, cex = cex)
}
panel.hist <- function(x, ...)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(usr[1:2], 0, 1.5) )
h <- hist(x, plot = FALSE)
breaks <- h$breaks; nB <- length(breaks)
y <- h$counts; y <- y/max(y)
rect(breaks[-nB], 0, breaks[-1], y, col='grey', ...)
}
x <- na.omit(x)
y <- t(na.omit(t(y)))
bitmap(file='test1.png')
pairs(t(y),diag.panel=panel.hist, upper.panel=panel.smooth, lower.panel=panel.tau, main=main)
dev.off()
load(file='createtable')
n <- length(y[,1])
print(n)
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,paste('Correlations for all pairs of data series (method=',par1,')',sep=''),n+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,' ',header=TRUE)
for (i in 1:n) {
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
}
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
for (j in 1:n) {
r <- cor.test(y[i,],y[j,],method=par1)
a<-table.element(a,round(r$estimate,3))
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
ncorrs <- (n*n -n)/2
mycorrs <- array(0, dim=c(10,3))
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Correlations for all pairs of data series with p-values',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'pair',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
cor.test(y[1,],y[2,],method=par1)
for (i in 1:(n-1))
{
for (j in (i+1):n)
{
a<-table.row.start(a)
dum <- paste(dimnames(t(x))[[2]][i],';',dimnames(t(x))[[2]][j],sep='')
a<-table.element(a,dum,header=TRUE)
rp <- cor.test(y[i,],y[j,],method='pearson')
a<-table.element(a,round(rp$estimate,4))
rs <- cor.test(y[i,],y[j,],method='spearman')
a<-table.element(a,round(rs$estimate,4))
rk <- cor.test(y[i,],y[j,],method='kendall')
a<-table.element(a,round(rk$estimate,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-value',header=T)
a<-table.element(a,paste('(',round(rp$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rs$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rk$p.value,4),')',sep=''))
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
if (rp$p.value < iiid100) mycorrs[iii, 1] = mycorrs[iii, 1] + 1
if (rs$p.value < iiid100) mycorrs[iii, 2] = mycorrs[iii, 2] + 1
if (rk$p.value < iiid100) mycorrs[iii, 3] = mycorrs[iii, 3] + 1
}
}
}
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Correlation Tests',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Number of significant by total number of Correlations',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Type I error',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
a<-table.row.start(a)
a<-table.element(a,round(iiid100,2),header=T)
a<-table.element(a,round(mycorrs[iii,1]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,2]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,3]/ncorrs,2))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')