Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_pairs.wasp
Title produced by softwareKendall tau Correlation Matrix
Date of computationMon, 08 Dec 2014 20:28:50 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/08/t1418070571y2anpfqhtda6082.htm/, Retrieved Fri, 17 May 2024 08:09:01 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=264217, Retrieved Fri, 17 May 2024 08:09:01 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact59
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Kendall tau Correlation Matrix] [] [2014-12-08 20:28:50] [c4557137b9b718365486b3b7af9cd43b] [Current]
Feedback Forum

Post a new message
Dataseries X:
4 2 2 3
4 5 5 5
3 3 5 5
6 6 6 6
4 4 4 3
4 4 4 5
5 5 5 4
5 3 6 5
7 7 7 7
7 6 7 6
3 4 3 5
7 7 6 6
2 3 5 6
5 5 7 7
6 5 7 7
5 5 6 6
4 3 4 4
6 5 5 5
5 5 6 6
7 6 7 7
7 5 7 7
7 7 6 6
6 6 5 5
5 4 6 6
5 6 5 6
4 5 6 5
5 6 4 6
5 5 5 5
6 5 5 6
6 5 4 6
2 2 2 2
6 5 6 5
3 5 7 5
6 6 6 6
4 5 4 4
3 5 5 7
7 6 5 5
5 5 5 5
5 5 6 6
4 4 5 6
4 4 4 3
6 3 5 6
5 5 6 6
5 2 4 6
4 2 4 4
7 5 6 6
4 5 4 4
6 5 5 7
6 6 6 7
4 4 4 4
5 3 5 5
5 4 6 5
4 4 5 5
5 6 6 6
6 6 6 6
5 6 6 6
5 3 2 3
4 5 5 6
6 4 5 5
5 5 5 4
6 5 6 5
4 5 6 7
3 3 4 5
5 4 4 4
5 5 4 5
5 5 5 5
4 6 6 6
5 4 6 6
5 4 5 7
5 4 3 4
5 5 5 5
5 6 5 5
6 5 4 5
6 5 6 6
5 1 6 6
2 3 6 5
4 4 4 5
6 7 6 5
4 3 3 3
4 5 5 5
5 5 4 6
5 5 6 6
4 4 6 5
4 5 5 7
3 4 4 4
5 5 6 5
6 6 6 6
5 5 5 7
4 5 5 6
5 5 6 5
4 4 5 6
4 2 5 3
4 7 7 7
2 2 4 3
4 5 4 4
4 6 6 6
5 5 5 5
7 5 6 4
4 3 3 5
6 6 5 6
5 5 6 4
4 5 6 7
2 5 5 4
6 6 6 7
4 4 5 5
4 4 4 7
7 7 4 7
6 5 5 5
5 6 5 6
5 6 5 5
6 5 5 6
6 6 5 6
4 5 5 6
1 1 2 2
5 3 4 3
4 4 5 5
7 5 5 7
5 6 6 5
6 5 5 5
6 5 6 6
5 5 5 5
5 5 5 5
4 4 5 5
6 6 7 6
4 4 4 4
5 5 5 5
4 4 3 3
4 4 7 7
6 6 7 7
4 4 7 5
6 2 4 5
5 6 5 6
6 4 6 6
5 5 5 5
4 4 5 4
6 5 5 6
3 4 5 5
5 5 6 6
5 5 5 6
6 7 6 6
2 3 3 3
5 5 4 5
6 6 6 6
5 4 4 4
6 5 5 6
5 4 4 4
5 7 7 7
4 5 5 6
5 4 5 5
6 4 6 5
6 5 6 7
5 5 6 5
4 5 5 5
3 4 2 4
6 6 6 6
7 7 7 7
7 6 7 7
6 5 6 5
6 6 5 6
6 3 4 6
4 4 4 6
6 5 5 7
5 5 5 6
6 5 5 6
4 5 4 4
3 4 4 4
5 6 5 5
5 5 5 5
7 6 6 7
4 5 6 4
7 7 7 7
5 5 5 6
5 4 5 5
5 6 5 6
5 6 5 5
5 5 4 6
5 4 5 5
4 1 2 4
4 5 4 4
6 5 4 4
5 5 5 5
4 5 4 4
6 4 5 6
5 5 5 6
3 3 2 4
4 7 5 7
4 7 5 6
4 6 5 7
4 7 4 6
5 6 5 4
4 5 4 5
3 6 6 6
6 7 5 6
5 5 6 6
7 4 4 5
5 4 4 4
5 5 5 4
4 4 4 4
5 3 5 6
6 6 4 7
2 2 2 2
6 6 4 6
6 6 5 6
4 3 4 4
4 4 4 5
6 7 1 7
5 6 7 7
5 5 5 3
5 4 5 6
4 5 7 5
5 6 5 5
6 6 6 6
5 6 5 6
5 5 5 7
5 6 6 6
3 5 4 5
5 2 3 5
6 5 5 6
4 6 4 5
4 3 5 6
5 5 5 6
5 5 5 5
6 1 6 6
5 5 4 5
4 4 4 4
4 5 4 5
5 4 7 7
5 6 5 6
5 6 6 6
4 5 6 5
7 6 5 6
6 7 6 6
7 6 6 6
5 5 5 5
5 6 6 6
5 6 5 5
6 5 5 7
7 7 4 5
3 3 2 3
5 5 5 6
7 7 6 7
5 5 3 6
4 6 5 6
4 4 4 4
5 4 6 6
5 5 6 6
4 3 3 6
3 5 5 5
7 4 5 7
6 5 7 6
5 5 4 6
5 5 6 4
4 5 5 4
1 1 1 1
3 3 4 4
5 5 6 6
4 4 4 5
3 7 7 6
6 3 5 6
4 4 5 5
5 5 3 6
5 4 5 6
5 3 3 4
6 6 6 6
5 5 5 6
5 4 5 5
5 5 4 5
7 6 7 7
5 6 6 6
6 5 6 6
5 5 5 5
5 3 5 4
5 5 6 5
6 6 6 5
6 5 5 6
4 4 3 5
5 5 5 5
3 4 5 4




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time2 seconds
R Server'Herman Ole Andreas Wold' @ wold.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 2 seconds \tabularnewline
R Server & 'Herman Ole Andreas Wold' @ wold.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264217&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]2 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Herman Ole Andreas Wold' @ wold.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264217&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264217&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time2 seconds
R Server'Herman Ole Andreas Wold' @ wold.wessa.net







Correlations for all pairs of data series (method=kendall)
AMS2AMS9AMS16AMS23
AMS210.3940.3420.402
AMS90.39410.3960.417
AMS160.3420.39610.449
AMS230.4020.4170.4491

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series (method=kendall) \tabularnewline
  & AMS2 & AMS9 & AMS16 & AMS23 \tabularnewline
AMS2 & 1 & 0.394 & 0.342 & 0.402 \tabularnewline
AMS9 & 0.394 & 1 & 0.396 & 0.417 \tabularnewline
AMS16 & 0.342 & 0.396 & 1 & 0.449 \tabularnewline
AMS23 & 0.402 & 0.417 & 0.449 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264217&T=1

[TABLE]
[ROW][C]Correlations for all pairs of data series (method=kendall)[/C][/ROW]
[ROW][C] [/C][C]AMS2[/C][C]AMS9[/C][C]AMS16[/C][C]AMS23[/C][/ROW]
[ROW][C]AMS2[/C][C]1[/C][C]0.394[/C][C]0.342[/C][C]0.402[/C][/ROW]
[ROW][C]AMS9[/C][C]0.394[/C][C]1[/C][C]0.396[/C][C]0.417[/C][/ROW]
[ROW][C]AMS16[/C][C]0.342[/C][C]0.396[/C][C]1[/C][C]0.449[/C][/ROW]
[ROW][C]AMS23[/C][C]0.402[/C][C]0.417[/C][C]0.449[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264217&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264217&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series (method=kendall)
AMS2AMS9AMS16AMS23
AMS210.3940.3420.402
AMS90.39410.3960.417
AMS160.3420.39610.449
AMS230.4020.4170.4491







Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
AMS2;AMS90.47740.45370.3938
p-value(0)(0)(0)
AMS2;AMS160.42930.39450.3419
p-value(0)(0)(0)
AMS2;AMS230.52040.46310.4021
p-value(0)(0)(0)
AMS9;AMS160.48620.45540.3956
p-value(0)(0)(0)
AMS9;AMS230.52570.48520.4175
p-value(0)(0)(0)
AMS16;AMS230.57040.51040.4492
p-value(0)(0)(0)

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series with p-values \tabularnewline
pair & Pearson r & Spearman rho & Kendall tau \tabularnewline
AMS2;AMS9 & 0.4774 & 0.4537 & 0.3938 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
AMS2;AMS16 & 0.4293 & 0.3945 & 0.3419 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
AMS2;AMS23 & 0.5204 & 0.4631 & 0.4021 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
AMS9;AMS16 & 0.4862 & 0.4554 & 0.3956 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
AMS9;AMS23 & 0.5257 & 0.4852 & 0.4175 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
AMS16;AMS23 & 0.5704 & 0.5104 & 0.4492 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264217&T=2

[TABLE]
[ROW][C]Correlations for all pairs of data series with p-values[/C][/ROW]
[ROW][C]pair[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]AMS2;AMS9[/C][C]0.4774[/C][C]0.4537[/C][C]0.3938[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]AMS2;AMS16[/C][C]0.4293[/C][C]0.3945[/C][C]0.3419[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]AMS2;AMS23[/C][C]0.5204[/C][C]0.4631[/C][C]0.4021[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]AMS9;AMS16[/C][C]0.4862[/C][C]0.4554[/C][C]0.3956[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]AMS9;AMS23[/C][C]0.5257[/C][C]0.4852[/C][C]0.4175[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]AMS16;AMS23[/C][C]0.5704[/C][C]0.5104[/C][C]0.4492[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264217&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264217&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
AMS2;AMS90.47740.45370.3938
p-value(0)(0)(0)
AMS2;AMS160.42930.39450.3419
p-value(0)(0)(0)
AMS2;AMS230.52040.46310.4021
p-value(0)(0)(0)
AMS9;AMS160.48620.45540.3956
p-value(0)(0)(0)
AMS9;AMS230.52570.48520.4175
p-value(0)(0)(0)
AMS16;AMS230.57040.51040.4492
p-value(0)(0)(0)







Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.01111
0.02111
0.03111
0.04111
0.05111
0.06111
0.07111
0.08111
0.09111
0.1111

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Correlation Tests \tabularnewline
Number of significant by total number of Correlations \tabularnewline
Type I error & Pearson r & Spearman rho & Kendall tau \tabularnewline
0.01 & 1 & 1 & 1 \tabularnewline
0.02 & 1 & 1 & 1 \tabularnewline
0.03 & 1 & 1 & 1 \tabularnewline
0.04 & 1 & 1 & 1 \tabularnewline
0.05 & 1 & 1 & 1 \tabularnewline
0.06 & 1 & 1 & 1 \tabularnewline
0.07 & 1 & 1 & 1 \tabularnewline
0.08 & 1 & 1 & 1 \tabularnewline
0.09 & 1 & 1 & 1 \tabularnewline
0.1 & 1 & 1 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264217&T=3

[TABLE]
[ROW][C]Meta Analysis of Correlation Tests[/C][/ROW]
[ROW][C]Number of significant by total number of Correlations[/C][/ROW]
[ROW][C]Type I error[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]0.01[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.02[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.03[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.04[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.05[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.06[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.07[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.08[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.09[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.1[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264217&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264217&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.01111
0.02111
0.03111
0.04111
0.05111
0.06111
0.07111
0.08111
0.09111
0.1111



Parameters (Session):
par1 = kendall ;
Parameters (R input):
par1 = kendall ;
R code (references can be found in the software module):
par1 <- 'pearson'
panel.tau <- function(x, y, digits=2, prefix='', cex.cor)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(0, 1, 0, 1))
rr <- cor.test(x, y, method=par1)
r <- round(rr$p.value,2)
txt <- format(c(r, 0.123456789), digits=digits)[1]
txt <- paste(prefix, txt, sep='')
if(missing(cex.cor)) cex <- 0.5/strwidth(txt)
text(0.5, 0.5, txt, cex = cex)
}
panel.hist <- function(x, ...)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(usr[1:2], 0, 1.5) )
h <- hist(x, plot = FALSE)
breaks <- h$breaks; nB <- length(breaks)
y <- h$counts; y <- y/max(y)
rect(breaks[-nB], 0, breaks[-1], y, col='grey', ...)
}
bitmap(file='test1.png')
pairs(t(y),diag.panel=panel.hist, upper.panel=panel.smooth, lower.panel=panel.tau, main=main)
dev.off()
load(file='createtable')
n <- length(y[,1])
n
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,paste('Correlations for all pairs of data series (method=',par1,')',sep=''),n+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,' ',header=TRUE)
for (i in 1:n) {
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
}
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
for (j in 1:n) {
r <- cor.test(y[i,],y[j,],method=par1)
a<-table.element(a,round(r$estimate,3))
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
ncorrs <- (n*n -n)/2
mycorrs <- array(0, dim=c(10,3))
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Correlations for all pairs of data series with p-values',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'pair',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
cor.test(y[1,],y[2,],method=par1)
for (i in 1:(n-1))
{
for (j in (i+1):n)
{
a<-table.row.start(a)
dum <- paste(dimnames(t(x))[[2]][i],';',dimnames(t(x))[[2]][j],sep='')
a<-table.element(a,dum,header=TRUE)
rp <- cor.test(y[i,],y[j,],method='pearson')
a<-table.element(a,round(rp$estimate,4))
rs <- cor.test(y[i,],y[j,],method='spearman')
a<-table.element(a,round(rs$estimate,4))
rk <- cor.test(y[i,],y[j,],method='kendall')
a<-table.element(a,round(rk$estimate,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-value',header=T)
a<-table.element(a,paste('(',round(rp$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rs$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rk$p.value,4),')',sep=''))
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
if (rp$p.value < iiid100) mycorrs[iii, 1] = mycorrs[iii, 1] + 1
if (rs$p.value < iiid100) mycorrs[iii, 2] = mycorrs[iii, 2] + 1
if (rk$p.value < iiid100) mycorrs[iii, 3] = mycorrs[iii, 3] + 1
}
}
}
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Correlation Tests',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Number of significant by total number of Correlations',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Type I error',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
a<-table.row.start(a)
a<-table.element(a,round(iiid100,2),header=T)
a<-table.element(a,round(mycorrs[iii,1]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,2]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,3]/ncorrs,2))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')