landing_vignette

library(neuroSCC)

Introduction

This vignette guides you through the initial data preparation steps necessary to use the neuroSCC package for analyzing neuroimaging data with Simultaneous Confidence Corridors (SCC).


1. Loading and Inspecting Neuroimaging Data

We’ll begin by loading a sample PET neuroimaging file included with the package and inspecting its structure.

niftiFile <- system.file("extdata", "syntheticControl1.nii.gz", package = "neuroSCC")

# Load and clean data using neuroCleaner
petData <- neuroCleaner(niftiFile)

# Inspect the data
head(petData)
#>   z x y pet
#> 1 1 1 1   0
#> 2 1 1 2   0
#> 3 1 1 3   0
#> 4 1 1 4   0
#> 5 1 1 5   0
#> 6 1 1 6   0
str(petData)
#> 'data.frame':    902629 obs. of  4 variables:
#>  $ z  : int  1 1 1 1 1 1 1 1 1 1 ...
#>  $ x  : int  1 1 1 1 1 1 1 1 1 1 ...
#>  $ y  : int  1 2 3 4 5 6 7 8 9 10 ...
#>  $ pet: int  0 0 0 0 0 0 0 0 0 0 ...

2. Creating a Database from Multiple PET Images

Now, let’s demonstrate how to create a structured database using multiple PET files available in the package.

# Create database for control subjects
controlPattern <- "^syntheticControl.*\\.nii.gz$"
databaseControls <- databaseCreator(pattern = controlPattern, control = TRUE, quiet = FALSE)
#> Processing CN_number 1 - File 1 of 2
#> Processing CN_number 2 - File 2 of 2

# Inspect the created database
head(databaseControls)
#>   CN_number z x y pet
#> 1         1 1 1 1   0
#> 2         1 1 1 2   0
#> 3         1 1 1 3   0
#> 4         1 1 1 4   0
#> 5         1 1 1 5   0
#> 6         1 1 1 6   0
table(databaseControls$CN_number)
#> 
#>      1      2 
#> 902629 902629

3. Creating the Data Matrix

We’ll transform the PET database into a matrix format suitable for SCC analysis.

# Create matrix for Z-slice 35
matrixControls <- matrixCreator(databaseControls, paramZ = 35, quiet = FALSE)
#> Processing Subject 1
#> Processing Subject 2

# Inspect matrix structure
dim(matrixControls)
#> [1]    2 9919
str(matrixControls)
#>  num [1:2, 1:9919] 0 0 0 0 0 0 0 0 0 0 ...

4. Normalizing the Data Matrix

Normalization adjusts for global intensity differences between subjects.

# Perform mean normalization
normalizedMatrix <- meanNormalization(matrixControls, returnDetails = FALSE)
#> 
#>  Mean before normalization: 4.345851
#> 
#>  Normalization completed.

5. Extracting Contours for Triangulation

Contours from the neuroimage are used to set boundaries for SCC computations.

# Extract contours from sample data
contours <- neuroContour(niftiFile, paramZ = 35, plotResult = TRUE)

Contours for brain imaging data


# Check contours structure
length(contours)
#> [1] 1
str(contours[[1]])
#> 'data.frame':    345 obs. of  2 variables:
#>  $ x: num  9 8.1 8.1 8.1 8.1 8.1 8.1 8.1 8.1 8.1 ...
#>  $ y: num  43.1 44 45 46 47 48 49 50 51 52 ...

6. Conditional Triangulation Setup (optional)

The triangulation step requires the external Triangulation package, currently not on CRAN. Ensure you have this package installed. If not, install it using:

remotes::install_github("FIRST-Data-Lab/Triangulation")

Conditional example for triangulation:


if (!requireNamespace("Triangulation", quietly = TRUE)) {
  cat("Triangulation package is not installed.\nInstall it using: remotes::install_github('FIRST-Data-Lab/Triangulation')\n")
} else {
  # Perform triangulation with the first contour
  mesh <- Triangulation::TriMesh(n = 15, contours[[1]])
  
  # Inspect mesh
  print(mesh[["V"]][1:10, ])
  print(mesh[["Tr"]][1:10, ])
}
#> Warning in tri.mesh(X, Y): success: collinearities reduced through jitter
#> Warning in tri.mesh(X, Y): dataset started with 3 colinear points, jitter
#> added!

Delaunay triangulations for brain imaging data

#>              X        Y
#>  [1,] 13.15333 38.42000
#>  [2,] 13.15333 43.47333
#>  [3,] 13.15333 48.52667
#>  [4,] 13.15333 53.58000
#>  [5,] 13.15333 58.63333
#>  [6,] 13.15333 63.68667
#>  [7,] 18.20667 28.31333
#>  [8,] 18.20667 33.36667
#>  [9,] 18.20667 38.42000
#> [10,] 18.20667 43.47333
#>       node1 node2 node3
#>  [1,]     1   542   541
#>  [2,]     1   541   540
#>  [3,]     1   540   538
#>  [4,]     1   538   537
#>  [5,]     1   537   536
#>  [6,]     1   536   535
#>  [7,]     1   535   533
#>  [8,]     1   533   532
#>  [9,]     1   532   531
#> [10,]     1   531     8

What’s Next?

You’re now ready to perform Simultaneous Confidence Corridor analyses:

Feel free to explore these vignettes to continue your analysis journey with neuroSCC.