This vignette demonstrates how to use the riskdiff package to perform causal inference using Inverse Probability of Treatment Weighting (IPTW). IPTW is a powerful method for estimating causal effects from observational data by creating a pseudo-population where treatment assignment is independent of measured confounders.
IPTW is particularly useful when:
IPTW relies on three main assumptions:
Let’s start with a simple example using the Cachar cancer screening dataset:
# Load the data
data(cachar_sample)
# Quick look at the data
head(cachar_sample)
#> id age sex residence smoking tobacco_chewing areca_nut alcohol
#> 1 1 53 female urban slum No Yes Yes No
#> 2 2 25 male rural No No Yes No
#> 3 3 18 female rural No No Yes No
#> 4 4 28 female rural No Yes Yes No
#> 5 5 51 male rural No Yes No No
#> 6 6 25 female rural No No No No
#> abnormal_screen head_neck_abnormal age_group tobacco_areca_both
#> 1 0 0 40-60 Yes
#> 2 0 0 Under 40 No
#> 3 0 0 Under 40 No
#> 4 0 0 Under 40 Yes
#> 5 0 0 40-60 No
#> 6 0 0 Under 40 No
table(cachar_sample$areca_nut, cachar_sample$abnormal_screen)
#>
#> 0 1
#> No 751 27
#> Yes 1348 374
First, we estimate propensity scores and calculate weights:
# Calculate ATE weights for areca nut use
iptw_result <- calc_iptw_weights(
data = cachar_sample,
treatment = "areca_nut",
covariates = c("age", "sex", "residence", "smoking", "tobacco_chewing"),
weight_type = "ATE",
verbose = TRUE
)
#> Fitting propensity score model using logistic regression
#> Sample size: 2500
#> Treatment prevalence: 0.689
#> Trimmed 20 weights below 0.428 and 25 weights above 4.075
#> Weight summary:
#> Min. 1st Qu. Median Mean 3rd Qu. Max.
#> 0.4277 0.7291 0.7957 0.9874 1.0941 4.0753
#>
#> Effective sample size: 1886.4
#> Maximum standardized difference: 0.057
# Examine the results
print(iptw_result)
#> Inverse Probability of Treatment Weighting Results
#> =================================================
#>
#> Propensity Score Model: logistic regression
#> Weight Type: ATE
#> Sample Size: 2500
#> Effective Sample Size: 1886.4
#>
#> Weight Summary:
#> Min. 1st Qu. Median Mean 3rd Qu. Max.
#> 0.4277 0.7291 0.7957 0.9874 1.0941 4.0753
#>
#> Covariate Balance:
#> variable std_diff_unweighted std_diff_weighted
#> age 0.55269148 -0.007137053
#> sex 0.21677976 0.049132648
#> residence -0.02416585 -0.018322013
#> smoking -0.09149158 -0.050166259
#> tobacco_chewing 1.02667004 0.057051627
Before interpreting results, we should check key assumptions:
# Comprehensive assumption checking
assumptions <- check_iptw_assumptions(iptw_result, verbose = TRUE)
#> IPTW Assumptions Check
#> ======================
#>
#> Overall Assessment: ✓ PASS
#>
#> 1. Positivity (Non-zero probability of treatment)
#> Propensity score range: [0.258, 0.976]
#> Extreme values (PS < 0.05): 0
#> Extreme values (PS > 0.95): 84
#> Status: ✓ OK
#>
#> 2. Covariate Balance
#> Maximum standardized difference: 0.057
#> Status: ✓ BALANCED
#>
#> 3. Weight Distribution
#> Weight range: [0.43, 4.08]
#> Effective sample size ratio: 0.755
#> Status: ✓ OK
Visual diagnostics help assess whether weighting achieved balance:
# Create balance plots (requires ggplot2)
if (requireNamespace("ggplot2", quietly = TRUE)) {
plots <- create_balance_plots(iptw_result, plot_type = "both")
print(plots$love_plot)
print(plots$ps_plot)
}
Now we can estimate the causal effect:
# Estimate ATE using IPTW
rd_causal <- calc_risk_diff_iptw(
data = cachar_sample,
outcome = "abnormal_screen",
treatment = "areca_nut",
covariates = c("age", "sex", "residence", "smoking", "tobacco_chewing"),
weight_type = "ATE",
verbose = TRUE
)
#> Calculating IPTW weights...
#> Fitting propensity score model using logistic regression
#> Sample size: 2500
#> Treatment prevalence: 0.689
#> Trimmed 20 weights below 0.428 and 25 weights above 4.075
#> Weight summary:
#> Min. 1st Qu. Median Mean 3rd Qu. Max.
#> 0.4277 0.7291 0.7957 0.9874 1.0941 4.0753
#>
#> Effective sample size: 1886.4
#> Maximum standardized difference: 0.057
#> Propensity score model diagnostics:
#> max_std_diff_unweighted max_std_diff_weighted mean_abs_std_diff_unweighted
#> 1 1.02667 0.05705163 0.3823597
#> mean_abs_std_diff_weighted
#> 1 0.03636192
#> Outcome conversion check:
#> Original outcome: 0 0 0 0 0
#> Converted outcome: 0 0 0 0 0
#> Any NAs in outcome? FALSE
#> Sample sizes: Treated = 1722 , Control = 778
#> Treated outcomes: 0 0 0 0 1 1 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 1 0 0 0 1 0 0 0 0 1 0 1 0 1 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 1 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 1 0 0 0 0 0 1 0 0 0 1 0 0 1 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 1 1 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 0 0 0 0 1 0 1 1 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 1 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 1 0 1 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 1 1 0 1 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 0 1 0 0 0 0 0 0 0 1 0 0 0 1 1 1 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 1 1 1 1 0 0 0 1 0 1 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 0 0 0 1 1 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 1 1 0 0 1 1 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 1 1 0 0 0 0 1 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 1 0 1 0 0 0 0 1 0 0 0 0 0 0 1 0 1 1 0 1 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 1 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 0 0 0 0 1 0 1 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 1 0 0 1 0 1 0 1 1 0 1 1 0 1 0 0 1 0 0 0 0 1 0 1 0 0 1 1 0 0 1 0 1 1 0 0 0 0 0 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 1 0 0 0 0 0 0 0 1 0 0 0 1 1 0 0 1 0 0 1 0 1 0 0 1 0 0 0 0 0 0 1 1 0 1 0 0 1 1 1 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 1 1 0 0 0 1 1 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 1 0 0 0 0 1 1 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 1 0 0 0 1 1 0 1 0 1 1 0 0 0 0 0 0 1 1 0 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0 0 1 0 1 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 1 1 0 1 0 0 0 1 0 0 0 0 0 0 1 1 0 0 0 0 0 1 0 0 0 0 0 0 1 1 0 0 0 0 1 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 1 1 0 1 1 1 0 0 0 1 0 1 1 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 0 1 1 0 0 0 0 1 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 0 0 1 1 1 0 0 1 0 0 0 0 0
#> Treated weights: 0.7390998 1.555489 2.594031 0.8850145 0.7254262 0.7291275 0.8493758 0.7165828 1.007077 0.9838604 0.781195 1.422518 0.7715335 1.007077 0.7291275 0.7305448 0.7878039 0.7291275 0.7618855 0.9064391 0.7912836 0.7498414 0.8156651 0.8073383 0.7239231 1.492138 1.097134 0.7905873 1.497103 0.7239231 1.04183 0.7565065 1.174117 1.656064 0.7805672 0.8064691 1.58595 0.8286531 0.759404 1.464862 1.792566 0.797868 0.7112165 1.269179 0.7367304 0.738415 1.497634 0.7844424 1.108387 0.8656989 0.8482847 0.7665811 1.09414 0.8783523 0.9320128 0.7523774 2.134013 0.7750272 1.470171 0.9565619 0.7588862 1.09414 0.7419639 0.8386567 0.7498414 1.108387 0.735103 0.8428697 0.7335308 0.7128199 0.738415 1.443641 1.138399 0.7542077 0.752859 0.7485716 1.785066 0.7519868 1.785066 1.223162 1.298172 0.9659729 0.7323313 0.7322791 1.170559 0.7692712 0.7367304 0.7401588 1.11098 1.302341 1.123369 1.792566 0.7687244 1.718885 1.526062 0.9660029 1.091386 0.7665389 1.520373 1.004915 0.8493758 1.154201 0.7948856 0.9474705 0.8239046 0.7565065 1.080377 0.7687244 0.7542077 0.7837925 0.7665389 0.9320128 0.9474705 0.8503683 1.561891 0.8990495 0.8073383 0.7726144 1.054238 0.8158172 0.728046 0.7884158 0.735103 0.732012 0.7438324 1.656064 1.319589 0.7215785 1.09414 0.8286531 0.7364047 0.8239779 1.09414 1.285887 0.8286531 1.497634 0.8850145 0.9985421 0.8148858 0.7780579 1.108387 0.7419639 1.718885 1.126342 0.8193173 1.108387 0.7955413 0.8017014 0.7565065 0.985831 0.7638994 0.8493758 0.7477689 0.735103 0.9659729 0.8335685 0.8797526 1.623222 0.8239046 0.8064691 0.8335685 2.237388 1.208832 0.9125819 0.8239046 0.7148884 0.9140885 0.7277582 0.9565619 0.7780579 0.8656989 0.8148858 1.677148 0.7941648 0.7692712 0.7519868 1.108387 0.7709401 0.7519868 0.8106048 0.7805672 0.8286531 1.58595 0.8106048 0.7438324 1.418011 0.735103 0.7546908 0.7598403 0.7422925 0.781195 0.7515575 0.7542077 0.7837925 0.7912836 0.7305448 0.7367304 1.369333 1.769808 1.413056 1.718885 1.323905 0.7367304 0.7502924 0.7715335 1.791842 1.792566 0.735103 0.8656989 0.9474705 0.7844424 0.7912836 0.7851489 0.7405382 1.393252 0.8239046 1.555489 1.170559 0.7805672 1.497634 1.497634 0.8286531 1.34176 0.8719163 0.797868 0.7401588 1.051755 2.134013 0.9565909 1.009429 0.7750272 0.7633891 0.7756642 1.369333 0.8286531 1.281509 0.8064691 1.792566 0.716379 0.7323313 0.9352362 0.7387815 0.7519868 0.8493758 0.7633891 1.443641 0.7251576 0.7715335 0.7644542 0.752847 0.8286531 1.205018 0.9220066 0.7502187 0.9064391 0.7715335 0.7878039 0.7438324 0.7850335 1.991745 0.7412239 1.285887 0.7905873 0.7121758 1.067082 0.7264354 1.138399 0.7259116 0.985831 1.718885 0.738415 0.7519868 0.7542077 1.617482 0.728296 1.719444 0.7401588 0.7687244 0.9548143 1.555489 0.8439237 1.755089 0.8482847 1.223162 0.7644542 0.738415 1.14172 1.423398 0.7370266 0.7565065 0.9494095 1.503609 1.054238 0.7126605 0.7878039 0.7671132 0.8192317 1.018264 1.108387 0.9565909 2.406478 0.985831 0.9302031 0.7401588 0.985831 0.8386567 0.7697686 0.7644542 1.464862 0.7844424 1.265614 1.067082 0.9660029 0.7665389 0.8335685 0.7477689 1.09414 0.738415 0.9757457 1.470171 0.7519868 0.8148858 0.735103 0.8335685 0.7338073 0.8286531 0.7457667 0.7986142 0.7481333 0.8482847 1.154201 0.9962707 0.9320128 1.393252 0.7750272 0.7905873 0.7502187 0.8719163 0.9660029 1.531236 1.947505 0.7227306 1.718885 1.080377 1.68391 0.7670194 0.735445 0.7837925 0.732012 0.8017014 0.8439237 1.555489 0.9302031 0.9474985 1.76168 0.8439237 0.781195 0.7419639 0.8148858 0.7423566 0.8064691 0.7878039 0.8656989 0.7482045 0.7750272 0.7457667 1.755089 0.7532973 0.7251576 0.8064691 0.8286531 1.04183 0.8667922 2.237388 2.594031 0.7744413 0.8335685 0.8071963 0.8493758 0.7744413 1.319589 1.711886 2.037539 0.7638994 0.7818775 0.7457667 1.531236 0.8493758 0.8193173 1.388465 0.7264354 0.7715335 1.555489 0.8286531 1.413056 0.7502924 0.8295175 0.732012 0.7542077 0.9494095 0.8386567 0.7542077 1.718885 1.623222 0.781195 0.7750272 0.7367304 1.393252 1.346227 1.393252 1.643591 1.187491 0.7570067 0.7720995 0.9660029 0.7687244 0.9660029 1.18055 0.732012 0.7457667 0.7498414 1.264923 0.9302292 0.7412239 0.7912836 1.184102 0.7200248 0.7542077 0.7948856 0.7844424 0.7457667 0.735103 1.364709 0.9157773 0.7787172 1.755089 0.7387216 1.219531 0.7786095 0.7844424 0.7367304 0.8439237 0.8106048 0.714538 0.7264354 0.7608566 1.418011 0.8493758 0.7128199 1.223162 0.7239231 0.8493758 0.7401588 0.7335308 0.7305448 0.7335308 0.8239046 0.8148858 0.7498414 0.7195795 0.9565619 0.7227306 0.7660107 0.8334736 0.9962707 1.526062 0.8850145 1.755089 0.7941648 0.8783523 0.7308532 0.7338613 0.9387147 0.8249027 0.7285662 1.163853 0.7618855 1.947505 1.018264 0.7955413 0.9757457 0.7956693 1.591495 2.037539 0.7246515 0.9064391 1.306873 0.7603397 0.7494267 0.756925 1.11098 0.8193173 0.7818775 0.7136641 1.351083 0.7311211 0.7780579 0.7850335 1.108387 0.7457667 0.8386567 1.443641 1.346227 0.8239046 0.7588862 1.261009 0.9659729 0.738415 1.561891 1.683257 0.7367304 0.9220319 1.306873 1.281509 1.475943 0.7222583 1.19428 0.9494095 1.438512 0.8148858 1.241943 0.7494365 0.8156651 0.7920407 0.8335685 0.7519868 0.7065317 0.7367304 1.792566 0.8097771 0.8428863 1.285887 1.398456 0.7588862 0.8335685 0.75841 0.7613889 1.792566 1.04183 2.594031 0.7727149 0.8024738 1.056497 1.138399 1.369333 1.285887 0.8931662 0.8239046 0.7956693 0.7457667 0.8106048 0.8140427 1.09414 0.8386567 2.594031 1.393252 0.727999 0.8493758 1.991745 1.04183 0.7744413 0.7687244 1.68391 0.8325849 1.223162 0.9386877 0.8439237 0.728046 0.7787172 0.732012 0.7542077 0.9109768 0.7401588 0.9565909 0.8395829 0.7871312 0.7438324 0.7426977 1.792566 0.8106048 0.8193173 1.792566 0.732012 0.7404762 0.8493758 0.7844424 0.7588862 0.7891517 0.8919108 0.8286531 0.8482847 0.9387147 0.7401588 0.8023992 1.526062 0.7477689 1.497634 0.9140885 0.9778655 0.8439237 0.8795239 0.7482045 0.7367304 1.093874 1.018264 0.7515575 1.369333 1.346227 0.7401588 0.7519868 0.9474705 0.7780579 0.7305448 0.820124 0.7241826 1.470171 2.194054 0.7720995 1.718885 0.7805672 0.8386567 0.7851489 0.8493758 0.8193173 0.9565619 0.7457667 0.8148858 0.7341398 0.8097771 0.8386567 1.755089 0.7291275 0.8193173 0.7477689 0.8493758 0.7608566 0.759404 1.018264 0.7913392 0.8503683 0.8386567 0.8249027 0.7267135 0.7251576 0.9565619 0.735103 0.732012 1.123134 0.7727149 0.8386567 0.7419639 0.7511074 0.8286531 0.7560574 1.054238 0.7241826 1.656064 0.7644542 0.8148858 0.8277029 0.7401588 1.06942 0.7756642 0.7805672 0.7878039 0.8024738 1.947505 0.8148858 0.8386567 1.184102 0.7885353 0.8439237 0.8064691 0.9187856 0.7453796 1.219531 0.7715335 2.406478 0.7335308 2.084943 0.7905873 1.201594 1.617482 0.7665389 0.8386567 1.497634 0.9659729 0.7738271 0.7434673 0.8719163 0.7948856 1.438512 0.8493758 0.7485716 0.7338073 1.422518 1.475943 0.7173484 0.9386877 1.194375 0.7227306 0.8106048 0.735103 0.7956693 0.794816 0.8286531 0.8239046 0.7401588 0.8439237 0.9140885 1.398456 2.540718 0.9962707 0.9386877 0.8439237 0.7204656 0.7638994 1.241943 0.7912836 0.7370844 0.735103 0.8064691 0.9220066 0.9962707 0.8148858 0.7251576 0.7837925 0.8428697 0.8106048 1.497634 0.7774514 0.7457667 1.413056 0.7885353 0.8148858 0.8428697 0.7878039 0.7291275 0.7948856 1.58595 0.8538901 0.7542077 0.7692712 0.8148858 0.7498414 1.154201 0.7173484 0.973796 0.8286531 0.985831 0.7291275 1.443641 0.820124 0.7715335 1.792566 1.007077 0.7654569 0.7308532 0.7948856 1.443641 0.8286531 0.8493758 0.8024738 0.7565065 0.8387542 2.084943 0.8494629 0.7727149 0.900349 0.7204656 0.7912836 1.497634 0.7956693 1.007077 0.7912164 0.7565065 1.443641 2.466849 0.8239046 1.368887 0.8031764 0.7204656 0.8064691 0.7618855 1.792566 1.526062 1.273259 0.7885353 0.9386877 0.7401588 2.134013 1.054238 1.323905 0.7457667 0.8115046 1.617482 0.820124 0.8106048 1.265614 0.8033135 1.120183 0.7638994 0.7218206 0.8239046 2.529342 1.029843 0.7251576 0.7277582 0.8850357 0.7239231 0.8156651 0.7064017 0.7519868 0.7377244 0.8386567 0.8106048 0.7419639 1.123134 0.7608566 0.8783728 0.7565065 1.04183 0.7175593 0.7948856 0.7515575 0.7419639 0.7401588 1.37436 1.273259 0.8106048 0.8024738 1.792566 0.728046 0.738415 1.58595 0.8439237 0.7193904 0.75841 0.985831 1.799388 0.7147282 1.526062 1.443145 0.7750272 0.8850357 1.991745 1.555489 1.792566 1.656064 1.054238 0.7780579 0.717159 1.09414 0.7498414 0.8439237 1.718885 1.555489 1.302341 1.443641 1.68391 1.423398 1.067082 1.205018 0.7519868 0.7387815 0.7963298 0.7387815 0.8493758 1.418011 0.7519868 0.823816 0.7818775 0.9187856 0.8990495 0.7335308 0.9659729 0.9565619 0.8064691 1.369333 0.9985421 0.7154426 1.68391 1.34176 2.084943 1.0203 1.555489 0.7401588 0.7671132 0.977488 0.8239046 0.9757457 1.650122 0.9474705 0.7305448 1.560845 1.560845 0.7565065 0.7498414 0.8105249 2.417095 1.423398 1.68445 0.7438324 0.7948856 0.7715469 1.497634 1.281509 0.744239 0.7156395 1.018264 1.792566 0.7948856 0.7457667 0.7254262 0.7851489 0.8795239 0.7457667 0.8439237 0.7523774 0.7227306 0.8395829 0.7756642 0.7542077 1.718885 0.8239046 0.7844424 0.7264354 0.8439237 0.735103 0.7912836 0.9757457 0.7419639 0.8148858 1.067082 0.7756642 0.909402 0.8115046 0.7419639 0.7948856 0.8632555 0.8335685 0.7128199 0.7885353 0.820124 0.735103 1.650122 0.9064391 0.8148858 0.7912836 0.9352362 1.398456 0.7477689 0.7844424 0.7177424 0.7441726 0.7912836 0.7277582 0.732012 0.735103 0.7692712 0.8148858 1.09414 1.755089 0.8560469 1.991745 0.7955413 0.7322791 1.067082 1.532247 1.792566 1.792566 0.735445 0.7871312 1.947505 0.8386567 0.9659729 0.7692712 0.7878039 0.7218206 0.8607488 1.37436 0.8336627 0.8386567 0.7405382 0.8024738 0.9220066 0.7239231 0.7774514 0.9317215 1.080377 0.9738421 0.7692712 0.8295175 1.306873 0.7720995 0.7264354 0.779909 0.9064391 0.8656989 0.7264354 0.7367304 1.526062 0.7560465 0.7948856 0.8335685 1.04183 0.7353892 0.7885353 0.8106048 0.7401588 0.7750272 0.8493758 0.7768155 0.7498414 2.291816 2.134013 0.7744413 0.7477689 0.7267135 1.718885 0.7665389 0.7277582 0.8428697 0.732012 0.8919108 1.526062 0.7154426 0.7515575 1.991745 1.58595 0.8097771 1.265614 1.067082 0.7473682 0.8335685 0.8106048 0.7498414 1.650122 0.8064691 0.7851489 0.7419639 1.310693 1.080377 1.58595 0.8156651 1.170559 0.7087285 0.8335685 0.7720995 0.7878039 0.7720995 0.7064017 0.8148858 1.991745 0.7750272 0.8919108 0.7457667 1.170559 0.7154426 0.75841 0.8064691 0.8024738 1.369333 0.8106048 1.418011 1.369333 0.8247397 1.346227 0.7401588 0.8439237 0.7200248 0.7912836 0.7744413 1.080377 0.8560469 0.7218206 1.755089 0.7941648 0.7335308 0.7941648 0.8202815 0.7780579 1.04183 0.8193173 1.09414 1.249473 0.8239046 0.8295175 0.7370844 0.738415 1.257494 0.8040213 0.7338613 1.364709 0.8335685 0.8056695 1.054238 1.138399 0.7805672 0.8719163 0.7477689 0.781195 1.11098 0.738415 1.208832 1.393252 0.7461187 0.8482847 0.7338613 0.7715335 0.7565065 0.7818775 0.7112165 0.7774514 0.7780579 0.8064691 1.718885 0.7750272 0.8193173 0.727216 0.8193173 0.744239 0.8202815 1.68391 1.123134 1.138399 1.018264 0.7239001 0.7419639 1.022811 1.555489 0.8017014 1.369333 0.9402322 0.735103 1.369333 0.7419639 1.792566 0.7912836 0.7756642 0.7519868 0.7277582 0.7401588 0.7884158 1.792566 1.520373 0.7293767 1.755089 1.526062 0.7401588 0.8783523 0.8335685 0.7878039 1.68391 0.7477689 0.9565619 0.8064691 1.464862 0.7805672 0.7720995 0.7083236 1.863481 0.7305448 0.7335308 1.34176 1.58595 1.418011 0.7546908 0.7498414 0.9680507 0.8596925 1.497634 0.7204656 0.8560469 0.735103 1.393252 2.466849 0.8386567 0.7985421 0.7787172 0.7780579 0.7367304 0.985831 0.989931 1.054238 1.04183 0.8024738 0.7367304 0.7438324 0.7211223 0.7477689 0.7986142 1.650122 0.7438324 0.7305448 0.7523774 0.820124 1.219531 0.781195 0.7850335 2.466849 0.820124 0.7335308 0.8335685 0.7727149 0.8783523 1.281509 0.7750272 0.8990495 1.108387 0.735103 1.201511 0.7805672 0.7986142 1.108387 2.529342 0.750684 0.8656989 0.9302031 1.904767 1.223162 1.32783 0.738415 0.9402322 0.8064691 1.591495 0.7387815 0.7401588 0.8538901 0.7588862 0.7419639 0.8596925 0.8493758 0.7215785 0.705804 1.555489 0.8386567 0.7457667 1.398456 0.7457667 0.985831 1.611172 0.7660107 2.134013 0.9080706 0.7405382 0.738415 1.470171 0.8064691 2.406478 1.20468 0.7851489 0.7416027 0.8106048 0.7457667 1.187491 0.8017014 0.8325849 1.718885 0.7461187 1.526062 0.8193173 0.738415 0.8428697 0.8783523 1.755089 1.470171 0.735445 1.443641 0.7992929 2.594031 1.423398 0.8229866 0.8148858 0.735103 0.8106048 0.7955413 0.7665389 0.7878039 0.7218206 0.7438324 0.7818775 0.7956693 0.8148858 1.09414 0.7588862 1.067082 0.7401588 1.792566 0.7251576 1.792566 0.8919108 0.7774514 0.735103 0.7941648 0.8106048 1.039431 0.7193904 1.080377 0.7744413 0.7851489 1.650122 0.8386567 0.738415 1.257494 0.8850145 0.7148933 0.7805672 0.7438324 0.8239046 1.755089 0.7639401 0.8239046 0.7750272 0.732012 1.526062 0.7665389 1.126342 1.246029 2.466849 1.526062 0.7335308 0.732012 0.8064691 0.75841 1.34176 0.8344633 0.7878039 0.7473682 0.732012 0.7367304 0.8428697 0.7239231 0.8335685 1.097134 1.718885 0.8148858 2.594031 0.7542077 1.123134 0.8249027 0.8056695 1.277482 1.302341 1.39279 1.09414 1.067082 0.8148858 0.7542077 1.650122 0.7519868 0.7183517 1.306873 1.04183 0.7885353 1.328596 0.7698657 0.7519868 0.8239046 0.8560469 0.7498414 0.7239231 0.7912836 0.8073383 0.8239046 0.7920407 0.7457667 0.9962707 1.423398 0.9064391 0.7477689 0.9064391 2.529342 0.7519868 0.8493758 0.7387216 0.8064691 1.018264 1.007077 0.8386567 1.261384 2.594031 0.7193904 1.364709 0.8439237 0.8239046 2.291816 0.735103 0.8719163 0.9474705 0.7687244 0.7715335 1.68391 0.7367304 0.8064691 0.9474705 0.75841 0.7726144 0.7878039 1.285887 1.080377 0.8335685 0.7608566 0.7878039 0.7948856 1.14172 0.8386567 1.154201 0.7994254 1.526062 0.7546119 1.76168 0.7774514 1.470171 1.755089 0.7419639 0.8667922 2.605807 0.7473682 0.9140885 0.7387815 0.7948856 0.7193904 0.8193173 1.555489 1.067082 1.418011 0.7419639 0.7542077 0.8573322 1.170559 0.8472438 0.7519868 0.7441726 0.8386567 0.9659729 1.58595 0.7565065 0.7749707 1.393252 0.8031764 0.7670194 0.7084678 0.7457667 0.8386567 0.7154426 0.7550981 0.7665389 1.555489 0.8632555 0.7948856 0.9962707 0.7885353 1.123134 0.7308532 1.190573 0.7370266 1.170559 0.8024738 0.732012 0.8428697 0.9140885 0.8919108 2.134013 0.791917 0.7755602 0.7878039 0.7477689 0.7457667 1.497634 0.7152659 0.8106048 0.7588862 1.018264 0.8439237 0.7837925 1.904767 0.8156651 0.8335685 1.281509 0.7101539 0.7401588 0.9510203 1.126342 0.8040429 0.7986142 0.7878039 0.8386567 0.8024738 1.281509 0.7461875 0.7812451 0.7246515 0.7477689 1.69006 0.7851489 1.418011 1.449217 0.7660107 1.170559 0.8990495 1.369333 1.991745 0.985831 0.7837925 1.58595 0.7844424 0.7234342 1.718885 0.8148858 0.7183517 0.797868 0.7986142 0.8386567 0.8193173 1.423398 1.555489 1.051814 0.7335308 0.7461875 0.7322791 1.68391 1.285887 0.7837925 0.716379 0.7871312 1.747844 0.7744413 0.732012 0.7401588 1.241943 0.781195 0.8105249 0.8115046 0.8719361 0.7498414 0.735103 1.555489 1.617482 0.8106048 1.346227 0.7871312 2.466849 0.8106048 0.705804 1.792566 0.7291275 0.8493758 1.323488 0.7154426 0.7438324 0.8064691 0.7419639 0.8148858 0.7715335 0.7720449 0.7750272 1.555489 0.7494365 0.8193173 0.7264354 0.8148858 0.7519868 0.8783523 1.201511 0.7120043 0.8783523 0.738415 1.049151 0.7878039 0.8344633 0.7335308 1.617482 1.18055 1.315065 0.7498414 1.870741 0.7322791 1.285887 2.529342 0.7774514 0.7613495 0.8719163 0.7419639 1.319589 0.732012 0.7473682 0.7727149 0.8193173 0.7837925 0.8482847 0.820124 0.7692712 0.9474705 0.8386567 1.561891 0.7419639 1.007077 0.7173484 1.470171 1.056497 1.04183 0.7698657 1.241943 0.7264354 0.7613495 1.526062 1.108387 0.8493758 1.020698 1.04183 1.170559 1.58595 0.735103 0.8286531 0.7837925 0.735103 1.464862 0.8158172 0.8919108 0.8439237 0.8193173 0.7457667 0.7519868 1.054238 1.58595 0.8439237 0.8395829 0.732012 0.7302679 0.8024738 0.9494095 1.201511 1.592578 0.7335308 2.291816 0.8493758 0.7671132 1.423398 0.7920407 0.7204656 1.09414 0.9064391 0.7438324 1.470171 0.7756642 1.592578 0.7565065 0.7608566 1.253415 1.68391 0.9386877 0.985831 1.792566 0.8344633 1.080377 0.7285662 1.323905 1.904767 1.470171 0.7367304 1.58595 0.759404 0.7175593 0.8064691 0.7405382 0.8017014 0.7241826 1.029843 0.8106048 0.7299638 0.7335308 0.8503683 1.138399 1.029843 0.7560465 0.728296 0.7305448 0.7665811 0.8106048 0.8184305 0.781195 0.8482847 1.650122 0.8064691 0.7633891 1.555489 0.797868 0.7159952 0.7181557 0.9490692 0.7254262 0.732012 1.531236 0.7367304 1.725252 0.9234734 0.7818775 0.9064627 0.7477689 1.208832 0.7665389 0.7405382 0.8439237 0.8229866 1.947505 0.7994254 0.7956693 1.555489 2.134013 0.7498414 1.082798 0.7457667 2.406478 0.7948856 0.8439237 0.7891517 1.438512 0.8386567 2.246959 1.475 0.8493758
#> Control outcomes: 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0
#> Control weights: 4.07533 0.4544844 0.5192941 1.481792 0.5341788 1.69303 0.9397257 0.5341788 1.003242 0.5054029 2.071021 0.5501283 0.4276627 0.4496194 0.4276627 0.4647334 0.6154857 0.7439052 0.5672186 4.07533 0.5266079 1.69303 0.4976566 4.07533 0.5341788 0.804725 0.5054029 0.6805354 0.5252847 0.6132543 1.646112 0.8220709 0.4276627 0.6372509 0.804725 3.703592 1.442047 0.4487691 1.105785 0.5341788 0.4874831 0.7150855 0.5403233 0.5000925 0.4276627 0.5054029 0.4276627 0.5192941 0.6263871 0.5930909 0.431756 0.5501283 0.7264401 0.5067315 0.5689701 0.4936789 0.5122285 0.4403793 0.5054029 1.646112 0.7292808 0.5054029 0.4449196 0.6702004 0.50421 0.5192941 0.5403233 0.5122285 0.5874082 0.5192941 0.5328091 0.5054029 0.4757155 0.4874831 1.366558 0.5122285 0.601206 0.4276627 0.5357043 0.5930909 1.646112 0.5328091 0.5328091 0.5122285 1.169004 1.897782 2.402799 0.6855634 2.331782 4.07533 0.5567122 0.5930909 0.4936789 0.5266079 4.07533 2.417108 0.431756 0.5192941 0.5192941 1.139788 0.9183851 0.5054029 0.6136165 0.5192941 3.477136 0.5341788 1.741597 0.4276627 1.032857 1.741597 2.276529 0.5403233 4.07533 1.843912 0.5855314 0.9618164 0.4276627 3.703592 0.5192941 0.660573 0.5122285 0.540598 0.540598 0.6283352 0.5501283 0.5501283 0.6372509 0.5567122 1.843912 0.5567122 1.522935 1.69303 0.4276627 1.741597 0.5420159 1.908636 0.5341788 0.4276627 1.791871 1.897782 3.087428 0.431756 1.117131 0.5672186 0.5501283 0.5054029 0.5341788 0.5054029 0.431756 0.431756 0.8736976 0.5266079 0.5122285 2.058116 1.843912 0.5180158 1.953545 1.751227 0.7013722 2.058116 0.5501283 0.4276627 0.8081013 1.539906 1.69303 1.296108 0.6753265 4.07533 0.6753265 0.9397257 3.087428 0.5122285 0.5328091 0.5266079 1.69303 2.183071 0.5192941 0.6987211 0.7624223 1.008354 1.897782 0.5403233 0.5585259 0.8361485 0.5122285 1.888036 0.4449196 0.6706309 0.5054029 0.5420159 0.4359932 1.230361 0.5742735 0.5267493 0.6779572 0.8189328 0.4276627 0.8778529 0.6706309 0.5501283 0.8934676 0.4449196 1.69303 0.6372509 0.7749311 0.5192941 0.858613 0.5000925 0.5067315 0.5483762 1.330614 1.199153 0.6175673 1.395522 0.7749311 0.897769 0.4486043 1.08456 0.5266079 1.684542 0.4449196 4.07533 0.5122285 1.441924 0.431756 0.5054029 0.5855314 0.5054029 1.359962 0.5930909 0.4440981 1.199057 0.597116 0.5192941 0.6175673 0.6175673 0.7123412 0.4310155 0.5420159 1.193698 0.7850395 1.230361 0.5054029 0.5192941 0.5122285 0.5122285 0.5266079 0.5501283 0.858613 0.5420159 0.431756 0.5192941 0.63486 0.5420159 0.5501283 0.4544844 0.5501283 0.9397257 0.4647334 2.798758 0.5742735 0.8400264 0.5420159 2.647052 0.5341788 1.262048 0.8778529 0.5280816 0.7844719 1.886147 0.431756 0.4359932 4.07533 0.5122285 0.4544844 0.4976566 1.084477 0.897769 0.5874082 0.5192941 0.4403793 0.5742735 1.003242 3.703592 0.5762169 0.5585259 0.752879 0.5122285 0.431756 0.4403793 1.953545 0.5420159 1.262564 0.5054029 0.4449196 0.5672186 0.5420159 0.5672186 0.6372509 0.4449196 0.5054029 2.082868 0.5341788 0.5341788 0.8081013 1.741597 1.111742 0.4359932 0.5762169 0.4395858 0.5122285 0.5420159 0.5585259 0.6881245 2.2098 0.5122285 0.4575283 4.07533 0.4449196 1.843912 0.4276627 4.07533 1.139788 0.4976566 1.168911 0.5192941 1.169004 0.6283352 0.4359932 0.5585259 0.6753265 1.646112 0.5835197 0.4496194 2.647052 0.540598 0.5420159 1.69303 0.5122285 1.07981 0.5930909 0.9017818 4.07533 0.5341788 4.07533 1.028424 0.4276627 0.4276627 0.804725 0.5672186 0.5672186 0.540598 0.5420159 0.50421 0.9397257 0.858613 2.082868 0.5653412 1.169004 1.964781 0.4701296 0.5585259 0.5266079 1.058221 0.704191 0.660573 0.5192941 0.5266079 0.6958794 0.5420159 0.5192941 0.5122285 0.4276627 0.5672186 1.592865 0.6029985 0.4276627 0.8736976 0.660573 0.4403793 1.199057 1.163734 0.5951733 0.4814977 0.5122285 0.5501283 0.8220709 0.4276627 1.791871 1.08456 1.111742 0.5054029 0.5672186 1.032857 0.4449196 0.5054029 3.971115 1.646112 1.433633 0.8743721 0.5780299 0.5780299 0.5054029 0.9312841 0.5122285 0.5236751 0.8855398 0.4359932 0.5341788 0.5420159 4.07533 1.741597 0.6534746 1.843912 0.4359932 0.5266079 0.597116 0.4276627 0.597116 0.5192941 2.331782 0.897769 0.5192941 0.5054029 0.9618164 0.4976566 0.4874831 0.7717802 1.953545 1.296002 0.5252847 0.5420159 0.4359932 0.5403233 0.7123412 0.5122285 0.5388732 0.6534746 1.40365 1.199153 0.6510196 0.5502851 0.9846836 0.9846836 0.5266079 4.07533 0.5054029 1.646112 0.6728524 0.6418532 0.5054029 1.084477 1.953545 0.4403793 0.5266079 0.7013722 0.4449196 0.4449196 0.7013722 0.4496194 2.778897 0.5192941 0.50421 0.4310155 0.6154857 0.8736976 1.169004 0.9397257 2.198137 1.583229 0.6489321 0.4276627 0.6987211 0.5054029 0.8400264 0.5266079 1.646112 4.07533 0.9183851 0.6466374 4.074008 1.741597 0.5122285 0.8545988 0.5266079 1.843912 0.5855314 0.5054029 0.5136038 0.4276627 1.782776 0.431756 2.491124 0.5403233 1.08456 0.5054029 0.5341788 0.5567122 0.6029985 0.6306264 1.111655 1.646112 0.9846836 2.814133 0.5501283 1.262667 0.5585259 2.196899 0.5192941 4.074008 0.6728524 0.5054029 3.588409 1.003242 2.31937 4.07533 0.5653412 0.858613 0.4359932 1.058221 3.703592 0.5341788 1.843912 2.814133 0.5192941 0.7748562 0.5122285 0.5855314 0.660573 0.5420159 0.5122285 2.993165 0.5054029 0.5501283 0.5122285 0.5835197 0.5054029 0.5483762 1.139878 0.7439751 0.5874082 0.5266079 0.6987211 1.481792 0.431756 0.431756 0.6136165 0.8778529 1.211589 0.8400264 4.07533 0.4359932 0.431756 0.4449196 0.7561419 0.4861904 0.540598 0.5501283 0.5266079 0.5341788 0.5122285 0.5420159 0.5122285 0.50421 0.5192941 0.5341788 0.6510196 0.660573 0.5122285 0.4976566 0.4496194 0.5122285 0.5762169 0.6881245 0.5672186 0.897769 1.69303 0.9146553 0.4359932 0.858613 0.5266079 0.4276627 1.262667 0.540598 0.5054029 0.4544844 0.6051541 0.5341788 1.330725 4.07533 0.5054029 2.058116 0.5122285 0.5403233 0.5192941 0.5054029 1.998802 0.5054029 0.431756 0.5653412 0.4814977 0.5420159 4.07533 0.5122285 0.4276627 0.5192941 0.5874082 0.6987211 0.5192941 0.4359932 4.046415 0.6261804 0.5501283 4.07533 2.248862 0.5266079 0.5266079 0.4276627 0.5054029 0.5855314 0.6728524 4.07533 0.5266079 0.5067315 1.078806 0.5689701 0.5192941 0.5192941 1.25672 0.4496194 0.431756 1.834497 0.8934676 0.5501283 0.9618164 0.5122285 2.797126 1.782776 1.609469 0.5672186 0.5341788 0.4276627 0.7591858 0.4636075 0.5266079 1.897782 4.07533 0.9662674 1.053632 0.5483762 0.4936789 2.814133 0.4544844 4.07533 0.5341788 0.6051541 0.431756 0.6728524 1.637912 0.5054029 1.953545 0.5420159 0.4403793 0.4496194 0.431756 0.9397257 0.4449196 0.5835197 0.4544844 0.7123412 1.886147 0.7292808 0.5502851 0.6372509 0.4449196 1.897782 1.058221 1.40365 0.4440981 0.5951733 0.5420159 0.5341788 0.5311841 0.5192941 2.402799 1.367136 0.4647334 0.9139325 1.366558 0.5266079 0.6154857 0.4395858 0.4701296 0.4586093 0.5236751 0.4276627 0.8778529 0.4440981 0.5585259 0.5483762 0.4276627 0.6029985 0.5420159 0.5341788 0.5951733 1.395639 1.782776 0.4276627 0.5266079 0.5192941 4.07533 0.6372509 0.5000925 0.5122285 1.897782 0.4276627 1.646112 1.08456 0.8778529 0.5567122 0.4691533 1.897782 1.646112 0.7717802 0.5341788 0.5818469 1.262564 0.5501283 0.5122285 0.5420159 0.4403793 0.4310155 0.7378825 1.134788 0.5742735 1.230361 0.6629632 0.5122285 0.6154857 1.646112 0.5266079 0.4449196 1.169004 0.6629632 0.431756 0.4544844 0.5054029 1.168911 0.5435949 2.071985 0.5951733 0.5501283 2.72915 0.5341788 0.5054029 3.703592 0.5266079 0.5420159 0.5420159 0.5054029 0.6374649 1.791871 1.941502 0.4352266 0.5192941 0.6958794 0.5192941 0.5341788 0.5689701 1.058221 0.5653412 1.410827 0.4276627 0.7233951 0.4586093 0.6728524 1.168911 0.5122285
#> All finite (treated): TRUE TRUE
#> All finite (control): TRUE TRUE
#> Risk in treated: 0.1933559
#> Risk in control: 0.04109176
#> Risk difference: 0.1522642
#> Standard error: 0.01420998
#> Z-statistic: 10.7153
#> P-value: 0
#>
#> IPTW Risk Difference Results:
#> Risk difference: 15.23 %
#> 95% CI: (12.44%, 18.01%)
#> P-value: 0
print(rd_causal)
#> IPTW-Standardized Risk Difference Results
#> =========================================
#>
#> Treatment: areca_nut
#> Weight Type: ATE
#> Effective Sample Size: 1886.4
#>
#> Risk in Treated: 19.3%
#> Risk in Control: 4.1%
#> Risk Difference: 15.23%
#> 95%CI: (12.44%, 18.01%)
#> P-value: <0.001
summary(rd_causal)
#> IPTW Risk Difference Analysis Summary
#> ====================================
#>
#> Treatment Variable: areca_nut
#> Weight Type: ATE weights
#> Effective Sample Size: 1886.4
#>
#> Effect Estimates:
#> -----------------
#> Risk in Treated Group: 19.3%
#> Risk in Control Group: 4.1%
#> Risk Difference: 15.23%
#> 95%Confidence Interval: (12.44%, 18.01%)
#> P-value: <0.001
#> Result: Statistically significant at α = 0.05
#>
#> Interpretation:
#> ---------------
#> The ATE shows a large increased risk of 15.2 percentage points.
#>
#> Note: Statistical significance does not necessarily imply clinical
#> significance. Consider the magnitude of effect in context.
IPTW can estimate different causal estimands depending on the research question:
The effect of treatment if the entire population received treatment vs. if none received treatment:
rd_ate <- calc_risk_diff_iptw(
data = cachar_sample,
outcome = "abnormal_screen",
treatment = "areca_nut",
covariates = c("age", "sex", "residence", "smoking"),
weight_type = "ATE"
)
cat("ATE: The average causal effect of areca nut use in the population\n")
#> ATE: The average causal effect of areca nut use in the population
cat("Risk Difference:", scales::percent(rd_ate$rd_iptw, accuracy = 0.01), "\n")
#> Risk Difference: 17.21%
The effect among those who actually received treatment:
rd_att <- calc_risk_diff_iptw(
data = cachar_sample,
outcome = "abnormal_screen",
treatment = "areca_nut",
covariates = c("age", "sex", "residence", "smoking"),
weight_type = "ATT"
)
cat("ATT: The average causal effect among areca nut users\n")
#> ATT: The average causal effect among areca nut users
cat("Risk Difference:", scales::percent(rd_att$rd_iptw, accuracy = 0.01), "\n")
#> Risk Difference: 18.29%
The effect among those who did not receive treatment:
rd_atc <- calc_risk_diff_iptw(
data = cachar_sample,
outcome = "abnormal_screen",
treatment = "areca_nut",
covariates = c("age", "sex", "residence", "smoking"),
weight_type = "ATC"
)
cat("ATC: The average causal effect among non-users of areca nut\n")
#> ATC: The average causal effect among non-users of areca nut
cat("Risk Difference:", scales::percent(rd_atc$rd_iptw, accuracy = 0.01), "\n")
#> Risk Difference: 14.78%
For small samples or when assumptions are questionable, bootstrap confidence intervals may be more robust:
rd_bootstrap <- calc_risk_diff_iptw(
data = cachar_sample,
outcome = "head_neck_abnormal",
treatment = "tobacco_chewing",
covariates = c("age", "sex", "residence", "areca_nut"),
bootstrap_ci = TRUE,
boot_n = 500, # Use more in practice (1000+)
verbose = FALSE
)
print(rd_bootstrap)
#> IPTW-Standardized Risk Difference Results
#> =========================================
#>
#> Treatment: tobacco_chewing
#> Weight Type: ATE
#> Effective Sample Size: 1982.5
#>
#> Risk in Treated: 23.0%
#> Risk in Control: 11.8%
#> Risk Difference: 11.17%
#> 95%CI: (7.96%, 14.10%)
#> P-value: <0.001
#>
#> Note: Confidence intervals based on 500 bootstrap replicates
You can specify different link functions for the propensity score model:
# Logistic regression (default)
ps_logit <- calc_iptw_weights(
data = cachar_sample,
treatment = "tobacco_chewing",
covariates = c("age", "sex", "residence", "areca_nut"),
method = "logistic"
)
# Probit regression
ps_probit <- calc_iptw_weights(
data = cachar_sample,
treatment = "tobacco_chewing",
covariates = c("age", "sex", "residence", "areca_nut"),
method = "probit"
)
# Compare propensity score distributions
cat("Logistic PS range:", round(range(ps_logit$ps), 3), "\n")
#> Logistic PS range: 0.155 0.783
cat("Probit PS range:", round(range(ps_probit$ps), 3), "\n")
#> Probit PS range: 0.152 0.783
Stabilized weights often have better properties:
# Unstabilized weights
ps_unstab <- calc_iptw_weights(
data = cachar_sample,
treatment = "areca_nut",
covariates = c("age", "sex", "residence"),
stabilize = FALSE
)
# Stabilized weights (default)
ps_stab <- calc_iptw_weights(
data = cachar_sample,
treatment = "areca_nut",
covariates = c("age", "sex", "residence"),
stabilize = TRUE
)
cat("Unstabilized weight variance:", round(var(ps_unstab$weights), 2), "\n")
#> Unstabilized weight variance: 1.64
cat("Stabilized weight variance:", round(var(ps_stab$weights), 2), "\n")
#> Stabilized weight variance: 0.11
Extreme weights can be problematic and may need trimming:
# Check for extreme weights
summary(ps_stab$weights)
#> Min. 1st Qu. Median Mean 3rd Qu. Max.
#> 0.5573 0.8052 0.9171 1.0007 1.1053 2.6419
# Trim at 1st and 99th percentiles
ps_trimmed <- calc_iptw_weights(
data = cachar_sample,
treatment = "areca_nut",
covariates = c("age", "sex", "residence"),
trim_weights = TRUE,
trim_quantiles = c(0.01, 0.99)
)
cat("Original weight range:", round(range(ps_stab$weights), 2), "\n")
#> Original weight range: 0.56 2.64
cat("Trimmed weight range:", round(range(ps_trimmed$weights), 2), "\n")
#> Trimmed weight range: 0.56 2.64
treatment = “smoking”, covariates = c(“maternal_age”, “race”, “education”), trim_weights = TRUE, trim_quantiles = c(0.01, 0.99) )
cat(“Original weight range:”, round(range(ps_stab\(weights), 2), "\n") cat("Trimmed weight range:", round(range(ps_trimmed\)weights), 2), “”)
## Comparison with Traditional Regression
Let's compare IPTW results with traditional regression adjustment:
``` r
# Traditional regression-based risk difference
rd_regression <- calc_risk_diff(
data = cachar_sample,
outcome = "abnormal_screen",
exposure = "areca_nut",
adjust_vars = c("age", "sex", "residence", "smoking"),
link = "auto"
)
# IPTW-based causal risk difference
rd_iptw <- calc_risk_diff_iptw(
data = cachar_sample,
outcome = "abnormal_screen",
treatment = "areca_nut",
covariates = c("age", "sex", "residence", "smoking"),
weight_type = "ATE"
)
# Compare results
comparison_table <- data.frame(
Method = c("Regression Adjustment", "IPTW (ATE)"),
Risk_Difference = scales::percent(c(rd_regression$rd, rd_iptw$rd_iptw), accuracy = 0.01),
CI_Lower = scales::percent(c(rd_regression$ci_lower, rd_iptw$ci_lower), accuracy = 0.01),
CI_Upper = scales::percent(c(rd_regression$ci_upper, rd_iptw$ci_upper), accuracy = 0.01),
P_Value = sprintf("%.3f", c(rd_regression$p_value, rd_iptw$p_value))
)
print(comparison_table)
#> Method Risk_Difference CI_Lower CI_Upper P_Value
#> (Intercept) Regression Adjustment 16.90% 9.72% 24.08% 0.000
#> IPTW (ATE) 17.21% 14.85% 19.56% 0.000
If covariates remain imbalanced after weighting:
# Check which variables have poor balance
assumptions <- check_iptw_assumptions(iptw_result)
poor_balance_vars <- assumptions$balance$poor_balance_vars
if (length(poor_balance_vars) > 0) {
cat("Variables with poor balance:", paste(poor_balance_vars, collapse = ", "), "\n")
# Try including interactions or polynomial terms
iptw_improved <- calc_iptw_weights(
data = cachar_sample,
treatment = "areca_nut",
covariates = c("age", "I(age^2)", "sex", "residence",
"smoking", "age:sex"), # Add interactions
weight_type = "ATE"
)
}
When subjects have very high or low propensity scores:
# Check propensity score distribution
iptw_result <- calc_iptw_weights(
data = cachar_sample,
treatment = "areca_nut",
covariates = c("age", "sex", "residence")
)
# Identify subjects with extreme scores
extreme_low <- which(iptw_result$ps < 0.05)
extreme_high <- which(iptw_result$ps > 0.95)
if (length(extreme_low) > 0 || length(extreme_high) > 0) {
cat("Consider trimming sample to region of common support\n")
# Restrict to common support
common_support <- iptw_result$ps >= 0.05 & iptw_result$ps <= 0.95
data_restricted <- cachar_sample[common_support, ]
# Re-analyze with restricted sample
rd_restricted <- calc_risk_diff_iptw(
data = data_restricted,
outcome = "abnormal_screen",
treatment = "areca_nut",
covariates = c("age", "sex", "residence")
)
}
Testing different model specifications:
# Simple model
ps_simple <- calc_iptw_weights(
data = cachar_sample,
treatment = "areca_nut",
covariates = c("age", "sex")
)
# Complex model with interactions
ps_complex <- calc_iptw_weights(
data = cachar_sample,
treatment = "areca_nut",
covariates = c("age", "I(age^2)", "sex", "residence",
"smoking", "tobacco_chewing", "age:sex")
)
# Compare balance
check_iptw_assumptions(ps_simple, verbose = FALSE)
check_iptw_assumptions(ps_complex, verbose = FALSE)
IPTW assumes no unmeasured confounding. Sensitivity analysis can assess robustness:
# Simulate an unmeasured confounder
set.seed(123)
cachar_sample$unmeasured_confounder <- rbinom(nrow(cachar_sample), 1, 0.3)
# Compare results with and without the unmeasured confounder
rd_without_u <- calc_risk_diff_iptw(
data = cachar_sample,
outcome = "abnormal_screen",
treatment = "areca_nut",
covariates = c("age", "sex", "residence")
)
rd_with_u <- calc_risk_diff_iptw(
data = cachar_sample,
outcome = "abnormal_screen",
treatment = "areca_nut",
covariates = c("age", "sex", "residence", "unmeasured_confounder")
)
cat("Without unmeasured confounder:", scales::percent(rd_without_u$rd_iptw), "\n")
cat("With unmeasured confounder:", scales::percent(rd_with_u$rd_iptw), "\n")
cat("Difference:", scales::percent(abs(rd_without_u$rd_iptw - rd_with_u$rd_iptw)), "\n")
When reporting IPTW analyses, include:
IPTW provides a powerful framework for causal inference from observational data. The riskdiff package makes these methods accessible while providing essential diagnostics and visualizations. Remember that causal inference requires careful thought about study design, confounders, and assumptions - the methods are only as good as these foundational elements.
For more advanced applications, consider methods like: - Marginal structural models for time-varying treatments - Doubly robust estimation combining IPTW with outcome modeling - Machine learning approaches for propensity score estimation
Austin, P. C. (2011). An introduction to propensity score methods for reducing the effects of confounding in observational studies. Multivariate Behavioral Research, 46(3), 399-424.
Hernán, M. A., & Robins, J. M. (2020). Causal inference: What if. Boca Raton: Chapman & Hall/CRC.
Robins, J. M., Hernán, M. A., & Brumback, B. (2000). Marginal structural models and causal inference in epidemiology. Epidemiology, 11(5), 550-560.