library(rio)
library(ggplot2)
library(tidyr)
library(lavaan)
library(semTools)
library(modsem)
6 Structural Equation Modeling
You can download the R code used in this lab by right-clicking this link and selecting “Save Link As…” in the drop-down menu: sem.R
6.1 Loading R packages
Load the required packages for this lab into your R environment.
6.2 Loading Data
Load the data into your environment. For this lab we will use a dataset based on N = 441 children whose caregivers completed a survey about family environment and child behavior. You can download the data by right-clicking this link and selecting “Save Link As…” in the drop-down menu: data/projectkids.csv. Make sure to save it in the folder you are using for this class.
The full dataset and more information about this project can be found here: https://www.ldbase.org/datasets/72ab9852-8ebc-4ba0-bb1f-5f1c347e2572.
<- read.csv("data/projectkids.csv") kids
The dataset includes item responses to 5 Reading Problem items from the Colorado Learning Disability Questionnaire (CLDQ; rated on five-point scale from never to always), 9 Attention items from the The Strengths and Weaknesses of ADHD Symptoms and Normal Behavior Scale (SWAN; rated on a seven-point scale from far below to far above average), and a composite, mean score of the Homework Problems Checklist (based on 19 items, rated on a four-point scale from never to very often).
Here are some example items from each scale:
CLDQ Reading Problems (item stem: Decide how well each statement describes your child):
Q1: Does/did your child have difficulty with spelling?
Q5: Does/did your child read below grade or expectancy level?
Q6: Does/did your child require extra help in school because of problems in reading and spelling?
SWAN Attention Items (item stem: How does your child compare to other children of the same age?):
Q1: Gives close attention to detail and avoids careless mistakes
Q2: Sustains attention on tasks or play activities
Q3: Listens when spoken to directly
Q5: Organizes tasks and activities
HPC Items (item stem: Circle the best answer that best describes your child’s homework habits):
Q1: Fails to bring home assignments and materials
Q4: Refuses to do homework assignment
Q6: Must be reminded to sit down and start homework
Q12: Easily frustrated by homework assignments
The hypothesis we want to test is whether there is an indirect effect of Attention, via Homework problems (or lack thereof), on Reading Problems. In other words, do kids with higher levels of attention experience fewer homework problems, which in turn is associated with fewer reading problems?
Note: This data file includes additional variables that we will use in a future R Lab.
6.3 Data Exploration
Before analyzing the data, we can look at the distribution of the variables to see if they follow a normal distribution (one of the main assumptions of the ML estimator that lavaan
uses by default) or if we can see skew in the distributions.
%>%
kids pivot_longer(everything()) %>%
ggplot(aes(x=value)) +
geom_histogram() +
facet_wrap(vars(name), scales = "free")
Do the histograms look “normal” enough? What can we do if there are issues with normality?
Package semTools
includes a set of functions to evaluate the skew and kurtosis of observed variables:
# Univariate skew and kurtosis
apply(kids, 2, skew)
cldq_1 cldq_2 cldq_4 cldq_5 cldq_6 cldq_18
skew (g1) 0.9726340 2.7834505 1.6533033 1.6591248 1.6700060 1.0359613
se 0.1167748 0.1166424 0.1166424 0.1167748 0.1166424 0.1169078
z 8.3291402 23.8631175 14.1741228 14.2078958 14.3173193 8.8613554
p 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000
cldq_19 cldq_20 chaos1 chaos2 chaos3 chaos4
skew (g1) 1.1701202 1.6292702 1.385669 1.065559 1.2892938 1.436448
se 0.1169078 0.1169078 0.117175 0.117175 0.1175793 0.117175
z 10.0089172 13.9363727 11.825635 9.093743 10.9653157 12.258996
p 0.0000000 0.0000000 0.000000 0.000000 0.0000000 0.000000
chaos5 chaos6 swan_1 swan_2 swan_3
skew (g1) -7.606825e-01 9.620434e-01 -0.0384075 -0.28474452 -0.1664108
se 1.170411e-01 1.175793e-01 0.1235604 0.12403473 0.1235604
z -6.499274e+00 8.182084e+00 -0.3108398 -2.29568375 -1.3467971
p 8.070833e-11 2.220446e-16 0.7559224 0.02169397 0.1780456
swan_4 swan_5 swan_6 swan_7 swan_8
skew (g1) -0.08387429 -0.09631494 -0.25689087 -0.1524466 -0.02334995
se 0.12371791 0.12324720 0.12356041 0.1237179 0.12356041
z -0.67794785 -0.78147772 -2.07907099 -1.2322114 -0.18897595
p 0.49780476 0.43452158 0.03761083 0.2178701 0.85011167
swan_9 hpc_mean
skew (g1) -0.07898916 1.4079328
se 0.12340351 0.1166424
z -0.64008847 12.0705092
p 0.52211509 0.0000000
apply(kids, 2, kurtosis)
cldq_1 cldq_2 cldq_4 cldq_5 cldq_6
Excess Kur (g2) 0.2298165 7.5171117 1.660876e+00 1.464507e+00 1.488210e+00
se 0.2335497 0.2332847 2.332847e-01 2.335497e-01 2.332847e-01
z 0.9840156 32.2229039 7.119522e+00 6.270642e+00 6.379370e+00
p 0.3251078 0.0000000 1.082912e-12 3.595619e-10 1.778178e-10
cldq_18 cldq_19 cldq_20 chaos1 chaos2
Excess Kur (g2) -0.2886126 0.638098194 1.908537e+00 1.442886e+00 -0.08115418
se 0.2338155 0.233815534 2.338155e-01 2.343500e-01 0.23434997
z -1.2343601 2.729066730 8.162577e+00 6.156973e+00 -0.34629482
p 0.2170688 0.006351385 2.220446e-16 7.414842e-10 0.72912116
chaos3 chaos4 chaos5 chaos6 swan_1
Excess Kur (g2) 0.44657989 1.996702 -0.40430533 0.55090353 -0.51524242
se 0.23515854 0.234350 0.23408229 0.23515854 0.24712083
z 1.89905880 8.520172 -1.72719311 2.34268988 -2.08498181
p 0.05755675 0.000000 0.08413299 0.01914529 0.03707095
swan_2 swan_3 swan_4 swan_5 swan_6
Excess Kur (g2) -0.2199770 -0.45229948 -0.678691670 -0.62596197 -0.3902963
se 0.2480695 0.24712083 0.247435830 0.24649441 0.2471208
z -0.8867558 -1.83027667 -2.742899730 -2.53945709 -1.5793745
p 0.3752104 0.06720858 0.006089928 0.01110247 0.1142502
swan_7 swan_8 swan_9 hpc_mean
Excess Kur (g2) -0.61695200 -0.3109368 -0.3559437 1.9502122
se 0.24743583 0.2471208 0.2468070 0.2332847
z -2.49338182 -1.2582379 -1.4421946 8.3597932
p 0.01265327 0.2083057 0.1492475 0.0000000
Which variables are significantly skewed? And which have significant issues with kurtosis?
6.4 Measurement Model
In the first step of model estimation, we will specify a CFA with all measured constructs and covariances between all factors. For the HPC mean score, we can specify a single-indicator factor. To do so, we need some estimate of reliability. This checklist is a highly reliable scale with a previously found Cronbach’s alpha of .96 (to be honest, this may mean that many of the items have so much overlap that the measured construct is quite narrow in its operational definition). Next, we need to know the variance in the HPC mean score in our sample, and then use the appropriate formula to compute the residual variance estimate:
var(kids$hpc_mean)
[1] 0.3928782
# residual = (1 - .96) * 0.393
1 - .96) * 0.393 (
[1] 0.01572
Next, we can specify and estimate the full measurement model. As some of the observed variables are skewed, we will use the mlr
estimator. In addition, our data contain some missing values. For this lab, we will assume that these data are missing at random (MAR) and us full information maximum likelihood (fiml
) to still use all available data.
When using mlr
, we get additional versions of approximate fit indices. Use the robust version if available, otherwise use the scaled version.
<- '
big_cfa readprob =~ cldq_1 + cldq_2 + cldq_4 + cldq_5 + cldq_6
attention =~ swan_1 + swan_2 + swan_3 + swan_4 + swan_5 + swan_6 +
swan_7 + swan_8 + swan_9
hwp =~ 1*hpc_mean
hpc_mean ~~ 0.01572*hpc_mean
'
<- cfa(big_cfa, kids,
fit_cfa estimator = "mlr",
missing = "fiml")
summary(fit_cfa, fit.measures = T, estimates = F)
lavaan 0.6-19 ended normally after 65 iterations
Estimator ML
Optimization method NLMINB
Number of model parameters 47
Number of observations 441
Number of missing patterns 13
Model Test User Model:
Standard Scaled
Test Statistic 294.259 214.704
Degrees of freedom 88 88
P-value (Chi-square) 0.000 0.000
Scaling correction factor 1.371
Yuan-Bentler correction (Mplus variant)
Model Test Baseline Model:
Test statistic 5812.658 3750.370
Degrees of freedom 105 105
P-value 0.000 0.000
Scaling correction factor 1.550
User Model versus Baseline Model:
Comparative Fit Index (CFI) 0.964 0.965
Tucker-Lewis Index (TLI) 0.957 0.959
Robust Comparative Fit Index (CFI) 0.968
Robust Tucker-Lewis Index (TLI) 0.962
Loglikelihood and Information Criteria:
Loglikelihood user model (H0) -7407.271 -7407.271
Scaling correction factor 1.639
for the MLR correction
Loglikelihood unrestricted model (H1) -7260.142 -7260.142
Scaling correction factor 1.464
for the MLR correction
Akaike (AIC) 14908.542 14908.542
Bayesian (BIC) 15100.727 15100.727
Sample-size adjusted Bayesian (SABIC) 14951.571 14951.571
Root Mean Square Error of Approximation:
RMSEA 0.073 0.057
90 Percent confidence interval - lower 0.064 0.049
90 Percent confidence interval - upper 0.082 0.065
P-value H_0: RMSEA <= 0.050 0.000 0.076
P-value H_0: RMSEA >= 0.080 0.106 0.000
Robust RMSEA 0.070
90 Percent confidence interval - lower 0.059
90 Percent confidence interval - upper 0.082
P-value H_0: Robust RMSEA <= 0.050 0.003
P-value H_0: Robust RMSEA >= 0.080 0.089
Standardized Root Mean Square Residual:
SRMR 0.034 0.034
What does the chi-square test of model fit tell us?
We will also use local fit information to give us more insight into the fit of our measurement model.
residuals(fit_cfa, type = "cor.bollen")$cov
cldq_1 cldq_2 cldq_4 cldq_5 cldq_6 swan_1 swan_2 swan_3 swan_4 swan_5
cldq_1 0.000
cldq_2 0.078 0.000
cldq_4 0.034 0.028 0.000
cldq_5 -0.056 -0.024 -0.002 0.000
cldq_6 -0.020 -0.021 -0.010 0.023 0.000
swan_1 -0.095 0.021 -0.009 -0.056 -0.051 0.000
swan_2 -0.031 0.051 0.028 -0.009 -0.002 0.064 0.000
swan_3 0.031 0.100 0.059 0.011 0.005 0.013 0.065 0.000
swan_4 -0.017 0.078 0.061 0.001 -0.008 -0.026 0.001 0.050 0.000
swan_5 -0.054 0.061 0.065 0.005 0.004 -0.008 -0.035 -0.063 0.023 0.000
swan_6 -0.069 0.010 -0.018 -0.094 -0.067 0.008 0.008 -0.029 -0.014 0.012
swan_7 0.000 0.084 0.071 -0.004 -0.016 -0.029 -0.034 -0.031 -0.006 0.045
swan_8 -0.057 0.028 0.036 -0.029 -0.023 -0.017 0.011 -0.004 -0.035 0.002
swan_9 -0.039 0.015 0.047 -0.034 -0.037 0.001 -0.020 0.018 0.014 -0.022
hpc_mean 0.065 -0.021 -0.040 -0.003 0.025 -0.028 0.013 0.042 -0.002 0.001
swan_6 swan_7 swan_8 swan_9 hpc_mn
cldq_1
cldq_2
cldq_4
cldq_5
cldq_6
swan_1
swan_2
swan_3
swan_4
swan_5
swan_6 0.000
swan_7 0.004 0.000
swan_8 0.036 0.004 0.000
swan_9 -0.010 0.021 -0.002 0.000
hpc_mean 0.036 -0.009 -0.042 0.000 0.000
None of the correlation residuals are > |.10|, indicating that remaining misfit might be trivial. Note that many people (including you), may stop at this point and continue to specify the structural model. However, this is a lab, so I want to show you some strategies for mindful model adjustment.
The approximate fit indices (especially RMSEA) indicate that global fit is not (approximately) amazing. I will use the modification indices to see if there are any parameters that, when added to the model, would meaningfully improve model fit. To ensure that we don’t get distracted by trivial options, we can add a somewhat large minimum.value
with which the model Chi-square needs to improve for a parameter to be included in the output.
modindices(fit_cfa, sort. = TRUE, minimum.value = 15)
lhs op rhs mi epc sepc.lv sepc.all sepc.nox
145 swan_1 ~~ swan_2 29.330 0.169 0.169 0.311 0.311
176 swan_5 ~~ swan_7 25.010 0.167 0.167 0.303 0.303
163 swan_3 ~~ swan_5 22.381 -0.179 -0.179 -0.267 -0.267
124 cldq_5 ~~ cldq_6 21.551 0.121 0.121 0.428 0.428
87 cldq_1 ~~ cldq_5 17.822 -0.110 -0.110 -0.248 -0.248
162 swan_3 ~~ swan_4 17.421 0.135 0.135 0.242 0.242
154 swan_2 ~~ swan_3 17.308 0.144 0.144 0.231 0.231
The largest modification index is associated with the residual covariance between the first two items of the SWAN (measuring attention). Both of these items contain the word ‘attention’ (see above), whereas none of the other items on this subscale contain that word. Thus, it may be reasonable that parents respond very similarly to these items because they share more in common than they do with the other items on the subscale.
6.5 Improving the Measurement Model
After adding this residual covariance to the model, we estimate and evaluate the model again.
<- '
big_cfa2 readprob =~ cldq_1 + cldq_2 + cldq_4 + cldq_5 + cldq_6
attention =~ swan_1 + swan_2 + swan_3 + swan_4 + swan_5 + swan_6 +
swan_7 + swan_8 + swan_9
hwp =~ 1* hpc_mean
hpc_mean ~~ 0.01572*hpc_mean
swan_1 ~~ swan_2
'
<- cfa(big_cfa2, kids,
fit_cfa2 estimator = "mlr",
missing = "fiml")
summary(fit_cfa2, fit.measures = T, estimates = F)
lavaan 0.6-19 ended normally after 73 iterations
Estimator ML
Optimization method NLMINB
Number of model parameters 48
Number of observations 441
Number of missing patterns 13
Model Test User Model:
Standard Scaled
Test Statistic 265.499 195.622
Degrees of freedom 87 87
P-value (Chi-square) 0.000 0.000
Scaling correction factor 1.357
Yuan-Bentler correction (Mplus variant)
Model Test Baseline Model:
Test statistic 5812.658 3750.370
Degrees of freedom 105 105
P-value 0.000 0.000
Scaling correction factor 1.550
User Model versus Baseline Model:
Comparative Fit Index (CFI) 0.969 0.970
Tucker-Lewis Index (TLI) 0.962 0.964
Robust Comparative Fit Index (CFI) 0.973
Robust Tucker-Lewis Index (TLI) 0.968
Loglikelihood and Information Criteria:
Loglikelihood user model (H0) -7392.891 -7392.891
Scaling correction factor 1.658
for the MLR correction
Loglikelihood unrestricted model (H1) -7260.142 -7260.142
Scaling correction factor 1.464
for the MLR correction
Akaike (AIC) 14881.782 14881.782
Bayesian (BIC) 15078.057 15078.057
Sample-size adjusted Bayesian (SABIC) 14925.727 14925.727
Root Mean Square Error of Approximation:
RMSEA 0.068 0.053
90 Percent confidence interval - lower 0.059 0.045
90 Percent confidence interval - upper 0.078 0.062
P-value H_0: RMSEA <= 0.050 0.001 0.259
P-value H_0: RMSEA >= 0.080 0.019 0.000
Robust RMSEA 0.065
90 Percent confidence interval - lower 0.053
90 Percent confidence interval - upper 0.077
P-value H_0: Robust RMSEA <= 0.050 0.023
P-value H_0: Robust RMSEA >= 0.080 0.019
Standardized Root Mean Square Residual:
SRMR 0.034 0.034
We can test if this model is a better fit to our data than the previous model (note that I used @nested
here to extract only the Chi-squared difference table).
<- compareFit(fit_cfa, fit_cfa2)
comp_12 @nested comp_12
Scaled Chi-Squared Difference Test (method = "satorra.bentler.2001")
lavaan->unknown():
lavaan NOTE: The "Chisq" column contains standard test statistics, not the
robust test that should be reported per model. A robust difference test is
a function of two standard (not robust) statistics.
Df AIC BIC Chisq Chisq diff Df diff Pr(>Chisq)
fit_cfa2 87 14882 15078 265.50
fit_cfa 88 14908 15101 294.26 11.369 1 0.0007469 ***
---
Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
What does the Chi-square difference test tell us?
Let’s see if there are any other large modification indices that are meaningfully larger than other modification indices:
modindices(fit_cfa2, sort. = TRUE, minimum.value = 15)
lhs op rhs mi epc sepc.lv sepc.all sepc.nox
163 swan_3 ~~ swan_5 23.062 -0.182 -0.182 -0.273 -0.273
125 cldq_5 ~~ cldq_6 21.523 0.121 0.121 0.428 0.428
176 swan_5 ~~ swan_7 20.546 0.150 0.150 0.281 0.281
154 swan_2 ~~ swan_3 18.952 0.146 0.146 0.225 0.225
162 swan_3 ~~ swan_4 18.763 0.141 0.141 0.254 0.254
88 cldq_1 ~~ cldq_5 17.806 -0.110 -0.110 -0.248 -0.248
There are still some modification indices > 15, but the first one does not necessarily make theoretical sense. The suggested modification is to add a residual covariance between SWAN item 3 and 5 (see above for content), but if we look at the epc
column (which stands for expected parameter change), it expects that that covariance is going to be negative. A negative residual covariance indicates that there may be an omitted common cause that affects each item in the opposite way. In other words, item 3 and 5 may have less in common than the measurement model predicted. While the content of item 3 and 5 is not extremely similar, they are still both hypothesized to measure attention on this well-validated scale. Thus, adding this residual covariance does not seem theoretically justified. It may just be a peculiarity of our sample.
The next set of modification indices have very similar values, thus, at least in statistical terms, there is no real reason to pick one over the other. CLDQ item 5 and 6 (see above) are both about reading problems that have emerged in the school setting (while other questions on this subscale are about reading problems more generally), so there may be theoretical justification to add this residual covariance. In addition, the epc
for this modification indicates that the standardized residual correlation (in sepc.all
column) between these two items is .43, which can be considered a large correlation. Compared to the next largest modification index, there is more evidence that adding this residual covariance is theoretically justified.
6.6 Improving the Measurement Model (Round 2)
After adding this residual covariance to the model, we estimate and evaluate the model again.
<- '
big_cfa3 readprob =~ cldq_1 + cldq_2 + cldq_4 + cldq_5 + cldq_6
attention =~ swan_1 + swan_2 + swan_3 + swan_4 + swan_5 + swan_6 +
swan_7 + swan_8 + swan_9
hwp =~ 1* hpc_mean
hpc_mean ~~ 0.01572*hpc_mean
swan_1 ~~ swan_2
cldq_5 ~~ cldq_6
'
<- cfa(big_cfa3, kids,
fit_cfa3 estimator = "mlr",
missing = "fiml")
summary(fit_cfa3, fit.measures = T, estimates = F)
lavaan 0.6-19 ended normally after 66 iterations
Estimator ML
Optimization method NLMINB
Number of model parameters 49
Number of observations 441
Number of missing patterns 13
Model Test User Model:
Standard Scaled
Test Statistic 246.103 184.173
Degrees of freedom 86 86
P-value (Chi-square) 0.000 0.000
Scaling correction factor 1.336
Yuan-Bentler correction (Mplus variant)
Model Test Baseline Model:
Test statistic 5812.658 3750.370
Degrees of freedom 105 105
P-value 0.000 0.000
Scaling correction factor 1.550
User Model versus Baseline Model:
Comparative Fit Index (CFI) 0.972 0.973
Tucker-Lewis Index (TLI) 0.966 0.967
Robust Comparative Fit Index (CFI) 0.976
Robust Tucker-Lewis Index (TLI) 0.971
Loglikelihood and Information Criteria:
Loglikelihood user model (H0) -7383.193 -7383.193
Scaling correction factor 1.688
for the MLR correction
Loglikelihood unrestricted model (H1) -7260.142 -7260.142
Scaling correction factor 1.464
for the MLR correction
Akaike (AIC) 14864.386 14864.386
Bayesian (BIC) 15064.749 15064.749
Sample-size adjusted Bayesian (SABIC) 14909.246 14909.246
Root Mean Square Error of Approximation:
RMSEA 0.065 0.051
90 Percent confidence interval - lower 0.056 0.042
90 Percent confidence interval - upper 0.075 0.060
P-value H_0: RMSEA <= 0.050 0.005 0.422
P-value H_0: RMSEA >= 0.080 0.005 0.000
Robust RMSEA 0.062
90 Percent confidence interval - lower 0.049
90 Percent confidence interval - upper 0.074
P-value H_0: Robust RMSEA <= 0.050 0.060
P-value H_0: Robust RMSEA >= 0.080 0.006
Standardized Root Mean Square Residual:
SRMR 0.036 0.036
We can test if this model is a better fit to our data than the previous model.
<- compareFit(fit_cfa3, fit_cfa2)
comp_23 @nested comp_23
Scaled Chi-Squared Difference Test (method = "satorra.bentler.2001")
lavaan->unknown():
lavaan NOTE: The "Chisq" column contains standard test statistics, not the
robust test that should be reported per model. A robust difference test is
a function of two standard (not robust) statistics.
Df AIC BIC Chisq Chisq diff Df diff Pr(>Chisq)
fit_cfa3 86 14864 15065 246.1
fit_cfa2 87 14882 15078 265.5 6.1408 1 0.01321 *
---
Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
What does the Chi-square difference test tell us?
Let’s see if there are any other large modification indices that are meaningfully larger than other modification indices:
modindices(fit_cfa3, sort. = TRUE, minimum.value = 15)
lhs op rhs mi epc sepc.lv sepc.all sepc.nox
163 swan_3 ~~ swan_5 23.119 -0.182 -0.182 -0.274 -0.274
176 swan_5 ~~ swan_7 20.511 0.150 0.150 0.280 0.280
154 swan_2 ~~ swan_3 18.939 0.146 0.146 0.225 0.225
162 swan_3 ~~ swan_4 18.746 0.141 0.141 0.253 0.253
We might consider adding the residual covariance between SWAN item 5 and 7. Both items mention ‘activities’, but there are many other items on this subscale that contain the word ‘activities’, so there is nothing special about this pair of items. Any further modifications may be purely data-driven (i.e., exploratory).
In addition to model fit, we can also consider how well the factor explain variance among the observed items.
lavInspect(fit_cfa3, what = "rsquare")
hpc_mean cldq_1 cldq_2 cldq_4 cldq_5 cldq_6 swan_1 swan_2
0.960 0.524 0.567 0.829 0.706 0.775 0.715 0.717
swan_3 swan_4 swan_5 swan_6 swan_7 swan_8 swan_9
0.584 0.823 0.790 0.720 0.792 0.674 0.759
Are the R-squared values sufficiently high to support that our measurement model is accounting for enough common variance among the items?
Finally, we can examine the reliability of our factors using coefficient Omega (note that the output does not include the hwp
factor, since that’s a single-indicator factor, and you need multiple indicators to compute coefficient Omega):
compRelSEM(fit_cfa)
readprob attention
0.920 0.962
Overall, I’m satisfied with the measurement model component of the SEM. The two residual covariances will need to be replicated in new samples to confirm that there is some shared cause among them that is separate from their main latent factors.
Note that R-squared and reliability are not affected by residual covariances, but they would be affected by added cross-loadings.
6.7 Structural Model
Now that we have finalized the measurement model, we can move on to the structural model. Let’s specify and estimate the hypothesized structural model:
<- '
big_sem # Measurement Model
readprob =~ cldq_1 + cldq_2 + cldq_4 + cldq_5 + cldq_6
attention =~ swan_1 + swan_2 + swan_3 + swan_4 + swan_5 + swan_6 +
swan_7 + swan_8 + swan_9
hwp =~ 1* hpc_mean
hpc_mean ~~ 0.01572*hpc_mean
swan_1 ~~ swan_2
cldq_5 ~~ cldq_6
# Structural Model
readprob ~ b*hwp
hwp ~ a*attention
# Indirect Effects
ind.att.hwp := a*b
'
<- sem(big_sem, kids,
fit_sem estimator = "mlr",
missing = "fiml")
summary(fit_sem, fit.measures = T, estimates = F)
lavaan 0.6-19 ended normally after 60 iterations
Estimator ML
Optimization method NLMINB
Number of model parameters 48
Number of observations 441
Number of missing patterns 13
Model Test User Model:
Standard Scaled
Test Statistic 249.799 187.394
Degrees of freedom 87 87
P-value (Chi-square) 0.000 0.000
Scaling correction factor 1.333
Yuan-Bentler correction (Mplus variant)
Model Test Baseline Model:
Test statistic 5812.658 3750.370
Degrees of freedom 105 105
P-value 0.000 0.000
Scaling correction factor 1.550
User Model versus Baseline Model:
Comparative Fit Index (CFI) 0.971 0.972
Tucker-Lewis Index (TLI) 0.966 0.967
Robust Comparative Fit Index (CFI) 0.976
Robust Tucker-Lewis Index (TLI) 0.971
Loglikelihood and Information Criteria:
Loglikelihood user model (H0) -7385.041 -7385.041
Scaling correction factor 1.702
for the MLR correction
Loglikelihood unrestricted model (H1) -7260.142 -7260.142
Scaling correction factor 1.464
for the MLR correction
Akaike (AIC) 14866.082 14866.082
Bayesian (BIC) 15062.356 15062.356
Sample-size adjusted Bayesian (SABIC) 14910.027 14910.027
Root Mean Square Error of Approximation:
RMSEA 0.065 0.051
90 Percent confidence interval - lower 0.056 0.042
90 Percent confidence interval - upper 0.075 0.060
P-value H_0: RMSEA <= 0.050 0.004 0.401
P-value H_0: RMSEA >= 0.080 0.005 0.000
Robust RMSEA 0.062
90 Percent confidence interval - lower 0.050
90 Percent confidence interval - upper 0.074
P-value H_0: Robust RMSEA <= 0.050 0.055
P-value H_0: Robust RMSEA >= 0.080 0.006
Standardized Root Mean Square Residual:
SRMR 0.046 0.046
We can compare this model to a just identified structural model that includes the direct effect from Attention to Reading Problems:
<- '
big_sem2 # Measurement Model
readprob =~ cldq_1 + cldq_2 + cldq_4 + cldq_5 + cldq_6
attention =~ swan_1 + swan_2 + swan_3 + swan_4 + swan_5 + swan_6 +
swan_7 + swan_8 + swan_9
hwp =~ 1* hpc_mean
hpc_mean ~~ 0.01572*hpc_mean
swan_1 ~~ swan_2
cldq_5 ~~ cldq_6
# Structural Model
readprob ~ b*hwp + c*attention
hwp ~ a*attention
# Indirect Effects
ind.att.hwp := a*b
tot.att := c + a*b
'
<- sem(big_sem2, kids,
fit_sem2 estimator = "mlr",
missing = "fiml")
<- compareFit(fit_sem, fit_sem2)
comp_34 @nested comp_34
Scaled Chi-Squared Difference Test (method = "satorra.bentler.2001")
lavaan->unknown():
lavaan NOTE: The "Chisq" column contains standard test statistics, not the
robust test that should be reported per model. A robust difference test is
a function of two standard (not robust) statistics.
Df AIC BIC Chisq Chisq diff Df diff Pr(>Chisq)
fit_sem2 86 14864 15065 246.1
fit_sem 87 14866 15062 249.8 3.5066 1 0.06112 .
---
Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
What does the Chi-square difference test tell us?
6.8 Parameter Estimate Interpretation
Now that we’ve finalized our structural model, we can interpret the parameter estimates. We will re-estimate the final model, using bootstrapping to get confidence intervals for the indirect effect. Note that we cannot combine mlr
with bootstrapping, because the bootstrap method itself is a way of addressing non-normality. So we change the estimator back to ml
.
<- sem(big_sem, kids,
fit_semb estimator = "ml",
missing = "fiml",
se = "bootstrap",
bootstrap = 1000,
iseed = 8789)
parameterEstimates(fit_semb, boot.ci.type = "bca.simple",
ci = TRUE, se = TRUE,
zstat = FALSE, pvalue = FALSE,
output = "text")
Latent Variables:
Estimate Std.Err ci.lower ci.upper
readprob =~
cldq_1 1.000 1.000 1.000
cldq_2 0.765 0.064 0.622 0.877
cldq_4 1.272 0.076 1.134 1.438
cldq_5 1.257 0.083 1.113 1.444
cldq_6 1.326 0.083 1.186 1.509
attention =~
swan_1 1.000 1.000 1.000
swan_2 0.940 0.035 0.865 1.001
swan_3 0.831 0.049 0.735 0.929
swan_4 1.092 0.043 1.015 1.183
swan_5 1.175 0.049 1.085 1.271
swan_6 1.032 0.048 0.940 1.123
swan_7 1.104 0.051 1.013 1.204
swan_8 1.020 0.051 0.937 1.139
swan_9 1.030 0.040 0.956 1.109
hwp =~
hpc_mean 1.000 1.000 1.000
Regressions:
Estimate Std.Err ci.lower ci.upper
readprob ~
hwp (b) 0.687 0.074 0.545 0.833
hwp ~
attention (a) -0.360 0.022 -0.403 -0.318
Covariances:
Estimate Std.Err ci.lower ci.upper
.swan_1 ~~
.swan_2 0.171 0.050 0.075 0.282
.cldq_5 ~~
.cldq_6 0.118 0.044 0.030 0.214
Intercepts:
Estimate Std.Err ci.lower ci.upper
.cldq_1 1.988 0.053 1.877 2.086
.cldq_2 1.327 0.038 1.249 1.401
.cldq_4 1.646 0.054 1.535 1.746
.cldq_5 1.663 0.059 1.545 1.784
.cldq_6 1.662 0.058 1.553 1.782
.swan_1 4.541 0.073 4.410 4.703
.swan_2 4.720 0.069 4.597 4.876
.swan_3 4.838 0.067 4.710 4.974
.swan_4 4.624 0.075 4.476 4.774
.swan_5 4.427 0.083 4.272 4.593
.swan_6 4.646 0.078 4.501 4.813
.swan_7 4.529 0.080 4.379 4.701
.swan_8 4.055 0.079 3.908 4.220
.swan_9 4.541 0.073 4.397 4.693
.hpc_mean 1.676 0.031 1.619 1.738
.readprob 0.000 0.000 0.000
attention 0.000 0.000 0.000
.hwp 0.000 0.000 0.000
Variances:
Estimate Std.Err ci.lower ci.upper
.hpc_mean 0.016 0.016 0.016
.cldq_1 0.570 0.058 0.460 0.690
.cldq_2 0.279 0.039 0.213 0.369
.cldq_4 0.207 0.047 0.127 0.321
.cldq_5 0.418 0.071 0.290 0.570
.cldq_6 0.324 0.044 0.247 0.434
.swan_1 0.623 0.072 0.497 0.790
.swan_2 0.546 0.062 0.439 0.687
.swan_3 0.769 0.080 0.628 0.944
.swan_4 0.400 0.041 0.327 0.482
.swan_5 0.573 0.070 0.460 0.751
.swan_6 0.648 0.102 0.490 0.916
.swan_7 0.499 0.093 0.357 0.732
.swan_8 0.784 0.082 0.645 0.986
.swan_9 0.525 0.058 0.424 0.649
.readprob 0.449 0.065 0.329 0.582
attention 1.561 0.141 1.294 1.857
.hwp 0.174 0.021 0.139 0.218
Defined Parameters:
Estimate Std.Err ci.lower ci.upper
ind.att.hwp -0.247 0.031 -0.310 -0.187
lavInspect(fit_semb, what = "rsquare")
hpc_mean cldq_1 cldq_2 cldq_4 cldq_5 cldq_6 swan_1 swan_2
0.960 0.524 0.568 0.831 0.704 0.773 0.715 0.717
swan_3 swan_4 swan_5 swan_6 swan_7 swan_8 swan_9 readprob
0.584 0.823 0.790 0.719 0.792 0.674 0.759 0.283
hwp
0.537
The direct effect of attention on homework problems is negative (b = -.358, 95% CI = [-.402,-.317]),indicating that kids with higher attention levels experience fewer homework problems. The direct effect of homework problems on reading problems is positive (b = .690, 95% CI = [.547, .833]), indicating that kids with more homework problems experience more reading problems. This means that the indirect effect of attention on reading problems through homework problems is negative (b = -.247, 95% CI = [-.309,-.191]), which indicates that the decrease in homework problems that is associated with a one-unit increase in attention is associated with a decrease in reading problems. Overall, the model explains 28% of the variability in homework problems and 54% of the variability in reading problems.
In a paper, you could report all parameter values (including those from the measurement model) as unstandardized + 95% CI and standardized + SE in a table and/or figure.
6.9 Interaction Effects with Latent Variables
Next, I will use these data to illustrate the process of estimating interaction effects between latent factors. Here, we will test the hypothesis that Attentiveness moderates the association between Reading Problems and Homework Problems (which acts as the outcome variable here!).
We can use the modsem
package to help estimate these interaction effects. To include the interaction effect in the model specification, you do the following (note structural model section):
<- '
big_sem3 # Measurement Model
readprob =~ cldq_1 + cldq_2 + cldq_4 + cldq_5 + cldq_6
attention =~ swan_1 + swan_2 + swan_3 + swan_4 + swan_5 + swan_6 +
swan_7 + swan_8 + swan_9
hwp =~ 1* hpc_mean
hpc_mean ~~ 0.01572*hpc_mean
swan_1 ~~ swan_2
cldq_5 ~~ cldq_6
# Structural Model
hwp ~ readprob + attention + readprob:attention
'
There are different methods for estimating an interaction effect between latent variables. Three main options are: the product indicator approach (using double centering), latent model structural equations (LMS), and quasi maximum likelihood estimation (QML). Of these, LMS and QML take more time to estimate and require complete data (but may be more accurate than the product indicator approach and have options for comparing model fit with and without the interaction effect). Here, I will demonstrate the product indicator approach (but example code is also included for the other two methods).
<- modsem(model = big_sem3,
fit_sem_mod_pi data = kids,
missing = "fiml",
method = "dblcent")
# verbose = TRUE print model estimation progress in the console
# fit_sem_mod_lms <- modsem(model = big_sem3,
# data = kids,
# method = "lms", verbose = TRUE)
# fit_sem_mod_qml <- modsem(model = big_sem3,
# data = kids,
# method = "qml", verbose = TRUE)
Parameter Estimate Interpretation
With the product indicator approach, the modsem
package helps compute all the product terms and write all of the lavaan model syntax. The output of this model is very extensive (lots of loadings, lots of residual variances, indicator intercepts, and a bunch of residual covariances which need to be fixed to 0). We will check that all factor loadings are significant and then focus on the regression estimates:
summary(fit_sem_mod_pi, std = T)
modsem (version 1.0.5, approach = dblcent):
lavaan 0.6-19 ended normally after 593 iterations
Estimator ML
Optimization method NLMINB
Number of model parameters 457
Number of observations 441
Number of missing patterns 13
Model Test User Model:
Test statistic 4555.221
Degrees of freedom 1433
P-value (Chi-square) 0.000
Parameter Estimates:
Standard errors Standard
Information Observed
Observed information based on Hessian
Latent Variables:
Estimate Std.Err z-value P(>|z|) Std.lv Std.all
readprob =~
cldq_1 1.000 0.780 0.713
cldq_2 0.774 0.051 15.278 0.000 0.603 0.750
cldq_4 1.280 0.070 18.225 0.000 0.998 0.903
cldq_5 1.295 0.078 16.622 0.000 1.009 0.850
cldq_6 1.367 0.078 17.506 0.000 1.066 0.893
attention =~
swan_1 1.000 1.251 0.846
swan_2 0.939 0.036 26.302 0.000 1.175 0.847
swan_3 0.830 0.044 18.766 0.000 1.039 0.764
swan_4 1.091 0.044 25.033 0.000 1.365 0.907
swan_5 1.173 0.049 24.116 0.000 1.468 0.889
swan_6 1.030 0.046 22.217 0.000 1.289 0.848
swan_7 1.103 0.046 24.104 0.000 1.379 0.890
swan_8 1.018 0.049 20.969 0.000 1.273 0.821
swan_9 1.028 0.044 23.315 0.000 1.287 0.871
hwp =~
hpc_mean 1.000 0.613 0.980
readprobattention =~
cldq_1swan_1 1.000 0.897 0.534
cldq_2swan_1 0.961 0.085 11.357 0.000 0.862 0.656
cldq_4swan_1 1.537 0.121 12.738 0.000 1.378 0.791
cldq_5swan_1 1.776 0.144 12.355 0.000 1.592 0.791
cldq_6swan_1 1.883 0.147 12.819 0.000 1.688 0.830
cldq_1swan_2 0.999 0.068 14.589 0.000 0.895 0.542
cldq_2swan_2 0.919 0.093 9.887 0.000 0.824 0.659
cldq_4swan_2 1.428 0.132 10.787 0.000 1.281 0.764
cldq_5swan_2 1.717 0.158 10.880 0.000 1.539 0.797
cldq_6swan_2 1.806 0.163 11.105 0.000 1.619 0.828
cldq_1swan_3 0.836 0.075 11.132 0.000 0.750 0.470
cldq_2swan_3 0.678 0.085 7.976 0.000 0.608 0.484
cldq_4swan_3 1.186 0.125 9.504 0.000 1.063 0.621
cldq_5swan_3 1.357 0.142 9.531 0.000 1.217 0.633
cldq_6swan_3 1.432 0.146 9.831 0.000 1.284 0.666
cldq_1swan_4 1.115 0.074 15.061 0.000 0.999 0.582
cldq_2swan_4 0.976 0.100 9.721 0.000 0.875 0.644
cldq_4swan_4 1.557 0.142 10.952 0.000 1.396 0.797
cldq_5swan_4 1.786 0.165 10.853 0.000 1.601 0.806
cldq_6swan_4 1.874 0.169 11.094 0.000 1.680 0.836
cldq_1swan_5 1.200 0.081 14.857 0.000 1.076 0.552
cldq_2swan_5 1.146 0.118 9.692 0.000 1.028 0.653
cldq_4swan_5 1.734 0.159 10.908 0.000 1.554 0.794
cldq_5swan_5 2.032 0.190 10.723 0.000 1.822 0.788
cldq_6swan_5 2.146 0.193 11.094 0.000 1.924 0.841
cldq_1swan_6 1.163 0.082 14.100 0.000 1.042 0.560
cldq_2swan_6 1.182 0.117 10.097 0.000 1.059 0.694
cldq_4swan_6 1.708 0.157 10.909 0.000 1.531 0.797
cldq_5swan_6 1.968 0.183 10.751 0.000 1.765 0.796
cldq_6swan_6 2.076 0.187 11.095 0.000 1.862 0.842
cldq_1swan_7 1.092 0.078 13.929 0.000 0.979 0.531
cldq_2swan_7 1.089 0.115 9.472 0.000 0.976 0.631
cldq_4swan_7 1.605 0.148 10.825 0.000 1.439 0.784
cldq_5swan_7 1.876 0.174 10.778 0.000 1.682 0.803
cldq_6swan_7 2.009 0.183 10.960 0.000 1.801 0.830
cldq_1swan_8 1.106 0.086 12.928 0.000 0.992 0.519
cldq_2swan_8 1.117 0.119 9.412 0.000 1.001 0.617
cldq_4swan_8 1.634 0.156 10.505 0.000 1.465 0.734
cldq_5swan_8 1.898 0.178 10.649 0.000 1.702 0.773
cldq_6swan_8 2.027 0.186 10.877 0.000 1.817 0.805
cldq_1swan_9 1.072 0.079 13.593 0.000 0.961 0.529
cldq_2swan_9 1.023 0.104 9.842 0.000 0.917 0.660
cldq_4swan_9 1.569 0.147 10.708 0.000 1.407 0.767
cldq_5swan_9 1.799 0.170 10.604 0.000 1.613 0.774
cldq_6swan_9 1.882 0.172 10.955 0.000 1.687 0.822
Regressions:
Estimate Std.Err z-value P(>|z|) Std.lv Std.all
hwp ~
readprob 0.108 0.037 2.913 0.004 0.137 0.137
attention -0.309 0.021 -14.716 0.000 -0.630 -0.630
readprobattntn -0.139 0.031 -4.430 0.000 -0.203 -0.203
Covariances:
Estimate Std.Err z-value P(>|z|) Std.lv Std.all
.swan_1 ~~
.swan_2 0.169 0.035 4.829 0.000 0.169 0.290
.cldq_5 ~~
.cldq_6 0.087 0.028 3.118 0.002 0.087 0.260
.cldq_1swan_1 ~~
.cldq_2swan_2 0.000 0.000 0.000
.cldq_2swan_3 0.000 0.000 0.000
.cldq_2swan_4 0.000 0.000 0.000
.cldq_2swan_5 0.000 0.000 0.000
.cldq_2swan_6 0.000 0.000 0.000
.cldq_2swan_7 0.000 0.000 0.000
.cldq_2swan_8 0.000 0.000 0.000
.cldq_2swan_9 0.000 0.000 0.000
.cldq_4swan_2 0.000 0.000 0.000
.cldq_4swan_3 0.000 0.000 0.000
.cldq_4swan_4 0.000 0.000 0.000
.cldq_4swan_5 0.000 0.000 0.000
.cldq_4swan_6 0.000 0.000 0.000
.cldq_4swan_7 0.000 0.000 0.000
.cldq_4swan_8 0.000 0.000 0.000
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_2 0.000 0.000 0.000
.cldq_5swan_3 0.000 0.000 0.000
.cldq_5swan_4 0.000 0.000 0.000
.cldq_5swan_5 0.000 0.000 0.000
.cldq_5swan_6 0.000 0.000 0.000
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_2 0.000 0.000 0.000
.cldq_6swan_3 0.000 0.000 0.000
.cldq_6swan_4 0.000 0.000 0.000
.cldq_6swan_5 0.000 0.000 0.000
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_2swan_1 ~~
.cldq_1swan_2 0.000 0.000 0.000
.cldq_1swan_2 ~~
.cldq_2swan_3 0.000 0.000 0.000
.cldq_2swan_4 0.000 0.000 0.000
.cldq_2swan_5 0.000 0.000 0.000
.cldq_2swan_6 0.000 0.000 0.000
.cldq_2swan_7 0.000 0.000 0.000
.cldq_2swan_8 0.000 0.000 0.000
.cldq_2swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_1swan_2 0.000 0.000 0.000
.cldq_1swan_2 ~~
.cldq_4swan_3 0.000 0.000 0.000
.cldq_4swan_4 0.000 0.000 0.000
.cldq_4swan_5 0.000 0.000 0.000
.cldq_4swan_6 0.000 0.000 0.000
.cldq_4swan_7 0.000 0.000 0.000
.cldq_4swan_8 0.000 0.000 0.000
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_1swan_2 0.000 0.000 0.000
.cldq_1swan_2 ~~
.cldq_5swan_3 0.000 0.000 0.000
.cldq_5swan_4 0.000 0.000 0.000
.cldq_5swan_5 0.000 0.000 0.000
.cldq_5swan_6 0.000 0.000 0.000
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_1swan_2 0.000 0.000 0.000
.cldq_1swan_2 ~~
.cldq_6swan_3 0.000 0.000 0.000
.cldq_6swan_4 0.000 0.000 0.000
.cldq_6swan_5 0.000 0.000 0.000
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_2swan_1 ~~
.cldq_1swan_3 0.000 0.000 0.000
.cldq_2swan_2 ~~
.cldq_1swan_3 0.000 0.000 0.000
.cldq_1swan_3 ~~
.cldq_2swan_4 0.000 0.000 0.000
.cldq_2swan_5 0.000 0.000 0.000
.cldq_2swan_6 0.000 0.000 0.000
.cldq_2swan_7 0.000 0.000 0.000
.cldq_2swan_8 0.000 0.000 0.000
.cldq_2swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_1swan_3 0.000 0.000 0.000
.cldq_4swan_2 ~~
.cldq_1swan_3 0.000 0.000 0.000
.cldq_1swan_3 ~~
.cldq_4swan_4 0.000 0.000 0.000
.cldq_4swan_5 0.000 0.000 0.000
.cldq_4swan_6 0.000 0.000 0.000
.cldq_4swan_7 0.000 0.000 0.000
.cldq_4swan_8 0.000 0.000 0.000
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_1swan_3 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_1swan_3 0.000 0.000 0.000
.cldq_1swan_3 ~~
.cldq_5swan_4 0.000 0.000 0.000
.cldq_5swan_5 0.000 0.000 0.000
.cldq_5swan_6 0.000 0.000 0.000
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_1swan_3 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_1swan_3 0.000 0.000 0.000
.cldq_1swan_3 ~~
.cldq_6swan_4 0.000 0.000 0.000
.cldq_6swan_5 0.000 0.000 0.000
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_2swan_1 ~~
.cldq_1swan_4 0.000 0.000 0.000
.cldq_2swan_2 ~~
.cldq_1swan_4 0.000 0.000 0.000
.cldq_2swan_3 ~~
.cldq_1swan_4 0.000 0.000 0.000
.cldq_1swan_4 ~~
.cldq_2swan_5 0.000 0.000 0.000
.cldq_2swan_6 0.000 0.000 0.000
.cldq_2swan_7 0.000 0.000 0.000
.cldq_2swan_8 0.000 0.000 0.000
.cldq_2swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_1swan_4 0.000 0.000 0.000
.cldq_4swan_2 ~~
.cldq_1swan_4 0.000 0.000 0.000
.cldq_4swan_3 ~~
.cldq_1swan_4 0.000 0.000 0.000
.cldq_1swan_4 ~~
.cldq_4swan_5 0.000 0.000 0.000
.cldq_4swan_6 0.000 0.000 0.000
.cldq_4swan_7 0.000 0.000 0.000
.cldq_4swan_8 0.000 0.000 0.000
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_1swan_4 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_1swan_4 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_1swan_4 0.000 0.000 0.000
.cldq_1swan_4 ~~
.cldq_5swan_5 0.000 0.000 0.000
.cldq_5swan_6 0.000 0.000 0.000
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_1swan_4 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_1swan_4 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_1swan_4 0.000 0.000 0.000
.cldq_1swan_4 ~~
.cldq_6swan_5 0.000 0.000 0.000
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_2swan_1 ~~
.cldq_1swan_5 0.000 0.000 0.000
.cldq_2swan_2 ~~
.cldq_1swan_5 0.000 0.000 0.000
.cldq_2swan_3 ~~
.cldq_1swan_5 0.000 0.000 0.000
.cldq_2swan_4 ~~
.cldq_1swan_5 0.000 0.000 0.000
.cldq_1swan_5 ~~
.cldq_2swan_6 0.000 0.000 0.000
.cldq_2swan_7 0.000 0.000 0.000
.cldq_2swan_8 0.000 0.000 0.000
.cldq_2swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_1swan_5 0.000 0.000 0.000
.cldq_4swan_2 ~~
.cldq_1swan_5 0.000 0.000 0.000
.cldq_4swan_3 ~~
.cldq_1swan_5 0.000 0.000 0.000
.cldq_4swan_4 ~~
.cldq_1swan_5 0.000 0.000 0.000
.cldq_1swan_5 ~~
.cldq_4swan_6 0.000 0.000 0.000
.cldq_4swan_7 0.000 0.000 0.000
.cldq_4swan_8 0.000 0.000 0.000
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_1swan_5 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_1swan_5 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_1swan_5 0.000 0.000 0.000
.cldq_5swan_4 ~~
.cldq_1swan_5 0.000 0.000 0.000
.cldq_1swan_5 ~~
.cldq_5swan_6 0.000 0.000 0.000
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_1swan_5 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_1swan_5 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_1swan_5 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_1swan_5 0.000 0.000 0.000
.cldq_1swan_5 ~~
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_2swan_1 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_2swan_2 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_2swan_3 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_2swan_4 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_2swan_5 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_1swan_6 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_2swan_8 0.000 0.000 0.000
.cldq_2swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_4swan_2 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_4swan_3 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_4swan_4 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_4swan_5 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_1swan_6 ~~
.cldq_4swan_7 0.000 0.000 0.000
.cldq_4swan_8 0.000 0.000 0.000
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_5swan_4 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_5swan_5 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_1swan_6 ~~
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_6swan_5 ~~
.cldq_1swan_6 0.000 0.000 0.000
.cldq_1swan_6 ~~
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_2swan_1 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_2swan_2 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_2swan_3 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_2swan_4 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_2swan_5 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_2swan_6 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_1swan_7 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_2swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_4swan_2 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_4swan_3 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_4swan_4 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_4swan_5 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_4swan_6 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_1swan_7 ~~
.cldq_4swan_8 0.000 0.000 0.000
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_5swan_4 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_5swan_5 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_5swan_6 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_1swan_7 ~~
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_6swan_5 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_6swan_6 ~~
.cldq_1swan_7 0.000 0.000 0.000
.cldq_1swan_7 ~~
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_2swan_1 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_2swan_2 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_2swan_3 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_2swan_4 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_2swan_5 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_2swan_6 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_2swan_7 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_1swan_8 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_4swan_2 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_4swan_3 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_4swan_4 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_4swan_5 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_4swan_6 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_4swan_7 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_1swan_8 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_5swan_4 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_5swan_5 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_5swan_6 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_5swan_7 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_1swan_8 ~~
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_6swan_5 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_6swan_6 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_6swan_7 ~~
.cldq_1swan_8 0.000 0.000 0.000
.cldq_1swan_8 ~~
.cldq_6swan_9 0.000 0.000 0.000
.cldq_2swan_1 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_2swan_2 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_2swan_3 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_2swan_4 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_2swan_5 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_2swan_6 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_2swan_7 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_2swan_8 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_4swan_2 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_4swan_3 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_4swan_4 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_4swan_5 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_4swan_6 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_4swan_7 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_4swan_8 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_5swan_4 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_5swan_5 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_5swan_6 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_5swan_7 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_5swan_8 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_6swan_5 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_6swan_6 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_6swan_7 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_6swan_8 ~~
.cldq_1swan_9 0.000 0.000 0.000
.cldq_2swan_1 ~~
.cldq_4swan_2 0.000 0.000 0.000
.cldq_4swan_3 0.000 0.000 0.000
.cldq_4swan_4 0.000 0.000 0.000
.cldq_4swan_5 0.000 0.000 0.000
.cldq_4swan_6 0.000 0.000 0.000
.cldq_4swan_7 0.000 0.000 0.000
.cldq_4swan_8 0.000 0.000 0.000
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_2 0.000 0.000 0.000
.cldq_5swan_3 0.000 0.000 0.000
.cldq_5swan_4 0.000 0.000 0.000
.cldq_5swan_5 0.000 0.000 0.000
.cldq_5swan_6 0.000 0.000 0.000
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_2 0.000 0.000 0.000
.cldq_6swan_3 0.000 0.000 0.000
.cldq_6swan_4 0.000 0.000 0.000
.cldq_6swan_5 0.000 0.000 0.000
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_2swan_2 0.000 0.000 0.000
.cldq_2swan_2 ~~
.cldq_4swan_3 0.000 0.000 0.000
.cldq_4swan_4 0.000 0.000 0.000
.cldq_4swan_5 0.000 0.000 0.000
.cldq_4swan_6 0.000 0.000 0.000
.cldq_4swan_7 0.000 0.000 0.000
.cldq_4swan_8 0.000 0.000 0.000
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_2swan_2 0.000 0.000 0.000
.cldq_2swan_2 ~~
.cldq_5swan_3 0.000 0.000 0.000
.cldq_5swan_4 0.000 0.000 0.000
.cldq_5swan_5 0.000 0.000 0.000
.cldq_5swan_6 0.000 0.000 0.000
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_2swan_2 0.000 0.000 0.000
.cldq_2swan_2 ~~
.cldq_6swan_3 0.000 0.000 0.000
.cldq_6swan_4 0.000 0.000 0.000
.cldq_6swan_5 0.000 0.000 0.000
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_2swan_3 0.000 0.000 0.000
.cldq_4swan_2 ~~
.cldq_2swan_3 0.000 0.000 0.000
.cldq_2swan_3 ~~
.cldq_4swan_4 0.000 0.000 0.000
.cldq_4swan_5 0.000 0.000 0.000
.cldq_4swan_6 0.000 0.000 0.000
.cldq_4swan_7 0.000 0.000 0.000
.cldq_4swan_8 0.000 0.000 0.000
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_2swan_3 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_2swan_3 0.000 0.000 0.000
.cldq_2swan_3 ~~
.cldq_5swan_4 0.000 0.000 0.000
.cldq_5swan_5 0.000 0.000 0.000
.cldq_5swan_6 0.000 0.000 0.000
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_2swan_3 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_2swan_3 0.000 0.000 0.000
.cldq_2swan_3 ~~
.cldq_6swan_4 0.000 0.000 0.000
.cldq_6swan_5 0.000 0.000 0.000
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_2swan_4 0.000 0.000 0.000
.cldq_4swan_2 ~~
.cldq_2swan_4 0.000 0.000 0.000
.cldq_4swan_3 ~~
.cldq_2swan_4 0.000 0.000 0.000
.cldq_2swan_4 ~~
.cldq_4swan_5 0.000 0.000 0.000
.cldq_4swan_6 0.000 0.000 0.000
.cldq_4swan_7 0.000 0.000 0.000
.cldq_4swan_8 0.000 0.000 0.000
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_2swan_4 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_2swan_4 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_2swan_4 0.000 0.000 0.000
.cldq_2swan_4 ~~
.cldq_5swan_5 0.000 0.000 0.000
.cldq_5swan_6 0.000 0.000 0.000
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_2swan_4 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_2swan_4 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_2swan_4 0.000 0.000 0.000
.cldq_2swan_4 ~~
.cldq_6swan_5 0.000 0.000 0.000
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_2swan_5 0.000 0.000 0.000
.cldq_4swan_2 ~~
.cldq_2swan_5 0.000 0.000 0.000
.cldq_4swan_3 ~~
.cldq_2swan_5 0.000 0.000 0.000
.cldq_4swan_4 ~~
.cldq_2swan_5 0.000 0.000 0.000
.cldq_2swan_5 ~~
.cldq_4swan_6 0.000 0.000 0.000
.cldq_4swan_7 0.000 0.000 0.000
.cldq_4swan_8 0.000 0.000 0.000
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_2swan_5 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_2swan_5 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_2swan_5 0.000 0.000 0.000
.cldq_5swan_4 ~~
.cldq_2swan_5 0.000 0.000 0.000
.cldq_2swan_5 ~~
.cldq_5swan_6 0.000 0.000 0.000
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_2swan_5 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_2swan_5 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_2swan_5 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_2swan_5 0.000 0.000 0.000
.cldq_2swan_5 ~~
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_2swan_6 0.000 0.000 0.000
.cldq_4swan_2 ~~
.cldq_2swan_6 0.000 0.000 0.000
.cldq_4swan_3 ~~
.cldq_2swan_6 0.000 0.000 0.000
.cldq_4swan_4 ~~
.cldq_2swan_6 0.000 0.000 0.000
.cldq_4swan_5 ~~
.cldq_2swan_6 0.000 0.000 0.000
.cldq_2swan_6 ~~
.cldq_4swan_7 0.000 0.000 0.000
.cldq_4swan_8 0.000 0.000 0.000
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_2swan_6 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_2swan_6 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_2swan_6 0.000 0.000 0.000
.cldq_5swan_4 ~~
.cldq_2swan_6 0.000 0.000 0.000
.cldq_5swan_5 ~~
.cldq_2swan_6 0.000 0.000 0.000
.cldq_2swan_6 ~~
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_2swan_6 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_2swan_6 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_2swan_6 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_2swan_6 0.000 0.000 0.000
.cldq_6swan_5 ~~
.cldq_2swan_6 0.000 0.000 0.000
.cldq_2swan_6 ~~
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_4swan_2 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_4swan_3 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_4swan_4 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_4swan_5 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_4swan_6 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_2swan_7 ~~
.cldq_4swan_8 0.000 0.000 0.000
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_5swan_4 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_5swan_5 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_5swan_6 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_2swan_7 ~~
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_6swan_5 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_6swan_6 ~~
.cldq_2swan_7 0.000 0.000 0.000
.cldq_2swan_7 ~~
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_4swan_2 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_4swan_3 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_4swan_4 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_4swan_5 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_4swan_6 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_4swan_7 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_2swan_8 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_5swan_4 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_5swan_5 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_5swan_6 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_5swan_7 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_2swan_8 ~~
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_6swan_5 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_6swan_6 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_6swan_7 ~~
.cldq_2swan_8 0.000 0.000 0.000
.cldq_2swan_8 ~~
.cldq_6swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_4swan_2 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_4swan_3 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_4swan_4 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_4swan_5 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_4swan_6 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_4swan_7 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_4swan_8 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_5swan_4 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_5swan_5 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_5swan_6 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_5swan_7 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_5swan_8 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_6swan_5 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_6swan_6 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_6swan_7 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_6swan_8 ~~
.cldq_2swan_9 0.000 0.000 0.000
.cldq_4swan_1 ~~
.cldq_5swan_2 0.000 0.000 0.000
.cldq_5swan_3 0.000 0.000 0.000
.cldq_5swan_4 0.000 0.000 0.000
.cldq_5swan_5 0.000 0.000 0.000
.cldq_5swan_6 0.000 0.000 0.000
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_2 0.000 0.000 0.000
.cldq_6swan_3 0.000 0.000 0.000
.cldq_6swan_4 0.000 0.000 0.000
.cldq_6swan_5 0.000 0.000 0.000
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_4swan_2 0.000 0.000 0.000
.cldq_4swan_2 ~~
.cldq_5swan_3 0.000 0.000 0.000
.cldq_5swan_4 0.000 0.000 0.000
.cldq_5swan_5 0.000 0.000 0.000
.cldq_5swan_6 0.000 0.000 0.000
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_4swan_2 0.000 0.000 0.000
.cldq_4swan_2 ~~
.cldq_6swan_3 0.000 0.000 0.000
.cldq_6swan_4 0.000 0.000 0.000
.cldq_6swan_5 0.000 0.000 0.000
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_4swan_3 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_4swan_3 0.000 0.000 0.000
.cldq_4swan_3 ~~
.cldq_5swan_4 0.000 0.000 0.000
.cldq_5swan_5 0.000 0.000 0.000
.cldq_5swan_6 0.000 0.000 0.000
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_4swan_3 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_4swan_3 0.000 0.000 0.000
.cldq_4swan_3 ~~
.cldq_6swan_4 0.000 0.000 0.000
.cldq_6swan_5 0.000 0.000 0.000
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_4swan_4 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_4swan_4 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_4swan_4 0.000 0.000 0.000
.cldq_4swan_4 ~~
.cldq_5swan_5 0.000 0.000 0.000
.cldq_5swan_6 0.000 0.000 0.000
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_4swan_4 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_4swan_4 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_4swan_4 0.000 0.000 0.000
.cldq_4swan_4 ~~
.cldq_6swan_5 0.000 0.000 0.000
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_4swan_5 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_4swan_5 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_4swan_5 0.000 0.000 0.000
.cldq_5swan_4 ~~
.cldq_4swan_5 0.000 0.000 0.000
.cldq_4swan_5 ~~
.cldq_5swan_6 0.000 0.000 0.000
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_4swan_5 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_4swan_5 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_4swan_5 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_4swan_5 0.000 0.000 0.000
.cldq_4swan_5 ~~
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_4swan_6 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_4swan_6 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_4swan_6 0.000 0.000 0.000
.cldq_5swan_4 ~~
.cldq_4swan_6 0.000 0.000 0.000
.cldq_5swan_5 ~~
.cldq_4swan_6 0.000 0.000 0.000
.cldq_4swan_6 ~~
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_4swan_6 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_4swan_6 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_4swan_6 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_4swan_6 0.000 0.000 0.000
.cldq_6swan_5 ~~
.cldq_4swan_6 0.000 0.000 0.000
.cldq_4swan_6 ~~
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_4swan_7 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_4swan_7 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_4swan_7 0.000 0.000 0.000
.cldq_5swan_4 ~~
.cldq_4swan_7 0.000 0.000 0.000
.cldq_5swan_5 ~~
.cldq_4swan_7 0.000 0.000 0.000
.cldq_5swan_6 ~~
.cldq_4swan_7 0.000 0.000 0.000
.cldq_4swan_7 ~~
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_4swan_7 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_4swan_7 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_4swan_7 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_4swan_7 0.000 0.000 0.000
.cldq_6swan_5 ~~
.cldq_4swan_7 0.000 0.000 0.000
.cldq_6swan_6 ~~
.cldq_4swan_7 0.000 0.000 0.000
.cldq_4swan_7 ~~
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_4swan_8 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_4swan_8 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_4swan_8 0.000 0.000 0.000
.cldq_5swan_4 ~~
.cldq_4swan_8 0.000 0.000 0.000
.cldq_5swan_5 ~~
.cldq_4swan_8 0.000 0.000 0.000
.cldq_5swan_6 ~~
.cldq_4swan_8 0.000 0.000 0.000
.cldq_5swan_7 ~~
.cldq_4swan_8 0.000 0.000 0.000
.cldq_4swan_8 ~~
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_4swan_8 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_4swan_8 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_4swan_8 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_4swan_8 0.000 0.000 0.000
.cldq_6swan_5 ~~
.cldq_4swan_8 0.000 0.000 0.000
.cldq_6swan_6 ~~
.cldq_4swan_8 0.000 0.000 0.000
.cldq_6swan_7 ~~
.cldq_4swan_8 0.000 0.000 0.000
.cldq_4swan_8 ~~
.cldq_6swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_4 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_5 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_6 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_7 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_8 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_6swan_5 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_6swan_6 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_6swan_7 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_6swan_8 ~~
.cldq_4swan_9 0.000 0.000 0.000
.cldq_5swan_1 ~~
.cldq_6swan_2 0.000 0.000 0.000
.cldq_6swan_3 0.000 0.000 0.000
.cldq_6swan_4 0.000 0.000 0.000
.cldq_6swan_5 0.000 0.000 0.000
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_5swan_2 0.000 0.000 0.000
.cldq_5swan_2 ~~
.cldq_6swan_3 0.000 0.000 0.000
.cldq_6swan_4 0.000 0.000 0.000
.cldq_6swan_5 0.000 0.000 0.000
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_5swan_3 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_5swan_3 0.000 0.000 0.000
.cldq_5swan_3 ~~
.cldq_6swan_4 0.000 0.000 0.000
.cldq_6swan_5 0.000 0.000 0.000
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_5swan_4 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_5swan_4 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_5swan_4 0.000 0.000 0.000
.cldq_5swan_4 ~~
.cldq_6swan_5 0.000 0.000 0.000
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_5swan_5 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_5swan_5 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_5swan_5 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_5swan_5 0.000 0.000 0.000
.cldq_5swan_5 ~~
.cldq_6swan_6 0.000 0.000 0.000
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_5swan_6 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_5swan_6 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_5swan_6 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_5swan_6 0.000 0.000 0.000
.cldq_6swan_5 ~~
.cldq_5swan_6 0.000 0.000 0.000
.cldq_5swan_6 ~~
.cldq_6swan_7 0.000 0.000 0.000
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_5swan_7 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_5swan_7 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_5swan_7 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_5swan_7 0.000 0.000 0.000
.cldq_6swan_5 ~~
.cldq_5swan_7 0.000 0.000 0.000
.cldq_6swan_6 ~~
.cldq_5swan_7 0.000 0.000 0.000
.cldq_5swan_7 ~~
.cldq_6swan_8 0.000 0.000 0.000
.cldq_6swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_5swan_8 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_5swan_8 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_5swan_8 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_5swan_8 0.000 0.000 0.000
.cldq_6swan_5 ~~
.cldq_5swan_8 0.000 0.000 0.000
.cldq_6swan_6 ~~
.cldq_5swan_8 0.000 0.000 0.000
.cldq_6swan_7 ~~
.cldq_5swan_8 0.000 0.000 0.000
.cldq_5swan_8 ~~
.cldq_6swan_9 0.000 0.000 0.000
.cldq_6swan_1 ~~
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_2 ~~
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_3 ~~
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_4 ~~
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_5 ~~
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_6 ~~
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_7 ~~
.cldq_5swan_9 0.000 0.000 0.000
.cldq_6swan_8 ~~
.cldq_5swan_9 0.000 0.000 0.000
.cldq_1swan_1 ~~
.cldq_1swan_2 1.270 0.105 12.063 0.000 1.270 0.644
.cldq_1swan_3 1.031 0.090 11.522 0.000 1.031 0.516
.cldq_1swan_4 1.288 0.110 11.758 0.000 1.288 0.650
.cldq_1swan_5 1.538 0.129 11.914 0.000 1.538 0.667
.cldq_1swan_6 1.354 0.115 11.791 0.000 1.354 0.619
.cldq_1swan_7 1.418 0.121 11.769 0.000 1.418 0.639
.cldq_1swan_8 1.361 0.120 11.379 0.000 1.361 0.587
.cldq_1swan_9 1.353 0.116 11.678 0.000 1.353 0.617
.cldq_2swan_1 0.371 0.041 9.126 0.000 0.371 0.264
.cldq_4swan_1 0.506 0.053 9.592 0.000 0.506 0.334
.cldq_5swan_1 0.531 0.057 9.337 0.000 0.531 0.304
.cldq_6swan_1 0.577 0.058 9.862 0.000 0.577 0.358
.cldq_1swan_2 ~~
.cldq_1swan_3 1.045 0.090 11.648 0.000 1.045 0.535
.cldq_1swan_4 1.277 0.108 11.840 0.000 1.277 0.659
.cldq_1swan_5 1.447 0.124 11.666 0.000 1.447 0.641
.cldq_1swan_6 1.362 0.114 11.953 0.000 1.362 0.636
.cldq_1swan_7 1.321 0.115 11.440 0.000 1.321 0.608
.cldq_1swan_8 1.382 0.118 11.665 0.000 1.382 0.608
.cldq_1swan_9 1.382 0.115 12.006 0.000 1.382 0.644
.cldq_2swan_2 0.296 0.034 8.721 0.000 0.296 0.227
.cldq_4swan_2 0.532 0.052 10.178 0.000 0.532 0.355
.cldq_5swan_2 0.545 0.057 9.581 0.000 0.545 0.336
.cldq_6swan_2 0.596 0.060 9.980 0.000 0.596 0.391
.cldq_1swan_3 ~~
.cldq_1swan_4 1.131 0.095 11.918 0.000 1.131 0.576
.cldq_1swan_5 1.094 0.104 10.542 0.000 1.094 0.478
.cldq_1swan_6 1.073 0.096 11.186 0.000 1.073 0.494
.cldq_1swan_7 1.065 0.099 10.792 0.000 1.065 0.484
.cldq_1swan_8 1.046 0.099 10.577 0.000 1.046 0.455
.cldq_1swan_9 1.077 0.097 11.122 0.000 1.077 0.495
.cldq_2swan_3 0.688 0.063 10.997 0.000 0.688 0.445
.cldq_4swan_3 1.057 0.088 11.947 0.000 1.057 0.559
.cldq_5swan_3 1.038 0.092 11.286 0.000 1.038 0.495
.cldq_6swan_3 1.090 0.096 11.372 0.000 1.090 0.538
.cldq_1swan_4 ~~
.cldq_1swan_5 1.668 0.137 12.153 0.000 1.668 0.736
.cldq_1swan_6 1.479 0.122 12.124 0.000 1.479 0.688
.cldq_1swan_7 1.500 0.127 11.820 0.000 1.500 0.688
.cldq_1swan_8 1.412 0.125 11.337 0.000 1.412 0.619
.cldq_1swan_9 1.445 0.123 11.784 0.000 1.445 0.671
.cldq_2swan_4 0.228 0.026 8.628 0.000 0.228 0.157
.cldq_4swan_4 0.361 0.039 9.198 0.000 0.361 0.244
.cldq_5swan_4 0.372 0.040 9.235 0.000 0.372 0.227
.cldq_6swan_4 0.312 0.039 7.967 0.000 0.312 0.203
.cldq_1swan_5 ~~
.cldq_1swan_6 1.671 0.141 11.889 0.000 1.671 0.667
.cldq_1swan_7 1.975 0.156 12.662 0.000 1.975 0.777
.cldq_1swan_8 1.722 0.149 11.581 0.000 1.722 0.648
.cldq_1swan_9 1.690 0.142 11.876 0.000 1.690 0.673
.cldq_2swan_5 0.198 0.032 6.127 0.000 0.198 0.102
.cldq_4swan_5 0.489 0.050 9.809 0.000 0.489 0.253
.cldq_5swan_5 0.480 0.053 9.015 0.000 0.480 0.208
.cldq_6swan_5 0.484 0.052 9.336 0.000 0.484 0.241
.cldq_1swan_6 ~~
.cldq_1swan_7 1.587 0.134 11.883 0.000 1.587 0.658
.cldq_1swan_8 1.566 0.133 11.758 0.000 1.566 0.622
.cldq_1swan_9 1.546 0.129 12.002 0.000 1.546 0.649
.cldq_2swan_6 0.305 0.036 8.417 0.000 0.305 0.180
.cldq_4swan_6 0.623 0.059 10.537 0.000 0.623 0.349
.cldq_5swan_6 0.588 0.062 9.541 0.000 0.588 0.284
.cldq_6swan_6 0.612 0.061 9.983 0.000 0.612 0.333
.cldq_1swan_7 ~~
.cldq_1swan_8 1.665 0.142 11.700 0.000 1.665 0.651
.cldq_1swan_9 1.671 0.137 12.195 0.000 1.671 0.692
.cldq_2swan_7 0.415 0.042 9.829 0.000 0.415 0.221
.cldq_4swan_7 0.466 0.047 9.960 0.000 0.466 0.262
.cldq_5swan_7 0.448 0.047 9.486 0.000 0.448 0.229
.cldq_6swan_7 0.478 0.050 9.655 0.000 0.478 0.252
.cldq_1swan_8 ~~
.cldq_1swan_9 1.662 0.137 12.117 0.000 1.662 0.658
.cldq_2swan_8 0.592 0.059 10.040 0.000 0.592 0.284
.cldq_4swan_8 0.794 0.075 10.628 0.000 0.794 0.358
.cldq_5swan_8 0.734 0.075 9.750 0.000 0.734 0.321
.cldq_6swan_8 0.796 0.077 10.361 0.000 0.796 0.364
.cldq_1swan_9 ~~
.cldq_2swan_9 0.381 0.038 10.127 0.000 0.381 0.237
.cldq_4swan_9 0.616 0.058 10.548 0.000 0.616 0.339
.cldq_5swan_9 0.611 0.061 9.989 0.000 0.611 0.299
.cldq_6swan_9 0.563 0.058 9.694 0.000 0.563 0.311
.cldq_2swan_1 ~~
.cldq_2swan_2 0.537 0.049 10.883 0.000 0.537 0.577
.cldq_2swan_3 0.445 0.046 9.655 0.000 0.445 0.409
.cldq_2swan_4 0.640 0.057 11.301 0.000 0.640 0.622
.cldq_2swan_5 0.680 0.066 10.268 0.000 0.680 0.576
.cldq_2swan_6 0.638 0.060 10.558 0.000 0.638 0.586
.cldq_2swan_7 0.593 0.061 9.665 0.000 0.593 0.499
.cldq_2swan_8 0.626 0.061 10.178 0.000 0.626 0.495
.cldq_2swan_9 0.595 0.054 10.942 0.000 0.595 0.576
.cldq_4swan_1 0.368 0.041 9.047 0.000 0.368 0.349
.cldq_5swan_1 0.368 0.044 8.365 0.000 0.368 0.302
.cldq_6swan_1 0.426 0.045 9.379 0.000 0.426 0.379
.cldq_2swan_2 ~~
.cldq_2swan_3 0.532 0.047 11.199 0.000 0.532 0.514
.cldq_2swan_4 0.635 0.055 11.490 0.000 0.635 0.650
.cldq_2swan_5 0.627 0.063 10.008 0.000 0.627 0.559
.cldq_2swan_6 0.668 0.060 11.129 0.000 0.668 0.645
.cldq_2swan_7 0.520 0.058 9.032 0.000 0.520 0.460
.cldq_2swan_8 0.586 0.059 10.015 0.000 0.586 0.487
.cldq_2swan_9 0.557 0.052 10.697 0.000 0.557 0.567
.cldq_4swan_2 0.365 0.036 10.091 0.000 0.365 0.359
.cldq_5swan_2 0.366 0.039 9.290 0.000 0.366 0.333
.cldq_6swan_2 0.400 0.041 9.707 0.000 0.400 0.388
.cldq_2swan_3 ~~
.cldq_2swan_4 0.516 0.051 10.016 0.000 0.516 0.452
.cldq_2swan_5 0.397 0.058 6.867 0.000 0.397 0.304
.cldq_2swan_6 0.467 0.054 8.650 0.000 0.467 0.387
.cldq_2swan_7 0.376 0.055 6.842 0.000 0.376 0.285
.cldq_2swan_8 0.465 0.055 8.421 0.000 0.465 0.331
.cldq_2swan_9 0.471 0.049 9.538 0.000 0.471 0.412
.cldq_4swan_3 0.803 0.072 11.126 0.000 0.803 0.544
.cldq_5swan_3 0.814 0.075 10.823 0.000 0.814 0.498
.cldq_6swan_3 0.895 0.080 11.140 0.000 0.895 0.566
.cldq_2swan_4 ~~
.cldq_2swan_5 0.901 0.078 11.614 0.000 0.901 0.728
.cldq_2swan_6 0.819 0.070 11.669 0.000 0.819 0.718
.cldq_2swan_7 0.749 0.071 10.555 0.000 0.749 0.601
.cldq_2swan_8 0.799 0.071 11.214 0.000 0.799 0.603
.cldq_2swan_9 0.754 0.063 11.906 0.000 0.754 0.697
.cldq_4swan_4 0.309 0.030 10.291 0.000 0.309 0.281
.cldq_5swan_4 0.315 0.031 10.184 0.000 0.315 0.258
.cldq_6swan_4 0.302 0.031 9.712 0.000 0.302 0.264
.cldq_2swan_5 ~~
.cldq_2swan_6 0.976 0.085 11.466 0.000 0.976 0.745
.cldq_2swan_7 0.998 0.091 11.027 0.000 0.998 0.697
.cldq_2swan_8 0.987 0.088 11.203 0.000 0.987 0.649
.cldq_2swan_9 0.843 0.075 11.276 0.000 0.843 0.678
.cldq_4swan_5 0.288 0.036 8.049 0.000 0.288 0.203
.cldq_5swan_5 0.265 0.038 6.945 0.000 0.265 0.156
.cldq_6swan_5 0.312 0.039 7.979 0.000 0.312 0.211
.cldq_2swan_6 ~~
.cldq_2swan_7 0.839 0.079 10.646 0.000 0.839 0.635
.cldq_2swan_8 0.869 0.078 11.092 0.000 0.869 0.619
.cldq_2swan_9 0.776 0.068 11.336 0.000 0.776 0.677
.cldq_4swan_6 0.309 0.037 8.470 0.000 0.309 0.243
.cldq_5swan_6 0.260 0.038 6.930 0.000 0.260 0.176
.cldq_6swan_6 0.365 0.039 9.376 0.000 0.365 0.278
.cldq_2swan_7 ~~
.cldq_2swan_8 0.951 0.086 11.116 0.000 0.951 0.620
.cldq_2swan_9 0.779 0.071 10.943 0.000 0.779 0.623
.cldq_4swan_7 0.359 0.041 8.708 0.000 0.359 0.263
.cldq_5swan_7 0.342 0.041 8.293 0.000 0.342 0.228
.cldq_6swan_7 0.360 0.044 8.201 0.000 0.360 0.247
.cldq_2swan_8 ~~
.cldq_2swan_9 0.797 0.070 11.320 0.000 0.797 0.599
.cldq_4swan_8 0.752 0.065 11.584 0.000 0.752 0.435
.cldq_5swan_8 0.717 0.066 10.852 0.000 0.717 0.402
.cldq_6swan_8 0.722 0.066 10.973 0.000 0.722 0.422
.cldq_2swan_9 ~~
.cldq_4swan_9 0.430 0.041 10.568 0.000 0.430 0.350
.cldq_5swan_9 0.420 0.042 9.924 0.000 0.420 0.305
.cldq_6swan_9 0.441 0.042 10.516 0.000 0.441 0.361
.cldq_4swan_1 ~~
.cldq_4swan_2 0.413 0.043 9.635 0.000 0.413 0.358
.cldq_4swan_3 0.354 0.041 8.539 0.000 0.354 0.247
.cldq_4swan_4 0.436 0.051 8.492 0.000 0.436 0.387
.cldq_4swan_5 0.491 0.058 8.531 0.000 0.491 0.387
.cldq_4swan_6 0.417 0.052 8.062 0.000 0.417 0.338
.cldq_4swan_7 0.422 0.054 7.873 0.000 0.422 0.348
.cldq_4swan_8 0.459 0.055 8.366 0.000 0.459 0.318
.cldq_4swan_9 0.410 0.049 8.420 0.000 0.410 0.327
.cldq_5swan_1 0.750 0.064 11.630 0.000 0.750 0.571
.cldq_6swan_1 0.718 0.064 11.271 0.000 0.718 0.593
.cldq_4swan_2 ~~
.cldq_4swan_3 0.392 0.040 9.754 0.000 0.392 0.270
.cldq_4swan_4 0.461 0.049 9.484 0.000 0.461 0.403
.cldq_4swan_5 0.430 0.051 8.426 0.000 0.430 0.335
.cldq_4swan_6 0.439 0.049 9.038 0.000 0.439 0.350
.cldq_4swan_7 0.407 0.049 8.314 0.000 0.407 0.331
.cldq_4swan_8 0.431 0.050 8.579 0.000 0.431 0.294
.cldq_4swan_9 0.414 0.046 9.062 0.000 0.414 0.325
.cldq_5swan_2 0.756 0.066 11.424 0.000 0.756 0.599
.cldq_6swan_2 0.778 0.067 11.603 0.000 0.778 0.656
.cldq_4swan_3 ~~
.cldq_4swan_4 0.388 0.047 8.273 0.000 0.388 0.273
.cldq_4swan_5 0.344 0.049 7.004 0.000 0.344 0.215
.cldq_4swan_6 0.360 0.047 7.728 0.000 0.360 0.231
.cldq_4swan_7 0.316 0.047 6.737 0.000 0.316 0.206
.cldq_4swan_8 0.350 0.048 7.276 0.000 0.350 0.192
.cldq_4swan_9 0.328 0.044 7.514 0.000 0.328 0.207
.cldq_5swan_3 1.477 0.116 12.736 0.000 1.477 0.738
.cldq_6swan_3 1.424 0.116 12.273 0.000 1.424 0.736
.cldq_4swan_4 ~~
.cldq_4swan_5 0.591 0.068 8.682 0.000 0.591 0.470
.cldq_4swan_6 0.542 0.062 8.780 0.000 0.542 0.442
.cldq_4swan_7 0.562 0.065 8.685 0.000 0.562 0.466
.cldq_4swan_8 0.577 0.065 8.861 0.000 0.577 0.403
.cldq_4swan_9 0.542 0.059 9.161 0.000 0.542 0.435
.cldq_5swan_4 0.509 0.047 10.840 0.000 0.509 0.409
.cldq_6swan_4 0.504 0.047 10.708 0.000 0.504 0.433
.cldq_4swan_5 ~~
.cldq_4swan_6 0.588 0.068 8.594 0.000 0.588 0.426
.cldq_4swan_7 0.641 0.075 8.543 0.000 0.641 0.473
.cldq_4swan_8 0.731 0.077 9.496 0.000 0.731 0.453
.cldq_4swan_9 0.517 0.063 8.163 0.000 0.517 0.369
.cldq_5swan_5 0.671 0.063 10.732 0.000 0.671 0.397
.cldq_6swan_5 0.666 0.061 10.946 0.000 0.666 0.452
.cldq_4swan_6 ~~
.cldq_4swan_7 0.610 0.068 8.999 0.000 0.610 0.462
.cldq_4swan_8 0.636 0.068 9.313 0.000 0.636 0.405
.cldq_4swan_9 0.520 0.060 8.741 0.000 0.520 0.381
.cldq_5swan_6 0.731 0.066 11.126 0.000 0.731 0.469
.cldq_6swan_6 0.672 0.063 10.741 0.000 0.672 0.486
.cldq_4swan_7 ~~
.cldq_4swan_8 0.714 0.074 9.586 0.000 0.714 0.463
.cldq_4swan_9 0.530 0.062 8.481 0.000 0.530 0.395
.cldq_5swan_7 0.564 0.053 10.701 0.000 0.564 0.397
.cldq_6swan_7 0.590 0.055 10.786 0.000 0.590 0.428
.cldq_4swan_8 ~~
.cldq_4swan_9 0.532 0.062 8.523 0.000 0.532 0.333
.cldq_5swan_8 1.025 0.086 11.904 0.000 1.025 0.541
.cldq_6swan_8 1.025 0.086 11.880 0.000 1.025 0.565
.cldq_4swan_9 ~~
.cldq_5swan_9 0.857 0.072 11.947 0.000 0.857 0.551
.cldq_6swan_9 0.810 0.069 11.819 0.000 0.810 0.587
.cldq_5swan_1 ~~
.cldq_5swan_2 0.459 0.051 8.988 0.000 0.459 0.319
.cldq_5swan_3 0.536 0.054 9.876 0.000 0.536 0.292
.cldq_5swan_4 0.661 0.067 9.841 0.000 0.661 0.456
.cldq_5swan_5 0.759 0.080 9.513 0.000 0.759 0.434
.cldq_5swan_6 0.700 0.072 9.778 0.000 0.700 0.423
.cldq_5swan_7 0.683 0.071 9.615 0.000 0.683 0.444
.cldq_5swan_8 0.619 0.066 9.423 0.000 0.619 0.360
.cldq_5swan_9 0.616 0.063 9.799 0.000 0.616 0.379
.cldq_6swan_1 0.808 0.071 11.417 0.000 0.808 0.578
.cldq_5swan_2 ~~
.cldq_5swan_3 0.424 0.047 8.992 0.000 0.424 0.244
.cldq_5swan_4 0.445 0.055 8.068 0.000 0.445 0.324
.cldq_5swan_5 0.372 0.062 6.023 0.000 0.372 0.224
.cldq_5swan_6 0.429 0.058 7.448 0.000 0.429 0.273
.cldq_5swan_7 0.414 0.057 7.259 0.000 0.414 0.284
.cldq_5swan_8 0.512 0.057 8.957 0.000 0.512 0.314
.cldq_5swan_9 0.510 0.054 9.405 0.000 0.510 0.331
.cldq_6swan_2 0.856 0.076 11.267 0.000 0.856 0.669
.cldq_5swan_3 ~~
.cldq_5swan_4 0.574 0.060 9.616 0.000 0.574 0.328
.cldq_5swan_5 0.604 0.068 8.843 0.000 0.604 0.285
.cldq_5swan_6 0.548 0.062 8.797 0.000 0.548 0.274
.cldq_5swan_7 0.524 0.061 8.601 0.000 0.524 0.282
.cldq_5swan_8 0.456 0.057 8.056 0.000 0.456 0.219
.cldq_5swan_9 0.570 0.057 10.023 0.000 0.570 0.290
.cldq_6swan_3 1.632 0.129 12.695 0.000 1.632 0.761
.cldq_5swan_4 ~~
.cldq_5swan_5 0.927 0.094 9.880 0.000 0.927 0.554
.cldq_5swan_6 0.767 0.081 9.450 0.000 0.767 0.486
.cldq_5swan_7 0.741 0.082 9.056 0.000 0.741 0.504
.cldq_5swan_8 0.633 0.074 8.528 0.000 0.633 0.385
.cldq_5swan_9 0.697 0.072 9.653 0.000 0.697 0.449
.cldq_6swan_4 0.557 0.050 11.038 0.000 0.557 0.430
.cldq_5swan_5 ~~
.cldq_5swan_6 0.890 0.097 9.180 0.000 0.890 0.466
.cldq_5swan_7 0.963 0.102 9.463 0.000 0.963 0.542
.cldq_5swan_8 0.708 0.090 7.855 0.000 0.708 0.356
.cldq_5swan_9 0.739 0.084 8.806 0.000 0.739 0.393
.cldq_6swan_5 0.745 0.068 10.997 0.000 0.745 0.423
.cldq_5swan_6 ~~
.cldq_5swan_7 0.888 0.091 9.781 0.000 0.888 0.529
.cldq_5swan_8 0.696 0.081 8.637 0.000 0.696 0.371
.cldq_5swan_9 0.728 0.078 9.352 0.000 0.728 0.410
.cldq_6swan_6 0.759 0.069 10.992 0.000 0.759 0.473
.cldq_5swan_7 ~~
.cldq_5swan_8 0.762 0.085 8.943 0.000 0.762 0.436
.cldq_5swan_9 0.750 0.079 9.537 0.000 0.750 0.454
.cldq_6swan_7 0.619 0.057 10.790 0.000 0.619 0.409
.cldq_5swan_8 ~~
.cldq_5swan_9 0.692 0.073 9.493 0.000 0.692 0.375
.cldq_6swan_8 1.122 0.092 12.213 0.000 1.122 0.600
.cldq_5swan_9 ~~
.cldq_6swan_9 0.916 0.075 12.182 0.000 0.916 0.592
.cldq_6swan_1 ~~
.cldq_6swan_2 0.318 0.046 6.884 0.000 0.318 0.255
.cldq_6swan_3 0.304 0.048 6.275 0.000 0.304 0.186
.cldq_6swan_4 0.472 0.062 7.600 0.000 0.472 0.377
.cldq_6swan_5 0.502 0.069 7.272 0.000 0.502 0.357
.cldq_6swan_6 0.437 0.062 7.013 0.000 0.437 0.322
.cldq_6swan_7 0.417 0.066 6.312 0.000 0.417 0.303
.cldq_6swan_8 0.375 0.062 6.086 0.000 0.375 0.247
.cldq_6swan_9 0.377 0.056 6.735 0.000 0.377 0.283
.cldq_6swan_2 ~~
.cldq_6swan_3 0.257 0.042 6.104 0.000 0.257 0.163
.cldq_6swan_4 0.367 0.052 7.084 0.000 0.367 0.304
.cldq_6swan_5 0.346 0.055 6.287 0.000 0.346 0.255
.cldq_6swan_6 0.333 0.052 6.414 0.000 0.333 0.254
.cldq_6swan_7 0.289 0.053 5.411 0.000 0.289 0.218
.cldq_6swan_8 0.280 0.051 5.477 0.000 0.280 0.191
.cldq_6swan_9 0.261 0.046 5.650 0.000 0.261 0.203
.cldq_6swan_3 ~~
.cldq_6swan_4 0.344 0.054 6.385 0.000 0.344 0.217
.cldq_6swan_5 0.270 0.057 4.774 0.000 0.270 0.152
.cldq_6swan_6 0.328 0.055 6.008 0.000 0.328 0.191
.cldq_6swan_7 0.268 0.056 4.789 0.000 0.268 0.154
.cldq_6swan_8 0.271 0.053 5.117 0.000 0.271 0.141
.cldq_6swan_9 0.236 0.048 4.888 0.000 0.236 0.140
.cldq_6swan_4 ~~
.cldq_6swan_5 0.628 0.082 7.652 0.000 0.628 0.461
.cldq_6swan_6 0.543 0.072 7.500 0.000 0.543 0.413
.cldq_6swan_7 0.610 0.080 7.645 0.000 0.610 0.457
.cldq_6swan_8 0.538 0.074 7.309 0.000 0.538 0.364
.cldq_6swan_9 0.484 0.066 7.311 0.000 0.484 0.375
.cldq_6swan_5 ~~
.cldq_6swan_6 0.603 0.081 7.447 0.000 0.603 0.409
.cldq_6swan_7 0.724 0.095 7.648 0.000 0.724 0.483
.cldq_6swan_8 0.666 0.087 7.691 0.000 0.666 0.402
.cldq_6swan_9 0.543 0.074 7.306 0.000 0.543 0.375
.cldq_6swan_6 ~~
.cldq_6swan_7 0.589 0.081 7.225 0.000 0.589 0.407
.cldq_6swan_8 0.516 0.075 6.918 0.000 0.516 0.323
.cldq_6swan_9 0.520 0.069 7.566 0.000 0.520 0.372
.cldq_6swan_7 ~~
.cldq_6swan_8 0.687 0.087 7.893 0.000 0.687 0.423
.cldq_6swan_9 0.542 0.075 7.274 0.000 0.542 0.382
.cldq_6swan_8 ~~
.cldq_6swan_9 0.449 0.068 6.614 0.000 0.449 0.286
readprob ~~
attention -0.442 0.060 -7.371 0.000 -0.454 -0.454
readprobattntn -0.378 0.054 -6.975 0.000 -0.541 -0.541
attention ~~
readprobattntn 0.214 0.062 3.431 0.001 0.191 0.191
Intercepts:
Estimate Std.Err z-value P(>|z|) Std.lv Std.all
.cldq_1 1.988 0.052 38.131 0.000 1.988 1.817
.cldq_2 1.327 0.038 34.654 0.000 1.327 1.650
.cldq_4 1.646 0.053 31.277 0.000 1.646 1.489
.cldq_5 1.662 0.057 29.407 0.000 1.662 1.401
.cldq_6 1.662 0.057 29.235 0.000 1.662 1.392
.swan_1 4.543 0.073 62.258 0.000 4.543 3.074
.swan_2 4.722 0.069 68.838 0.000 4.722 3.403
.swan_3 4.840 0.067 71.835 0.000 4.840 3.561
.swan_4 4.626 0.074 62.469 0.000 4.626 3.075
.swan_5 4.430 0.081 54.472 0.000 4.430 2.682
.swan_6 4.648 0.075 61.949 0.000 4.648 3.059
.swan_7 4.531 0.076 59.326 0.000 4.531 2.924
.swan_8 4.057 0.077 52.896 0.000 4.057 2.615
.swan_9 4.543 0.073 62.421 0.000 4.543 3.077
.hpc_mean 1.676 0.030 56.204 0.000 1.676 2.676
.cldq_1swan_1 0.015 0.084 0.180 0.857 0.015 0.009
.cldq_2swan_1 0.013 0.066 0.191 0.849 0.013 0.010
.cldq_4swan_1 0.023 0.087 0.267 0.790 0.023 0.013
.cldq_5swan_1 0.024 0.100 0.242 0.809 0.024 0.012
.cldq_6swan_1 0.029 0.101 0.285 0.775 0.029 0.014
.cldq_1swan_2 0.022 0.083 0.269 0.788 0.022 0.013
.cldq_2swan_2 0.013 0.063 0.213 0.831 0.013 0.011
.cldq_4swan_2 0.025 0.084 0.299 0.765 0.025 0.015
.cldq_5swan_2 0.029 0.096 0.304 0.761 0.029 0.015
.cldq_6swan_2 0.034 0.098 0.348 0.728 0.034 0.017
.cldq_1swan_3 0.016 0.080 0.195 0.846 0.016 0.010
.cldq_2swan_3 0.011 0.063 0.171 0.864 0.011 0.009
.cldq_4swan_3 0.020 0.086 0.233 0.815 0.020 0.012
.cldq_5swan_3 0.023 0.096 0.237 0.813 0.023 0.012
.cldq_6swan_3 0.024 0.097 0.247 0.805 0.024 0.012
.cldq_1swan_4 0.026 0.086 0.301 0.763 0.026 0.015
.cldq_2swan_4 0.018 0.068 0.270 0.787 0.018 0.014
.cldq_4swan_4 0.031 0.087 0.352 0.725 0.031 0.018
.cldq_5swan_4 0.037 0.099 0.374 0.708 0.037 0.019
.cldq_6swan_4 0.037 0.100 0.368 0.713 0.037 0.018
.cldq_1swan_5 0.017 0.098 0.176 0.860 0.017 0.009
.cldq_2swan_5 0.017 0.079 0.215 0.830 0.017 0.011
.cldq_4swan_5 0.026 0.098 0.262 0.793 0.026 0.013
.cldq_5swan_5 0.031 0.115 0.273 0.785 0.031 0.014
.cldq_6swan_5 0.032 0.114 0.278 0.781 0.032 0.014
.cldq_1swan_6 0.008 0.093 0.082 0.935 0.008 0.004
.cldq_2swan_6 0.002 0.076 0.025 0.980 0.002 0.001
.cldq_4swan_6 0.011 0.096 0.119 0.906 0.011 0.006
.cldq_5swan_6 0.019 0.111 0.168 0.867 0.019 0.008
.cldq_6swan_6 0.018 0.110 0.166 0.868 0.018 0.008
.cldq_1swan_7 0.023 0.093 0.247 0.805 0.023 0.012
.cldq_2swan_7 0.017 0.078 0.216 0.829 0.017 0.011
.cldq_4swan_7 0.025 0.092 0.268 0.789 0.025 0.013
.cldq_5swan_7 0.030 0.105 0.283 0.777 0.030 0.014
.cldq_6swan_7 0.030 0.108 0.282 0.778 0.030 0.014
.cldq_1swan_8 0.025 0.096 0.258 0.796 0.025 0.013
.cldq_2swan_8 0.017 0.081 0.215 0.830 0.017 0.011
.cldq_4swan_8 0.025 0.100 0.248 0.804 0.025 0.012
.cldq_5swan_8 0.027 0.110 0.248 0.804 0.027 0.012
.cldq_6swan_8 0.030 0.113 0.263 0.793 0.030 0.013
.cldq_1swan_9 0.017 0.091 0.182 0.856 0.017 0.009
.cldq_2swan_9 0.016 0.069 0.234 0.815 0.016 0.012
.cldq_4swan_9 0.023 0.092 0.249 0.804 0.023 0.012
.cldq_5swan_9 0.033 0.104 0.317 0.752 0.033 0.016
.cldq_6swan_9 0.029 0.102 0.285 0.775 0.029 0.014
Variances:
Estimate Std.Err z-value P(>|z|) Std.lv Std.all
.hpc_mean 0.016 0.016 0.040
.cldq_1 0.589 0.044 13.436 0.000 0.589 0.492
.cldq_2 0.282 0.021 13.175 0.000 0.282 0.437
.cldq_4 0.226 0.025 8.870 0.000 0.226 0.185
.cldq_5 0.390 0.037 10.557 0.000 0.390 0.277
.cldq_6 0.290 0.033 8.906 0.000 0.290 0.203
.swan_1 0.619 0.049 12.555 0.000 0.619 0.284
.swan_2 0.546 0.044 12.495 0.000 0.546 0.283
.swan_3 0.768 0.058 13.210 0.000 0.768 0.416
.swan_4 0.400 0.035 11.432 0.000 0.400 0.177
.swan_5 0.574 0.048 11.956 0.000 0.574 0.211
.swan_6 0.648 0.051 12.620 0.000 0.648 0.281
.swan_7 0.500 0.042 11.881 0.000 0.500 0.208
.swan_8 0.785 0.061 12.880 0.000 0.785 0.326
.swan_9 0.525 0.043 12.316 0.000 0.525 0.241
.cldq_1swan_1 2.014 0.132 15.263 0.000 2.014 0.715
.cldq_2swan_1 0.980 0.068 14.426 0.000 0.980 0.569
.cldq_4swan_1 1.136 0.077 14.763 0.000 1.136 0.374
.cldq_5swan_1 1.517 0.098 15.416 0.000 1.517 0.374
.cldq_6swan_1 1.289 0.095 13.629 0.000 1.289 0.311
.cldq_1swan_2 1.930 0.124 15.559 0.000 1.930 0.707
.cldq_2swan_2 0.886 0.059 14.932 0.000 0.886 0.566
.cldq_4swan_2 1.168 0.076 15.445 0.000 1.168 0.416
.cldq_5swan_2 1.362 0.093 14.680 0.000 1.362 0.365
.cldq_6swan_2 1.202 0.090 13.317 0.000 1.202 0.314
.cldq_1swan_3 1.981 0.119 16.670 0.000 1.981 0.779
.cldq_2swan_3 1.207 0.076 15.790 0.000 1.207 0.765
.cldq_4swan_3 1.806 0.118 15.265 0.000 1.806 0.615
.cldq_5swan_3 2.217 0.142 15.663 0.000 2.217 0.600
.cldq_6swan_3 2.073 0.145 14.335 0.000 2.073 0.557
.cldq_1swan_4 1.947 0.131 14.915 0.000 1.947 0.661
.cldq_2swan_4 1.077 0.073 14.762 0.000 1.077 0.585
.cldq_4swan_4 1.119 0.078 14.278 0.000 1.119 0.365
.cldq_5swan_4 1.383 0.096 14.378 0.000 1.383 0.350
.cldq_6swan_4 1.215 0.093 12.999 0.000 1.215 0.301
.cldq_1swan_5 2.642 0.177 14.904 0.000 2.642 0.695
.cldq_2swan_5 1.420 0.103 13.803 0.000 1.420 0.574
.cldq_4swan_5 1.416 0.103 13.814 0.000 1.416 0.370
.cldq_5swan_5 2.022 0.141 14.296 0.000 2.022 0.379
.cldq_6swan_5 1.530 0.119 12.872 0.000 1.530 0.292
.cldq_1swan_6 2.376 0.152 15.624 0.000 2.376 0.686
.cldq_2swan_6 1.209 0.086 14.107 0.000 1.209 0.519
.cldq_4swan_6 1.345 0.093 14.478 0.000 1.345 0.364
.cldq_5swan_6 1.805 0.122 14.813 0.000 1.805 0.367
.cldq_6swan_6 1.424 0.109 13.066 0.000 1.424 0.291
.cldq_1swan_7 2.445 0.161 15.198 0.000 2.445 0.718
.cldq_2swan_7 1.443 0.098 14.697 0.000 1.443 0.602
.cldq_4swan_7 1.297 0.094 13.817 0.000 1.297 0.385
.cldq_5swan_7 1.562 0.112 13.955 0.000 1.562 0.356
.cldq_6swan_7 1.467 0.114 12.928 0.000 1.467 0.311
.cldq_1swan_8 2.673 0.172 15.501 0.000 2.673 0.731
.cldq_2swan_8 1.631 0.103 15.828 0.000 1.631 0.619
.cldq_4swan_8 1.836 0.118 15.528 0.000 1.836 0.461
.cldq_5swan_8 1.953 0.128 15.226 0.000 1.953 0.403
.cldq_6swan_8 1.792 0.125 14.304 0.000 1.792 0.352
.cldq_1swan_9 2.385 0.153 15.618 0.000 2.385 0.721
.cldq_2swan_9 1.086 0.071 15.228 0.000 1.086 0.564
.cldq_4swan_9 1.388 0.092 15.053 0.000 1.388 0.412
.cldq_5swan_9 1.747 0.110 15.892 0.000 1.747 0.402
.cldq_6swan_9 1.372 0.099 13.845 0.000 1.372 0.325
readprob 0.608 0.073 8.372 0.000 1.000 1.000
attention 1.565 0.147 10.618 0.000 1.000 1.000
.hwp 0.145 0.012 12.428 0.000 0.386 0.386
readprobattntn 0.804 0.144 5.569 0.000 1.000 1.000
The regression estimates show that Reading Problems are positively associated with Homework Problems whereas Attentiveness is negatively associated with Homework problems. In addition, the interaction effect is significant and negative. Thus, a one-unit increase in Attentiveness reduces the association between Reading Problems and Homework Problems by about \(-0.14\). We can understand the interaction effect better by visualizing it (remember that the x-axis is mean-centered) by plotting the Johnson-Neyman regions (again, the x-axis is mean centered):
plot_jn(x = "readprob", z = "attention", y = "hwp", model = fit_sem_mod_pi, max_z = 4)
This figure shows that the association between Reading Problems and Homework Problems is:
- significant and positive (i.e., reporting more reading problems is associated with more homework problems) for attention levels < 0.2;
- not significant for attention levels between 0.2 and 2.04;
- significant and negative (i.e., reporting more reading problems is associated with fewer homework problems) for attention levels > 2.04.
These findings (which are highly exploratory and not based on any theory!) may suggest that attentiveness buffers against/compensates for the negative consequences of reading problems on homework problems.
6.10 Summary
In this R lab, you were introduced to the steps involved in specifying, estimating, evaluating, comparing and interpreting the results of a full structural equation model. In the next R Lab, you will learn all about measurement invariance testing, a method that combines CFA and multiple-group comparisons.