ACDE model and estimability Why can’t we estimate (co)variances due to A, C, D and E simultaneously in a standard twin design?

Slides:



Advertisements
Similar presentations
OpenMx Frühling Rijsdijk.
Advertisements

Properties of Least Squares Regression Coefficients
Multiple Regression Analysis
Topic 12: Multiple Linear Regression
Copyright 2004 David J. Lilja1 Comparing Two Alternatives Use confidence intervals for Before-and-after comparisons Noncorresponding measurements.
Computational Statistics. Basic ideas  Predict values that are hard to measure irl, by using co-variables (other properties from the same measurement.
Kin 304 Regression Linear Regression Least Sum of Squares
Structural Equation Modeling Using Mplus Chongming Yang Research Support Center FHSS College.
The Simple Regression Model
ACE estimates under different sample sizes Benjamin Neale Michael Neale Boulder 2006.
Chapter 12 Simple Linear Regression
Statistics Measures of Regression and Prediction Intervals.
Linear regression models
Linear Regression and Binary Variables The independent variable does not necessarily need to be continuous. If the independent variable is binary (e.g.,
A Short Introduction to Curve Fitting and Regression by Brad Morantz
1 Chapter 2 Simple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
HIERARCHICAL LINEAR MODELS USED WITH NESTED DESIGNS IN EDUCATION, PSYCHOLOGY USES RANDOM FACTORS EXPECTED MEAN SQUARE THEORY COMBINES INFORMATION ACROSS.
Path Analysis Danielle Dick Boulder Path Analysis Allows us to represent linear models for the relationships between variables in diagrammatic form.
Multivariate Genetic Analysis: Introduction(II) Frühling Rijsdijk & Shaun Purcell Wednesday March 6, 2002.
Generalized Regression Model Based on Greene’s Note 15 (Chapter 8)
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Univariate Analysis in Mx Boulder, Group Structure Title Type: Data/ Calculation/ Constraint Reading Data Matrices Declaration Assigning Specifications/
Mixed models Various types of models and their relation
7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 Another important method to estimate parameters Connection.
Path Analysis Frühling Rijsdijk SGDP Centre Institute of Psychiatry King’s College London, UK.
Path Analysis Frühling Rijsdijk. Biometrical Genetic Theory Aims of session:  Derivation of Predicted Var/Cov matrices Using: (1)Path Tracing Rules (2)Covariance.
Raw data analysis S. Purcell & M. C. Neale Twin Workshop, IBG Colorado, March 2002.
Outline Single-factor ANOVA Two-factor ANOVA Three-factor ANOVA
Lecture 5 Correlation and Regression
Path Analysis HGEN619 class Method of Path Analysis allows us to represent linear models for the relationship between variables in diagrammatic.
Advantages of Multivariate Analysis Close resemblance to how the researcher thinks. Close resemblance to how the researcher thinks. Easy visualisation.
Regression and Correlation Methods Judy Zhong Ph.D.
Measures of Regression and Prediction Intervals
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Institute of Psychiatry King’s College London, UK
Copyright © Cengage Learning. All rights reserved. 12 Simple Linear Regression and Correlation
Summary of introduced statistical terms and concepts mean Variance & standard deviation covariance & correlation Describes/measures average conditions.
Whole genome approaches to quantitative genetics Leuven 2008.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 8 Analysis of Variance.
Topic 30: Random Effects. Outline One-way random effects model –Data –Model –Inference.
Mx modeling of methylation data: twin correlations [means, SD, correlation] ACE / ADE latent factor model regression [sex and age] genetic association.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 9 Review.
Regression Line  R 2 represents the fraction of variation in the data (regression line)
BUSINESS MATHEMATICS & STATISTICS. Module 6 Correlation ( Lecture 28-29) Line Fitting ( Lectures 30-31) Time Series and Exponential Smoothing ( Lectures.
Part II Exploring Relationships Between Variables.
Multivariate Genetic Analysis (Introduction) Frühling Rijsdijk Wednesday March 8, 2006.
Regression and Correlation of Data Summary
Genetic simplex model: practical
Univariate Twin Analysis
Kin 304 Regression Linear Regression Least Sum of Squares
MRC SGDP Centre, Institute of Psychiatry, Psychology & Neuroscience
Path Analysis Danielle Dick Boulder 2008
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
BPK 304W Regression Linear Regression Least Sum of Squares
Linear Regression.
Linear Regression.
Estimates and 95% CIs of between- and within-pair variations for SS and OS twin pairs and achievement test scores in mathematics and reading assessed in.
Correlation for a pair of relatives
DRQ 8 Dr. Capps AGEC points
Statistical Assumptions for SLR
Linear regression Fitting a straight line to observations.
Linear Hierarchical Modelling
Sarah Medland faculty/sarah/2018/Tuesday
HW# : Complete the last slide
BOULDER WORKSHOP STATISTICS REVIEWED: LIKELIHOOD MODELS
Factor Analysis.
Canonical Correlation Analysis
Regression Models - Introduction
Presentation transcript:

ACDE model and estimability Why can’t we estimate (co)variances due to A, C, D and E simultaneously in a standard twin design?

Covariances: MZ cov(y i1,y i2 |MZ) = cov(MZ) =  A 2 +  D 2 +  C 2

Covariance: DZ cov(y i1,y i2 |DZ) = cov(DZ) =½  A 2 + ¼  D 2 +  C 2

Functions of covariances 2cov(DZ) – cov(MZ) =  C 2 - ½  D 2 2(cov(MZ) – cov(DZ))=  A / 2  D 2

Linear model y ij =  +b i +w ij  y 2 =  b 2 +  w 2 y, b and w are random variables t =  b 2 /  y 2 –intra-class correlation = fraction of total variance that is attributable to differences among pairs

Data: “Sufficient statistics” (= Sums of Squares / Mean Squares) MZ –variation between pairs (= covariance) –variation within pairs (= residual) DZ –variation between pairs (covariance) –variation within pairs (residual) 4 summary statistics, so why can’t we estimate all four underlying components?

Causal components Between pairsWithin pairs  A 2 +  D 2 +  C 2  E 2 DZ½  A 2 + ¼  D 2 +  C 2 ½  A / 4  D 2 +  E 2 Difference ½  A / 4  D 2 ½  A / 4  D 2 Different combinations of values of  A 2 and  D 2 will give the same observed difference in between and within MZ and DZ (co)variance: confounding (dependency), can only estimate 3 components

In terms of (co)variances “Observed”Expected MZ var  A 2 +  D 2 +  C 2 +  E 2 MZ cov  A 2 +  D 2 +  C 2 DZ var  A 2 +  D 2 +  C 2 +  E 2 DZ cov½  A 2 + ¼  D 2 +  C 2 MZ & DZ variance have the same expectation. Left with two equations and three unknowns

Assumption  D 2 = 0 : the ACE model Between pairsWithin pairs  A 2 +  C 2  E 2 DZ½  A 2 +  C 2 ½  A 2 +  E 2 4 Mean Squares, 3 unknowns –Maximum likelihood estimation (e.g., Mx)