# - Word counts - Speech error counts - Metaphor counts - Active construction counts Moving further Categorical count data.

## Presentation on theme: "- Word counts - Speech error counts - Metaphor counts - Active construction counts Moving further Categorical count data."— Presentation transcript:

- Word counts - Speech error counts - Metaphor counts - Active construction counts Moving further Categorical count data

Hissing Koreans Winter & Grawunder (2012)

No. of Cases Bentz & Winter (2013)

Poisson Model

Siméon Poisson 1898: Ladislaus Bortkiewicz Army Corps with few Horses Army Corps lots of Horses few deaths low variability many deaths high variability The Poisson Distribution

Poisson Regression = generalized linear model with Poisson error structure and log link function

The Poisson Model Y ~ log(b 0 + b 1 *X 1 + b 2 *X 2 )

In R: lmer(my_counts ~ my_predictors + (1|subject), mydataset, family="poisson")

Poisson model output log values predicted mean rate exponentiate

Poisson Model

- Focus vs. no-focus - Yes vs. No - Dative vs. genitive - Correct vs. incorrect Moving further Binary categorical data

Bentz & Winter (2013) Case yes vs. no ~ Percent L2 speakers

Logistic Regression = generalized linear model with binomial error structure and logistic link function

The Logistic Model p(Y) ~ logit -1 (b 0 + b 1 *X 1 + b 2 *X 2 )

In R: lmer(binary_variable ~ my_predictors + (1|subject), mydataset, family="binomial")

Probabilities and Odds Probability of an Event Odds of an Event

Intuition about Odds N = 12 What are the odds that I pick a blue marble? Answer: 2/10

Log odds = logit function

Representative values ProbabilityOddsLog odds (= “logits”) 0.10.111-2.197 0.20.25-1.386 0.30.428-0.847 0.40.667-0.405 0.510 0.61.50.405 0.72.330.847 0.841.386 0.992.197

Snijders & Bosker (1999: 212)

Bentz & Winter (2013)

Estimate Std. Error z value Pr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Case yes vs. no ~ Percent L2 speakers Log odds when Percent.L2 = 0

Bentz & Winter (2013)

Estimate Std. Error z value Pr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Case yes vs. no ~ Percent L2 speakers For each increase in Percent.L2 by 1%, how much the log odds decrease (= the slope)

Bentz & Winter (2013)

Estimate Std. Error z value Pr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Case yes vs. no ~ Percent L2 speakers Logits or “log odds” Exponentiate Transform by inverse logit Odds Proba- bilitie s

Estimate Std. Error z value Pr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Case yes vs. no ~ Percent L2 speakers Logits or “log odds” Transform by inverse logit Odds Proba- bilitie s exp(-6.5728)

Estimate Std. Error z value Pr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Case yes vs. no ~ Percent L2 speakers Logits or “log odds” exp(-6.5728) Transform by inverse logit 0.001397 878 Proba- bilitie s

Odds > 1 < 1 Numerator more likely Denominator more likely = event happens more often than not = event is more likely not to happen

Estimate Std. Error z value Pr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Case yes vs. no ~ Percent L2 speakers Logits or “log odds” exp(-6.5728) Transform by inverse logit 0.001397 878 Proba- bilitie s

Estimate Std. Error z value Pr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Case yes vs. no ~ Percent L2 speakers Logits or “log odds” logit.inv(1.4576) 0.81

Bentz & Winter (2013) About 80%(makes sense)

Estimate Std. Error z value Pr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Case yes vs. no ~ Percent L2 speakers Logits or “log odds” logit.inv(1.4576) 0.81 logit.inv(1.4576+ -6.5728*0.3) 0.37

Bentz & Winter (2013)

= logit function = inverse logit function

This is the famous “logistic function” logit -1

Inverse logit function (transforms back to probabilities) logit.inv = function(x){exp(x)/(1+exp(x))} (this defines the function in R)

General Linear Model General Linear Model Generalized Linear Model Generalized Linear Model Generalized Linear Mixed Model

General Linear Model General Linear Model Generalized Linear Model Generalized Linear Model Generalized Linear Mixed Model

General Linear Model General Linear Model Generalized Linear Model Generalized Linear Model Generalized Linear Mixed Model

Generalized Linear Model Generalized Linear Model = “Generalizing” the General Linear Model to cases that don’t include continuous response variables (in particular categorical ones) = Consists of two things: (1) an error distribution, (2) a link function

= “Generalizing” the General Linear Model to cases that don’t include continuous response variables (in particular categorical ones) = Consists of two things: (1) an error distribution, (2) a link function Logistic regression: Binomial distribution Poisson regression: Poisson distribution Logistic regression: Logit link function Poisson regression: Log link function

= “Generalizing” the General Linear Model to cases that don’t include continuous response variables (in particular categorical ones) = Consists of two things: (1) an error distribution, (2) a link function Logistic regression: Binomial distribution Poisson regression: Poisson distribution Logistic regression: Logit link function Poisson regression: Log link function lm(response ~ predictor) glm(response ~ predictor, family="binomial") glm(response ~ predictor, family="poisson")

Categorical Data Dichotomous/Binary Count Logistic Regression Poisson Regression

General structure Linear Model continuous~any type of variable Logistic Regression dichotomous~any type of variable Poisson Regression count~any type of variable

For the generalized linear mixed model… … you only have to specify the family. lmer(…) lmer(…,family="poisson") lmer(…,family="binomial")

That’s it (for now)

Download ppt "- Word counts - Speech error counts - Metaphor counts - Active construction counts Moving further Categorical count data."

Similar presentations