Download presentation

Presentation is loading. Please wait.

Published byKeshawn Pipes Modified over 2 years ago

1
- Word counts - Speech error counts - Metaphor counts - Active construction counts Moving further Categorical count data

2
Hissing Koreans Winter & Grawunder (2012)

3
No. of Cases Bentz & Winter (2013)

6
Poisson Model

7
Siméon Poisson 1898: Ladislaus Bortkiewicz Army Corps with few Horses Army Corps lots of Horses few deaths low variability many deaths high variability The Poisson Distribution

9
Poisson Regression = generalized linear model with Poisson error structure and log link function

10
The Poisson Model Y ~ log(b 0 + b 1 *X 1 + b 2 *X 2 )

11
In R: lmer(my_counts ~ my_predictors + (1|subject), mydataset, family="poisson")

12
Poisson model output log values predicted mean rate exponentiate

13
Poisson Model

14
- Focus vs. no-focus - Yes vs. No - Dative vs. genitive - Correct vs. incorrect Moving further Binary categorical data

15
Bentz & Winter (2013) Case yes vs. no ~ Percent L2 speakers

20
Logistic Regression = generalized linear model with binomial error structure and logistic link function

21
The Logistic Model p(Y) ~ logit -1 (b 0 + b 1 *X 1 + b 2 *X 2 )

22
In R: lmer(binary_variable ~ my_predictors + (1|subject), mydataset, family="binomial")

23
Probabilities and Odds Probability of an Event Odds of an Event

24
Intuition about Odds N = 12 What are the odds that I pick a blue marble? Answer: 2/10

25
Log odds = logit function

26
Representative values ProbabilityOddsLog odds (= “logits”) 0.10.111-2.197 0.20.25-1.386 0.30.428-0.847 0.40.667-0.405 0.510 0.61.50.405 0.72.330.847 0.841.386 0.992.197

27
Snijders & Bosker (1999: 212)

28
Bentz & Winter (2013)

29
Estimate Std. Error z value Pr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Case yes vs. no ~ Percent L2 speakers Log odds when Percent.L2 = 0

30
Bentz & Winter (2013)

31
Estimate Std. Error z value Pr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Case yes vs. no ~ Percent L2 speakers For each increase in Percent.L2 by 1%, how much the log odds decrease (= the slope)

32
Bentz & Winter (2013)

33
Estimate Std. Error z value Pr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Case yes vs. no ~ Percent L2 speakers Logits or “log odds” Exponentiate Transform by inverse logit Odds Proba- bilitie s

34
Estimate Std. Error z value Pr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Case yes vs. no ~ Percent L2 speakers Logits or “log odds” Transform by inverse logit Odds Proba- bilitie s exp(-6.5728)

35
Estimate Std. Error z value Pr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Case yes vs. no ~ Percent L2 speakers Logits or “log odds” exp(-6.5728) Transform by inverse logit 0.001397 878 Proba- bilitie s

36
Odds > 1 < 1 Numerator more likely Denominator more likely = event happens more often than not = event is more likely not to happen

37
Estimate Std. Error z value Pr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Case yes vs. no ~ Percent L2 speakers Logits or “log odds” exp(-6.5728) Transform by inverse logit 0.001397 878 Proba- bilitie s

38
Estimate Std. Error z value Pr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Case yes vs. no ~ Percent L2 speakers Logits or “log odds” logit.inv(1.4576) 0.81

39
Bentz & Winter (2013) About 80%(makes sense)

40
Estimate Std. Error z value Pr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Case yes vs. no ~ Percent L2 speakers Logits or “log odds” logit.inv(1.4576) 0.81 logit.inv(1.4576+ -6.5728*0.3) 0.37

41
Bentz & Winter (2013)

42
= logit function = inverse logit function

43
This is the famous “logistic function” logit -1

44
Inverse logit function (transforms back to probabilities) logit.inv = function(x){exp(x)/(1+exp(x))} (this defines the function in R)

45
General Linear Model General Linear Model Generalized Linear Model Generalized Linear Model Generalized Linear Mixed Model

46
General Linear Model General Linear Model Generalized Linear Model Generalized Linear Model Generalized Linear Mixed Model

47
General Linear Model General Linear Model Generalized Linear Model Generalized Linear Model Generalized Linear Mixed Model

48
Generalized Linear Model Generalized Linear Model = “Generalizing” the General Linear Model to cases that don’t include continuous response variables (in particular categorical ones) = Consists of two things: (1) an error distribution, (2) a link function

49
= “Generalizing” the General Linear Model to cases that don’t include continuous response variables (in particular categorical ones) = Consists of two things: (1) an error distribution, (2) a link function Logistic regression: Binomial distribution Poisson regression: Poisson distribution Logistic regression: Logit link function Poisson regression: Log link function

50
= “Generalizing” the General Linear Model to cases that don’t include continuous response variables (in particular categorical ones) = Consists of two things: (1) an error distribution, (2) a link function Logistic regression: Binomial distribution Poisson regression: Poisson distribution Logistic regression: Logit link function Poisson regression: Log link function lm(response ~ predictor) glm(response ~ predictor, family="binomial") glm(response ~ predictor, family="poisson")

51
Categorical Data Dichotomous/Binary Count Logistic Regression Poisson Regression

52
General structure Linear Model continuous~any type of variable Logistic Regression dichotomous~any type of variable Poisson Regression count~any type of variable

53
For the generalized linear mixed model… … you only have to specify the family. lmer(…) lmer(…,family="poisson") lmer(…,family="binomial")

54
That’s it (for now)

Similar presentations

OK

Chapter 3: Generalized Linear Models 3.1 The Generalization 3.2 Logistic Regression Revisited 3.3 Poisson Regression 1.

Chapter 3: Generalized Linear Models 3.1 The Generalization 3.2 Logistic Regression Revisited 3.3 Poisson Regression 1.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on ic fabrication steps Ppt on mentoring and coaching Ppt on bank lending statistics Ppt on conservation of wildlife and natural vegetation Ppt on computer languages and platforms Ppt on money and credit class 10 economics Ppt on honeycomb in concrete structure Presentation ppt on motivation theory Ppt on standing order activated Download ppt on query processing and optimization ppt