Presentation is loading. Please wait.

Presentation is loading. Please wait.

Overview of 2010 EHC-CAPI Field Test and Objectives Jason Fields Housing and Household Economic Statistics Division US Census Bureau Presentation to the.

Similar presentations


Presentation on theme: "Overview of 2010 EHC-CAPI Field Test and Objectives Jason Fields Housing and Household Economic Statistics Division US Census Bureau Presentation to the."— Presentation transcript:

1 Overview of 2010 EHC-CAPI Field Test and Objectives Jason Fields Housing and Household Economic Statistics Division US Census Bureau Presentation to the ASA/SRM SIPP Working Group November 17, 2009

2 “Re-SIPP” Development * Following successful completion of the EHC Paper Field Test

3 “Re-SIPP” Development * Following successful completion of the EHC Paper Field Test * Develop the 2010 plan to test an electronic EHC instrument

4 “Re-SIPP” Development * Following successful completion of the EHC Paper Field Test * Develop the 2010 plan to test an electronic EHC instrument * Broad involvement across Census Bureau - DID- FLD - TMO - DSD- HHES - DSMD- SRD

5 Primary Goals of 2010 Test

6 (1) Strong evidence of comparable data quality

7 Primary Goals of 2010 Test (1) Strong evidence of comparable data quality - How well do the calendar year 2009 data from the 2010 EHC-CAPI Field Test match data from the 2008 SIPP panel?

8 Primary Goals of 2010 Test (1) Strong evidence of comparable data quality - How well do the calendar year 2009 data from the 2010 EHC-CAPI Field Test match data from the 2008 SIPP panel? - Especially for income transfer programs

9 Primary Goals of 2010 Test (1) Strong evidence of comparable data quality - How well do the calendar year 2009 data from the 2010 EHC-CAPI Field Test match data from the 2008 SIPP panel? - Especially for income transfer programs (2) Strong evidence to guide development and refinement before implementation in 2013 as the production SIPP instrument

10 Basic Design Features (1)

11  8,000 Sample Addresses

12 Basic Design Features (1)  8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities

13 Basic Design Features (1)  8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities “High Poverty” Sample Stratum

14 Basic Design Features (1)  8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities “High Poverty” Sample Stratum - to evaluate how well income transfer program data are collected

15 Basic Design Features (1)  8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities “High Poverty” Sample Stratum - to evaluate how well income transfer program data are collected State-Based Design

16 Basic Design Features (1)  8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities “High Poverty” Sample Stratum - to evaluate how well income transfer program data are collected State-Based Design - likely (possible?) access to admin records

17 ROStateSample NNotes BOSConnecticut Massachusetts New York Rhode Island 204 465 366 120 1,155 covers upstate (non-NYC) NY

18 ROStateSample NNotes BOSConnecticut Massachusetts New York Rhode Island 204 465 366 120 1,155 covers upstate (non-NYC) NY NYNew York 1,681covers NYC portion of NY

19 ROStateSample NNotes BOSConnecticut Massachusetts New York Rhode Island 204 465 366 120 1,155 covers upstate (non-NYC) NY NYNew York 1,681covers NYC portion of NY PHILMaryland 280

20 ROStateSample NNotes BOSConnecticut Massachusetts New York Rhode Island 204 465 366 120 1,155 covers upstate (non-NYC) NY NYNew York 1,681covers NYC portion of NY PHILMaryland 280 CHIIllinois Wisconsin 620 132 752 excludes 57 IL addresses in KC-RO

21 ROStateSample NNotes BOSConnecticut Massachusetts New York Rhode Island 204 465 366 120 1,155 covers upstate (non-NYC) NY NYNew York 1,681covers NYC portion of NY PHILMaryland 280 CHIIllinois Wisconsin 620 132 752 excludes 57 IL addresses in KC-RO DALTexas Louisiana 1,382 325 1,707

22 ROStateSample NNotes BOSConnecticut Massachusetts New York Rhode Island 204 465 366 120 1,155 covers upstate (non-NYC) NY NYNew York 1,681covers NYC portion of NY PHILMaryland 280 CHIIllinois Wisconsin 620 132 752 excludes 57 IL addresses in KC-RO DALTexas Louisiana 1,382 325 1,707 LACalifornia 2,407excludes 445 CA addresses in SEA-RO

23 ROStateSample NNotes BOSConnecticut Massachusetts New York Rhode Island 204 465 366 120 1,155 covers upstate (non-NYC) NY NYNew York 1,681covers NYC portion of NY PHILMaryland 280 CHIIllinois Wisconsin 620 132 752 excludes 57 IL addresses in KC-RO DALTexas Louisiana 1,382 325 1,707 LACalifornia 2,407excludes 445 CA addresses in SEA-RO TOTAL N: 7,982

24 ROStateSample NNotes BOSConnecticut Massachusetts New York Rhode Island 204 465 366 120 1,155 covers upstate (non-NYC) NY NYNew York 1,681covers NYC portion of NY PHILMaryland 280 CHIIllinois Wisconsin 620 132 752 excludes 57 IL addresses in KC-RO DALTexas Louisiana 1,382 325 1,707 LACalifornia 2,407excludes 445 CA addresses in SEA-RO TOTAL N: 7,982 TOTAL ADMIN RECS (?) N: 6,736

25 Basic Design Features (2)

26 Field Period: Early Jan - mid March 2010

27 Basic Design Features (2) Field Period: Early Jan - mid March 2010 - collect data about calendar year 2009

28 Basic Design Features (2) Field Period: Early Jan - mid March 2010 - collect data about calendar year 2009 Field Representative training in Dec/Jan

29 Basic Design Features (2) Field Period: Early Jan - mid March 2010 - collect data about calendar year 2009 Field Representative training in Dec/Jan - goal: minimize # of FRs with post-training “down-time” - evaluation and improvement of training

30 Basic Design Features (2) Field Period: Early Jan - mid March 2010 - collect data about calendar year 2009 Field Representative training in Dec/Jan - goal: minimize # of FRs with post-training “down-time” - evaluation and improvement of training Use FRs with a wide range of experience

31 Basic Design Features (2) Field Period: Early Jan - mid March 2010 - collect data about calendar year 2009 Field Representative training in Dec/Jan - goal: minimize # of FRs with post-training “down-time” - evaluation and improvement of training Use FRs with a wide range of experience Expand RO involvement

32 Research Agenda

33 1. Quantify likely cost savings

34 Research Agenda 1. Quantify likely cost savings 2. Test the data processing system

35 Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality

36 Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials

37 Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training

38 Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs”

39 Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues

40 Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC)

41 Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

42 Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

43 Special Methods 1. Quantify likely cost savings

44 Special Methods 1. Quantify likely cost savings - new cost code(s) established - timing interview length - exchange between 12-month recall and 3 interviews per year

45 Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

46 Special Methods 2. Test the data processing system

47 Special Methods 2. Test the data processing system The data collected in this test will be used to develop and test a new data processing system.

48 Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

49 Special Methods 3. Evaluate data quality

50 Special Methods 3. Evaluate data quality - administrative records

51 Special Methods 3. Evaluate data quality - administrative records - recording of selected interviews

52 Special Methods 3. Evaluate data quality - administrative records - recording of selected interviews - extract SIPP 2008 panel data; compare CY2009 estimates from the two surveys

53 Special Methods 3. Evaluate data quality - administrative records - recording of selected interviews - extract SIPP 2008 panel data; compare CY2009 estimates from the two surveys

54 (Details) Interview Recording

55 - close-to-RO FRs (approximately 80)

56 (Details) Interview Recording - close-to-RO FRs (approximately 80) - 3 recording windows (early Jan, late Jan, mid Feb)

57 (Details) Interview Recording - close-to-RO FRs (approximately 80) - 3 recording windows (early Jan, late Jan, mid Feb) - message: “record the next two interviews”

58 (Details) Interview Recording - close-to-RO FRs (approximately 80) - 3 recording windows (early Jan, late Jan, mid Feb) - message: “record the next two interviews” - with consent; adults only (21+)

59 (Details) Interview Recording - close-to-RO FRs (approximately 80) - 3 recording windows (early Jan, late Jan, mid Feb) - message: “record the next two interviews” - with consent; adults only (21+) - record R’s entire continuous “turn”

60 (Details) Interview Recording - close-to-RO FRs (approximately 80) - 3 recording windows (early Jan, late Jan, mid Feb) - message: “record the next two interviews” - approximately 480 recorded interviews - with consent; adults only (21+) - record R’s entire continuous “turn” - in RO, with the assistance of the ROCS transfer recordings to the secure HQ network

61 Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

62 Special Methods 4. Evaluate “field support” materials (advance letter, brochure, calendar aid)

63 Special Methods 4. Evaluate “field support” materials (advance letter, brochure, calendar aid) - Respondent debriefing instrument block

64 Special Methods 4. Evaluate “field support” materials (advance letter, brochure, calendar aid) - Respondent debriefing instrument block - FR debriefing sessions

65 Special Methods 4. Evaluate “field support” materials (advance letter, brochure, calendar aid) - Respondent debriefing instrument block - FR debriefing sessions - recording of selected interviews

66 Special Methods 4. Evaluate “field support” materials (advance letter, brochure, calendar aid) - Respondent debriefing instrument block - FR debriefing sessions - recording of selected interviews

67 (Details) R Debriefing Block

68 - at end of interview (status=“complete”)

69 (Details) R Debriefing Block - at end of interview (status=“complete”) - focus on “field support” materials:

70 (Details) R Debriefing Block - at end of interview (status=“complete”) - focus on “field support” materials:  advance letter, brochure, calendar aid

71 (Details) R Debriefing Block - at end of interview (status=“complete”) - focus on “field support” materials:  advance letter, brochure, calendar aid - very brief question set:

72 (Details) R Debriefing Block - at end of interview (status=“complete”) - focus on “field support” materials:  advance letter, brochure, calendar aid - very brief question set:  “did you see [X]?”

73 (Details) R Debriefing Block - at end of interview (status=“complete”) - focus on “field support” materials:  advance letter, brochure, calendar aid - very brief question set:  “did you see [X]?”  “did you read [X]?”

74 (Details) R Debriefing Block - at end of interview (status=“complete”) - focus on “field support” materials:  advance letter, brochure, calendar aid - very brief question set:  “did you see [X]?”  “did you read [X]?”  “did [X] have [+/-/0] impact?”

75 (Details) R Debriefing Block - at end of interview (status=“complete”) - focus on “field support” materials:  advance letter, brochure, calendar aid - very brief question set:  “did you see [X]?”  “did you read [X]?”  “did [X] have [+/-/0] impact?” - with most convenient respondent

76 Special Methods 4. Evaluate “field support” materials (advance letter, brochure, calendar aid) - Respondent debriefing instrument block - FR debriefing sessions - recording of selected interviews

77 Special Methods 4. Evaluate “field support” materials (advance letter, brochure, calendar aid) - Respondent debriefing instrument block - FR debriefing sessions - recording of selected interviews

78 (Details) FR Debriefings

79 - at (or near) end of field period

80 (Details) FR Debriefings - at (or near) end of field period - at least one session per RO

81 (Details) FR Debriefings - at (or near) end of field period - at least one session per RO - with 8-10 FRs/SFRs

82 (Details) FR Debriefings - at (or near) end of field period - at least one session per RO - with 8-10 FRs/SFRs - guided 2-3 hour discussion

83 (Details) FR Debriefings - at (or near) end of field period - at least one session per RO - with 8-10 FRs/SFRs - guided 2-3 hour discussion - wide range of issues – e.g., training, EHC procedures, usability, interview “process” issues, etc.

84 (Details) FR Debriefings - at (or near) end of field period - at least one session per RO - with 8-10 FRs/SFRs - guided 2-3 hour discussion - wide range of issues – e.g., training, EHC procedures, usability, interview “process” issues, etc. - improvements for 2013

85 Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

86 Special Methods 5. Evaluate FR training

87 Special Methods 5. Evaluate FR training - recording of selected interviews

88 Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing

89 Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation

90 Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation - HQ (and RO) interview observation

91 Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation - HQ (and RO) interview observation - FR debriefing sessions

92 Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation - HQ (and RO) interview observation - FR debriefing sessions - FR feedback instrument block

93 Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation - HQ (and RO) interview observation - FR debriefing sessions - FR feedback instrument block - FR training assessment form

94 Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation - HQ (and RO) interview observation - FR debriefing sessions - FR feedback instrument block - FR training assessment form - Trainers’ debriefing

95 Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation - HQ (and RO) interview observation - FR debriefing sessions - FR feedback instrument block - FR training assessment form - Trainers’ debriefing

96 (Details) HQ/RO Interview Observations

97 - intensive HQ/RO observation of field test

98 (Details) HQ/RO Interview Observations - intensive HQ/RO observation of field test - key observation themes:

99 (Details) HQ/RO Interview Observations - intensive HQ/RO observation of field test - key observation themes:  use of EHC techniques (landmarks, cross-domain referencing, calendar aid)

100 (Details) HQ/RO Interview Observations - intensive HQ/RO observation of field test - key observation themes:  use of EHC techniques (landmarks, cross-domain referencing, calendar aid)  instrument usability/navigation

101 (Details) HQ/RO Interview Observations - intensive HQ/RO observation of field test - key observation themes:  use of EHC techniques (landmarks, cross-domain referencing, calendar aid)  instrument usability/navigation  FR preparedness/training

102 (Details) HQ/RO Interview Observations - intensive HQ/RO observation of field test - key observation themes:  use of EHC techniques (landmarks, cross-domain referencing, calendar aid)  instrument usability/navigation  FR preparedness/training  R interest/engagement

103 (Details) HQ/RO Interview Observations - intensive HQ/RO observation of field test - key observation themes:  use of EHC techniques (landmarks, cross-domain referencing, calendar aid)  instrument usability/navigation  FR preparedness/training  R interest/engagement - R debriefing regarding landmarks

104 Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation - HQ (and RO) interview observation - FR debriefing sessions - FR feedback instrument block - FR training assessment form - Trainers’ debriefing

105 Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation - HQ (and RO) interview observation - FR debriefing sessions - FR feedback instrument block - FR training assessment form - Trainers’ debriefing

106 (Details) FR Feedback Block

107 - at end of interview (status=“complete”)

108 (Details) FR Feedback Block - at end of interview (status=“complete”) - brief set of Qs about:

109 (Details) FR Feedback Block - at end of interview (status=“complete”) - brief set of Qs about:  use of EHC methods (domains; success)

110 (Details) FR Feedback Block - at end of interview (status=“complete”) - brief set of Qs about:  use of EHC methods (domains; success)  EHC instrument bugs

111 (Details) FR Feedback Block - at end of interview (status=“complete”) - brief set of Qs about:  use of EHC methods (domains; success)  EHC instrument bugs  perceived +/- R reactions

112 (Details) FR Feedback Block - at end of interview (status=“complete”) - brief set of Qs about:  use of EHC methods (domains; success)  EHC instrument bugs  perceived +/- R reactions  training gaps

113 Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

114 Special Methods 6. Identify & document instrument “bugs”

115 Special Methods 6. Identify & document instrument “bugs” - HQ (and RO) interview observations

116 Special Methods 6. Identify & document instrument “bugs” - HQ (and RO) interview observations - FR debriefing sessions

117 Special Methods 6. Identify & document instrument “bugs” - HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block

118 Special Methods 6. Identify & document instrument “bugs” - HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block - item-level notes

119 Special Methods 6. Identify & document instrument “bugs” - HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block - item-level notes

120 (Details) Item-Level Notes

121 - accessible throughout Blaise interview

122 (Details) Item-Level Notes - accessible throughout Blaise interview non-calendar sections standarized Q “script”

123 (Details) Item-Level Notes - accessible throughout Blaise interview - FR training will encourage & instruct

124 (Details) Item-Level Notes - accessible throughout Blaise interview - FR training will encourage & instruct - focus on “bugs” – instrument not working as planned, e.g.:

125 (Details) Item-Level Notes - accessible throughout Blaise interview - FR training will encourage & instruct - focus on “bugs” – instrument not working as planned, e.g.:  wrong/missing fills

126 (Details) Item-Level Notes - accessible throughout Blaise interview - FR training will encourage & instruct - focus on “bugs” – instrument not working as planned, e.g.:  wrong/missing fills  garbled wording

127 (Details) Item-Level Notes - accessible throughout Blaise interview - FR training will encourage & instruct - focus on “bugs” – instrument not working as planned, e.g.:  wrong/missing fills  garbled wording  wrong/missing Qs

128 (Details) Item-Level Notes - accessible throughout Blaise interview - FR training will encourage & instruct - focus on “bugs” – instrument not working as planned, e.g.:  wrong/missing fills  garbled wording  wrong/missing Qs  FR “work-arounds”

129 (Details) Item-Level Notes - accessible throughout Blaise interview - FR training will encourage & instruct - focus on “bugs” – instrument not working as planned, e.g.:  wrong/missing fills  garbled wording  wrong/missing Qs  FR “work-arounds”  missing help screens

130 (Details) Item-Level Notes - accessible throughout Blaise interview - FR training will encourage & instruct - focus on “bugs” – instrument not working as planned, e.g.:  wrong/missing fills  garbled wording  wrong/missing Qs  FR “work-arounds”  missing help screens  confusing/inapp./redundant/etc. Qs

131 Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

132 Special Methods 7. Identify “interview process” issues

133 Special Methods 7. Identify “interview process” issues (interview “flow,” R interest/engagement, EHC interaction, mix of structured/unstructured Qs)

134 Special Methods 7. Identify “interview process” issues (interview “flow,” R interest/engagement, EHC interaction, mix of structured/unstructured Qs) - HQ (and RO) interview observations

135 Special Methods 7. Identify “interview process” issues (interview “flow,” R interest/engagement, EHC interaction, mix of structured/unstructured Qs) - HQ (and RO) interview observations - FR debriefing sessions

136 Special Methods 7. Identify “interview process” issues (interview “flow,” R interest/engagement, EHC interaction, mix of structured/unstructured Qs) - HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block

137 Special Methods 7. Identify “interview process” issues (interview “flow,” R interest/engagement, EHC interaction, mix of structured/unstructured Qs) - HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block - recording of selected interviews

138 Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

139 Special Methods 8. Identify usability issues (esp. EHC) (instrument navigation, FRs’ ability to access and use special features of the EHC)

140 Special Methods 8. Identify usability issues (esp. EHC) (instrument navigation, FRs’ ability to access and use special features of the EHC) - HQ (and RO) interview observations

141 Special Methods 8. Identify usability issues (esp. EHC) (instrument navigation, FRs’ ability to access and use special features of the EHC) - HQ (and RO) interview observations - FR debriefing sessions

142 Special Methods 8. Identify usability issues (esp. EHC) (instrument navigation, FRs’ ability to access and use special features of the EHC) - HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block

143 Special Methods 8. Identify usability issues (esp. EHC) (instrument navigation, FRs’ ability to access and use special features of the EHC) - HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block - recording of selected interviews

144 Special Methods 8. Identify usability issues (esp. EHC) (instrument navigation, FRs’ ability to access and use special features of the EHC) - HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block - recording of selected interviews - FR testing sessions at HQ

145 Summary: Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC)

146 Summary: Research Agenda Lots of Extra “Stuff” – 2010 Test is Loaded - Data quality - Instrument quality - Training quality

147 Summary: Research Agenda Lots of Extra “Stuff” – 2010 Test is Loaded - Data quality - Instrument quality - Training quality GOAL: Fully Exploit the Test’s Information Potential

148 Summary: Research Agenda Lots of Extra “Stuff” – 2010 Test is Loaded - Data quality - Instrument quality - Training quality GOAL: Fully Exploit the Test’s Information Potential Improvements/Refinements for 2013

149 What’s Missing from 2010?

150 - Attrition/mover effects in an annual interview

151 What’s Missing from 2010? - Attrition/mover effects in an annual interview - Year to year data quality - seams between waves of a 12-month reference period interview

152 What’s Missing from 2010? - Attrition/mover effects in an annual interview - Year to year data quality - seams between waves of a 12-month reference period interview - Wave 2+ instrument and procedures

153 What’s Missing from 2010? - Attrition/mover effects in an annual interview - Year to year data quality - seams between waves of a 12-month reference period interview - Wave 2+ instrument and procedures - In Development – 2011 / 2012 Testing Plans

154 Thanks! Questions? contact: Jason.M.Fields@census.gov


Download ppt "Overview of 2010 EHC-CAPI Field Test and Objectives Jason Fields Housing and Household Economic Statistics Division US Census Bureau Presentation to the."

Similar presentations


Ads by Google