Presentation is loading. Please wait.

Presentation is loading. Please wait.

Future directions for DDI

Similar presentations


Presentation on theme: "Future directions for DDI"— Presentation transcript:

1 Future directions for DDI
Steven McEachern and Jared Lyle AAPOR DDI panel May 2017

2 What have we seen today Discovering data (CLOSER)
Managing data and metadata (MIDUS) Integrating and harmonising metadata (BLS) Depositing data, transparency (Roper) So what else could you do?

3 Re-using questions Questions can be reused, at the moment you can just download the information and import into a questionnaire design tool or save to a PDF for reference

4 Re-using questions – with DDI-Lifecycle
Behind the scenes, there is the ability for say a questionnaire design tool to capture this question What language it is in The code list being used Response cardinality & within the blob of information self-provenance where it came from Persistent URN

5 Creating new instruments
European Social Survey QDDT – NSD Norway INSEE France Pogues

6 European Social Survey (ESS)
Questionnaire Module Concept Question Response Domain Instrument Workflow in the European Social Survey (ESS) NSD©2016 NSD©2016

7 QDDT conceptual model, DDI 3.2 based
StatementItem ControlConstruct Instrument QuestionAsInInstrument Study Universe Concept QuestionItem QuestionGrid ResponseDomain CodeDomain CategoryDomain ExternalAid DateTimeDomain Instruction ScaleDomain TextDomain IASSIST, Bergen, May 31 – June 3 NumericDomain

8

9 INSEE - General Architecture
Questionnaire design Persistence Conversion & translation Transformation Questionnaire publication Pogues ENO RMéS Pogues (back-office) Metadata supply Developed with Pogues Coltrane development already in production Development in progress

10 Questionnaire design - Guidelines

11 Same source, multiple outputs
Roper Center Data deposit “Automated” TI reporting Data Archives Codebook/data dictionary creation

12 Roper Center - Metadata
Sample: National adult including an oversample of year olds Sample Notes: This study contains sampling using landline telephones and cellular phones Sample Size: 1,821 Response Rate: Landline=AAPOR RR3: 8.7 percent, Cell=AAPOR RR3: 8.6 percent So let’s take a look again at that report from Pew. This is what Roper’s database of metadata for archived studies looks like now. It contains the necessary information in an organized and standardized form, but there’s a fair amount of information in each field. To make this conform with DDI standards, we have to introduce greater granularity in the metadata Estimated Sample Error: +/- 2.5 percentage points at the 95 percent confidence level

13 DDI Standards Sample Size: 1,821
Geographical Location (GeographicLocation): US Universe (StudyUnitUniverseRef): Adult population Sample1 (SourceDescription) : All (Source Type) Adults Sample2 (SourceDescription) : Oversample (Source Type) Age 18-33 Sample: National adult including an oversample of year olds Sample Notes: This study contains sampling using landline telephones and cellular phones Sample Size: 1,821 Response Rate: Landline=AAPOR RR3: 8.7 percent, Cell=AAPOR RR3: 8.6 percent Estimated Sample Error: +/- 2.5 percentage points at the 95 percent confidence level Sample Size1 (NumberofResponses): 1,821 Sample1, Mode1 and 2: Telephone interviews/landline and Telephone interviews/cell phone Sample Size2 (NumberofResponses): 481 Sample1, Mode1: Telephone interviews/landline Sample Size3 (NumberofResponses): 1,125 Sample1, Mode2: Telephone interviews/cell phone Sample Size4 (NumberofResponses): 215 Sample2, Mode2: Screened cell phone (18-33 oversample) DDI does help to drive that push toward greater granularity – as does the AAPOR Transparency Initiative by encouraging survey research organizations to provide more information about methods. But again, we could achieve that without adopting DDI. But by adopting DDI standards, we now have a way of identifying each piece of metadata we are including that is consistent with other organizations and that expedites machine readability. We’re currently in betatesting for a data processing system that reads and writes to DDI. This processing system will be used internally, but there’s an external component as well, which will allow data providers to submit data. I’m going to show you some of what that looks like. Mode1 (ModeOfCollection): Telephone interviews/landline Response Rate 1 (SpecificResponseRate): 8.7% Response Rate 1 Definition (Description): AAPOR RR3 Mode2 (ModeOfCollection): Telephone interviews/cell phone Response Rate 2 (SpecificResponseRate): 8.6% Response Rate 2 (Description): AAPOR RR3 Margin of error (SamplingError): +/- 2.5 (percentage points)

14 Depositor information
Sponsor & survey organization First page of the deposit provides basic information about the depositor and the organizations involved in the survey – which also gives essential provenance information for the archive. Grant funding Submit single survey or a group or series

15 DDI planning Manifesto for long-term data infrastructure in the social sciences Currently in development from the DDI Alliance Looking at metadata and data usage requirements across the data lifecycle

16 Example: Element Registry and Survey Design

17 Discussion What kinds of metadata and data management problems are you trying to solve? What would be the next step for you to implement DDI into your own processes? How can DDI and the DDI Alliance help?


Download ppt "Future directions for DDI"

Similar presentations


Ads by Google