Presentation is loading. Please wait.

Presentation is loading. Please wait.

Data lifecycle Data Management & Survey Conception

Similar presentations


Presentation on theme: "Data lifecycle Data Management & Survey Conception"— Presentation transcript:

1 Data lifecycle Data Management & Survey Conception
Data System Architecture Data collection management Data Analysis & Dissemination 1 1

2 Type of info per usage Introduction 2

3 From Data.. to Information
Introduction Operation Data Manager should be involved in all the steps of a “Data Lifecycle”. Any break of this cycle ends with the failure of the system : A data collection form that is ill- designed either because it does not satisfy operational information requirements or is flawed from a technical standpoint A well designed survey with a poorly designed and therefore poorly maintained database A structurally well designed database with no data, as data collection cycles have not been integrated/respected A well populated database without implemented reports and queries and therefore no output Survey Conception Data System Architecture Collection management Data Analysis & Dissemination 3

4 Layers of data collection
Before the Form… Survey Conception Avoid reinventing the wheel – check what has been designed and piloted before Consultation with all stakeholders – avoid duplication of efforts and assessment fatigue of beneficiaries Layers of data collection Collect Simple base reference data first Embark on detailed info based on samples defined from the base reference Data collection frequency should vary according to how frequently the phenomena being tracked or measured changes 4

5 Good practices for Data Collection Forms
Survey Conception Good practices for Data Collection Forms Questionnaires used in survey research should be clear and well presented. Think about the form of the questions, Keep the survey as short as possible. Make definitions of data elements consistent with standard definitions and analytic conventions Plan clearly how answers will be analyzed. Test the survey for “understandability” and respondent effort through focus groups Good practices for Data Collection Forms Questionnaires used in survey research should be clear and well presented. The use of capital (upper case) letters only should be avoided, as this format is hard to read. Questions should be numbered and clearly grouped by subject, each respondent being asked the same questions in the same order. Clear instructions should be given and headings included to make the questionnaire easier to follow. Provide definitions and clarifying information for individual items and terms. Be clear about the date or time period the survey should reflect. Be sure that return address, fax number, and contact information are included on the form itself as well as in accompanying letter. Consider including an addressed, postage-paid envelope to return the survey. Think about the form of the questions, Avoid ‘double-barreled’ questions (two or more questions in one, ie. do not combine two separate ideas inappropriately and request a single response). Avoid questions containing double negatives, and leading or ambiguous questions. Use standard language. Avoid jargon and abbreviations If closed questions are used the interviewer should also have a range of pre-coded responses available. Make categories as concrete as possible (Example: Replace vague quantifying words (e.g., frequently, often, sometimes) with specific numerical ranges whenever possible. If response categories are supposed to be mutually exclusive and exhaustive, make sure all possible categories are included on the survey. If in doubt about whether categories are exhaustive, include a response option of "other, please specify." Indicate subpopulations to include and/or exclude in the reporting. If data may be unknown or missing, be sure to include an “unknown/missing” option. Provide a “not applicable” response for questions that may not be applicable to all respondents. Design the item sequence of the survey such that it increases the respondent's ability to complete the survey. Keep topic-related questions together and provide transitions between topics. Fro example, each type of data is listed in a separate section with a clear heading that identifies the topic (e.g., general information, demographic, population movement). Definitions and cohorts within one survey and across related surveys should be consistent. If data for a cohort are reported in more than one section of the survey, indicate that grand totals for each section should match. Example: total by age should equal total by ethnicity. Keep the survey as short as possible. Ensure that the response burden does not exceed a reasonable length and is justified by the use of the data. Examine each item in the data collection instrument to make sure that the information is needed and will be used. Avoid requesting information that is of marginal use. Avoid requesting data that can be obtained from another available survey or database (Example: Do not ask for disaggregating data that will not be used). Make definitions of data elements consistent with standard definitions and analytic conventions (i.e., calculations or methodologies) when appropriate and feasible. If appropriate, use definitions that conform to definitions developed nationally to ensure that the data reported will be comparable to data reported by other agencies and organizations at the institutional, state, and federal levels. If standard definitions and/or analytic conventions are used, indicate the sources of the definitions or conventions used. A number of resources are identified in the appendices. Determine whether another organization is already collecting data related to the items you plan to collect; if so, obtain a copy of that survey and consider using the same definitions and analytic conventions as a starting point for your survey. Another option is to ask respondents to report data directly from the other survey or to request a copy of the completed survey. Any deviations from accepted usage should be explained. If more than one analytic convention is commonly used for a given measure, indicate the preferred convention. Ask respondents to indicate if they used a different analytic convention and why. Questions may be open (where the respondent composes the reply) or closed (where pre-coded response options are available, e.g. multiple-choice questions). Whether using open or closed questions, data managers should plan clearly how answers will be analyzed. Closed questions with pre-coded response options are most suitable for topics where the possible responses are known. Closed questions are quick to administer and can be easily coded and analyzed. Open questions should be used where possible replies are unknown or too numerous to pre-code. Open questions are more demanding for respondents but if well answered can provide useful insight into a topic. Open questions, however, can be time consuming to administer and difficult to analyze. Open questions are used more frequently in unstructured interviews, whereas closed questions typically appear in structured interview schedules. A structured interview is like a questionnaire that is administered face to face with the respondent. Whenever possible, test the survey for “understandability” and respondent effort through focus groups, cognitive laboratory sessions, or pilot testing. Whenever possible, conduct a “trial-run” in which results are made available only to survey respondents before releasing them publicly. The purpose of these activities is to ensure that: Each item is understandable to the respondent The technical terms used are appropriate to the respondent The questions are clear and unambiguous to the respondent The items elicit a single response The survey is not too much of a burden for the respondent Seek periodic review of the survey instrument from experienced, knowledgeable individuals.

6 Data model Data System Architecture Data models are the key for interoperability (i.e easy data exchange with partners) Implementing partners should not have to draft and decide on a core data model; it should be the same everywhere and just adapted locally where necessary; support (guidelines) need to be there Importance of a common referential Site / community Assessment Beneficiary registration Demographics Multi sectoral assessment: Health Education Water Bio Data Vulnerability Needs Site Infrastructure Inventory Base indicators Delivered Assistance Organization Who’s doing what where? Project activities description Performance Indicators Activity monitoring 6

7 Building an Interface for data collection:
System architecture Data System Architecture Building an Interface for data collection: Mobile Offline desktop Web/Server based OCR* ready form (can be scanned) Integration of external data source (ETL**) Offering analysis capacity (OLAP*** and Stats) * Mechanical or electronic translation of scanned images of handwritten, typewritten or printed text into machine-encoded text ** Extract, transform, and load (ETL) is a process in database usage that involves Extracting data from outside sources, Transforming it to fit operational needs (which can include quality levels), Loading it into the end target (database or data warehouse) *** An OLAP (Online analytical processing) cube is a data structure that allows fast analysis of data. 7

8 Reports are part of the data system
Data System Architecture Queries and tools to extract data from the databases need to be designed along with the database Must give abilities for reporting officers to Set up queries and reports without high level IT knowledge To be clear on the standard indicators these queries should be based on 8

9 Data collection strategies
Data collection management Direct coordination with partners ex : Somali protection cluster Establishment of a « data collection project » ex : UNOPS Goma Specific Contract with a dedicated partner Ex: CartONG in Uganda 9

10 Avoid conflict of interest
Implementation matrix Data collection management Avoid conflict of interest 10

11 Targets mostly local partners and decision makers
PDF reports and maps Data Dissemination Targets mostly local partners and decision makers Can be disseminated through mailing list (cf Somali protection) Google group (cf Goma Update) Website (cf ReliefWeb) 11

12 GeoPortal and Open Data API
Data Dissemination GeoPortal: is a tool to ensure institutional memory and “Master Data” management Can be a tool for desk officers to visualize a situation and use map extracts in their reporting Data API: Can be used for global dissemination: cf Worldbank Data API or Google public data Offers material for data journalism (e.g. computer assisted reporting on data through journalists) 12

13 Data, Law & License Data Dissemination For all data sets that do not fall under the “Guidelines for the Regulation of Computerized Personal Data Files” (for instance protection data) …. …. The “Open database license” (ODBL) can give a legal frame to all our data collection activities 13

14 Providing support for the 4 phases of the process
Conclusion 4 specific types of expertise that are difficult to combine in one profile: Statistician/Analyst: Creating a questionnaire and compiling analyzing the resulting statistics IS Architect: Building the information system Manager: Managing the stakeholder consultation process during the design phase, the collection in the field and dissemination of results Data journalist: Developing sound and sexy reports Need to find where are the gap among the “Operation Data Management” officers network Need to define the training & support need for each of those specific domains 14


Download ppt "Data lifecycle Data Management & Survey Conception"

Similar presentations


Ads by Google