Presentation is loading. Please wait.

Presentation is loading. Please wait.

Expected Outputs & Lessons Learned

Similar presentations


Presentation on theme: "Expected Outputs & Lessons Learned"— Presentation transcript:

1 Expected Outputs & Lessons Learned
Léa Hakim & Neda Jafar Statistics Coordination Unit, ESCWA Workshop on Building Country Capacity to Maintain a Central Repository of Data Amman March 2007

2 2. Action plan for clean-up
Expected Outputs By the end of today’s session: 1. Preliminary Report 2. Action plan for clean-up

3 2. Action plan for clean-up
Expected Outputs 1. Preliminary Report 2. Action plan for clean-up

4 Preliminary Report Consolidation of working session outputs as per templates shared: Status: Database Management DevInfo as a Central Repository of Data (CRD) Data coverage Linking producers and users of data Dissemination

5 Preliminary Report Structure and Content Review Summary:
DevInfo summary report (Excel sheets) 2. MDG indicator I-U-S analysis (template provided)

6 1. DevInfo summary report: Categories (Current – Proposed)
Indicator Unit Subgroup I-U-S Time period Area Sector Goal Framework Institution Theme Convention Source Specify Indicator Classification(s) employed

7 1. DevInfo summary report: Review of subgroups
Source: Report on DevInfo 2004 Technical Support for UNDG Project “Building Capactiy and Statistical Literacy for MDG Monitoring”, Community Systems Foundation p.12

8 1. DevInfo summary report: Review of sources
Source: Report on DevInfo 2004 Technical Support for UNDG Project “Building Capactiy and Statistical Literacy for MDG Monitoring”, Community Systems Foundation p.9

9 2. MDG Monitoring Indicator I-U-S Analysis
Source: Report on DevInfo 2004 Technical Support for UNDG Project “Building Capactiy and Statistical Literacy for MDG Monitoring”, Community Systems Foundation p.19

10 2. MDG Monitoring Indicator I-U-S Analysis
Complete template in reviewing data content Customize area categories Include review of whether changes implemented for final report [ Column “Action Taken (Y, N)” in template shared] Note: Electronic I-U-S Analysis template distributed.

11 Preliminary Report Conclusions for next steps
Outline decisions taken for next steps during this training Ex. Focus on MDG indicators only for first launch Ex. Establishing area codes at sub-national level Ex. Adding proxies/national development indicators

12 Final Report Outline Introduction Objectives Status of CRD
Structure and Content Review Conclusions Challenges in clean-up Lessons learned Evaluation report (M&E UNICEF) Annexes Include Action Plan Timetable Preliminary Report Note: Electronic report template distributed.

13 2. Action plan for clean-up
Expected Outputs 1. Preliminary Report 2. Action plan for clean-up

14 Beyond CRD Training Complete database (DB) Review
Complete DevInfo DB cleanup UNICEF M&E officer evaluation Submission of final report Use and dissemination of DB Country Roll-out

15 2. Action plan Main Task Task Components Person(s) responsible March
April I. Full conversion to DevInfo v5.0 II. Completion of DB review III. Implementation of cleanup IV. Evaluation V. Submission of report VI. Launch of national DevInfo DB Note: Detailed electronic action plan template distributed.

16 Lessons Learned

17 Strategy and Planning Exchange of experiences
Should invest in preparation prior to release: Setup (complete DevInfo team w/ clear functions & responsibilities) Necessity of Technical/Steering Committee Establish (feedback between DevInfo team, statistics producers, UN agencies). Clear strategy and plan of action Emphasis on quality Prioritize MDG DB Raise awareness of decision-makers to make use of available statistics

18 Database Management Start with planning process
Start small, think BIG! Importance of establishing: DevInfo team Committees: Steering and Technical Formal agreements with line ministries Involve users from beginning DB management requires experts for assessing data and metadata

19 DevInfo as CRD DevInfo is user-friendly as a program and interface
Web-enabling option Institutional link between DBs and DevInfo Customized national DevInfo Use of GIS Mapping facility Data organized by themes for targeted policy-making Store other activities’ data such as surveys & censuses Ready-tailored metadata and indicators of MDG framework Availability of international sources of data

20 Data coverage Sex disaggregated data Small area disaggregated data
Time series Different sources Strengths and weaknesses of DB Establish processes for data collection Review of list of indicators in national database relative to global MDG list Addition of country-specific development indicators Review of meta-data on indicators, sources Review of geographical areas/maps

21 Producer – User dialogue
Establish linkages with line ministries Regular forums Unification of standards and methods Unification of classifications and definitions Data-entry by line ministries through web Train users to understand statistics, indicators, and analysis of data

22 Dissemination Press releases Web-enabling
Dissemination through workshops Addition of DevInfo links to NSO sites (including training material) Provision of DevInfo CD with national MDGR Knowledge transfer Roll-out advocacy campaign

23 Structure & Content Spell Check Thematic databases
Always refer to global DB structures - Import MDG indicators and goals from global DB – do not type. Planning of DB content is essential Continuous check of data quality Metadata Follow standard naming of source Document all changes made Assessment of data availability has to be done BEFORE creating a template Source of data should specify the original producer ex. MOH for health indicators Naming of source should follow “PAL_MOH_DHS_2003” Set up different DBs for big amounts of info ex.census & surveys Remove indicators with no data values to avoid over reporting

24 Mapping Continent and country level: apply ISO coding
Ensure each area ID is connected to area name and vice versa Start working towards a target for infrastructure maps Area ID is the key to the mapping

25 Data Quality During data entry preferable to insert value not formula
Review data and consistency – do not rely on automated ways Do not include indicators with missing values Avoid duplication Ensure completeness of data values and time series

26 Thank you


Download ppt "Expected Outputs & Lessons Learned"

Similar presentations


Ads by Google