Presentation on theme: "Metrics Planning Group (MPG) Report to Plenary Clyde Brown ESDSWG Nov 3, 2011."— Presentation transcript:
Metrics Planning Group (MPG) Report to Plenary Clyde Brown ESDSWG Nov 3, 2011
MPG Focus on Product Data Quality and Citation Metrics
Product Quality Metrics Overall Objective: Given that the objective of the MEaSUREs program is to produce and deliver readily usable Earth Science Data Records of high quality: – Define program level metric(s) that permit assessment of the steps taken / progress made to ensure that high quality products are provided by MEaSUREs projects and the MEaSUREs program as a whole. – Develop a draft recommendation for Product Quality Metric(s) that would then go through the regular MPARWG review process. Recommendation from the October 2010 [ESDSWG] meeting: – Develop a checklist of a small number of questions that represent progress in the product quality area. – We considered product quality to be a combination of scientific quality and the completeness of associated documentation and ancillary information, and effectiveness of supporting services. – The responsibility for the product quality is shared between the projects generating the ESDRs and the DAACs that eventually archive and distribute them.
Product Quality Metrics Completed work on questions / checklists, reach agreement on a first version to work with Next steps – Projects & DAACs compile initial set of checklists, P. I.s send to Rama. – Rama creates a strawman set of project level summary roll-ups, and an aggregated program level roll-up, sends back to P.I.s. – Telecon to discuss, modify, etc., the summary roll-ups. – Draft MPARWG recommendation for product quality metrics (i.e., the agreed summary roll-ups).
Draft Project Checklist Science Quality LevelResponse 1. Have the data been evaluated by community representatives? (Summarize results)Y / N w/results 2. Is the data set complete as proposed and consistently processed? (Explain 'partial').Y / P / N 3. Is error understood, including in its spatial or temporal dimension?Y / P / N 4. Have the data been validated, i.e. ‘assessed for estimated accuracy, errors, and uncertainties, to the extent possible by comparison with alternative measurements’, by the project prior to release?Y / P / N 5. Is the new data original or different from existing products? (Explain how, in what ways)Y / N 6. Have promised improvements in the new data compared to existing products been achieved?Y / N 7. Have the ESDR’s algorithm or analysis method, product description and product evaluation results been published in peer-reviewed literature?” Y / N Documentation Quality LevelResponse 1. Has data format description been provided to DAAC?Y / N 2. Have algorithm and processing steps description been provided to DAAC?Y / N 3. Has documentation of metadata been provided to the DAAC?Y / N Usage and SatisfactionResponse 1. If project is distributing products, is the targeted community using the data?Y (trend) / N / NA 2. Is the broader community using the data?Y (trend) / N / NA 3. Are users satisfied with the data product?Y / N / NA Per a reviewer recommendation, responses to all questions could include comments, and would use a text format such as Word that would facilitate commenting.
Draft DAAC Checklist Science Quality LevelResponse 1. Have the new data been compared with existing products? (Summarize results)Y / P / N Documentation Quality LevelResponse 1. Is the data format well and completely described and/or is a commonly accepted appropriate standard format used ?Y / N 2. Are the algorithm and processing steps described?Y / N 3. Does the documentation include complete metadata?Y / N Accessibility / Support Services QualityResponse 1. Is it easy for users to discover the data?Y / N 2. Is it easy for users to access the data?Y / N 3. Are tools and services that enable reading and use of the data readily available?Y / N 4. Is it easy for users to use the data?Y / N 5. Can the users get help with discovery, access and use of the data?Y / N Usage and SatisfactionResponse 1. For products distributed by DAAC, is the targeted community using the data?Y (trend) / N 2. For products distributed by DAAC, is the broader community using the data?Y (trend) / N 3. For products distributed by DAAC, are users satisfied with the data product?Y / N Per a reviewer recommendation, responses to all questions could include comments, and would use a text format such as Word that would facilitate commenting.
Citations Metrics A change to the baseline adding new Citations Metrics was recommended by the MPARWG in 2009 and approved by NASA HQ in October 2010 for the FY2011 reporting year. NASA HQ requested a report on the first year of citations metrics. The expectation was that MEaSUREs projects that are REASoN continuations would be in the best position to begin reporting Citations Metrics in 2011. By September 30, 14 projects reported on citations. – 6 of the 7 MEaSUREs projects that are REASON continuations reported. – 8 new MEaSUREs projects reported. – 1,708 citations in peer-reviewed publications were reported (excluding ISCCP), and 235 citations in non-peer-reviewed publications. The goal of this session was to examine the experience and lessons learned from the first year effort, and chart a course for citations metrics reporting in 2012. The report to NASA HQ will reflect the results of the first year of citations metrics reporting and the way forward agreed to here.
Citations Metrics Reviewed citation metrics for FY2011 and the methods used by the projects to identify best practices and assess level of effort Next Steps: – Develop guidance for projects based on this year’s experience and results of our discussion. – Citation Metrics for FY2012 will be collected by September 30 to allow for annual report to NASA HQ
Future Work MPG will continue to function on an ad hoc basis to consider metrics issues as they arise, e.g. – Metrics for Distributed Services – Ensuring that data access by online services are accounted for (e.g. which data granule(s) were accessed to produce a plot)