Presentation on theme: "1 External Review Results November 30, 2011 Reta Beebe."— Presentation transcript:
1 External Review Results November 30, 2011 Reta Beebe
Structure of Build 1d review 2 Based on concerns about the readiness for external review from MC members at the Flagstaff F2F meeting Aug 22-23, the build 1d external assessment was divided into Part 1 (review of documents) and Part 2 (review of implementation). The call for the Part 1 review was issued Sept 6 with a 2 week review period The call for the Part 2 review was issued Sept 29 with a 2 week review period
Responders 3 1.Elias Barbinis – JPL-Radio 2.Ed Bell - NSSDC 3.Jim Bell – ASU - MIR 4.David Choi – LPL/U of A –Dynamics –heavy PDS user 5.Bob Deen – JPL -Imaging 6.Alexandria DeWolfe – LASP/U of C – MAVEN SOC 7.Larry Granroth – PDS-PPI / U of I 8.Steve Levoe – JPL _ Imaging 9.Jerry Manweiler – Fund Tech – Cassini/MIMI archivist 10.Sarah Mattson – U of A-imaging data node 11.Bea Mueller – PSI-ground-based – comet collection 12.Kim Murray – ASU/Mars Space Flight Facility 13.Conor Nixon - NASA GSFC Cassini/CIRS archivist 14.Mike Reid - JHU/APL Messenger SOC 15.Ninel Smirnova - SRI, Russian Academy of Sciences-Grunt 16.Paul Withers - Boston University – ATMOS user & reviewer
Summary 4 The sixteen responders submitted a total of 405 comments. Every person appreciated the opportunity to provide input and were complimentary of the effort. About half finished Phase 2. The comments have been classified and compiled into a single spreadsheet. Many comments are overlapping and some have already been addressed. Some comments resulted from a misunderstanding on the part the reviewer. The original and compiled spreadsheets have been posted for access by the DDWG Suggested changes are under consideration by the DDWG. Some updates have been made.
Summary of Responses 6 1.Principally focused on documents with the vast majority of issues being editorial comments. The phase I documents received more comments. Examples: consolidate documents more and reduce overlap, address inconsistencies, etc. 2.Many cautioned that true test is in doing something with PDS4. 3.Need better indexes, appendices, page numbering, etc. 4.Some comments that there is a lot of terminology. 5.Some suggested improvements in the use of XML schema (e.g., more inheritance, resolving conflicting terminology) 6.More examples, including PDS3 v PDS4 7.Support UTF--8/Unicode 8.Support delimited tables 9.How to make PDF/A; how to archive software 10.Capture HTML resources 11.Clarification on bundle versus collection of collections. 12.Some discipline extensions need to be improved (e.g., coordinate systems)
Common Themes between 1d review and internal exercises 7 It is important to exercise the end-to-end process and get experience. PDS4 can better leverage XML schema. There is a need for more examples and clarity in each step of the process of building PDS4 products. Bundle and Collections in particular have been called out. Documents need improvements.
Next Steps The results have been turned over to the DDWG to plan the short term and long term improvements to the PDS4 emergent standard. The DDWG members will present on these issues and the short/long term plan in the afternoon session. The key outstanding issues impacting Build 2b have been integrated into the Build 2b release plan. Finalize the scope of build 2c at the March MC meeting and plan for its release in May. Between now and March, nodes should feed their needs to Faith so they can be used to finalize the scoping. 8
Conclusion From Reviews External & Internal Examples and working the system is heavily needed at this point in time to show proof-of- concept. Migration should be a great venue for exercising this by using incrementally more difficult datasets - this should work out the holes as we go. 9