Judges Score Card Overview Mark Turner NASA Ames Research Center.

Slides:



Advertisements
Similar presentations
A I R T R A F F I C O R G A N I Z A T I O N Future Communications Study Technology Assessment Team: Suggested Phase III Activities Presented at ICAO ACP.
Advertisements

Baldrige National Quality Program 2003 Baldrige National Quality Program The Site Visit Process and Evaluating Site Visit Issue Worksheets.
Roadmap for Sourcing Decision Review Board (DRB)
Fundamentals and Best Practices for outcomes and success.
Course: e-Governance Project Lifecycle Day 1
PROJECT TITLE Project Leader: Team: Executive Project Sponsor (As Required): Date: Month/Day/Year 110/17/2014 V1.
PROJECT TITLE Project Leader: Team: Executive Project Sponsor (As Required): Date: Month/Day/Year 110/17/2014 V1.
Environment case Episode 3 - CAATS II Final Dissemination Event Brussels, 13 & 14 Oct 2009 Hellen Foster, Jarlath Molloy NATS, Imperial College London.
Basic Overview of Project Management and Life Cycle ACES Presentation T. Gregory Guzik January 21, 2003.
1 Architecture and Planning Strategies for Addressing Radiation, Space Weather, and Space Climatology Impact on NASA Missions Study Sponsor - NASA Office.
Middle Years Programme
Change is a Process Organizational Stages Individual Stages (ADKAR) Business Need Concept and Design Implementation Post-Implementation Awareness Desire.
1 Introduction to System Engineering G. Nacouzi ME 155B.
SQM - 1DCS - ANULECTURE Software Quality Management Software Quality Management Processes V & V of Critical Software & Systems Ian Hirst.
1 08 January 2015 Stephen Horan Cube Quest Kick-off: Communications Rules PI for Avionics Space Technology Mission Directorate.
System Safety & Mission Assurance (SS&MA) for Sub-Class D Missions Steve Jara NASA Ames System Safety & Mission Assurance Division.
Operations and Rules Jim Cockrell Cube Quest Administrator.
Team Name Final Presentation Team Members Date. User notes –You may reformat to fit your design but make sure you cover the following points –You may.
Defining the Activities. Documents  Goal Statement defines why helps manage expectations  Statement of Work what gets delivered defines scope  Software.
Info-Tech Research Group1 Improving Business Satisfaction Moving from Measurement to Action.
Runway Safety Teams (RSTs) Description and Processes Session 5 Presentation 1.
Preventive Controls Phase 2 Workgroup 1.
Leaders Manage Meetings
LSU 07/07/2004Communication1 Communication & Documentation Project Management Unit – Lecture 8.
TEMPO Mission Project July 23, 2013 Project Manager: Alan Little.
New Mexico Space Grant Consortium Student Launch Program Provide annual access to space for student experiments from Spaceport America.
S/W Project Management
Service Organization Control (SOC) Reporting Options and Information
McLean & Company1 Improving Business Satisfaction Moving from Measurement to Action.
Certification and Accreditation CS Phase-1: Definition Atif Sultanuddin Raja Chawat Raja Chawat.
1 Project Kick Off Briefing Cost Data Integrity Project August 30, 2007.
普 华 永 道 Phase 1: Project Preparation Phase 1: Project Preparation Phase Overview Phase Overview.
Core Banking Transformation: A Roadmap to a Successful Core Banking Product Implementation - PMI Virtual Library | | © 2008 Kannan S. Ramakrishnan.
Critical Design Review (CDR)
Solar Probe Plus A NASA Mission to Touch the Sun March 2015 Instrument Suite Name Presenter's Name.
Overview of RUP Lunch and Learn. Overview of RUP © 2008 Cardinal Solutions Group 2 Welcome  Introductions  What is your experience with RUP  What is.
National Aeronautics and Space Administration Space Launch System Phase I Safety Review Preparation for GT #2 Topic: Phase I Safety Review.
CCSSO Task Force Recommendations on Educator Preparation Idaho State Department of Education December 14, 2013 Webinar.
SRR and PDR Charter & Review Team Linda Pacini (GSFC) Review Chair.
Making It Better Planning Employee & Patient Satisfaction November 2010.
SwCDR (Peer) Review 1 UCB MAVEN Particles and Fields Flight Software Critical Design Review Peter R. Harvey.
IPLFOR POIF Process Review Eric Melkerson Payload Operations Director Operations Directors’ Office / EO03 Marshall Space Flight Center
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
Projects, Programmes and Best Practice. Geoff Reiss.
18 March 2004 / Rev. 01 Cyber Blue – Team 234 Perry Meridian High School Indianapolis, IN Design Review Process Documentation This document is intended.
ISIS Project Status Report May 18, 2006 Prepared by MAXIMUS, Inc Education Systems Division for the ABT Committee.
ESA UNCLASSIFIED – For Official Use Experiment Development and Integration Process Philippe Schoonejans, Head of Robotics and Future Projects Office ESA.
NASA MSFC Mission Operations Laboratory MSFC NASA MSFC Mission Operations Laboratory Payload Operations and Integration Function Overview Payload Operations.
Transfer Course Credit – Institutions of Higher Education Credit for Prior Learning Industry Recognized Credentials/Test Credit AGC – April 2016.
Ted Biess NASA HQ Environmental Mgt Div Scott Motter Northrop Grumman NASA Risk Management Aligned Environmental Management System (RM-A-EMS)
IV&V Facility 7/28/20041 IV&V in NASA Pre-Solicitation Conference/ Industry Day NASA IV&V FACILITY July 28, 2004.
First-Year Engineering Program Preliminary Design Review (PDR) Definition PDR Objectives PDR Material Project Management Review.
EIAScreening6(Gajaseni, 2007)1 II. Scoping. EIAScreening6(Gajaseni, 2007)2 Scoping Definition: is a process of interaction between the interested public,
1 Functional Area Owners Preparation for Primary Working Session.
Roadmaps for Future Operational Space Weather Services ESWW9 Session 1 05 Nov 2012, Brussels Gareth LAWRENCE, RHEA System SA.
Stages of Research and Development
DoD Template for Application of TLCSM and PBL
Software Independent Verification and Validation (IV&V)
Description of Revision
Where We Are Now. Where We Are Now Project Oversight Project Oversight Oversight’s Purposes: A set of principles and processes to guide and improve.
© [2012] Orbital Sciences Corporation. All Rights Reserved.
Software System Integration
Unit 6 Performance appraisal
By Jeff Burklo, Director
MINGGU KE 9: PROSES DESAIN PRODUK BARU
Instrument PDR Summary of Objectives
PSS verification and validation
<Your Team # > Your Team Name Here
{Project Name} Organizational Chart, Roles and Responsibilities
Presentation transcript:

Judges Score Card Overview Mark Turner NASA Ames Research Center

Ground Tournaments Purpose is to provide interim monetary awards and feedback to Teams Four Ground Tournaments (GT4 is required for EM-1 launch) six months apart Evaluation, analysis and simulations will be performed by ARC’s Mission Design Center, WFF’s Mission Planning Lab and Subject Matter Experts January 7, 2015Scorecard2

GT Concept of Operations January 7, Teams Prepare Data 40% - Probability of Mission Success 60% - Compliance with SLS Interface Requirements & Specific Challenge Rules Judges Simulations and Review by ARC-MDC WFF-MPL SMEs Review by SLS PIM SLS S&MA SLS Safety Panel Judges Centennial Program Office Scores & Evaluations Judges Final Scoring & Ranking Awards Questions for clarification Safety Hazard Report action items Scores (and feedback) to Teams ARC- MDC: Ames Mission Design Center WFF-MPL: Wallops Mission Planning Laboratory SMEs: Subject Matter Experts

Purpose of the Score Card Provide a method to objectively judge each team to a set of clearly defined metrics Provide to each team a clear picture of what data they will need to provide for each Ground Tournament (GT) Provide to Teams a clear picture of how they will be evaluated for each GT January 7, 2015Scorecard4

Score Card Development January 7, 2015Scorecard5 Subject Matter Experts from ARC, WFF and GSFC developed and reviewed the Score Card over several months. ARC-MDC and WFF-MPL developed detailed inputs necessary from Teams to run simulations. Two judges reviewed the Score Card.

Scorecard Evaluation Criteria Probability of Mission Success – 40% of total score – Likelihood of Achieving Communications Goals – Likelihood of Achieving Orbit or Distance Goals – Likelihood of Meeting Longevity Goals – Systems Design Maturity Relative to GT Compliance with SLS Interface Requirements & Specific Challenge Rules - 60% of total score – Specific Challenge Rules – SLS SPDS IDRD requirements (beginning with GT2) – Safety Panel Hazard Report GT1 identify potential safety hazards GT2 & 3 Identify how you plan to address these hazards GT4 Provide data to prove hazards have been mitigated January 7, 2015Scorecard6

Inputs from Teams Each category identifies REQUIRED inputs and RECOMMENDED inputs (as noted in the tabs on the Score Card) REQUIRED inputs identify data critical for judges to evaluate each Team – Required data is the minimum acceptable data required for judging the competition – By minimizing the amount of required inputs, each team can allocate its resources as needed to develop a successful mission design RECOMMENDED inputs identify data NASA would typically develop for in- house missions – Given that teams are operating with limited resources, each team can decide which recommended data they will provide – Recommended data provides additional insight to the judges as to how robust and complete the proposed mission design is January 7, 2015Scorecard7

Scorecard Evaluation Criteria Probability of Mission Success – 40% of total score – Likelihood of Achieving Communications Goals – Likelihood of Achieving Orbit or Distance Goals – Likelihood of Meeting Longevity Goals – Systems Design Maturity Relative to GT Compliance with SLS Interface Requirements & Specific Challenge Rules - 60% of total score – Specific Challenge Rules – SLS Secondary Payload Deployment System (SPDS) IDRD requirements (beginning with GT2) – Safety Panel Hazard Report GT1 identify potential safety hazards GT2 & 3 Identify how you plan to address these hazards GT4 Provide data to prove hazards have been mitigated January 7, 2015Scorecard8

Expectations Teams will reach higher fidelity of designs as the series of GTs progress. Less fidelity is expected at GT1 and high fidelity is expected at GT4. Judges’ grading will become more stringent as the series of GTs progress. January 7, 2015Scorecard9

Evaluation Criteria for GTs GT 1 – Evaluate the mission architecture and concepts to meet Challenge Goals and evaluate plans to meet Rules & Interface Requirements. GT 2 – Evaluate plans and designs to meet Challenge Goals, Rules & Interface Requirements. – Determine if the integrated design is appropriately mature to continue with final designs and fabrication. GT 3 – Evaluate the readiness to begin system Assembly, Integration & Testing and progress toward meeting Challenge Goals, Rules & Interface Reqts. GT 4 – Evaluate the readiness for launch, likelihood of meeting Challenge Goals and ensure final compliance to Rules & Interface Requirements. – Three Flight selections shall be made at GT4 with one additional team as backup January 7, 2015Scorecard10

Schedule January 7, 2015Scorecard11 SPUG: SLS Secondary Payload User’s Guide IDRD: Interface Definition Requirement Document (IDRD)

Scorecard Examples January 7, 2015Scorecard12