Presentation is loading. Please wait.

Presentation is loading. Please wait.

Judges Score Card Overview Mark Turner NASA Ames Research Center.

Similar presentations


Presentation on theme: "Judges Score Card Overview Mark Turner NASA Ames Research Center."— Presentation transcript:

1 Judges Score Card Overview Mark Turner NASA Ames Research Center

2 Ground Tournaments Purpose is to provide interim monetary awards and feedback to Teams Four Ground Tournaments (GT4 is required for EM-1 launch) six months apart Evaluation, analysis and simulations will be performed by ARC’s Mission Design Center, WFF’s Mission Planning Lab and Subject Matter Experts January 7, 2015Scorecard2

3 GT Concept of Operations January 7, 20153 Teams Prepare Data 40% - Probability of Mission Success 60% - Compliance with SLS Interface Requirements & Specific Challenge Rules Judges Simulations and Review by ARC-MDC WFF-MPL SMEs Review by SLS PIM SLS S&MA SLS Safety Panel Judges Centennial Program Office Scores & Evaluations Judges Final Scoring & Ranking Awards Questions for clarification Safety Hazard Report action items Scores (and feedback) to Teams ARC- MDC: Ames Mission Design Center WFF-MPL: Wallops Mission Planning Laboratory SMEs: Subject Matter Experts

4 Purpose of the Score Card Provide a method to objectively judge each team to a set of clearly defined metrics Provide to each team a clear picture of what data they will need to provide for each Ground Tournament (GT) Provide to Teams a clear picture of how they will be evaluated for each GT January 7, 2015Scorecard4

5 Score Card Development January 7, 2015Scorecard5 Subject Matter Experts from ARC, WFF and GSFC developed and reviewed the Score Card over several months. ARC-MDC and WFF-MPL developed detailed inputs necessary from Teams to run simulations. Two judges reviewed the Score Card.

6 Scorecard Evaluation Criteria Probability of Mission Success – 40% of total score – Likelihood of Achieving Communications Goals – Likelihood of Achieving Orbit or Distance Goals – Likelihood of Meeting Longevity Goals – Systems Design Maturity Relative to GT Compliance with SLS Interface Requirements & Specific Challenge Rules - 60% of total score – Specific Challenge Rules – SLS SPDS IDRD requirements (beginning with GT2) – Safety Panel Hazard Report GT1 identify potential safety hazards GT2 & 3 Identify how you plan to address these hazards GT4 Provide data to prove hazards have been mitigated January 7, 2015Scorecard6

7 Inputs from Teams Each category identifies REQUIRED inputs and RECOMMENDED inputs (as noted in the tabs on the Score Card) REQUIRED inputs identify data critical for judges to evaluate each Team – Required data is the minimum acceptable data required for judging the competition – By minimizing the amount of required inputs, each team can allocate its resources as needed to develop a successful mission design RECOMMENDED inputs identify data NASA would typically develop for in- house missions – Given that teams are operating with limited resources, each team can decide which recommended data they will provide – Recommended data provides additional insight to the judges as to how robust and complete the proposed mission design is January 7, 2015Scorecard7

8 Scorecard Evaluation Criteria Probability of Mission Success – 40% of total score – Likelihood of Achieving Communications Goals – Likelihood of Achieving Orbit or Distance Goals – Likelihood of Meeting Longevity Goals – Systems Design Maturity Relative to GT Compliance with SLS Interface Requirements & Specific Challenge Rules - 60% of total score – Specific Challenge Rules – SLS Secondary Payload Deployment System (SPDS) IDRD requirements (beginning with GT2) – Safety Panel Hazard Report GT1 identify potential safety hazards GT2 & 3 Identify how you plan to address these hazards GT4 Provide data to prove hazards have been mitigated January 7, 2015Scorecard8

9 Expectations Teams will reach higher fidelity of designs as the series of GTs progress. Less fidelity is expected at GT1 and high fidelity is expected at GT4. Judges’ grading will become more stringent as the series of GTs progress. January 7, 2015Scorecard9

10 Evaluation Criteria for GTs GT 1 – Evaluate the mission architecture and concepts to meet Challenge Goals and evaluate plans to meet Rules & Interface Requirements. GT 2 – Evaluate plans and designs to meet Challenge Goals, Rules & Interface Requirements. – Determine if the integrated design is appropriately mature to continue with final designs and fabrication. GT 3 – Evaluate the readiness to begin system Assembly, Integration & Testing and progress toward meeting Challenge Goals, Rules & Interface Reqts. GT 4 – Evaluate the readiness for launch, likelihood of meeting Challenge Goals and ensure final compliance to Rules & Interface Requirements. – Three Flight selections shall be made at GT4 with one additional team as backup January 7, 2015Scorecard10

11 Schedule January 7, 2015Scorecard11 SPUG: SLS Secondary Payload User’s Guide IDRD: Interface Definition Requirement Document (IDRD)

12 Scorecard Examples January 7, 2015Scorecard12


Download ppt "Judges Score Card Overview Mark Turner NASA Ames Research Center."

Similar presentations


Ads by Google