Presentation is loading. Please wait.

Presentation is loading. Please wait.

Industry Engagement Request Questionnaire 1 Assessment Methodology

Similar presentations


Presentation on theme: "Industry Engagement Request Questionnaire 1 Assessment Methodology"— Presentation transcript:

1 Industry Engagement Request Questionnaire 1 Assessment Methodology
Meeting with Companies May 03, 2013 1

2 Purpose To provide companies with details on the assessment methodology that will be used in assessing your responses to Questionnaire 1 To provide clarifications and provide answers to questions To provide companies with information on next steps We are committed to an open, fair and transparent process

3 Industry Engagement Request Questionnaire 1
Canadian fighter mission and threat information across two time horizons ( and 2030+). Responses to include defined mission configurations: Air-to-Air; Air-to-Ground; and Air-to-Surface. Responses to include capability developments (planned upgrades). Responses to include descriptions of capability, production and supportability. Companies are being asked to provide 3 separate configurations for each of the 2 time horizons; so potentially 6 different configurations could be included in the responses. Questions 1 to 18 of the questionnaire are specific to “capabilities”, while questions 19 to 24 focus on “production and supportability”.

4 PRINCIPLES The following Principles will guide all participants involved in the assessment of the responses: Consistency Team leads will ensure that the assessment is well documented, including all qualitative and professional military judgement aspects. Individual aircraft are being assessed against each Canada First Defense Strategy (CFDS) mission. No aircraft will be screened out. All options will remain on the table. Final product will be a high level risk assessment. Objective The National Fighter Procurement Secretariat will provide oversight through all steps of the assessment process Use of Third Parties Responses will be shared with the Independent Review Panel (i.e. Panel) members and any significant differences of opinion between assessment teams will be brought to the attention of the Panel.

5 Overview of Q1 Assessment
The “Capabilities” questions of the Industry Engagement Request will be assessed using a 3-step assessment process: Step 1 – Assessment of Measures of Performance (MOP); Step 2 – Assessment of Measures of Effectiveness (MOE); and Step 3 – Assessment of Aerospace Capabilities weighted against Canada First Defense Strategy (CFDS) Mission and criticality to result in “Operational Risk” assessment of each aircraft for each of the missions. The “Production and Supportability” questions will be assessed using a 2-step assessment process: Step 1 – Assessment of responses by technical Subject Matter Experts teams on Military Strategic Assessment Factors – Impact Statements; and Step 2 – Assessment of responses by the Royal Canadian Air Force Senior leadership level from a missions perspective to result in “Strategic Risk” assessment of each aircraft for each of the missions. Both the “Operational Risk” and “Strategic Risk” will then be consolidated into an overall risk profile for each aircraft against each mission. The end result of the “Capabilities” assessment will be translated into an “Operational Risk”; and the “Production and Supportability” assessment will result in a “Strategic Risk”. Both of these assessments will then be consolidated to determine an overall risk profile for each aircraft against each mission. These assessments will not result in any “pass” or “fail” results; the end product will also not include any “point rating”, but rather an assessment of benefits and risks associated with each aircraft to meet the needs of each mission. As indicated in the Terms of Reference, the publicly released summary report will respect commercial sensitivities and classified information restrictions.

6 Step 1 of Capabilities Assessment
Assessments being performed using responses to the 17 areas of « Capabilities » contained in the Industry Engagement Request. National Defence teams assessing specific areas, with oversight from the National Fighter Procurement Secretariat. Qualitative assessment - independent of missions. Results in Measures of Performance assessment for each aircraft. Sensors Radio Frequency Sensors Electro-optical or Infrared Sensors Air-to-Air Weapons Weapons Air-to-Ground Weapons Air-to-Surface Weapons Non-Kinetic Weapons (Electronic Attack) Self Protection Radio Frequency Self Protection Infrared Self Protection Countermeasures Avionics Data Link Pilot Workload Communications Sensor Integration Performance Radio Frequency Signature Infrared Signature Engine/Airframe Combat Radius/Endurance The 1st part of the assessment will focus on the company’s responses to questions 1 to 18 of the questionnaire. These 18 questions refer specifically to the “Measures of Performance” (or MOPs) detailed in the Industry Engagement Request. There are 17 different Measures of Performance. Separate teams from National Defence have been set-up to assess each of the 17 Measures of Performance shown on this slide. Each team will be made up of up to 5 subject-matter experts from the Department of National Defence. Each member of each team will assess the systems based on their area of expertise (ex. Radio Frequency Sensors). Each member will discuss their views and their assessment with the goal of reaching consensus on the Measures of Performance rating. These team discussions will be held with the oversight of the National Fighter Procurement Secretariat to ensure consistency of assessment methodology between teams. Each rating will be fully documented.

7 Evaluation Grid for Step 1
Rating Definition 10 Excellent Without appreciable deficiencies 9 8 Very Good Limited only by minor deficiencies 7 6 Good Limited by moderate deficiencies 5 4 Poor Limited by major deficiencies 3 2 Very Poor Significantly limited by major deficiencies 1 Non-Existant Performance rating of individual systems Rating scale from 0-10 to allow sufficient fidelity Classified systems information represented by a number (supported by classified references) For the assessment of the 17 Measures of Performance, the assessment teams will use this assessment grid. This part of the assessment does not take the various Canada First Defence Strategy missions into consideration. The missions will be accounted for later in the assessment. 7

8 Step 2 of Capabilities Assessment
Three Royal Canadian Air Force teams (four members each) performing this assessment, with oversight from the National Fighter Procurement Secretariat: Including fighter pilots and at least one Air Force Air Weapon Controller on each team. Significant differences of opinion between assessment teams will be brought to the attention of the Panel. The following inputs will be used in this assessment: Scores and input from the results of Step 1; Companies responses to the Industry Engagement Request, including the mission configuration systems; Vignettes detailed in Appendix “A” of the Industry Engagement Request; Detailed threat capabilities from Canadian Forces Threat Assessment; Detailed fighter task lists; and Other information available to Government. Fighter Measures of Effectiveness areas of consideration detailed in the Industry Engagement Request. Once the 17 Measures of Performance (MOPs) have been assessed and assigned a point rating, three separate teams from the Air Force will conduct the second step of the capabilities assessment. This will result in “Measures of Effectiveness” or MOEs ratings, on a scale of one to ten. The second step of the assessment will be informed by the results of Step 1 (MOPs). The teams will also use the Vignettes that were included at the end of Questionnaire 1 as well as the detailed threat capabilities and associated fighter task lists (these later two components of the assessment were not included in the Questionnaire due to security considerations). The teams will also consider open source and Government to Government information. Any material differences of opinion between the assessment teams will be brought to the attention of the independent review panel.

9 Information on Missions as per the Canada First Defence Strategy (CFDS)
1 – Conduct daily domestic and continental operations, including in the Arctic and through NORAD (Vignette 1) 2 – Support a major international event in Canada, such as the 2010 Olympics (Vignette 2) 3 – Respond to a major terrorist attack (9/11-type scenario under CFDS Mission 1 through NORAD) (Vignette 5) 4 – Support civilian authorities during a crisis in Canada such as a natural disaster (No Vignette) 5 – Lead and/or conduct a major international operation for an extended period [including complex peace enforcement operation, and coalition state-on-state war-fighting] (Vignettes 3 & 4) 6 – Deploy forces in response to crises elsewhere in the world for shorter periods (Vignette 6) Of the 6 Canada First Defense Strategy missions above, only 5 relate to the 6 vignettes. Mission 4 (Support civilian authorities during a crisis in Canada such as a natural disaster) will be analyzed together with Mission 2, understanding the fighter’s role in this mission is generally minor. This slide provides information on the missions as per the Canada First Defence Strategy and the related Vignettes of the Industry Engagement Request. All but one mission is reflected in the Vignettes. Mission 4 of the Canada First Defence Strategy will be analyzed together with Mission 2. Further information on the Vignettes can be found in Appendix “A” to the Questionnaire.

10 Information on Aerospace Capabilities
Defensive Counter Air (DCA) Offensive Counter Air (OCA) Strategic Attack Close Air Support (CAS) Land Strike Tactical Support to Maritime Operations (TASMO) Intelligence, Surveillance, and Reconnaissance (ISR) This slide refers to the various Aerospace Capabilities that were referenced on the previous slide; not all aerospace capabilities are required for all CFDS missions. For example, when defending Canada, the “Land Strike” aerospace capability would not be assessed.

11 Information on Fighter Measures of Effectiveness
Awareness Survivability Reach and Persistence Responsiveness Lethality Interoperability

12 Step 2 of Capabilities Assessment Overview
Step 2 will first generate “raw” scores for Measures of Effectiveness (MOEs). Raw results will then be weighted against the applicable mission to come up with an Operational Level Aerospace Capability Score for each aircraft against each mission over the 2 timeframes. Mission weightings are sensitive and not included as part of this presentation. This second step of the Capabilities Assessment Overview introduces the missions as well as the varying aerospace capabilities such as Defensive Counter Air measures that would be required to fulfill a given mission. Using the results of Step 1 (Measures of Performance), the Vignettes, open source and government-to-government information, as well as detailed threat capabilities and associated fighter task lists, the assessment teams will determine a raw score for each Measure of Effectiveness for each of the two time horizons. The Measures of Effectiveness mission weightings will then be used and applied against the raw scores. The outcome of these weightings will result in an “Operational Level Aerospace Capability Score”.

13 Evaluation Grid for Step 2 Raw Scores
Effectiveness rating of contributing system performances. Multiple systems will contribute to each measure of effectiveness. Will vary across all missions. Rating scale from 1-10 to allow sufficient fidelity. Qualitative, and thus requires Subject Matter Expert input. Will result in “Raw” scores for Step 2. Rating Definition 10 Excellent No appreciable limitations in delivering the desired effect 9 8 Very Good Only minor limitations in delivering the desired effect 7 6 Good Moderate limitations in delivering the desired effect 5 4 Poor Major limitations in delivering the desired effect 3 2 Very Poor Severe limitations in delivering the desired effect 1 The grid that we are using for the 2nd step of the Capabilities assessment is similar to the one used in the first step. The 2nd step involves conducting a qualitative assessment of Measures of Effectiveness using the results of the Measures of Performance as well as other available information.

14 1. Defence of Canada – Baseline and Arctic
Step 2 Example CFDS Mission Aerospace Capability 1. Defence of Canada – Baseline and Arctic Defensive Counter Air Vignette 1 Threat The Canadian Fighter is conducting normal daily and contingency North American Aerospace Defence (NORAD) missions at normal alert levels, and is prepared to react to elevated alert levels. Operations are conducted from main operating bases as well as Forward Operating Locations and missions may be over land and over water. Operations are being conducted in a combined and/or joint environment with NORAD forces. The threats for these missions can be air and maritime-based. : Non-state civilian aircraft (including low/slow) State long-range aviation Classified Threat List (NORAD) Tasks - Rapidly establish and maintain air presence/policing/deterrence for a tactically significant period of time in an area of operations. - Detect, validate, characterize, assess, and warn of unwanted and unauthorized air activity approaching or operating within North American domestic airspace including potential attacks against North America. (NCI , 1.2.1) - Detect, Identify, Determine Intent, Monitor, Intercept, Report, Negate, or Destroy potential air threats. 2030+: State long-range aviation and escorts Non-state use of stand-off missile technologies This example shows how the “Capabilities” assessment comes together using Vignette 1 (Daily domestic/continental operations) and the Defensive Counter Air Aerospace Capability. 14

15 Step 2 Example – Assessment of Raw Scores
Defensive Counter Air (MOEs) using Vignette 1 in 2020 to 2030 timeframe Aircraft X Aircraft Y Lethality 2 9 Survivability 4 Responsiveness 6 Reach and Persistence Interoperability 5 8 Awareness Tech Area Contributing System Aircraft X Aircraft Y Sensors Radio Frequency Sensors 3 8 Electro-optical or Infrared Sensors 7 Weapons Air-to-Air Weapons 2 Air-to-Ground Weapons Air-to-Surface Weapons Non-Kinetic Weapons 5 Self Protection Radio Frequency Self Protection 9 Infrared Self Protection Countermeasures Avionics Data Link 10 Communications Sensor Integration Pilot Workload Performance Radio Frequency Signature 4 Infrared Signature 6 Engine/Airframe Combat Radius/Endurance For the purposes of the example only two aircraft ( Aircraft X and Aircraft Y) are assessed. The table on the right provides the results from the 17 teams that performed the Step 1 assessments (MOPs). This summary shows the results of the assessment for Air-to-Air Weapons configuration, applicable to Defensive Counter Air. The 17 teams will create assessment summaries for each configuration submitted by the companies. Using the information contained in the assessment of the 17 Measures of Performance (as well as other available information), the Step 2 teams will collectively discuss and determine a score for each of the Measures of Effectiveness (such as Lethality) shown in the left hand table. 15

16 Step 2 Example – Capability Raw Scores
Mission Aerospace Capability Measures of Effectiveness (Aircraft X) Lethality Survivability Reach and Persistence Responsiveness Interoperability Awareness 1. Defence of Canada (Baseline and Arctic) Defensive Counter Air 2 4 9 5 6 Offensive Counter Air Strategic Attack Close Air Support Land Strike TASMO ISR 5.8 This example uses the scores shown on the previous slide. The Raw Score of 5.8 is arrived at by averaging the scores of each Measures of Effectiveness within a given aerospace capability. This translates into moderate limitations (illustrated by yellow) in the Defensive Counter Air aerospace capability within a defence of Canada mission scenario. Example Only 16

17 Step 2 Example – Capability Raw Scores
Mission Aerospace Capability Measures of Effectiveness (Aircraft Y) Lethality Survivability Reach and Persistence Responsiveness Interoperability Awareness 1. Defence of Canada (Baseline and Arctic) Defensive Counter Air 9 6 8 Offensive Counter Air Strategic Attack Close Air Support Land Strike TASMO ISR 7.66 Slide 17 shows the raw score of Aircraft Y for the example. This translates to minor limitations (illustrated by light green) in the Defensive Counter Air Aerospace Capability within a Defence of Canada mission scenario. Example Only 17

18 Step 2 - Capability (Weighted)
Applying weighting function to each measure of effect allows capability to be properly quantified within that Mission Mission Aerospace Capability (Weighted) Measures of Effectiveness (Aircraft X) ( ) Lethality Survivability Reach and Persistence Responsiveness Interoperability Awareness 1. Defence of Canada (Baseline and Arctic) Defensive Counter Air Offensive Counter Air Strategic Attack Close Air Support Land Strike TASMO ISR 6.7 This example uses the weightings that apply to Defence of Canada against the raw scores to show the next step of the assessment. For this particular mission, Aircraft X’s score moves from a raw score of 5.8 to 6.7, following the application of the weightings. For the purpose of clarity, the original (non-weighted) Measures of Effectiveness colour-coded ratings have been left on this chart to illustrate the overall effect of weighting on the rollup of Measures of Effectiveness for each Aerospace Capability. In this particular case, Lethality and Survivability are clearly not as important as reach and persistence to the Defensive Counter Air capability. Example Only 18

19 Step 2 - Capability (Weighted)
Applying weighting function to each measure of effect allows capability to be properly quantified within that Mission Mission Aerospace Capability (Weighted) Measures of Effectiveness (Aircraft Y) ( ) Lethality Survivability Reach and Persistence Responsiveness Interoperability Awareness 1. Defence of Canada (Baseline and Arctic) Defensive Counter Air Offensive Counter Air Strategic Attack Close Air Support Land Strike TASMO ISR 7.1 Aircraft Y’s raw score has moved from 7.66 to 7.1, following the application of the weighting. This illustrates the importance of Reach and Persistence, and Responsiveness, to the Defensive Counter Air Aerospace Capability. Example Only 19

20 Step 3 of Capabilities Assessment
This last step of the Capabilities Assessment takes into account: a) the weighting of each Fighter Aerospace Capability within a particular mission and b) criticality as follows: DCA – XX%, OCA – XX%, TASMO – XX%, ISR – XX%; Mission Critical (MC), Mission Essential (ME), or Mission Routine (MR); The Operational Level Score obtained at the end of Step 2 is aggregately weighted and assessed by Royal Canadian Air Force against the mission criticality matrix. Aerospace capability limitations will carry risk forward dependant on their mission criticality (translation matrix on slide 22 expands). The final step, Step 3, of the “Capabilities” assessment takes into consideration the Criticality of a given mission. No assessment teams are used for this part of the assessment, rather a Translation Matrix is used that works as follows: Each of the missions, is assigned a Criticality Factor to each of the applicable Aerospace Capabilities. This is based on mission criticality as defined by the Royal Canadian Air Force’s Capability Based Planning cycle. Only three possible Criticality Factors were used: Mission Critical, Mission Essential and Mission Routine. The specific Mission Criticality and Aerospace Capability Weightings to CFDS Mission weightings are sensitive information. For example, Defensive Counter Air for Missions in Canada could be a Mission Critical factor. This is explained using an example in the next slides. 20

21 Step 3a – Operational Mission Weighting
Operational level scoring (before Mission criticality) CFDS 1 CFDS 2/4 CFDS 3 CFDS 5 CFDS6 V1 V2 V5 V3 V4 V6 Aircraft A Aircraft B Aircraft C Aircraft D Aircraft E Aircraft Y 7.0 ISR TASMO Ld Strk. CAS Strat Atk. OCA DCA CFDS Mission 1 (example) E C Mission criticality (C/E/R) X% Weighting roles VS CFDS CFDS Mission 1 (example) DCA OCA Strategic Attack CAS Land Strikes TASMO ISR Aircraft A Aircraft B Aircraft C Aircraft D Aircraft E Aircraft Y 7.1 6.1 7.3 Operational Risk CFDS 1 CFDS 2/4 CFDS 3 CFDS 5 CFDS 6 V1 V2 V5 V3 V4 V6 Aircraft A Aircraft B Aircraft C Aircraft D Aircraft E Aircraft Y Weighting of Fighter Roles / CFDS missions + Mission Criticality This example is for missions in Canada (Vignette 1); therefore only three Aerospace Capabilities apply: Defensive Counter Air (DCA), Tactical Support to Maritime Operations (TASMO) and Intelligence, Surveillance and Reconnaissance (ISR). Based on this example (see top left hand box), Aircraft Y scored 7.1 for DCA (light green), 6.1 for TASMO (yellow) and 7.3 (light green) for ISR. As a result there are 2 light green risk boxes and one yellow box. Before applying the mission criticality matrix, Aircraft Y would score 7.0 (or light green) – See top right hand box. However, in the bottom left hand box, you will find the mission criticality of Aerospace Capability (example used does not represent actual mission criticality for CFDS mission 1). DCA is considered Mission Critical, TASMO is also Mission Critical and ISR is considered Mission Essential. Once each of the Aerospace Capabilities is run through the Mission Criticality Translation Matrix, the result is Significant Operational Risk (Yellow) for that aircraft. The next slide explains how the 2 light greens and one yellow result in an overall assessment of yellow. 21

22 Step 3b - Mission Criticality Translation Matrix
Weighted Capability Assessment of each Aerospace Capability Criticality of Aerospace Capability Maximum Operational Assessment + = Mission Critical Mission Essential Mission Routine Definitions Mission Critical: A capability that delivers a direct effect as part of its primary function. This effect Is evaluated as critical to mission success and will pose risk to mission success if it is not employed. Mission Essential: A capability that is an essential enabler to Mission Critical Capabilities; the lack of one or more Mission Essential Capabilities will pose risk to mission success. Mission Routine: A capability that is required for the mission but either as a routine supporting function or a very low likelihood of employment. Only in cases where multiple Mission Routine Capabilities constitute a systems failure will any significant risk be posed to the mission. On the previous slide, DCA was considered Mission Critical and was assessed as light green (7.1). Using the matrix shown on this slide, a light green score for a Mission Critical element translates so that the final Operational risk assessment cannot score better than light green. TASMO was also considered Mission Critical and it was assessed as yellow (6.1). Using the matrix shown on this slide, a yellow score for a Mission Critical element translates so that the final Operational risk assessment cannot score better than yellow. Finally, ISR was considered Mission Essential and it was assessed as light green (7.3). Using the matrix shown on this slide, a light green score for a Mission Essential translates so that the final Operational risk assessment cannot score better than dark green. Using the principle of the matrix, the highest operational score that can be obtained refers to the one score showing the highest risk factor. In this example, Yellow is the highest risk factor and therefore results in the Operational Score for that mission.

23 Operational level scoring (before Mission criticality) 2020-2030
Step 3 – “Operational Risk” Assessment Rating Definition 10 Insignificant limitations in accomplishing the mission. 9 8 Minor limitations in accomplishing the mission 7 6 Moderate limitations in accomplishing the mission. 5 4 Major limitations in accomplishing the mission 3 2 Severe limitations in accomplishing the mission. 1 Operational level scoring (before Mission criticality) CFDS 1 CFDS 2/4 CFDS 3 CFDS 5 CFDS 6 Vignette 1 Vignette 2 Vignette 5 Vignette 3 Vignette 4 Vignette 6 Aircraft A Aircraft B Aircraft C Aircraft D Aircraft E Example Definition Low operational risk to the RCAF ability to successfully fulfill the fighter contribution to CFDS mission X. Medium operational risk to the RCAF ability to successfully fulfill the fighter contribution to CFDS mission X. Significant operational risk to the RCAF ability to successfully fulfill the fighter contribution to CFDS mission X. High operational risk to the RCAF ability to successfully fulfill the fighter contribution to CFDS mission X. Very high operational risk to the RCAF ability to successfully fulfill the fighter contribution to CFDS mission X. Step 3 - End State “Operational Risk” Mission Criticality Translation Matrix Operational Risk CFDS 1 CFDS 2/4 CFDS 3 CFDS 5 CFDS 6 Vignette 1 Vignette 2 Vignette 5 Vignette 3 Vignette 4 Vignette 6 Aircraft A Aiircraft B Aircraft C Aircraft D Aircraft E Example This slide summarizes the previous two slides showing how an aircraft’s assessment has moved from light green (Minor operational limitations) to yellow (significant operational risk). This completes the “Capabilities” portion of the assessment methodology. 23

24 Assessment of the “Production and Supportability” Questions
Step 1 – Review of responses by technical Subject Matter Expert teams – no scoring. Step 2 – Assessment of responses by the Royal Canadian Air Force Senior leadership level from a Canada First Defense Strategy missions perspective to result in “Military Strategic Risk” assessment of each aircraft against the missions. Both Steps will be overseen by the National Fighter Procurement Secretariat. Significant differences of opinion between assessment teams will be brought to the attention of the Panel. This slide outlines the assessment methodology of the “Production and Supportability” questions. These were questions dealt with in 19 to 24 of the IER and are grouped into the following 4 Strategic Assessment Factors: 1. Acquisition 2. Supportability and Force Management 3. Integration 4. Growth Potential There is a 2-step process for this part of the assessment. The first step will be conducted by teams of Operational Subject Matter Experts (LCol / Maj level) from the Royal Canadian Air Force with extensive background in the respective area of the Strategic Assessment Factors. They will review each of the responses, supplemented by other available information such as Open Source and Government to Government data. The Step 1 reviews will result in a qualitative assessment documented with comments from the assessment teams but without any scoring. The comments will provide a risk analysis of each response to the 4 areas of Production and Supportability. These 4 areas will be referred to as “Strategic” components. The analysis completed by the Step 1 teams will be provided to Senior Air Force staff (ex. Colonel level) for assessment. This team will consider the analysis provided in Step 1 and determine risk ratings (or scores) for each of the overall “strategic” component risks against each mission and in the same 2 time horizons that were assessed for Capabilities. 24

25 Factor Analysis through Deductive Reasoning
Where does the Industry response fit inside the framework? Factors Considerations Deductions (step 1) (step 2) Likelihood Possible Impacts Required mitigations Strategic Assessment Definition: There is a possible risk that… Strategic Risk Assessment So What? So What? The mission need is… The threat is assessed as… CFDS Source Documentation Slide 25 entitled, Factor Analysis through Deductive Reasoning, illustrates how this 2-step assessment will be conducted and what information will be used during the assessment. These “strategic” assessment teams will be using professional military judgement based on a standard deductive reasoning process. Given a set of Factors, the teams will apply their experience and subject matter expertise to run through a “so what” analysis of considerations. This will be the output of Step 1. The Step 2 team will then deduce the Strategic Risk to accomplishing the mission through further analysis of the considerations (Likelihood of occurrence, impact statements and mitigation measures). The responses to the questionnaire are one of the source documents feeding into this deductive reasoning process. Industry responses to Industry Engagement Request Open source study Government-to-Government Information Other products from the Evaluation of Options (used during step 2 only)

26 Acquisition This Strategic Assessment Factor (SAF) is an assessment of factors, other than cost, that would affect acquisition of aircraft types. This includes: planned production periods of aircraft type; the manufacturer’s ability to continue or re-establish production periods; the ability of the manufacturer to offer a complete “cradle to grave” program, including; training, in-service / life cycle support and disposal; the manufacturer’s successful completion of any required developmental work before the aircraft are ready for acquisition; and the ability of aircraft type to be certified for airworthiness under Canadian Forces regulations. 26

27 Supportability and Force Management
An assessment of the overall / long-term supportability and aspects related to management of the fighter force. This includes factors such as: the required quantity of aircraft; managed readiness postures and sustainment rations; the training system and production of appropriately trained pilots and maintenance / support personnel (including use of simulators); training considerations for maintenance / aircrew personnel; aircraft mission availability rates and aircrew / technician workload; long-term availability of components needed for aircraft maintenance; implementation and sustainment of supporting infrastructure; the ability to operate from required locations; weapons compatibility and support; and the suitability of a reprogramming capability including its flexibility and responsiveness for new threats and/or theatres of operation or new capabilities. 27

28 Integration This Strategic Assessment Factor assesses broad interoperability within the Canadian Forces and with allied forces. This includes interoperability with air-to-air refuelling services (other than Canadian Forces), common ground/spares support with allies and their supply lines, training systems, data sharing, communications, STANAGs, and the ability to feed data into Canadian Forces and Government of Canada networks taking into consideration national security requirements.

29 Growth Potential This Strategic Assessment Factor assesses the growth potential and technological flexibility to respond to unforeseen future advances in threat capabilities, to implement required enhancements to fighter technology, and to evolve as needed to meet the Canadian Forces’ needs. Factors include analysis of the architecture of aircraft types, power and cooling capability for new systems. 29

30 Strategic Assessment Factors Other reports resulting
Response to IER Step 1 Aircraft - SAF Assessment Technical SME level working groups Problem space defined and scoped by the use of a “possible risk” based questionnaire SME comment on likelihood; then Comment on possible impacts and required mitigations to the overall fighter program Information to be collated and considered in Step 2. Strategic Assessment Factors Other reports resulting from evaluation of options Acquisition Supportability & Force Management Integration Growth Potential Future Security Env’t Open Source/Other Info and Gov-to-Gov Info Definition Low strategic risk to the RCAF ability to successfully fulfill the fighter contribution to CFDS mission X. Medium strategic risk to the RCAF ability to successfully fulfill the fighter contribution to CFDS mission X. Significant strategic risk to the RCAF ability to successfully fulfill the fighter contribution to CFDS mission X. High strategic risk to the RCAF ability to successfully fulfill the fighter contribution to CFDS mission X. Very high strategic risk to the RCAF ability to successfully fulfill the fighter contribution to CFDS mission X. End State - Strategic Risk Strategic Risk CFDS 1 CFDS 2/4 CFDS 3 CFDS 5 CDFDS 6 V1 V2 V5 V3 V4 V6 Aircraft A Aircraft B Aircraft C Aircraft D Aircraft E Example Step 2 SAF - CFDS assessment Senior leadership level Working group (fighter heavy) Problem space defined and scoped by the Task 2 report - “Considering the likelihood of possible impacts to the overall Fighter Program, and based on your Professional Military Judgment, how would you rate the strategic risk to the RCAF ability to successfully fulfill the fighter contribution to each CFDS mission?” Slide 30 provides a pictorial view of the 2 steps involved in the strategic assessment of the responses. The box in the top left hand side of the slide lists the documents and factors that will be used to conduct the assessment of the 4 Strategic Assessment Factors shown in the middle of the slide. Listed on the top right hand side of the slide, are the Step 1 team reviews that will be conducted. They will identify possible risks, likelihood and impacts of risk, as well as potential mitigation strategies. The work of the Step 1 teams will be overseen by the NFPS. Listed on the bottom right hand side of the slide, are the factors that the Senior Air Force staff will consider when reviewing the analysis provided in Step 1 and which will be used to determine risk ratings (or scores) for each of the 4 strategic components, against each mission and in the 2 time horizons that will be completed in Step 2. This assessment will be done through the CFDS missions’ lens and it will be performed in a group discussion environment with NFPS’ oversight. It is at that point that a strategic risk assessment will be assigned to each of the 4 strategic factors and for each mission. The “End State” box at the bottom of the slide basically takes all 4 strategic factors and assigns it an overall risk rating per mission. This step will be done by the same team that performed the Step 2 assessment and will be based on professional military judgement. 30

31 Final Stage – Overall Mission Risk Assessment
End State: Mission Risk Strategic Risk Operational Risk

32 (fusion of Operational and Strategic risk)
End State Mission Risk Operational Risk CFDS 1 CFDS 2/4 CFDS 3 CFDS 5 CFDS 6 V1 V2 V5 V3 V4 V6 Aircraft A Aircraft B Aircraft C Aircraft D Aircraft E Aircraft Y Definition Low risk to the RCAF ability to successfully fulfill the fighter contribution to CFDS mission X. Medium risk to the RCAF ability to successfully fulfill the fighter contribution to CFDS mission X. Significant risk to the RCAF ability to successfully fulfill the fighter contribution to CFDS mission X. High risk to the RCAF ability to successfully fulfill the fighter contribution to CFDS mission X. Very high risk to the RCAF ability to successfully fulfill the fighter contribution to CFDS mission X. Mission Risk CFDS 1 CFDS 2/4 CFDS 3 CFDS 5 CFDS 6 V1 V2 V5 V3 V4 V6 Aircraft A Aircraft B Aircraft C Aircraft D Aircraft E Aircraft Y Mission Risk (fusion of Operational and Strategic risk) RCAF to apply professional military judgement C Air Force endorsement of Final Report Strategic Risk CFDS 1 CFDS 2/4 CFDS 3 CFDS 5 CFDS 6 V1 V2 V5 V3 V4 V6 Aircraft A Aircraft B Aircraft C Aircraft D Aircraft E Aircraft Y Mission Risk will be determined by a Working group of senior Air Force personnel, who will consider the Operational Risk and the Strategic Risk using a mission risk translation matrix as a decision support tool. This slide provides an example of a situation where the Operational Risk is rated as yellow and the Strategic Risk as light green. When both of these are run through a translation matrix the result is an overall Mission Risk of Yellow for Mission 1. 32

33 Mission Risk Mission Risk will be an outcome of the fusion of Strategic and Operational risk by the same RCAF senior leaders assessment teams who took part in step 2 of the Strategic Assessments. Decision support tools, such as but not limited to a mission risk translation matrix and the DND/CF Integrated Risk Management Guidelines will be used to determine a rating. Final assessment will be informed by, but not limited to the decision support tools provided. Assessors will be allowed to deviate from the mission risk translation matrix used by + or – one risk level. Rationale behind a final assessment that deviates from the mission risk translation matrix used will be provided to the Independent Review Panel for review. 33

34 Summing Up We are committed to an open, fair and transparent process
IER and methodology posted publicly on Government website All questions and answers during IER shared simultaneously with all companies Details of assessment methodology shared with companies in advance of IER response deadline NFPS overseeing entire assessment process Significant differences of opinion between assessment teams will be brought to the attention of the Panel The process will result in an assessment of benefits and risks associated with each aircraft against each mission. Canada will release a summary report that respects commercial sensitivities and classified information restrictions. 34


Download ppt "Industry Engagement Request Questionnaire 1 Assessment Methodology"

Similar presentations


Ads by Google