Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation and Improvement

Similar presentations


Presentation on theme: "Evaluation and Improvement"— Presentation transcript:

1 Evaluation and Improvement
3/22/2017 HSEEP Exercise Evaluation and Improvement Welcome to the HSEEP Exercise Evaluation Training Course. This course has been developed by ODP with support from contractors who have helped put together the HSEEP Exercise Evaluation and Improvement guidance document. A feedback form is included in your notebook. We need your help to make this course more helpful to you and others. Please complete it at the end of each section, no later than the end of each day. You’ll be reminded again later. Instructors Notes Introduce yourself and the rest of the instructors and support staff including facility host if appropriate. Include relevant aspects of your background to establish credibility or allow each instructor to do the same. Make general administrative remarks, i.e.: Locations of restroom and other facilities (i.e. phones) Where to pick-up phone/fax messages Time allocated for breaks and lunch Exit locations Rules of engagement Use cell phones outside class room Set pagers/beepers on vibrate No smoking in facility NO TEST, but there will be a course completion certificate, based on full participation DRAFT

2 3/22/2017 ODP’s Mission Primary responsibility within the executive branch to build and sustain the preparedness of the US to reduce vulnerabilities, prevent, respond to, and recover from acts of terrorism (Homeland Security Act). ODP was created with in the Department of Justice in 1998 to enhance preparedness to respond to acts of terrorism. ODP moved into the new Department of Homeland Security, along with 21 other agencies, on March 1, 2003. With that move ODP was given a broader mission which is that ODP has primary…. The assignment of expanded responsibilities broadens ODP’s constituency from a primarily state and local focus to include federal departments and agencies, tribal governments, the private sector and the international community. ODP’s original emphasis on response capabilities has been broadened to include efforts to reduce vulnerabilities, and to prevent and recover from acts of terrorism. DRAFT

3 ODP’s Responsibilities
3/22/2017 ODP’s Responsibilities Grant programs for planning, equipment, training and exercises National training program National exercise program ODP’s mission is implemented through an integrated program of grants, training, and exercise. For those of you who are contractors your focus is on developing and conducting exercises. Do you know how those exercises fit into the larger integrated program? How many of you are familiar with the Assessment and Strategy Development process implemented under the grant programs? How many are familiar with ODP’s training programs and the course catalogue? I am going to provide a quick overview of ODP’s programs, the assessment and strategy process and our national training and exercise program. I will then talk about the Homeland Security Exercise and Evaluation Program and how the evaluation of performance under exercises serves as a primary measure of effectiveness for all of ODP’s programs. DRAFT

4 Grant Programs State Homeland Security Program
3/22/2017 Grant Programs State Homeland Security Program Law Enforcement Terrorism Prevention Program Citizen Corps Program Urban Areas Security Initiative Program Fire Fighter Assistance Program DRAFT

5 State Homeland Security Program
3/22/2017 State Homeland Security Program Purpose: to enhance capacity of states and locals to prevent, respond to, and recover from terrorism Provides funds for Homeland security and emergency operations planning The purchase of specialized equipment CBRNE and cyber security training programs CBRNE and cyber security exercises State Homeland Security Assessments and Strategies SHSP Purpose:- to enhance capacity of states and local jurisdictions to prevent, respond to, and recover from terrorism Provides funds for Homeland security and emergency operations planning The purchase of specialized equipment Costs for the design, development, and conduct of state CBRNE and cyber security training programs and attendance of ODP sponsored training Costs relate to the design, development, conduct, and evaluation of CBRNE and cyber security exercises Costs associated with implementing State Homeland Security Assessments and Strategies Funds provided to states, D.C., territories; .(2003- $1.9 B) (2004- $1.9 B) DRAFT

6 Law Enforcement Terrorism Prevention Program
3/22/2017 Law Enforcement Terrorism Prevention Program Provide law enforcement communities with funds to support the following prevention activities: Information sharing to preempt terrorist attacks Target hardening Recognition of potential or developing threats Interoperable communications Intervention of terrorists before they can execute a threat Planning, organization, training, exercises, and equipment LETPP Provide law enforcement communities with funds to support the following prevention activities: Information sharing to preempt terrorist attacks Target hardening to reduce vulnerability of selected high value targets Recognition of potential or developing threats Interoperable communications Intervention of terrorists before they can execute a threat Can be used for planning, organization, training, exercises, and equipment DRAFT

7 3/22/2017 Citizen Corps Program Provides funds to support Citizen Corps Councils with planning, outreach, and management of Citizen Corps program and activities Form and sustain a Citizen Corp Council Engage citizens in homeland security Conduct public education and outreach Develop and implement Citizen Corps programs Coordinate Citizen Corps activities with other DHS funded programs and other federal initiatives CCP Provides funds to support Citizen Corps Councils with planning, outreach, and management of Citizen Corps program and activities Bring together the appropriate leadership to form and sustain a Citizen Corp Council Develop and implement a plan for the community to engage all citizens in homeland security, community preparedness, and family safety Conduct public education and outreach to inform public about their role in crime prevention, mitigation, emergency preparedness and public health measures Develop and implement Citizen Corps programs offering training and volunteer opportunities to support first responders, disaster relief groups, and community safety efforts to include the four charter federal programs Community Emergency Response Teams (CERT), Neighborhood Watch, Volunteers in Police Service (VIPS), and Medical Reserve Corps (MRC) Coordinate Citizen Corps activities with other DHS funded programs and other federal initiatives DRAFT

8 Urban Areas Security Initiative Program
3/22/2017 Urban Areas Security Initiative Program Address the unique needs of large urban areas – 50 cities Conduct jurisdictional assessment and develop Urban Area Homeland Security Strategy. Funds for planning, equipment, training, exercise, and administration and operational activities related to heightened threat levels UASI New Program in FY 2003 Designed to address the unique needs of large urban areas – 50 cities Conduct jurisdictional assessment and develop Urban Area Homeland Security Strategy. Direct grants to local jurisdictions for planning, equipment, training, exercise, and administration and operational activities related to heightened threat levels DRAFT

9 Fire Fighter Assistance
Protect public and fire fighters against fire and fire-related hazards Fire fighting Operations and Safety Fire Prevention Fire fighting Vehicles

10 Strategy Process Overview
3/22/2017 Strategy Process Overview STEP 1 STEP 2 STEP 3 Statewide Homeland Security Strategy State Assistance Plan Assessments Conducted at the local and state levels Created at the regional and state level Created by ODP The states manage the assessment and strategy process, engaging local jurisdictions in the assessments. The state designates local jurisdictions for this purpose which may be done at a county or regional level. For the Urban Area Security Initiative Program the assessment must be coordinated with and developed jointly by the core city, core county, jurisdictions contiguous to the core city and county or jurisdictions with which the core city and county have established mutual aid agreements. The Urban Areas develop a strategy and are coordinated with the state strategy. State and Urban Area use strategy to identify & allocate all HS resources END RESULT = Capability Improvements DRAFT

11 Strategy Participants
3/22/2017 Strategy Participants State and local jurisdictions All First Responder Disciplines Public Health Health Care Public Works Government Administrative Private Sector Non-Profit/Voluntary Sector Fire Service HazMat Emergency Medical Services Law Enforcement Emergency Management Public Safety Communications The assessment and strategy process is an inclusive process with active participants from the full spectrum of homeland security disciplines. DRAFT

12 Assessment Overview Shortfalls or “Gaps” Risk Assessment
3/22/2017 Statewide Homeland Security Strategy Risk Assessment Needs Assessment Planning Factors CBRNE* Scenarios Threat Assessment Vulnerability Assessment The assessment process is comprehensive and is supported by an on-line assessment tool. It requires that local jurisdictions and state agencies conduct a risk and needs assessment. The risk assessment includes a threat and a vulnerability assessment. Jurisdictions may also choose to conduct an agricultural vulnerability assessment. The agricultural assessment is new with the current assessment tool. Jurisdictions also conduct a needs assessment that looks at capabilities and needs. Required Capabilities Current Capabilities Agricultural Vulnerability Assessment Shortfalls or “Gaps” * CBRNE: Chemical, Biological, Radiological, Nuclear, Explosive DRAFT

13 Agricultural Vulnerability Assessment
3/22/2017 Threat Assessment Who: Local, state, and federal law enforcement officials What: Identify number of Potential Threat Elements (PTEs) Identify threat factors (existence, violent history, intentions, WMD capability, and targeting) Identify motivations (political, religious, environmental, racial, or special interest) Identify WMD capabilities (CBRNE) Risk Assessment Threat Assessment Vulnerability Assessment The threat assessment measures the existence of potential threat elements located within the jurisdiction, and their capability, targeting, motivations, and history. Only the sensitive information rating is included in the assessment. The assessment is used to determine the most probable kind of WMD incident that could occur at a potential target. This is useful information to get from the local jurisdiction when working with them to determine the type of exercise to conduct. Agricultural Vulnerability Assessment DRAFT

14 Vulnerability Assessment
3/22/2017 Vulnerability Assessment Who: All response disciplines at local, state, and federal levels What: Identify critical infrastructure/ potential targets Evaluate targets for: Level of visibility Criticality of target site Impact outside of jurisdiction Access to target Target threat of hazard Target site population capacity Potential collateral mass casualties Risk Assessment Threat Assessment Vulnerability Assessment The vulnerability assessment provides a vulnerability profile and rating for all potential targets with the jurisdiction. Agricultural Vulnerability Assessment DRAFT

15 Capabilities and Needs: Planning
3/22/2017 Capabilities and Needs: Planning The results from the risk assessment process (threat and vulnerability) provide a link to the capabilities and needs assessment process. Planning Organization Equipment Training Exercises Through the capabilities and need assessment, local jurisdictions assess their capabilities by discipline in 5 areas: planning organization equipment training exercises. DRAFT

16 State Homeland Security Strategy
3/22/2017 State Homeland Security Strategy Developed by State based on local needs Provides blueprint for planning of homeland security efforts to enhance preparedness and for use of resources DRAFT

17 State Assistance Plans
3/22/2017 State Assistance Plans ODP uses the strategies and needs assessment data to tailor and formulate a State/Metro Assistance Plan (SAP/MAP) for each state A SAP/MAP is a blueprint for the delivery of ODP training, exercise, technical assistance and equipment services DRAFT

18 National Training Program
3/22/2017 National Training Program Training for federal, state and local homeland security professionals Based on critical tasks to prevent, respond to or recover from a terrorist incident Over 40 courses available DRAFT

19 ODP Training Program ODP offers more than 40 courses (Examples)
3/22/2017 ODP Training Program ODP offers more than 40 courses (Examples) Live chemical agents training – Center for Domestic Preparedness Live explosives training – New Mexico Institute of Mining and Technology Radiological and nuclear agents training – Nevada Test Site Advanced emergency medical training using human patient simulators – Texas A&M Training on bioterrorism – Louisiana State University DRAFT

20 National Exercise Program
3/22/2017 National Exercise Program Responsible for National Exercise Program Threat and performance-based excises at federal, state, local, and international levels Strategy and Exercise Planning Workshops to define exercise needs and plan for each state DRAFT

21 Assess Program Success Through Exercises
3/22/2017 Assess Program Success Through Exercises Performance measures for ODP’s grant, training, and exercise programs are tied to performance of critical tasks Percent of jurisdictions that can perform critical tasks as demonstrated through exercises 500,000+ population 100,000+ population 50,000+ population DRAFT

22 Overview of HSEEP Threat- and Performance-based Exercises
3/22/2017 Overview of HSEEP Threat- and Performance-based Exercises Cycle of exercises Increasing complexity To improve preparedness This section of the course describes some basic principles of the exercise evaluation and improvement activities. This section will be given by ODP for all contractors and state sessions. Additional slides and materials to be added by ODP. DRAFT

23 HSEEP Manuals Volume I: Program Overview and Doctrine
Volume II: Exercise Evaluation and Improvement Volume III: Exercise Development Volume IV: Sample Exercise Documents and Formats

24 Vol I: HSEEP Overview and Doctrine
ODP’s exercise and evaluation doctrine Uniform approach for exercise design, development, conduct, and evaluation Exercise design and implementation process Suite of common scenarios (TBD)

25 Vol II: Exercise Evaluation and Improvement
3/22/2017 Vol II: Exercise Evaluation and Improvement Defines exercise evaluation and improvement process Provides uniform set of evaluation guides Defines data analysis process Includes standardized After-Action Report template DRAFT

26 Vol III: Exercise Development
Defines exercise planning and design process Provides guidance for the development and conduct of various types of exercises

27 Vol IV: Sample Documents
Provides sample letters, planning documents, checklists, scenarios, etc. Reduces development time for exercise design team

28 3/22/2017 Exercise Evaluation Assess preparedness at federal, state and local levels Validate strengths and identify improvement opportunities, resulting in improved preparedness Provide guide for resource allocations Exercises provide a means to train and practice prevention, response, and recovery capabilities in a risk-free environment and to assess and improve performance. The goal of exercise evaluation is to validate strengths and identify improvement opportunities for the participating organization(s). This is accomplished by: observing the exercise and collecting supporting data; analyzing the data to compare performance against expected outcomes; and determining what changes need to be made to the procedures, plans, staffing, equipment, communications, organizations, and interagency coordination to ensure expected outcomes. The information obtained during an exercise can help review performance at several different levels, listed on this slide. We’ll discuss these in more detail in a moment. The level of analysis conducted on the exercise data will vary depending on the type of exercise. This training focuses on the most complicated exercise, the full-scale, multi—jurisdictional exercises. In addition, we will discuss the differences involved in conducting a tabletop exercise. DRAFT

29 Evaluation Enhancements
Focus on performance of critical tasks and mission outcomes Use of uniform evaluation tools Enhanced data analysis Debriefing meeting with key officials Improvement Plan Track implementation of improvements Suite of common scenarios (TBD)

30 Exercise Evaluation Methodology Development
3/22/2017 Exercise Evaluation Methodology Development Exercise Evaluation Working Group Builds on Responder Guidelines ODP exercise experience CSEP and other programs Will continue to enhance and improve DRAFT

31 Exercise Evaluation and Improvement Process
3/22/2017 Exercise Evaluation and Improvement Process Evaluation Planning, Observation, and Analysis Exercise Evaluation and Improvement Process Data Collection and Analysis Step 1 Plan & Organize the Evaluation Step 2 Observe the Exercise & Collect Data Step 3 Analyze Data Step 4 Develop After Action Report Improving Preparedness Here we’ll give you a very brief overview of the exercise design process, and how evaluation fits into it. The overall process can be thought of in two major phases: Evaluation Planning, Observation, and Analysis (steps 1 through 4) Improving Preparedness (Steps 5 through 8) Instructor’s Notes: The rest of this section should be done quickly since most of it is covered in Sections III and IV. Step 5 Conduct Debriefing Step 6 Identify Improvements Step 7 Finalize After Action Report Step 8  Track Implementation DRAFT

32 Levels of Analysis Performance is analyzed at three levels: Task level
Agency/discipline/function level Mission level (within and across communities)

33 Levels of Analysis Task Level Performance
3/22/2017 Levels of Analysis Task Level Performance Answers the question: did the person or team do the right thing the right way at the right time? Helps assess need for training, equipment, personnel, etc. Task = work with measurable output that has utility Task Level Performance: At the most fundamental level, an exercise evaluation can look at the ability to perform individual prevention and response tasks. A task can be defined as “work with a measurable output that has utility”. Analysis at this level will answer the question: Did the individuals or team carry out the task the way that you expected and in a way that achieved the goal of the function? In other words: Did the person or small team do the right thing the right way at the right time? The evaluation of the performance of individual tasks can help determine whether personnel, training, and equipment are sufficient for the individuals/teams to do their job. Such information is useful for team leaders and first-line supervisors when determining training needs, scheduling maintenance, and routine purchasing. Note: DHS/ODP DHS/ODP developed Emergency Responder Guidelines that identify the essential tasks that response agencies must perform to effectively prevent, respond to, and recover from a threat or act of terrorism, including those involving the use of CBRNE weapons. The task in this picture is a firefighter sizing up a fire. The outcome or result of this task is an assessment of the extent and type of fire, so that firefighters can determine how to attack it. {Note: need more action oriented task.} Instructor Notes: DRAFT

34 3/22/2017 Levels of Analysis Agency/Discipline/Function Level Performance — Multiple teams Answers the question: did the larger team or organization perform duties in accordance with plans and policies? Helps assess communication, coordination, planning budgets, etc. Exercise evaluation also assesses performance of: agencies (e.g., police or fire department), disciplines (e.g., local, state, and federal law enforcement agencies), and functions, often as defined with the Incident Command System (e.g., HazMat team, Emergency Operation Center, or fire services). The purpose of evaluation at this level is to answer the question: Did the larger team or organization accomplish its duties correctly in accordance with approved plans, policies, procedures, and agreements? Or did the Team deviate from planned response in an appropriate and successful way to meet the need, threat and resources available at the time? The analysis at this level is useful for assessing such issues as advanced planning and preparation; how the members work together at the discipline, department, or organizational level; and how well team members communicate across organizational and jurisdictional boundaries. This information is used by department managers and agency executives at the state and local level in developing annual operating plans and budgets, in communicating with political officials in setting long-range training and planning goals, and in developing interagency and interjurisdictional agreements. In this example, the agency or discipline represented is fire or Emergency Medical Services (EMS). Multiple tasks are being performed to evacuate this patient, such as monitoring the patient’s medical signs, transporting the patient on a stretcher, and using appropriate PPE. DRAFT

35 Levels of Analysis Mission Level Performance Outcomes = results
3/22/2017 Levels of Analysis Mission Level Performance Answers the question: were the mission level outcomes achieved? Addresses jurisdictional preparedness Outcomes = results Mission Level Performance As public officials know, success in a real emergency is measured by outcomes (or results). The public expects its government, and law enforcement and response agencies to prevent terrorist attacks if possible, and if attacked, to respond to and recover from these attacks quickly and effectively, mitigate the associated hazards, care for the victims, and protect the public. By focusing on performance and the root causes for variances from expected outcomes, public officials will be able to target their limited resources on improvements that will have the greatest effect on terrorism preparedness. In this example, we have multiple disciplines and agencies represented in an Emergency Operations Center (EOC); their mission is emergency management for public safety. Why are we concerned about outcomes? Example: In the TOPOFF 2000 exercise, a military team was fully trained and equipped to rescue and decontaminate victims. In the scenario, there was a chemical WMD release at a port facility. After the live victims had been evacuated, there were many contaminated dead bodies remaining on the pier. The bodies were baking in the sun. Seagulls had arrived to pick at the remains. The victims’ families were traumatized, and the media were in a frenzy. But this military team wouldn’t remove the bodies to a makeshift morgue because it wasn’t in their procedure. Their procedure only called for handling live victims – not dead ones. So they followed procedures, but missed the point: victim care. In this case, care for the victims’ remains and the well-being of the victims’ families. So while we want to measure adherence to procedures, we also want to make sure we achieve the desired results. Instructor Notes: DRAFT

36 Mission Outcomes Prevention/Deterrence Pre-Event Emergency Assessment
3/22/2017 Mission Outcomes Prevention/Deterrence Emergency Assessment Emergency Management Hazard Mitigation Public Protection Victim Care Investigation/Apprehension Recovery/Remediation Pre-Event Emergency Response Post-Event For purposes of the HSEEP, ODP has defined 8 Mission Level Outcomes:   Prevention/Deterrence – ability to prevent or deter terrorist actions Emergency Assessment - ability to detect an event, determine its impact, classify the event, conduct environmental monitoring, and make government-to-government notifications Emergency Management – ability to direct, control, and coordinate a response, provide emergency public information to the population-at-risk and the population-at-large, and manage resources. Incident/Hazard Mitigation - ability to control and contain an incident at its source and to reduce the magnitude of its impact. This outcome also includes all response tasks conducted at the incident scene except those specifically associated with victim care Public Protection - ability to keep uninjured people from becoming injured, once an incident has occurred. Victim Care - ability to medically treat victims and handle human remains. Investigation/Apprehension – ability to find the cause and source of the attack, prevent secondary attacks, and to identify and apprehend those responsible Recovery/Remediation – ability to restore essential services and business, cleanup the environment and render the affected area safe, compensate victims, and restore a sense of well-being to the community  Instructor Notes: DRAFT

37 Evaluation Requirements
3/22/2017 Evaluation Requirements Determine what outcomes will be evaluated, based on exercise objectives Identify activities to be evaluated Identify which functions should be observed Determine where observations will take place Identify the appropriate evaluation tools Commonly, many of the same people are involved in both the Exercise Planning Team and the Exercise Evaluation Team. Certainly the Exercise Director will participate and monitor the evaluation team’s planning although the team should be lead by the Lead Evaluator. The Exercise Lead Controller should also be involved. Exercise Planning and Evaluation Planning are best accomplished concurrently. Lets look now at general requirements for successful exercise evaluation planning. The evaluation planning team will use the EXPLAN and MSEL to plan the evaluation, as follows: The evaluation planning team will first use the exercise goals and objectives to determine what performance outcomes should be evaluated. Once the outcomes to be evaluated are determined, the team identifies what activities should be evaluated. Based on these activities, the team identifies which functions (e.g., individuals, teams, disciplines, organizations) should be evaluated. From the functions, the evaluation planning team can identify where the observations should take place (i.e., what locations) and which specific tasks should be evaluated. Once these steps have been completed, the evaluation planning team can identify or develop the appropriate evaluation tools for the evaluators to use. DRAFT

38 Exercise Evaluation Guides
3/22/2017 Exercise Evaluation Guides ODP has developed Exercise Evaluation Guides that: Identify the activities that the evaluator should be observing Provide consistency in tasks across exercises Link individual tasks to disciplines and outcomes In the end of this section, we will discuss the ODP Exercise Evaluation Guides. These are tools that ODP has developed to create consistency in the evaluation process across all exercises. Instructor Notes: As the students how an evaluator knows what to look for, once they have been assigned to a particular location Explain that EEGs provide a consistent way of defining those tasks. In the set of EEGs, there is basically one guide for every task to be observed. The Evaluation Team would just select those EEGs pertinent to the particular exercises, as part of the planning. Direct students to the EEGs in the notebooks. Explain how the guides are organized according to the eight mission outcomes described earlier. Thus, there will be 8 sets of EEGs, once completed. Since the first set pertaining to tasks that lead to prevention and deterrence outcome are not yet developed, the EEGs in the notebook start with second set, under the outcome of Assessment. All EEGs under this outcome start with Roman Numeral II. ODP wants to have these EEGs used for every ODP-sponsored exercise. This way, the AARs can be written up following this structure, so that data can later be rolled up consistently, to the national level, as described in the beginning of this course. Explain that the next 3 slides walk through the structure of an individual task EEG. can be sorted by function/discipline or outcome. DRAFT

39 The EVALPLAN Exercise-specific information
3/22/2017 The EVALPLAN Exercise-specific information Plans, policies, procedures, and agreements Evaluator recruiting and assignments Evaluator training and instructions Good evaluation planning should result in the development of an exercise Evaluation Plan (EVALPLAN), and as we have already said, evaluation planning starts as early as possible in the exercise design and planning. The EVALPLAN provides an overview of the exercise and the plan for the evaluation of the exercise. It is distributed to exercise planners, controllers and evaluators. It should include the purpose of the exercise, a list of tasks and outcomes to be evaluated, and a list of participating jurisdictions, as well as administrative and logistical information for the exercise. There may be some duplication of material between the EXPLAN and the EVALPLAN. This allows evaluators to obtain all the important information they need in one planning document. The EVALPLAN typically consists of: Exercise-specific information – scenario/MSEL, the map of play site (including evaluation locations), and the exercise schedule (including the evaluation schedule), appropriate plans, policies, procedures and agreements. Evaluator team information: how many evaluators are needed, where they will be located, and how they are organized. Evaluator instructions -- on what to do before they arrive (e.g., review exercise materials, jurisdictional plans and procedures, and the EVALPLAN) as well as how to proceed upon arrival. Ask students for examples, good and bad, from their experience and whether (and when) the evaluation planning was or was not incorporated into the exercise development process. DRAFT

40 Recruiting and Assigning Evaluators
3/22/2017 Recruiting and Assigning Evaluators Setting expectations – evaluators must be available for: pre-exercise training and briefing pre-exercise site visit the entire exercise (hours to days) post-exercise hot-wash post-exercise data analysis (1 day) contribution to the draft AAR Expectations: Evaluators are generally expected to be available for the pre-exercise training and briefing/site visit, the exercise itself, the post-exercise hot-wash, and for data analysis and contribution to the AAR. This time commitment is usually equivalent to one day before the exercise, the exercise day(s), and full one day after the exercise. Instructor Note: Facilitate a discussion as to how these expectations can be met. For Example: Incentives Highest level possible of support from management and leadership DRAFT

41 Recording Observations
3/22/2017 Recording Observations The emphasis is on Who? What? When? Where? How? Why? Record observations through: use of Evaluator Guides blank sheets of paper Collect exercise documents One way to record and capture information is to ask the questions who, what, when, where, how, and why. Information can be captured using the ODP EEGs or even just blank sheets of paper with a column for recording events and times. The EEG provide space to create a chronological record of the action to address the above questions. Although the evaluator should be familiar with the expected outcomes and steps outlined in the “What to Look For” section of the guide and the questions in the “Data Analysis Questions and Measures” section, he/she should not try to use this as a checklist. It is important to concentrate on simply recording what is happening. The analysis of how well the exercise met expectations is done later during the analysis phase. DRAFT

42 Record Significant Activities
3/22/2017 Record Significant Activities Initiating scenario events Facility activities Response actions Key decisions made by Players Deviations from plans and procedures Completion time of events Here are a few items usually considered as significant events during an exercise: Initiating scenario events (release begins) Facility staffing, activation, time, and completion patterns (For example, who is there vs. who should be at the facility and when did they arrive) Actions of responders Key decisions made by players such as directors, coordinators, judges, politicians, the times the decisions were initiated and the times they were completed Deviations from plans and procedures Exact times events were completed Evaluators should remember that real world emergency events can occur and should be recorded as well. Of course, a real world event could delay exercise play, cancel play at one or more exercise locations, or cancel the entire exercise. Perhaps the best way to record significant events is to use a timeline format in taking your notes. DRAFT

43 3/22/2017 Evaluator Summary Compile observations into chronological narrative of events Describe outcomes achieved or not – use questions below and evaluation guides: What happened? What was supposed to happen? If there is a difference, why? What is the impact of that difference? What should be learned from this? What improvements might be recommended? Once the exercise is completed, and the evaluators have collected the appropriate additional information, each evaluator should then compile his or her observations into a chronological narrative of events, describing outcomes achieved or not achieved. For any outcomes that are not achieved, the evaluator and the evaluator’s team should analyze the sequence of events and attempt to determine the cause of the issue, using the questions discussed above, i.e.: what happened what was supposed to happen if there is a difference, why? What is the impact of that difference? What should be learned from this? What improvements might be recommended? The evaluator should also refer to the specific questions provided at the end of each evaluation form, which may help in conducting a more detailed analysis of the specific events observed. Evaluators will then bring their individual narratives to the team analysis, described below. DRAFT

44 Data Analysis Conduct Hotwash Develop timeline of significant events
3/22/2017 Data Analysis Conduct Hotwash Develop timeline of significant events Analyze performance: Individuals Teams/Functions Outcomes For the next several viewgraphs, we will discuss data analysis of the exercise activities. These include conducting a post exercise hotwash, developing a timeline of events and the initial narrative or what happened and finally we will move into analyzing the performance of the individuals, teams, functions, and the outcomes. But first the player hotwash. DRAFT

45 Hotwash Player Hotwash: Provides opportunity for:
3/22/2017 Hotwash Player Hotwash: Usually held immediately following exercise play Typically facilitated by the evaluator Provides opportunity for: Player self-assessment An interactive discussion Clarification of observed events Assessment of exercise simulations The hotwash is best held as soon as possible after exercise play ends, and before players start departing from the area. It shouldn’t take more than an hour. The hotwash is typically facilitated by the evaluator assigned to the particular location. It allows the evaluator to ask questions to clarify points or situations. It is important not to make judgmental or subjective statements. The hotwash: allows players to participate in self-assessment of the exercise play facilitates an interactive discussion to clarify actual activities during the actual response gives a general assessment of how the organization performed in the exercise Fills in information for evaluators Provides an opportunity for players to comment on how well the exercise was planned and conducted, and the effectiveness of the mock-ups and simulations DRAFT

46 Timeline Development Identify the appropriate outcome for
3/22/2017 Timeline Development Time Observations Location Team Outcome 0852 Three staff members arrive at JIC – PIO, Deputy, Admin. Ass’t; begin setting up (computers, removing displays from storage, job aids at work stations, etc.) JIC Rushmore Co. EM 0905 First media call to JIC requesting info. on event. PIO provides initial incident info. & tells reporter to watch for news release shortly re: JIC activation. Rushmore Co. PIO 0906 Forest Co. PIO and Assistant arrive at EOC; PIO immediately calls in to EOC Forest Co. 0910 EAS message from EOC received by Fax EOC 0912 EAS copied & distributed at JIC to all staff work stations; additional copies on tables in media room. JIC staff Make a team timeline of actions Focus on significant actions Second, the participants identify the mission outcomes associated with particular activities and events. The outcomes are shown here in the far right column of the viewgraph. These are all tied to the Emergency Management Outcome. Separate timelines are produced for each exercise location. During the analysis and report writing activities, the timelines will be shared as necessary to facilitate the analysis. A consolidated timeline will also be produced, as discussed a little later. If the timelines are produced and combined electronically, there are many ways to the data to help in discovering just what happened and the root cause of an issue. Identify the appropriate outcome for each activity DRAFT

47 Analysis of Performance
3/22/2017 Analysis of Performance Analysis of activities What tasks were to be accomplished How well were they performed Root causes of differences between expected and actual performance Recommendations Following the timeline development the group can then analyze the events, as follows: Review the site-specific objectives and tasks to be accomplished at that location Determine which tasks went well and which need improvement Identify the strengths and weaknesses in carrying out those tasks Determine why an action was not accomplished – the root cause analysis Recommend an improvement action – include, if possible, recommendations on who will carry out the action, what the action is, and when it should be accomplished The next two slides discuss these activities in more detail. The evaluation group from each location presents the results of their findings to the larger exercise evaluation group. The process is then repeated across locations and jurisdictions. DRAFT

48 Root Cause Analysis 1. Why did it happen? 2. Why did that happen?
3/22/2017 Root Cause Analysis 1. Why did it happen? 2. Why did that happen? 3. Why was that? 4. And why was that? 5. And why was that? 6. And so on… For each action, the participants should search for the “root cause” to try to determine the reason that an expected action did not occur or was not performed as expected. A number of different analysis tools are available for root-cause analysis. One commonly used tool is the “why staircase” (or 5-why technique). This tool is used to help determine why there was a difference between what was planned and what actually occurred. It also helps an analysis team detect flaws in its reasoning. The analysis team should keep asking why that happened or did not happen until they are satisfied that they have the cause. It is important to reach this level of understanding in order to make recommendations to enhance preparedness. Instructor Notes: Time permitting, ask students for a real-life example of a real or exercise issue that they’re aware of. Instructor: Here’s a few examples: An order was given to evacuate an area after a chemical release. An elderly shut-in was not evacuated. Why? (no one checked because she wasn’t on a list because the list-making process was flawed because not all sources of identifying special needs people were used, etc.) A supply of atropine delivered to a staging area was discovered to be beyond its shelf life. Why? (no one was assigned responsibility to check expiration dates because it wasn’t in anyone’s procedures because ?) ***Root Cause*** Each step must completely explain the step above… …down to the basic underlying causal factor. DRAFT

49 Integrated Analysis Allows further identification of:
3/22/2017 Integrated Analysis Allows further identification of: Successes and best practices New gaps and problems Root causes Recommendations for improvement Compares observations from different locations and functions We began this morning by discussing analysis within single lanes: functions/disciplines/locations. We now want to cross outside our lanes to examine performance across functions/disciplines/locations. This is called “integrated analysis”. Remember the Subaru commercial with the frustrated girl in kindergarten drawing and being told to “stay within the lines”. Then it flashes forward to her 4-wheeling in the mud and says “sometimes you just have to break the rules”—meaning you have to go outside the lines and interact with the rest of the environment. Integrated analysis is where we get creative and expand beyond our basic analysis of what happened at each function/discipline/location. We’re looking to see how each group interacts with others, and where disconnects may be occurring. This further helps us to identify: Additional successes and best practices Additional gaps or problems Root causes (to new problems or those previously identified) Recommendations for improvement How is this done? Following the analysis by each group, a more limited group of evaluators, controllers, and players then meet to analyze performance across the various functions and locations. Location-specific analysis focuses on assessing the performance of individual (or team) tasks and, to some extent, functions and disciplines. An integrated analysis cuts across locations and further identifies functional or discipline performance and, most important, mission-critical performance. DRAFT

50 Recommendations for Improvement
3/22/2017 Recommendations for Improvement Questions for identifying recommendations for improvement: What training and/or equipment is needed? What changes need to be made to plans and procedures, or organization structures? What changes could be made to the management processes? Once a root cause is determined, the individual evaluator and the team should use the following questions as a guide for developing recommendations for improvement: What changes need to be made to plans, policies, procedures, and relationships/ agreements to support the performance of essential tasks? What changes need to be made to organizational structures to support the performance of essential tasks? What changes need to be made to leadership and management processes to support the performance of essential tasks? What training is needed to support the performance of essential tasks? What changes to or additional resources are needed to support the performance of essential tasks? Note that these are the initial recommendations of the individual evaluators and their evaluation teams. They play an important part in developing the final set of recommendations that will be contained in the draft and final After-Action Reports. Although some recommendations will be immediately right-on target and adopted fully in the final report, others will bear modifying once all exercise information is in and the senior evaluators, exercise director, and players confer. Instructor Notes: Time permitting, ask students for examples of each item. DRAFT

51 The After-Action Report (AAR)
3/22/2017 The After-Action Report (AAR) Serves as feedback tool Summarizes what happened Identifies successes and recommendations for improvement May include lessons learned to share with other jurisdictions Help jurisdictions focus resources on greatest needs So let’s begin writing our report. The AAR is the capstone of the exercise. It is the tool used to provide feedback to the participating jurisdiction(s) on their performance during the exercise. The AAR provides: summary of what happened and recommendations for improvements May include “lessons learned” -- knowledge gained from an innovation or experience that provides valuable evidencepositive or negativerecommending how to approach a similar problem in the future. Although every recommendation that comes out of the analysis process may result in a lesson learned for the participating jurisdictions, it is those that may have applicability to other jurisdictions that should be highlighted as lessons learned in the After Action Report (AAR). DRAFT

52 After-Action Report Prepared in two stages:
3/22/2017 After-Action Report Prepared in two stages: Draft AAR – completed immediately after the exercise for review Community adds improvement steps/corrective actions Final AAR The first draft is prepared by the whole evaluation team – with input from controllers, SIMCELL members, participants -- while on site. A draft report is then prepared by evaluation team leaders for review by participants, who add how they’ll address the recommendations. The review is conducted in a follow-up visit to debrief the exercise. Then the final report is prepared. DRAFT

53 AAR Format Executive Summary Part 1: Exercise Overview
3/22/2017 AAR Format Executive Summary Part 1: Exercise Overview Part 2: Exercise Goals and Objectives Part 3: Exercise Events Synopsis Part 4: Analysis of Mission Outcomes Part 5: Analysis of Critical Task Performance Part 6: Conclusion Appendix A: Improvement Plan Matrix AAR Format: Refer to Appendix D in your HSEEP Volume 2; it has no page numbers, but it’s the very last section of the manual. This is the outline of the AAR: Executive Summary Part 1: Exercise Overview Part 2: Exercise Goals and Objectives Part 3: Exercise Events Synopsis Part 4: Analysis of Mission Outcomes Part 5: Analysis of Critical Task Performance Part 6: Conclusion Appendix A: Improvement Plan Matrix Notice sections 4 and 5: Analysis of Mission Outcomes and Critical Task Performance. This is the meat of the report. The instruction and practical activities of the last 2 days are intended to help you prepare this report. If you use the evaluation process as defined in the HSEEP, then the report-drafting should be almost anti-climactic. It won’t quite write itself, but you should be nearly there. Instructors Notes: -Ask students to describe the features of the Exec Summary. Will give flavor of report. Ask about: -Audience -Content (exercise design, successes, improvements, followup) -Length -Intro (report purpose) DRAFT

54 Identify Improvements Finalize After Action Report
3/22/2017 Improvement Process Improving preparedness activities: Conduct exercise debrief Identify improvements Finalize AAR Track implementation The effort of an exercise is wasted if the lessons from the exercise are not translated into actions that result in improvements to the capabilities tested. The draft After Action Report (AAR) will present observations and recommendations based on the data collection and analysis completed by the evaluation team. The evaluation team will assist the jurisdiction(s) that conducted the exercise in turning those recommendations into action. They will debrief the exercise to the participating agency officials and, as appropriate, to public officials and assist them in identifying and documenting corrective actions for program improvement. Step 5 Conduct Debriefing Step 6 Identify Improvements Step 7 Finalize After Action Report Step 8 Track Implementation DRAFT

55 Exercise Debrief Provides a forum for jurisdiction officials to:
3/22/2017 Exercise Debrief Provides a forum for jurisdiction officials to: Hear the results of the analysis Validate the findings and recommendations in draft AAR Begin development of Improvement Plan The exercise debrief provides a forum for jurisdiction officials to: hear the results of the analysis and validate the findings and recommendations presented in the draft AAR. The presentation includes a discussion of: the exercise objectives; what happened during the exercise; any differences between expected performance and actual performance; the reasons for differences and their impact on the response; lessons learned; and recommendations for improvement. The debrief should be interactive, with the jurisdiction officials validating the observations and recommendations and/or providing insights into some activities that might have been missed or misinterpreted by the evaluation team. The draft AAR would be modified to incorporate any clarifying information. The debrief should also include a facilitated discussion of ways that the jurisdiction can build on strengths identified and begin to address recommendations for improvement through the development of an Improvement Plan. DRAFT

56 Improvement Plan Developed by local jurisdiction during debrief
3/22/2017 Improvement Plan Developed by local jurisdiction during debrief Identifies how recommendations will be addressed: What actions Who is responsible Timeline for completion The Improvement Plan (IP) is the means by which the lessons learned from the exercise are turned into concrete, measurable steps that result in improved response capabilities. An initial IP should be developed at the debrief, while all of the key officials are together. The initial IP should identify: what actions will be taken to address each recommendation presented in the draft AAR, who or what agency will be responsible for taking the action, and the timeline for completion. If more information is needed to answer these questions, the initial IP should at identify which agency will explore the issue further. The IP should be realistic and establish priorities for the use of limited resources.  During the meeting, the facilitator should assist the officials in identifying sources of funding, or exploring alternative solutions if funds will not be immediately available. DRAFT

57 Finalize AAR Improvement Plan is included in final AAR
3/22/2017 Finalize AAR Improvement Plan is included in final AAR Final AAR submitted to ODP through State Administrative Agency Generally, the initial IP will be included in the final AAR. The final AAR should follow the same format as the draft AAR discussed in the previous chapters with the addition of the improvement steps that will be taken. The improvement steps that will be taken to address a specific recommendation will generally be listed in the AAR immediately following the recommendation. DRAFT

58 Monitor Implementation
3/22/2017 Monitor Implementation ODP Exercise Management System (under development) will provide: Centralized calendar of exercises across the country Electronic submission of AAR/IPs to the SAA and ODP Monitoring of Improvement Plan implementation DHS/ODP is developing a secure internet-based Exercise Management System that will: provide a centralized calendar of exercises across the country provide for the electronic submission of AAR/IPs to the SAA and DHS/ODP, and Monitor implementation of IPs. The system is being designed so that all information flow is through the SAA, providing them with a tool to enhance the management of their exercise program. All AAR/IPs and follow-up information will be designated “For Official Use Only.” DRAFT

59 Sharing Lessons Learned
3/22/2017 Sharing Lessons Learned Ready-Net – Web-based, secure information network National repository for best practices and lessons learned Accessible to approved users within the response community Administered by the Memorial Institute for the Prevention of Terrorism DHS/ODP will provide copies of the AARs to the Memorial Institute for the Prevention of Terrorism’s (MIPT) Ready-Net, a Web-based best practices and lessons learned information network for first responders and emergency planners nationwide. MIPT Ready-Net will serve as the national repository for best practices and lessons learned. Ready-Net will analyze the information and pull out the best practices, lessons learned, and trends It will be accessible to approved users within the response community through the DHS/ODP secure portal. All AAR information will be secure and will be provided to approved users in summary form and/or with all identifying information removed. . DRAFT

60 Benefits of HSEEP Approach
3/22/2017 Benefits of HSEEP Approach Nationwide consistency More useful after action reports and improvement plans Ability of jurisdictions to focus resources on greatest needs ENHANCED PREPAREDNESS As ODP contractors, we are looking to you to help states and local jurisdictions to implement the HSEEP evaluation process: Helps to have everyone using the same, familiar approach at different exercises Improves state and local preparedness Results in realistic plans that can be implemented Because planning and conducting an exercise requires a significant commitment of resources, it is important to maximize the benefits gained from the exercise through implementation of these DHS/ODP evaluation and improvement process. DRAFT

61 Exercise Evaluation Training Course
2 ½ days - Exercise Evaluation methodology 6 sessions to train ODP staff and contractors as change agents (225 people) Training for SAAs Feb-May 2004 ODP Exercise Design Course being revised to deliver consistent message

62 Goal for Working Group Review and modify Exercise Evaluation Guides for Radiological and Biological attacks Are the right tasks identified? Do other tasks need to added? Are the conditions and typical steps logical and complete? Are the followup analysis questions the right questions to assess performance?


Download ppt "Evaluation and Improvement"

Similar presentations


Ads by Google