Presentation is loading. Please wait.

Presentation is loading. Please wait.

MORS Irregular Warfare II Analysis Workshop

Similar presentations


Presentation on theme: "MORS Irregular Warfare II Analysis Workshop"— Presentation transcript:

1 MORS Irregular Warfare II Analysis Workshop
NATO UNCLASSIFIED MORS Irregular Warfare II Analysis Workshop Final Outbrief for SOCOM Chair: Renee G. Carlucci, CAA Co-Chair: Don Timian, ATEC MORS convened its second Irregular Warfare workshop in February This workshop was hosted by Special Operations Command (SOCOM) and held at the Davis Conference Center at MacDill AFB. This brief provides an overview from each of the working groups as well as any specific recommendations for SOCOM from each of the working groups. 3 - 6 February 2009 Davis Conference Center MacDill AFB, Florida

2 Agenda Overview Tutorial and Plenary Session Observations
Key Issues and Discussion Items Gaps Recommendations This report will follow this agenda beginning with a short overview and will discuss the many things learned from the tutorial and plenary speakers. It will cover observations gleaned from the working groups and the state of the assumptions, definitions and terms that underpin the key issues and discussions that follow. Finally, it will provide recommendations based on all the things learned over the span of the workshop.

3 Overview Title: Irregular Warfare Analysis Workshop Organization
3-6 Feb 09, held at SECRET/REL to AUS/CAN/GBR/USA level at MacDill AFB, Tampa, FL Program chairs: Ms. Renee Carlucci, Mr. Don Timian, and LTC Clark Heidelbaugh 5 Working Groups: Global Engagement; Stability, Security, Transition, and Reconstruction (SSTR); Information Operations (IO)/Psychological Operations (PSYOP)/Social Sciences; Counterinsurgency (COIN); and Thinking Models About IW; Synthesis Group Objectives To frame the toughest IW problems that SOCOM and DoD are facing To bring special operators, analysts, and problem solvers together to help define IW analysis problems, explore techniques to deal with these problems, identify what has worked and not worked, and determine recommended ways ahead Context Changing strategy with greater emphasis on Irregular Warfare (NDS, QDR, IW JOC, etc.) Continuing emphasis on building partnership capacity (BPC Roadmap, TSC, etc) Limited policy, process, tools and methods for Irregular Warfare activities Focus on partnering operators with analysts and problem solvers to gain operator perspectives Scope: Military dimension of conflict in Phases 0-5 of DoD’s campaign planning construct Tools/methods, algorithms, historical/current data sources, on-going analysis, opportunities for collaboration on future analysis and tool development The purpose of the US IW Analysis Workshop was to frame the toughest IW problems that US Special Operations Command (USSOCOM) and Department of Defense (DoD) are facing and refine options to address them. The conference combined special operators, analysts, and problem solvers to help define possible IW analysis problems, explore techniques to deal with these problems, share what has been attempted in the past, identify what has worked or not worked, and determine recommended ways ahead. The workshop attempted to challenge thinking and assumptions within DoD about IW. Generally, DoD is coming to the consensus that SOF can’t do the IW missions alone; DoD must leverage and enable general purpose forces to achieve our national objectives in IW operations. Furthermore, IW is heavily reliant on the inter-agency and coalition operations. Importantly it was hoped that this conference would serve to enhance collaboration between the US military, our coalition partners, and inter-agency analysts. The US IW Analysis Workshop builds upon previous Military Operations Research Society (MORS) sponsored workshops, but does not duplicate past MORS IW events. The previous (December 2007) MORS-sponsored IW workshop focused on broad international participation as a means to increase the transfer of information and know how among US, allied, and coalition partner analysts. The objective of the February 2009 workshop centers on bringing together agencies and organizations from the US that deal with IW, as well as our coalition partners – in a classified forum – which allows for sharing and collaborating on the latest developments of new capabilities and approaches that analyze IW in support of decision-makers. In particular, the meeting served to focus on addressing and matching analytical gaps to SOCOM requirements.

4 Tutorial, Plenary, and Keynote Speakers
Irregular Warfare Joint Operations Concept Mr. Jeffery (Gus) Dearolph, Deputy Director Internal,SOCOM J10 Lessons from the Irregular Warfare Methods, Models, Techniques COL Jeff Appleget, TRAC Summary of Improving Cooperation Among Nations for Irregular Warfare Analysis Workshop Dr. Al Sweetser, Director, OSD-PAE SAC Operational Design LTC Reb Yancey, SOCOM SORR-J8-Studies Sponsor’s Welcome Mr. E. B. Vandiver III, FS Dir, Center for Army Analysis Keynote Speaker Mr. William J. A. Miller Dir, Strategy, Plans, and Policy, SOCOM We were very fortunate in our workshop to have been supported by six outstanding briefers. We were taught by Mr. Jeffery Dearolph about the Irregular Warfare Joint Operations Concept and by Colonel Jeffrey Appleget about the lessons derived from the TRADOC Analysis Center’s study on Irregular Warfare Methods, Models and Techniques. Our plenary speaker Dr. Al Sweetser spoke about the very successful first IW workshop held at the Naval Postgraduate School last year and was followed by LTC Reb Yancey who introduced us to the Systemic Operational Design process. One of our two sponsors, Mr. E. B. Vandiver III, provided some important context for the meeting. Our keynote speaker, Mr. Miller spoke about his perspectives on IW as the SOCOM Director for Strategy, Plans, and Policy.

5 Irregular Warfare Joint Operations Concept Mr. Dearolph
There is continuing friction with the IW definition across Services, agencies, interagency, and among allies There is a lack of grand strategy and a failure to understand population Key IW factors are: indirect, enduring, persistent, proactive, population-centric, respect of legitimate sovereignty linked to over-arching strategy Consists of : Key missions (e.g., FID, UW, COIN, CT, Stab Ops) Key activities (e.g., Strategic communications, IO, PSYOPS, Intel, Counter- intel, Support to law enforcement) IW Military Leadership JFCOM for General Purpose Forces (GPF) SOCOM for Special Operations Forces (SOF) The main workshop was preceded by a tutorial on the Irregular Warfare Joint Operating Concept (IW JOC) given by Mr. Jeffery (Gus) Dearolph, Deputy Director Internal of SOCOM’s J10 Irregular Warfare Directorate. The tutorial was extremely well attended with over forty percent of the conference attendees sitting in. Mr. Dearolph’s excellent tutorial provided a solid overview of IW terminology and issues. There is continuing friction with the IW definition across the Services, agencies, interagency, and allies. The term “Irregular Warfare” is not well liked by many outside DoD. The largest issue stems from the lack of a “grand strategy” for IW. In addition, there is (1) a failure to understand the population-centric nature of global competition and (2) a tendency to confuse strategic theory with operational missions due to the linkages between policy, strategy, and the various levels of conflict.

6 Lessons from the Irregular Warfare Methods, Models, Techniques COL Jeff Appleget
“IW focus is on the population” “COIN” is the Key when insurgents exert more influence on local populations than the national government” IWMmAWG Study established a 7-element framework Identified 35 gaps, 34 related to data and social sciences Analytical Approach Now, Top-down, Western perspective (DIMEFIL-PMESII) Soon, Bottom-up employing social sciences expertise Track strategic level Methods, models, Tools (MmTs) Iterative development of “key data” is central Over-all needs Create credible, relevant MmTs to address decision maker issues Make social scientists integral members of the analysis team Continue community-wide dialogue through IW Working Group Following Mr. Dearolph’s brief, COL Jeff Appleget gave a tutorial on the findings and status of methods, models, and tools germane to IW. COL Appleget is responsible to the Director of TRADOC Analysis Center (TRAC), Mr. Mike Bauman, FS, for TRAC’s IW Campaign Plan development and, together with Mr. Steve Stephens (USMC), is co-chair of the (Joint) IW Working Group that oversees the research and development of IW analytic Methods, Models, and Tools (MMT). COL Appleget reviewed the results of the IW Methods, modeling, and Analysis Working Group (IW MmAWG) Study which established a 7-element framework and identified 35 gaps, 34 of which are related to data and social sciences. Through the IW Working Group, COL Appleget and his team continue to identify ongoing activities from across DoD, Interagency, and our coalition partners that can contribute to fulfilling gaps and future collaboration. Both tutorials included a healthy discourse, giving all in attendance a common frame of reference and a foundation for the deliberations of the working groups.

7 Improving Cooperation Among Nations for Irregular Warfare Analysis Workshop (NPS) Dr. Al Sweetser, Director, OSD-PAE SAC There is value having international participants from many different nations Emphasized importance of “Whole of Government“ approach Useful to conceptualize the problem as “Complex Adaptive Systems” (e.g., act, react, re-react,…) Consider a hybrid approach (e.g., wargame – model – wargame) Dr. Al Sweetser, the Director of the OSD Program Analysis and Evaluation (OSD-PAE) Simulation & Analysis Center (SAC) chaired and provided an overview of the previous (December 2007) MORS-sponsored IW workshop which focused on broad international participation as a means to increase the transfer of information and know-how among U.S., allied, and coalition partner analysts. The December workshop at the Naval Postgraduate School provided an opportunity to explore ways to enhance collaboration on and improve the performance of IW analyses within and across agencies and governments, with workshop attendees representing twenty-one different countries. A key conference lesson was the value of a broad perspective and the need for a “whole of government” approach for IW. Participants felt strongly that wargames help facilitate collaboration among analysts with different backgrounds (e.g., differing military services or countries).

8 Systemic Operational Design (SOD) LTC Reb Yancey, SOCOM J8-Studies
IW is a “wicked problem” Akin to relearning COIN analysis approaches (Vietnam / Iraq) SOD employs a structured method of inquiry that enables a commander to: Make sense of a complex situation Capture understanding Share the resulting visualization SOD is a method of inquiry, is based on discourse, creates a learning system Requires accepting humility and valuing heresy Means challenging the information and the boss To deal with a dynamical complex system, one needs to explore the interactions among the key parts (“hermeneutics’) LTC Reb Yancey, a senior Operations Research analyst at USSOCOM, provided an introduction at the plenary to Operational Design. LTC Yancey was trained by BG (Ret) Shimon Naveh. During the last two years, LTC Yancey has utilized Operational Design concepts on numerous projects for SOCOM. He reminded the group that IW is a “wicked problem” requiring a broader approach than traditional computational methods typically allow. Operational Design employs a structured method of inquiry enabling a commander to make sense of a complex situation, capture understanding, and share the resulting visualization. The method, based on discourse, creates a learning system requiring accepting humility and valuing heresy. In order to deal with a dynamic, complex system, one needs to explore the interactions among the key parts.

9 Keynote Speaker Mr. William J. A
Keynote Speaker Mr. William J. A. Miller, SOCOM Dir, Strategy, Plans, & Policy, “IW is about populations” In analyzing IW issues, a Lanchester view is not useful “Behave” not kill our way to victory Shape vs. exploit, synthesis not analysis, transforming is satisfising whereas solutions are optimizing, presence changes the problem Be as “un-wrong” as can be in conceptualizing a global perspective on issues Globalization challenges and threats to the US—Migration, Crime, Extremism SOCOM Challenges: Be up-stream (leverage), turn down the heat (affect), engage in dialogue with senior decision makers The Keynote Address was given by Mr. William J. A. “Joe” Miller, the Director of SOCOM’s Strategy, Plans, and Policy (J-5) Directorate. Mr. Miller provided some powerful comments and insights. He also emphasized that IW is about supporting populations vice meeting kill quotas. We need to “behave” -- not kill our way to victory. He stressed in analyzing IW issues, a Lanchester view is not useful and when conceptualizing a global perspective and attempting to identify the underlying factors that result in friction, our goal should not be to ensure our analysis is correct, but to be as “un-wrong” as possible. Mr. Miller went on to describe how SOCOM is utilizing Operational Design in order to gain a better visualization of the globalization challenges to the United States. From the SOCOM perspective, these challenges are primarily migration, crime, and extremism.

10 Working Group Observations
The working groups (WG’s) were highly partitioned by their titles and topics areas (tough to find overlap) WG’s employed from 4 to 9 presentations in their sessions—a total of 30 different workshop presentations WG’s ranged in size from 16 to 50 members—the “modeling IW WG” had the highest numbers WG’s recognized that they have more challenges and tasks then they can handle in a three-day workshop WG’s have heart and intellectual energy but are limited by clock time and “soak time” WG’s would like to “sit in” on other working groups (series vs. parallel information meetings) From Wednesday afternoon until Friday morning the workshop was separated into five working groups plus a synthesis group. Each of the groups addressed a segment of the problem space with Synthesis looking for common themes and addressing overarching high level questions.

11 General Observations We are still struggling with the exact meaning and breadth of irregular warfare (bounding and characterization) We are not familiar with the agencies that understand or have jurisdiction for DIMEFIL and PMESII “Models and Tools” do not equal “computer programs and computer models” Wargaming with the right players offers a powerful technique for discovery Graphics in a storyboard approach has a prominent place in IW for displaying and understanding influences Everyone is talking about data, its definition, its meaning, its form, who is collecting it, processing it, and storing it No consensus on what information does exist, should exist or who is or should be responsible—regardless, the complexity of the situation transcends the data VV&A is still the topic on IW models and data Some general observations made across the six working groups and the three day conference. There is still a lot of debate on the terms, definitions, and meanings associated with elements of irregular warfare. This is occurring internally within the Department of Defense, but is compounded by terms and definitions used by our interagency and coalition partners. We need a common lexicon in order to better communicate with each other.

12 WG 1:Global Engagement Findings/Recommendations
WG1: Global Engagement Charter: Provide recommendations on appropriate analytical techniques to prioritize, plan, and assess Theater Security Cooperation activities to assist the COCOMs in addressing the analytical challenges that they currently confront. Key Questions: (1) How should objectives and indicators be structured? (2) How does an assessment division with four to ten people measure and maintain the baseline? (3) How can you determine the right activities to support partner nations while making the most progress towards desired end-states? (4) Is it possible to measure cause and effect in a complex system? (5) What is missing from this process? Takeaways: Many of the effects are potentially unquantifiable (and will remain so). The challenge remains informing decision makers given this constraint. Interagency analytical resources can assist and are essential Don’t just accept objectives or rush to create them. Need to focus on shaping objectives as well as measuring progress. Reframe. IW analysis will affect traditional analytical paradigms Messy data, Cause & Effect, No easy “one-size-fits-all” toolset Effective Security Cooperation exceeds the boundaries of DoD’s authorities and capabilities Working Group 1 – Global Engagement was chaired by Mr. Andy Caldwell, OSD-PAE and co-chaired by Col. Thomas Feldhausen, Joint Staff, J-5. Mr. Caldwell is an exchange analyst from the Ministry of Defense of the United Kingdom. As part of the preparations for the workshop the Chairs for the working group visited three combatant commands, SOUTHCOM, AFRICOM and PACOM. Interviews with each commands’ Strategy, Assessment and Engagement divisions provided an understanding of the Theater Campaign Plan (TCP) planning cycle and specifically the analytical challenges confronted by each command in planning, assessing and measuring progress in meeting the TCP’s objectives. Global Engagement is a large topic, and for it to be tackled in a two-day workshop it was necessary to focus on a specific aspect. As an analytical community we have applied our resources to gain a better understanding of Counterinsurgency (COIN), Stability, Security, Transition, and Reconstruction (SSTR) and IW. The majority of efforts have focused on winning the nation’s wars and being prepared for the next set of operations. It is also necessary to focus on taking actions now that will allow us to avoid having to undertake all but the most necessary enduring future operations of this nature. Global engagement requires us to consider prevention as well as cure. Many parts of the U.S. government, international organizations and regional partners take part in this activity. It includes ensuring the provision of basic services, effective governance, the rule of law, social justice, and security. The DoD contribution is through Security Cooperation and Theater Security Cooperation is the name for the role of the Combatant Commands in meeting this challenge. From the interviews with the COCOMs and supported with some presentations, the working group focused on trying to address five challenges related to measuring and assessing theater security cooperation activities. The result of their work is a series of detailed findings and suggestions for each challenge. Some of the major takeaways include keeping objectives and indicators living. The working group suggests that these are re-evaluated quarterly to determine whether we are having success in evaluating individual objectives and if not flag the objective-indicators for rework next time through the cycle. The data is messy. We are uncomfortable with that so we keep trying to improve the data reliability so that it fits traditional Ops Research methods. Forget it. We can’t get there from here. So, we are going to have forget about building large super models that explain everything. Lets tackle this one small tactical problem at a time. That’s what we did in WWII, where Ops research began. The first analyses involved where to place radar stations and at the time there was no model about how radar contributes to air defense. The Ops researchers had to make their own determinations on important factors, such as how easy it was to defend the site, warning time it provided, redundancy between stations and atmospheric conditions. We did the same in the Battle of the Atlantic. We put analysts in the bombers with clip boards to identify whether changing one variable (e.g. the depth at which a depth charge detonates) improved kill rate. Years, decades later we knew enough about maritime warfare to build campaign models, but we didn’t do that during WWII. We didn’t know enough yet. Finally, another statement of the obvious but these challenges extend beyond DoD’s traditional boundaries. That also applies to the Ops Research Community. If we could repeat this conference in a year’s time with Interagency analysts in the audience that would be another significant step forward for the ORSA community.

13 Assessment Techniques
Analysis Method Description Examples Quantitative data Collection of input/output data associated with activities and generic country indicators TCSMIS database Country indices on corruption, economic growth, security etc. Polling and surveys Public opinion or opinion of targeted groups View of the U.S. before and after USNS Comfort port visit Content Analysis Survey popular media for identified themes Failed States Index (Fund for Peace) Expert Opinion Subject Matter Expertise and Focus Groups TSC Working Group Objective identification Modeling/ Simulation and Gaming Simplified representation of a complex system COMPOEX (PACOM) Baseline Trends Interviews with the COCOMs identified 5 techniques currently in use. The collection or creation of quantitative data mainly focused on inputs and outputs, most of which were recorded in the Theater Security Cooperation Management Information System (TSCMIS). The collection of quantitative data might also include basic country facts such as GDP or crime statistics. This information is used to inform the baseline, or as inputs to other decisions or a record of activities and outputs (but not necessarily outcomes) achieved. Polling, surveys and content analysis were also techniques mentioned by the COCOMs. Although content analysis was mentioned no specific examples in the COCOMs were identified. One example might be the Failed States Index, an open source measure of country instability. The methods used here are applicable to the COCOMs and potentially greater use of these techniques (or tailoring the outputs of other content analysis providers) may be valuable. Polling, survey and content analysis are useful techniques for identifying trends. Finally there was heavy use of expert opinion and some attempts (not always successful) to conduct modeling, simulation and gaming. Both expert opinion and modeling are used for forecasting. These are being used to identify how a variable may react to engagement activities or other factors. Baseline, Trends and Forecasting. None of these techniques is perfect (or even close to perfect if we consider the challenges with forecasting) but those techniques in use cover the recording of basic data for assessments, monitoring performance over past years and analysis (mainly Subject Matter Expertise) to identify which activities will move the COCOM closer to achieving its end-states. Forecasting

14 Activities (14 engagement tools)
Defense Support to Public Diplomacy Facilities & Infrastructure support projects Humanitarian Assistance Information Sharing/Intelligence Cooperation International Armaments Cooperation Security Assistance Cross-Cutting Programs Combined/Multinational Education Combines/Multinational Exercises Combined/Multinational Experimentation Combined/Multinational Training Counter/Non-Proliferation Counter Narcotics Assistance Defense & Military Contacts The final slide to orient you covers the 14 engagement tools or activities through which the COCOMs currently conduct TSC. These tools are the ones in use by SOUTHCOM and this generic list is given in the GEF.

15 WG1 Findings-- Suggestions
Q1.How should objectives and indicators be written, structured and prioritized? Ideally, a comprehensive set of metrics should be identified, where that is not possible indicators should be MOE rather than MOP. Beware, decomposition can be endless. “If you can’t measure the objective then you have no objective!” Involvement of the analyst in structuring the specific language used in objectives is essential. It must mean something analytically. Embed analyst in the strategy division? SME qualitative indicators. Are they valid and consistent between different experts? May not be able to trend the data, there will be limits here. Prefer something that is quantitative but don’t take judgment out of the process. Consider prioritizing indicators, on basis of which are most important, not by which are easiest to collect. (Embedded analyst can assist). Don’t forget to re-evaluate what indicators you are using. Iraq experience of looking at MOP rather than MOE to assess progress. And reframe the problem at the objective level, reprioritize when necessary, goals should be achievable. The working group considered the first question. How should objectives be written, structured and prioritized? There are some objectives which may neatly break down into a comprehensive and exclusive set of indicators. For example, WMD management breaks down into about six activities or measures, response time being an example of one of those. If you can measure all six of those indicators you can form a comprehensive assessment of a country’s capability (and therefore your progress towards intermediate objectives). However, not all problems break down easily into comprehensive and exclusive sets of indicators. For example, a health service can be measured by the number of available clinics, available doctors and nurses, and patients treated. But, you might miss one key indicator, for example the availability of prescription medicines and without that critical indicator you could report good progress whereas in fact the opposite were true. This is what happened in Iraq, where the CPA was using MOP, measures of performance, measures that told you something about a sub-system but which did not tell you about progress towards strategic goals. So, where you cannot identify a comprehensive and exclusive set of MOP you need to turn to MOE, measures of effectiveness. In this example maybe child mortality rates coupled with one or two other MOE that may allow you to identify whether changes in mortality are down to the performance of the health system or some other variable, such as the outbreak of civil conflict or disease. Decomposition of indicators can be endless. Considerable design needs to go into which MOE and MOP tell you the most about your objectives while minimizing the number of data items you need to collect. Using someone with an Ops Research background can assist with structuring the language and the scope of objectives. The key point here is that if you can’t measure progress towards an objective then you may as well not have it because how will you be able to tell whether anything you are doing is making a difference? Those in the strategy area would disagree with that, as it is still a requirement of the Command, even if it is difficult to measure. But assessment resources are limited and prioritizing what you are going to measure is vital to making the best use of the assessments staff. In AFRICOM, the strategy division is currently considering adding one Ops Research Analyst to their staff to perform exactly this function. Even if this is not possible the involvement of the assessment staff in structuring the objectives is important. Where this happens you are more likely to have a set of objectives that you can evaluate after conducting engagements, and that allows confidence when it comes to resource reprioritization. The working group was unable to reach any firm conclusion on the use of qualitative indicators. Some objectives will not lend themselves well to numeric expression. In fact, SOUTHCOM found this was the case and expressed qualitatively the values of some of their indicators. The danger here lies in how the SMEs are used to judge the level achieved against an indicator. Judgment is important but there needs to be some scientific rigor in how qualitative values are calculated. If the SME changes there should not be significant difference in interpretation of what’s good and what’s bad. There needs to be consistency between experts. You can test this by asking 10 experts the same question individually. If they each choose the qualitative value (say out of a list of 4 available) then your value bands are well defined. If there is variability you will need to tighten the language. Finally, even if you successfully populate your indicators don’t just revisit them next time through the cycle. Take time to identify whether the indicators were useful, whether the list can be changed (or slimed down) and whether it is the best set to evaluate your objectives. We all fall in the trap of thinking about measures when we design them initially but then not thinking about them again until someone asks the question “why do we collect that again? What do we use this data item for?”. Keep thinking. We’re paid to think every time we look at a set of indicators, not just the first time we see them.

16 WG1 Findings-- Suggestions
Q2. How should a baseline be established and maintained? Identify the indicators before looking for data, this allows you to identify gaps in the data you collect. Have to put effort into thinking about inclusive measures – no laundry list End states & steady-states. Use, wherever possible, existing government or reliable data sources. Be aware of the origin of data where sources may not be reliable. Be aware of dangers of active verses passive data collection Some indicators useful for forecasting, others not. There will be some universal measures, such as child mortality rates which will be good indicators across a range of objectives. Aggregate diverse data elements into composite index. Will show trends. Need correct SMEs to interpret the data. Most data will be messy. If you measure too often you may affect the system. If you can’t present your data reliably you’ve failed. Map background and cartographic display and trends work well. For stop-light charts define criteria. In answering the question on how the baseline should be established and maintained the working group focused on the available analytical resources. On the previous slide we suggested keeping the indicators you collect data against at a manageable level. However, you should still have a view on what indicators you would like to collect given more resources, time and data availability. You may never collect against the full set, but at least you know where the risk lies in what you are collecting against and how full (or otherwise) the picture is that you are reporting on. The group said “no laundry list”. The laundry list has its uses, it might be a stepping stone to a shorter list of measures. But think critically about the laundry list of measures before you start collecting the data. You may find one measure that tells you the same thing three other measures are telling you. The point here is don’t just think of everything you could measure and leave it at that. Think about what you could measure and then continue to think about variations on that set of measures that gives you almost same insight but with a reduced collection requirement. Also, consider whether your objectives tell you anything about the target you are aiming at with you plan. For example, is it necessary to train a nation’s police force to be as competent and effective as a western nation’s police forces or is it sufficient to just stamp out corruption and inculcate an awareness and adherence to human rights as the end-point? There is wealth of data out there available from other government institutions, international organizations and non-profits. Be aware of those who do not apply rigorous data collection and management techniques or those who may deliberately manipulate statistics to their advantage. Not all governments publish reliable data, for a variety of reasons ranging from inadequate measurement systems and skills to deliberate deceit. Some of the data the COCOM needs will not be familiar. How do you interpret economic data if you don’t have an economist or health data if you have not worked in that sector. The key thing here is the data will be messy and it will be important to bring in an expert in different disciplines from time to time to help the assessment staff identify what the data means. Most of us are comfortable with military and security related data. But in other areas bring in an expert to help. We would be disappointed if a social scientist commented on military capability without using our expertise to interpret data. We should be conscious of the same issue when interpreting messy data sets dealing with governance, culture and social justice. Most data collection is likely to be passive but active data collection, such as polling may change the system. If you keeping asking about whether a population agrees with violent extremist ideologies you may inadvertently be encouraging them to view VEO as legitimate alternatives to national governance. A scientific principle is that you cannot measure any system without changing it in someway. Even with CENTOM’s short list of 56 indicators there needs to be a roll-up for presentation purposes or for engaging senior decision makers. Aggregating disparate but related data into indexes, ranging from 1 to 10 may be OK. The index is only meaningful for identifying whether something is getting better or worse. And “better and worse” comes with a caveat. If you are counting hospitals and doctors and both go up, the index for health care goes up. But if you’ve missed the availability of medicines in the index then health care might actually be getting worse but you’ll have missed it. This again emphasizes the importance of knowing what you have consciously decided not to measure and the importance of focusing on MOE to describe the system rather than MOP. If you build an index define either end of the scale. If you are measuring police officers per capita pick a country in the region that is the ideal standard and put that at the top, a 10 and then choose the worst country and call that 1. Score everyone else in between. Aggregating the number of police officers with the number of crimes or other indicators of civil disorder into a relative index will start to build a measure that tells you about how internal security or the rule-of-law is changing over time. Finally, the indicators need to be presented in an effective way. Spreadsheets with hundreds of numbers won’t do, but neither will a single slide roll up with a stop-light chart. Techniques that may be useful include the use of maps with a cartographic overlay showing problem areas, or areas where values have changed significantly since the last assessment. Trend data is useful, not as a predictor but to provide context to what has happened over time and although stop-light charts were frowned upon by the working group it was also acknowledged that this was still a useful tool, as long as the decision maker was aware of values used to determine transition from red to amber to green and to not just show them a chart that will be translated as “worry about this, but don’t worry about that”. Give the context to the color, so the decision maker can make the decision on whether they need to worry or not.

17 WG1 Findings-- Suggestions
Q3. How should developing partner nations’ security forces be evaluated and supported? Focus on sustainability (institutional change, 15 years) Trust and confidence Build the professional military education school house before going out on the rifle range Target/create/instill/develop the cadre of professionals Assessment methods for building security institutions Defense Resource Management Study Project (DRMS) difficult to implement for under-developed institutions Comprehensive baseline surveys must be conducted. E.g. U.S. Country Team or SOF site survey. Consider host nation’s security forces – not just military. Can we do that with other U.S. government institutions? Authorities and treaties are issues. Other allies where required. Assessment measures must be tailored to each country’s unique security requirements, authorities and situation Existing U.S. assessment measures may be considered for establishing baseline or appropriate framework A negotiation on suitable role/end-state for each partner nation’s forces Leverage capacity of other allies to help build regional capacity You don’t necessarily need a U.S. level of performance to be successful One of the key objectives of TSC is to help nations provide for their own security. In NATO the challenges faced by EUCOM revolved around modernizing forces, changing doctrine, instilling professional officer and NCO cadres and building up interoperability. Although not every former Eastern European country had military forces that made NATO integration easy they did at least have rudimentary military institutions that could be rebuilt and reshaped. The challenge in some parts of the world, especially Africa is very different. There may be very little there to build on. How do you begin to address issues such as ethos, training and professionalism when simple infrastructure systems, such as payroll, don’t even exist? The focus here needs to be on building sustainable military institutions. To find the “tipping point” where a military (or other security force) can sustain itself and hopefully improve itself, even in the absence of U.S. training and support. The first observation is that time and effort is expended initially in building trust and relationships. So, don’t expect to see quick improvements in capability. Measure instead willingness to share information, ask for help and to allow access. This might be the first 2 or 3 years of engaging a new partner. The next 7 might be just building up the basic institutions. Most Combatant Commanders would expect to see a measurable increase in capability over their tenure. But, the reality is the benefit might not be noticeable, in capability terms, for 10 years. Rushing to the firing range to train a battalion to shoot straight will not have a lasting effect unless the institutions are there for the security forces to take ownership of their own capability maintenance. The MOE need to express this long-term view. Measuring short-term increases in capability will lead engagement teams to focus on the unsustainable capability growth at the expense of creating self-sufficiency. With it will come the burden of having to repeat the same basic training for many more years beyond the notional 10 or so required to put that army or police force into a position to learn, build and sustain itself. Assessment methods currently in use to assess an entire MOD include the Defense Resource Management Study Project (DRMS) which deploys for two or more years a team of experienced DoD professionals, typically about 4, to work with the country. But this can only be done for a few countries at a time and although it is not impossible to work with countries who have no MoD infrastructure it certainly makes things harder. “DRMS lite” need only be the assessment phase and that can be gathered in several ways. Ideally the country team but even small SOF detachments can make an evaluation of key infrastructure attributes in relatively short order. Baselining the starting position for partner nations is important before engaging the nation in a range of capability development activities. In some countries the first engagement activity might be hiring a consultant from the private sector to help establish with the basics, like building a payroll that prevents corruption or basic literacy programs for soldiers well before taking them out on the firing range. Beyond institutions we need to baseline capability. The assessment should be tailored to each country’s unique security requirements and other constraints. For example, if the military in that country are used to Russian equipment and doctrine then building upon that is going to be more valuable than re-educating them in NATO doctrine. Of course, there is a judgment here. Russian doctrine and equipment may be adequate for national defense but may not be suited to irregular warfare. In this case it would need to be replaced. If the old doctrine was likely to do more harm than good then we need to make that point. Where it is a good start, lets use that as our standard for measurement. Also, certainly in Africa there are ties that still go back to the Colonial era. The French particularly still have regional influence and undertake TSC in some African nations. Where possible leveraging allied contributions to developing regional security capacity should be part of the plan. Where progress is inadequate the engagement may be through a partner rather than directly with the country where capacity is being built-up. Authorities do not necessarily lend themselves well to this approach but there are other influence mechanisms at the Combatant Commander’s disposal to encourage allies to develop capabilities that contribute to collective regional security.

18 WG1 Findings-- Suggestions
Q4. How would you begin to address analyzing cause and effect? Can’t easily get to cause and effect. Is measuring effect enough for the COCOMs to make good decisions? Without cause and effect how do we build models? Need to be realistic about the level of perfection that can be achieved, “better than a coin toss” may be an appropriate standard Make more structured use of SMEs Use techniques to add scientific rigor to SME contributions: pair-wise comparison, gaming, structured interviews, role-playing, value focused thinking, SME selection remains important, encourage diversity of opinion - groupware Try and think through to potential second & third order effects Other techniques that may be valuable Historical analysis, electronic markets, risk-consequence management Near real time data required for insights on causal relationships. Modeling needs to be issue specific, at least initially. Need to be able to look under the hood (no black boxes, we need insights not just answers) Need to understand the lag between action and response in the system System dynamics What is the ideal refresh rate for indicators and reframing objectives? It may be different from one indicator or objective to the next. Analyzing cause and effect is the “Holy Grail” in TSC but remains notoriously hard to achieve. We cannot model, either in computer models or conceptually, how to achieve effects if we do not have a basic understanding of how we can cause those effects to occur. But, maybe measuring the effect is good enough. Techniques such as historical analysis will tell you which factors are significant in achieving an effect but why they are the dominant factors is not part of the analysis. Historical analysis will tell you the degradation from hitting the target on the firing range and hitting the target on the battlefield for a British rifleman is 100:1, but it won’t tell you why (this is actually a good ratio, believe it or not). But if you conduct a similar analysis for a partner nation you will be able to identify their capability in battlefield conditions by measuring performance in a controlled environment, such as the firing range. You don’t know why the degratuon will be 100:1 or a 1000:1 but at least you know the effect of the engagement activity on their capability. The most important observation was that “we are not going to be able to model our way out of this one anytime soon” and this leaves SMEs as the best chance at understanding cause and effect. If the use of the SME improves your decision making to at least be better than a coin-toss then at least that adds value. Perfection and optimization in making every dollar count is not an achievable goal. Getting more things right than wrong though is achievable. The COCOMs mainly used their SMEs in work-shop sessions. But there are other ways to engage SMEs where rather than just relying on their experience and anecdotal guidance you can identify the consistency in their advice. Structured techniques exist for capturing SME input. Structured interviews, gaming and pair-wise comparisons will get more out of the SME. But, a word of caution. You also need to establish the credentials of the SME. It has to be the right SME. For Interagency, allied and partner nation contributions we don’t often look beyond their affiliation to the parent organization or country. Danielle Miller, from OSD/PA&E Joint Data Support will publish a paper on this topic in the near-future. Selecting the right SME is as important as how the SME is used. Finally, some effects won’t materialize for years whereas others may be instantaneous. So some indicators we will need to re-evaluate monthly whereas others we may not need to consider even yearly. It is important to understand the refresh rate for indicators. It should be part of the indicator description. Economies, culture and demographic indicators are generational. If you reassess them too often you may decide to begin or discontinue activities before the effect of that activity works its way into your measurements. System dynamics can help here. Don’t micro-manage your activities and engagements. In some cases changing the activities and engagements even annually may in actual fact be micro-managing the system to the detriment of achieving the COCOM’s long-term objectives.

19 WG1 Findings-- Suggestions
Q5. What is missing from the process? Consider the link between the indicators required for the baseline and measuring the effects of activities. Is there a common set? Activity Identification is immediately resource constrained Need to identify unconstrained requirement to estimate risk Where in the process do we do the risk evaluation? Policy incentives to encourage regional development NATO was a strong incentive for development Stronger links between COCOMs and OSD/ PA&E and Policy Understanding resource constraint earlier in the process will assist with assessing IPL requests and creating new authorities, policies and funding vehicles. Design new engagement tools to meet regional security challenges Potential misalignment of assessment resources to assessment requirements – Continue to prioritize objectives and indicators. The final question was “what is missing from this process?” . Well, in our theoretical Theatre Campaign Planning cycle we take measurements in two places. To establish the baseline and to measure the value (or otherwise) of our engagement activities. Normally, the assessment division will do both so chances are there is a link and some commonality in the measurements taken. But, for the record, consciously returning to the indicators and baseline as the starting point for assessing activities is a good place to start. No one activity will move a baseline indicator. But if, for example a link can be established between the measurement of activities and an indicator it may at least gives you a cue as to how you want to express your data. If your indicator requires you to count the number of clinics there is little value in expressing the output of an activity as the number of square feet of new hospitals constructed. Make them compatible (and don’t be afraid to change the indicator rather than the measures used for the activity if that makes more sense). Another observation the working group offered was that once the objectives are identified the engagement team immediately begin constraining what activities they plan to undertake to fit available resources, authorities and partner nation permissions. A common process in DoD is to first of all express the unconstrained demand. So, if the objective is to establish a fully professional military with COIN capabilities within 5 years what would you need to do, given a free hand and unconstrained resources, to achieve that? The value of this is that it allows you to identify the risk (or shortfall against your target) once resource constraints are applied. The unconstrained demand is evidence for more resources and new authorities. In addition, it allows you to prioritize. If your resource constrained engagements are only 1% of what is required to achieve the objective then you may conclude doing nothing and allocating those resources to another task, where that effort will make a bigger difference against another objective, is more valuable. In saying this the group did not suggest stopping all engagements where was little chance of progress. All engagements allow access, trust, confidence building and relationship development. All of which would be essential if regional priorities changed to focus on that partner as the lynch-pin to regional security (consider Pakistan’s importance to U.S. interests the day before and then the day after 9/11). But if you can maintain trust and confidence by expending less resources and consciously decide that is the holding pattern for that country then that will allow you to allocate your resources more economically. This is not an analytical observation as such but a policy one. NATO enlargement presented a significant carrot to partner nations to improve their security. NATO membership guaranteed security. Guaranteed security attracts external investment which accelerates progress towards EU membership and the benefits of joining. We are not suggesting that we form regional alliances with the obligations of NATO in other parts of the world but we are suggesting identifying the carrots (not necessarily DoD ones) that can be linked to progress in improving security. If they already exist do we currently communicate that as a benefit of engagement or do we leave the partner nation to draw their own conclusions? Should this be part of the objective prioritization process? Countries with the opportunity to gain benefits for improving security capacity are more likely to make rapid progress towards improvements. Although program review is the formal process for resourcing COCOM requests for additional resources there are advantages, within PA&E, in understanding the specific issues and shortfalls before program review begins. PA&E irregular warfare division already has links to each of the COCOMs, but not necessarily to the engagement and assessment divisions. Visibility of these divisions’ challenges earlier in the year will allow more thinking time to understand the relative importance of a request and therefore more time to evaluate it. Similarly, there are parts of OSD/Policy that would benefit from regular liaison. One of AFRICOM’s challenges is that their authorities are outdated for the modern and dynamic security environment found in their AOR. Although amending or creating new authorities is never going to be a fast process understanding the requirement 6 months before frustration becomes a problem can only be beneficial to the process of seeking change. There are 14 engagement tools. Is there a 15th out there that we currently don’t consider but could do so if the issue was raised? If we took the contraint off the engagement tools we currently use could we think of another way of developing partner nations. That won’t give us permission to do it, but that’s another example of a policy issue. Push it to OSD/Policy as an option and start the process of debating whether additional engagement tools can be legally added to the toolbox. Finally, the assessment division is always going to be stretched thin. The current process identifies a reprioritization of indicators and objectives every year. Maybe a second look at priorities should be undertaken once the baseline is completed (pr partially completed). If you can’t measure progress towards one high priority maybe that should trigger a reassessment as to whether it is worth concentrating in the short-term on a smaller set of equally high priority objectives that can be measured. The feedback loop from resource priority and baseline assessment is unclear. Resource priorities are established in line with strategic priorities but there is also a pragmatic element to this. Demonstrating success against one objective is more likely to result in making the case for increasing or maintaining funding from the Services in subsequent years. Evidence of effect is important, particularly if budgets are frozen or reduced in response to the global economic crisis.

20 WG 2:Stability Operations Findings/Recommendations
WG2: Stability Operations Charter: Identify SO challenges and problem areas to be solved and identify analytical methods that might help solve those areas Critical Insights: No single method, model or simulation will provide complete answer, but many can provide results to help inform decisions in one or more areas. Several of the tools can be used immediately and many under development have promise. Identification of metrics is absolutely critical. Identification and collection of relevant data is difficult but must be done. Key Takeaways Though Stability Operations is only a part of Irregular Warfare, it still presents a large problem space Challenge areas presented by different agencies had some common threads: Determination of demand/requirements,Prioritization of efforts/risk management, Determination/use of metrics, Attaining “whole of government” approach Many challenge areas are not adequately addressed by current analytical methods, models, and techniques Many promising methods, models, and techniques are in development Working Group 2 – Stability, Security, Transition, and Reconstruction Operations was chaired by COL Dean Mengel, Center for Army Analysis and co-chaired by Mr. William Krondak of the Training and Doctrine Command’s Analysis Center (TRAC). This working group focused on exploring the challenges and possible solutions for Stability Operations (SO). Some of the challenge areas were presented by USSOCOM representatives. They highlighted the need for the Combatant Commands (COCOMs) to identify the Security Force Assistance (SFA) requirements and demands, and under the new guidance, for USSOCOM to prioritize the requirements across COCOMs. They noted that there did not appear to be a coordinated plan for development of SFA capabilities across the Services. Finally, they indicated an issue regarding the identification and tracking of personnel with appropriate SFA-related skills, training, and experience. Briefers presented several issue areas focused on understanding the actual needs of the host nation or region with regard to the recognized stability sectors. They noted that identifying the international and regional partners and their capabilities was critical to success. As the working group deliberated, the need for an overarching strategy or “vision” regarding the application of “whole of government” resources became apparent. A number of methods, models, and tools were presented and discussed by the working group. However, they only address a small part of the problem. None of them are capable of solving all aspects of SO, and perhaps it is desirable to employ multiple approaches and combine the results. Compounding some of the difficulties being experienced by those trying to tackle these issues are the lack of overarching strategies and goals which would lead to some organization created to lead, oversee, and integrate the activities of various institutions required. The workgroup, as the entire workshop, recognizes that these types of inclusive efforts need to continue, and where possible, expanded to include as many participants from this community-of-interest.

21 Working Group # 2 Challenge Areas
<Source> Description Severity / Impact Difficulty with Solutions 1. Foreign Security Forces <USSOCOM> Need a coherent plan for building and training Foreign Security Assistance (SFA) Forces. Supply of forces may not be available to meet demand from COCOMs. Coordination with interagency elements is inadequate. 2. Identification of SFA Requirements (missions, etc.) Need to understand and identify the demands driving SFA requirements. Related to challenge area above, need to identify demand to plan resources and schedule training. -Need clarification by identifying total US govt demand and then identifying DOD piece. 3. Prioritization of SFA Requirements - Need tools/methods to prioritize SFA activities. Each COCOM has high priority requirements but not enough resources to fill needs. Who has authority to prioritize between COCOM requirements? 4. Personnel Tracking Determine and track training, skill sets, and experience relevant to SO. Some missions may require special skills or experience. Who has them? - Need more than skill identification. - Consider implications for career path . This chart provides a short summary of the challenge areas presented by the USSOCOM representatives, LtCol Caputo and MAJ Mills. They highlighted the need for the Combatant Commands (COCOMs) to identify the Security Force Assistance (SFA) requirements and demands, and under the new guidance, for USSOCOM to prioritize the requirements across COCOMs. They noted that there did not appear to be a coordinated plan for development of SFA capabilities across the services. Finally, they indicated an issue regarding the identification and tracking of personnel with appropriate SFA-related skills, training, and experience. In addition to showing the issue area, the agency that presented the issue, and a short description, the charts also show some of the working group’s thoughts regarding the severity or impact of the issue and some of the difficulties that may be encountered in trying to use the various methods, models, and simulations to address the issue.

22 Working Group # 2 Challenge Areas
<Source> Description Severity \ Impact Difficulty with Solutions 5a. Information prep of the Operational Environment <IDA> Determine what needs to be done within each sector. Determine causes/fixes. Reconstruction reqts for self-governance. How to recognize when “self-governing” achieved Without this info, resources may be misapplied or inappropriate actions may be taken. Who has responsibility for this? Different agencies have different perspectives. Needs must be relevant the host nation. 5b. Information prep of the Operational Environment Determine potential partners and what they can do. -Affected govt/society. Int’l partners (donor nations, humanitarian, financial org, non-govt). USG agencies. Without this information, inefficient or ineffective efforts may result. Who has responsibility for leading or coordinating this effort? Martin Lidy of Institute for Defense Analysis (IDA) presented several issue areas regarding the need for understanding the actual needs of the host nation or region with regard to the Stability sectors. He noted that identifying the international and regional partners and their capabilities was critical to success.

23 Working Group # 2 Challenge Areas
<Source> Description Severity \ Impact Difficulty with Solutions 5c. Information prep of the Operational Environment <IDA> Determine how to achieve unity of effort. Collaborative and Cooperative architectures. Public diplomacy and strategic communications. Without unity of effort, inefficient or ineffective efforts may be initiated that fail to meet needs and waste resources. This sends negative message to host nation. Information sharing is hindered by lack of common terminology and political issues. 5d. Information prep of the Operational Environment Determine how to measure progress toward achieving objectives. Quantitative metrics Qualitative metrics. Without proper metrics to measure progress, no way exists to determine whether certain projects or interventions are working or remain appropriate. Need prior identification of goals/objectives When is “good enough” achieved? Mr. Lidy continued by citing the issues of how to achieve unity of effort and identifying the metrics needed to measure progress toward achieving the objectives. The charts above show some of the working group’s thoughts regarding the severity or impact of the issue and some of the difficulties that may be encountered in trying to use the various methods, models, and simulations to address the issue.

24 Working Group # 2 Challenge Areas
<Source> Description Severity \ Impact Difficulty with Solutions 6. What capability and capacity does DOD need for sectors other than security? <OSD SOLIC> -DOD is both supported and supporting agency for SO, therefore must know what is needed. -What factors should be considered when prioritizing support? -Need metrics to evaluate performance. Supply of forces of appropriate type may not be available to meet demand from COCOMs. Requires decisions and guidance outside DOD. -“Restore” is relatively clear, but “support” is more open-ended 7. Security Force Assistance - Need to identify overall SFA demand. - Need process to identify and prioritize SFA needs of partners - Need metrics to evaluate performance. Meeting overall demand has implications for SOF/GPF and AC/RC Mix. Mr. Shawn Steene of OSD presented two major issues and several related subordinate questions. He reiterated concerns identified by previous speakers regarding identifying the needs and SFA demands as well as the identification of appropriate metrics. As the working group deliberated, the need for an overarching strategy or “vision” regarding the application of “whole of government” resources became apparent.

25 Working Group # 2 Challenge Areas
<Source> Description Severity \ Impact Difficulty with Solutions 8. Lethal and non-lethal capabilities <PKSOI> Must use mix of methods to set conditions supporting other instruments of power. Must establish security for progress but not totally alienate relevant populations. Non-lethal capabilities are more than rubber bullets and tear gas. 9. How should the military support reconstruction and stabilization policy and strategy? This requires actions in Doctrine, Organization, Training, Materiel, Leadership, Personnel and Facilities. This could be a significant resource issue. Leaders must be able to make informed decisions. Need to clearly understand other agencies capabilities and intent with respect to this area. 10. How do information and info ops support and nest within stability operations? Information operations and strategic communications must be informed by data and send consistent messages. Inconsistent or late info ops and strategic comms make US look bad and can be exploited by rivals and opposition media. Can problem be solved analytically? Mr. Mike Esper of the U.S. Army Peacekeeping and Stability Operations Institute (PKSOI) presented several open-ended issues regarding the application of military capabilities in the stability operations environment. The issue of information and information operations generated discussion that reiterated the need for unity of effort and an overarching approach or “vision”.

26 Working Group # 2 Challenge Areas
<Source> Description Severity \ Impact Difficulty with Solutions 11. Do the joint and service task lists sufficiently address the range of activities required to conduct joint stability operations? <PKSOI> Must ensure unit missions and Mission Essential Task Lists are updated and that doctrine and training are appropriate. Supply of forces of appropriate type and capability may not be available to meet demand from COCOMs. -Army and Joint Task lists recently reviewed as part of Army’s SO Capabilities Based Assessment. -Must review other service task lists. 12. Is the military’s current approach sufficient for operations where the focus is on “relevant populations” and not an enemy force? <PKSOI> This requires actions in Doctrine, Organization, Training, Materiel, Leadership, Personnel and Facilities. This could be a significant resource issue. Leaders must be able to make informed decisions. -Must be able to identify status of military’s ongoing efforts to assess. -Can problem be solved analytically? -How do you measure “sufficient” approach? Mr. Esper’s question regarding joint and service task lists had been partially addressed by the HQDA sponsored Stability Operations Capability-Based Assessment conducted by TRADOC Analysis Center (TRAC) and Center for Army Analysis (CAA) in 2006 and The working group noted that the issues raised by Mr. Esper required the Services to examine their capabilities across the range of doctrine, organization, training, materiel, leadership, personnel, and facilities (DOTMLPF).

27 Working Group # 2 Challenge Areas
<Source> Description Severity \ Impact Difficulty with Solutions 13. How do we best determine appropriate MOEs and MOPs for full spectrum operations? <PKSOI> Must identify Metrics that cover range of sectors, include strategic to tactical level, cover immediate response , transition, and sustaining efforts. Can they be modeled and simulated Without proper metrics to measure progress, no way exists to determine whether certain projects or interventions are working or remain appropriate. -Can metrics be modeled and/or simulated? -If some metrics are “qualitative” how do you evaluate? 14. How does the military support emerging security initiatives and DOD policy on Security Sector Reform? Must identify how support requirements change with changing policies in this area. This could be a significant resource issue. Leaders must be able to make informed decisions. As with previous speakers, Mr. Esper noted the need for Measures of Effectiveness (MOE) and Measures of Performance (MOP) that are relevant and appropriate for the Stability Operations area.

28 Working Group # 2 Challenge Areas
<Source> Description Severity \ Impact Difficulty with Solutions 15. How does the US military’s approach nest within the emerging body of interagency, intergovernmental, and allied approaches to reconstruction and stabilization? <PKSOI> Must identify how force requirements change with changing approaches in this area. This could be a significant resource issue. Leaders must be able to make informed decisions. Can problem be solved analytically? Finally, Mr. Esper’s issue regarding the military’s approach to nesting within the overall government and allied approaches to reconstruction and stabilization initiated work group discussion regarding the ability to apply structured approaches to this issue.

29 Working Group # 2 Methods, Models, Simulations
Method\Model\ Simulation <Agency> Description Areas addressed Current Status 1. Integrated Gaming System <TRAC FLVN> -Flexible definition of infrastructure and factions. -Stochastic. Faction satisfaction. Infrastructure status. Military impacts. Operational insights. In use. 2. PSOM <UK DSTL/JS J8> - Strategic and Operational level training assessment. Social Behavioral response. Stochastic \ Deterministic. High level pol/mil. Gain insights on operational impacts of high level decisions and resource allocations. 3. Nexus Network Learner <OSD PA&E> Societal Assessment Bayesian . - Stochastic. - Agent Based. - Modular. - Adaptive to data and other models. -Assess DIME impact (different COAs) on social changes. - Examine modification of behavior. In use with continuing development. The next session involved presentations by various agencies regarding the methods, models, and simulations that potentially could be applied to the issues. Kerry Lenninger of TRAC provided an overview of the Methods, Models, and Simulations for Stability Operations Analysis that assessed the applicability of 26 tools in this area. TRAC obtained and applied the Integrated Gaming System (IGS) and the Peace Support Operations Model (PSOM). She presented the insights and requirements for use of these tools. Dr. Deborah Duong presented information on her agent-based approach titled NEXUS. This approach assesses behavioral changes of populations in response to various stimuli, and is designed to be composable with various other models and approaches.

30 Working Group # 2 Methods, Models, Simulations
Method\Model\ Simulation <Agency> Description Areas Addressed Current Status 4. Primary Force Estimator (PRIME – ATLAS model) <CAA> Task based using approved rules of allocation. Includes geospatial considerations. Quick turn results. Deterministic. Army forces only. Under development. 5. Wargaming Human in the loop board game method. Focused on Security. Regional and overall Theater focus. -Assess force levels required to respond to different levels of violence. -Integration with other DIME aspects of campaign. Established and in use. Expansion of capabilities underway. Trudy Ferguson of CAA presented on the developmental tool PRIME-ATLAS that is designed to determine what Army forces are needed to fulfill certain missions and tasks. LTC Dave Sanders of CAA provided an overview of the wargame that CAA has been using to support commander’s decisions in Operation Enduring Freedom (OEF) and Operation Iraqi Freedom (OIF).

31 Working Group # 2 Methods, Models, Simulations
Method\Model\ Simulation <Agency> Description Areas Addressed Current Status 6. Task Event Outcome (TEO) IW Analysis <TRAC > - Tactical focus. - Human in the loop Wargame. Quick turn around. Ties in to Lines of Effort (LOE) - Analyze changes in organization equipment and TTP. Identify required changes in MOE (different metrics). Capture, analyze and disperse experience/ information gained in the field. -Link actions to LOE to higher PMSEII states. Requirements under development. 7. Workshops <CAA> - Provides an established structure to examine non-quantitative issues. -SME-based. -Senior level reviews. Investigate a wide variety of issues related to stability operations. Available and in use. LTC Russ Schott and Mr. Paul Works of TRAC discussed the developmental tool designed to use and evaluate a “task-event-outcome” approach to SO and IW. The tool is focused on the tactical small unit and soldier issues. Mr. Greg Andreozzi of CAA presented the workshop approach that has been successfully used by CAA for many years in addressing complex issues. He specifically discussed the application in the recent HQDA SO CBA.

32 Working Group # 2 Methods, Models, Simulations
Method\Model\ Simulation <Agency> Description Areas Addressed Current Status 8. Contingency Operations Tiger Team (COTT) <USACE \ TRAC> -Provides recommendations relating to USACE and ERDC R&D, analysis and studies for reconstruction, stability, contingency, aid, and relief efforts. - Understand challenges and build collaborative solutions to complex problems in reconstruction and stability efforts. Develop, identify, and validate potential R&D solutions to strategic and mission-level stability and reconstruction challenges. Link capabilities from different sources or programs. COTT formed, collaborations being established. Agent-based model for cultural geography in SO <COTT/USACE/TRAC> - Stochastic Agent Based Stand alone tool Tactical focus - Provide evaluation of impact of SO infrastructure projects on social perceptions. Under development. Tim Perkins of TRAC and Ms. Elizabeth Lyon of U.S. Army Corps of Engineers discussed the Contingency Operations Tiger Team approach that includes several initiatives regarding data gathering and assessment approaches. The tool set includes such efforts as the Measuring Progress in Conflict Environments (MPICE) and a variety of geo-spatial visualization tools. Mr. Leroy “Jack” Jackson of TRAC-Monterey presented information on the development effort focused on Representing Urban Cultural Geography in Stability Operations. This is an agent based modeling approach.

33 Working Group # 2 – Session 2 Assessment
The working group next turned to assessing which of the methods, models, and simulations (MMS) presented had the capability to either wholly or partially address the issues identified by earlier speakers. The chart above and those that follow show the issues and the group’s assessment of MMS applicability. The group noted that some of the MMS were available now while others were still under development and perhaps not “ready for prime time”. A consistent view was that no one tool could fully address an issue but that each tool selected could provide results that would partially inform decisions. As the group deliberated, it became apparent that tools that applied human in the loop (HITL) techniques were much better suited for addressing the multi-faceted issues relating to populations and political-social problems. The CAA wargame and the CAA workshop approach appeared to be the best-suited tools for this HITL approach. The IGS and PSOM also apply subject matter expert (SME) “white-cells” or high-level pol/mil games to identify the operational approaches that will be adjudicated in the models. (A)=Available Tool

34 WG2 Findings & Suggestions
−Even though everyone agrees that Stability Operations requires whole of government, non-government, coalition, and host nation/public participation, most of our methods, models, and techniques do not account for all of them −It appears that many of the challenge areas are indirect results of an absence of overarching strategies and goals −It is hard to understand how some tools, methods, and models work without common terms of reference – the same is true for data ♦ Suggestions: −Develop common terms of reference for understanding how tools, methods, and models work and for describing data −Ensure future collaboration efforts continue and expand to include the entire SO community-of-interest Stability Operations and Irregular Warfare provide immense challenges, not only for DOD, but for our nation, other nations, and affected populations everywhere. As such, the solutions for these challenges will be all-encompassing and many faceted. However, most of the methods, models, and tools which were discovered and discussed during this workshop only address a small part of the problem. None of them are capable of solving all aspects of SO, and perhaps it is impossible for a single tool to accomplish this. No single approach will work, and consequently, no single analysis tool will be able to support the efforts. Compounding some of the difficulties being experienced by those trying to tackle these issues is the lack of overarching strategies and goals which would lead to some organization created to lead, oversee, and integrate the activities of various institutions required. The workgroup, as the entire workshop, recognizes that these types of inclusive efforts need to continue, and where possible, expanded to include as many participants from this community-of-interest. During discussions in the workgroup it was obvious we have a problem within the community of problem solvers in conveying to each other the capabilities, logic, and products of the various methods, models, and tools that are available and being developed. Given the diverse backgrounds and experiences of those being drawn to solving these issues this is no surprise. The multi-discipline approach brings with it a variety of ideas, practices, and words for communicating them. The same is true for data – its sources, labels, structures, and uses are different in many of the disciplines. This is important because as we realize we need a multi-faceted approach with a multi-discipline flavor, a common basis for understanding is necessary to understand, use, and integrate the various methods, models, and tools along with the data they use and produce. It appears that the creation of a “terms of reference” for the community would be very helpful. Such a reference would put all parties working these issues on common ground that will facilitate better communications.

35 WG 3:IO/PSYOP/SocSciences Findings/Recommendations
WG3 Charter: Improve the foundations of information operations /PSYOP analysis; identify existing analytic capabilities and shortfalls; explore the application of quantitative and qualitative methods for improving analytical capabilities; evaluate and recommend concrete applications. Key Takeaways: A coherent taxonomy and lexicon of IO is required: Analysts and operators must use the same set of definitions Models, methods, and tools must provide mechanisms for learning, understanding of the problem, not prediction Coordinate PSYOP across related combined, joint, and inter agency arenas Develop robust case studies which capture a full problem set to greatly benefit exercises, education, and training Non-kinetic assessment (MOP, MOE) must be in the initial plan Key gaps in PSYOP capabilities must be resolved by other means (traditional social sciences, ORSA approaches may assist): Red teaming, Evolutionary development of M&S, Enhanced Wargaming (Phase 0), Human terrain and media analysis This working group, due to the limited time and space available, focused on PSYOP to the exclusion of the other four pillars of Information Operations (IO): Operations Security (OPSEC), deception, Electronic warfare (EW), and Computer Network Operations (CNO). They also focused on IO to the exclusion of examining the impacts of social science on IW in general. With over 40 members, the group had representation from the entire IO/PSYOP community with a good mix of social scientists and operations research analysts (approx 25% social scientists). Subject matter experts identified the requirement for leadership to establish a strategic vision or concept for PSYOP, then operational objectives, and then effectiveness can follow. We need to determine what our message should be and the intended audience. Analysts can assist planners in course of action development with tools and methods to measure effect on audiences, task accomplishment, and kinetic versus non-kinetic effects. They can also assist with the development of success assessment criteria, prioritization of assets, and the plan for failure and unintended consequences. Some limitations of the existing tools include: limited functionality, lack of Validation, Verification & Accreditation (VV&A), paucity of pedigreed data, cost (e.g. polling), limited linkage to social science theories, and difficulty of employment. The working group concluded that a coherent taxonomy and lexicon of IO is required with analysts and operators using the same set of definitions. The models, methods, and tools must provide mechanisms for learning and understanding of the problem, not prediction. Psychological operations must be coordinated across related combined, joint, and interagency arenas. Robust case studies should be developed which capture a full problem set to greatly benefit exercises, education, and training. A non-kinetic assessment with Measures of Performance (MOP) and Measures of Effectiveness (MOE) must be in the initial plan. Key gaps in PSYOP capabilities must be resolved by other means including red teaming, evolutionary development of M&S, enhanced wargaming (Phase 0), and human terrain and media analysis.

36 Analytic Design Issues
How do we appropriately choose models methods and tools for OD in PSYOPS? Generic tools that can be fine-tuned to the situation through social discourse (like the MpiCE project – Measuring Progress in Conflict Environments which provides a list of MOEs for organization’s SMEs to choose from to tailor to specific situation). Develop different solutions that you can test Know the TYPE of your problem Test and compare using same data sets Get a conformal standardized data set What disciplines should be on the team? How do we choose the right ones and access them? Analytic Ability/skills regardless of field Open-minded and able to work across disciplines Familiar with both military and OD process Have both field and background analysis capabilities 1 – How do we appropriately choose models methods and tools for OD in PSYOPS? Generic tools that can be fine-tuned to the situation through social discourse (like the MpiCE project – Measuring Progress in Conflict Environments which provides a list of MOEs for organization’s SMEs to choose from to tailor to specific situation). Develop different solutions that you can test Know the TYPE of your problem Test and compare using same data sets Get a conformal standardized data set Use these models in different ways (eg – examination of both links and nodes in network models can provide insights). Some examples: SNA tools, Agent-based models, Bayesian, Influence models, Systemic Operational Design. Other issues: Security Classification problems often prevent sufficient collaboration; CONTEXT matters for social issues; Keep in mind that tools should be crafted to support analysts, they cannot replace analysts; Meta data tagging problems. 2 – What disciplines should be on the team? How do we choose the right ones and access them? Analytic Ability/skills regardless of field; Open-minded and able to work across disciplines; Familiar with both military and OD process;; Have both field and background analysis capabilities; Build a reserve corps like Civil Affairs which can be tapped as needed; Some team members must have PSYOPS experience; Comms theory (esp campaigns and influence issues); Planners; Cultural expertise; FAO; Modeling/Sim etc technical people; Linguist (translation); Social Sciences; Psychologists. 36

37 Analytic Design Issues
What is the appropriate approach to measure effectiveness? What else needs to be measured? Step 1: Know the intent of campaign or conditions to be changed Step 2: then you can set measures up front and constantly refine over time (iteratively) How should we study outcomes of our actions? COORDINATE – form friendly network of interservice, interagency, govt, private partners Tailor to sub-groups and integrate Do in steps – eg – how much closer did I get to the goal? (eg – goal 50% positive polling – track trends from beginning) Give your partners the collection requirements so they can collaborate Don’t rely on a single measure (eg – not just polling) There should be different measures for different timeframes – short/medium/long Short – single behavior events (eg – vote, obey curfew, etc) Medium – trends in behavior (eg. Calling a reporting hotline) Longer term – attitudes underlying (Must understand what attitudes underly your objectives and then what behaviors reflect these attitudes iot measure them) Address both good and bad outcomes Cannot measure attitudes directly (polling can help but is not entirely reliable) 3 – What is the appropriate approach to measure effectiveness? What else needs to be measured? Step 1: Know the intent of campaign or conditions to be changed Step 2: then you can set measures up front and constantly refine over time (iteratively) 4 – How should we study outcomes of our actions? COORDINATE – form friendly network of interservice, interagency, govt, private partners Tailor to sub-groups and integrate Do in steps – eg – how much closer did I get to the goal? (eg – goal 50% positive polling – track trends from beginning) Give your partners the collection requirements so they can collaborate Don’t rely on a single measure (eg – not just polling) There should be different measures for different timeframes – short/medium/long Short – single behavior events (eg – vote, obey curfew, etc) Medium – trends in behavior (eg. Calling a reporting hotline) Longer term – attitudes underlying (Must understand what attitudes underly your objectives and then what behaviors reflect these attitudes iot measure them) Address both good and bad outcomes Cannot measure attitudes directly (polling can help but is not entirely reliable) 37

38 Analytic Design Issues
Gap: need to fund longer-term studies on what kinds of observable behaviors reflect the attitudes we are likely to seek (eg – what behaviors underly acceptance of a “market democracy”?) Further issue: giving people something positive, something to say “yes” to –something which reflects their self-interests and values. This approach might be more effective (can sponsor studies to determine) but also more likely to provide the types of objectives which lend themselves to observable/measurable behaviors. Gap: need to fund longer-term studies on what kinds of observable behaviors reflect the attitudes we are likely to seek (eg – what behaviors underly acceptance of a “market democracy”?) Further issue: Giving people something positive, something to say “yes” to –something which reflects their self-interests and values. This approach might be more efective (can sponsor studies to determine) but also more likely to provide the types of objectives which lend themselves to observable/measurable behaviors. 38

39 WG 4:Counterinsurgency Findings/Recommendations
WG4: Counterinsurgency Charter: Explore Various Analytical Tools And Methods For Use In Planning And Conducting Counterinsurgency Findings/Recommendations: COIN analytical techniques applicable to general purpose forces (GPF) are equally applicable to special operations forces (SOF) Strengths and weaknesses of COIN analytical support for GPF are the same for SOF Tools include: Deployed analysts, Human-in-the-Loop (HITL) computer-supported wargaming, COIN M&S not mature (caution on ability to model human behavior) Recommend USSOCOM develop a structure to provide analytical support to COIN forces Recommend USSOCOM consider interdisciplinary teams Recommend SOF training/education/familiarization with benefits of analytical support This working group examined past, present, and future modeling and simulation (M&S) as well as non-M&S analysis capabilities that are applicable for USSOCOM counterinsurgency (COIN) planning and execution. DoD has deployed operations research analysts in the past to provide analytical support directly to general purpose forces COIN operations and is presently doing so. DoD analysts have developed a wide array of non-M&S analytical tools and techniques that address COIN problem areas. All of those analytical techniques are potentially applicable to USSOCOM forces during COIN execution. The working group recommends that USSOCOM develop a structure to provide the same type of analytical support to special operations forces that operations research analysts are currently providing to general-purpose forces. Traditionally, M&S has been a primary analytical technique for military planning. This is not the case for COIN. A good analytical technique for supporting COIN planning now is computer-supported wargaming. The use of computer-supported wargames to support COIN planning is context specific. The technology is not mature enough to support some applications (e.g. programmatic issues). The difficulty stems from the inability of current M&S technology to capture human behavior in a satisfactory manner. DoD operations research analysts are generally not satisfied with our COIN M&S capabilities. Traditional combat modeling is rooted in the physical sciences and our initial forays into simulating COIN operations have been designed and built in the same mold. It is not working very well. The Department of Defense (DOD) operations analysis community is working hard to improve our capability to where we want it to be, but we are not there yet.

40 Present – COIN Execution
COIN analytical techniques applicable to general purpose forces (GPF) are equally applicable to special operations forces (SOF) IED Analysis, Polling, Social Network Analysis, ISR Network Analysis, Assessment Analysis, Trend Analysis, Criminal Activity Profiling, RIO Analysis, Etc. Strengths and weaknesses of COIN analytical support for GPF are the same for SOF Assessing influence on population Problem Area – at times data obtained from host nation untrained collectors As we have noted, we have deployed operations research analysts in the past to support general purpose forces COIN operations and we are presently deploying operations research analysts in Iraq and Afghanistan. Analysts have developed a wide array of analytical tools and techniques that address COIN problem areas. All of the those analytical techniques are potentially applicable to USSOCOM forces during COIN execution. Again, as noted earlier, there is a continuing frustrating inability to address problems and questions that deal with human behavior and the effects of actions and policies on human behavior. An additional problem area that will surface when providing analytical support to USSOCOM forces is obtaining data. Currently, there are many difficulties supporting general purpose forces regarding data provided by U.S. personnel. At best, the data is ‘dirty.’ Much effort is spent on scrubbing, collating, and verifying ‘dirty’ data. It is like making sausage. Analytically supporting USSOCOM forces will, more than likely, involve obtaining data from host nation sources, producing even dirtier data.

41 Present – COIN Planning
Human-in-the-Loop (HITL) computer-supported wargaming Adequate way to provide insights now Federations of specialized simulations Wargame Integration toolkits Must use caution; not mature enough for some contexts Models and Simulation Warm and fuzzy – not! Emerging but still in its infancy Similarly, analytical requirements for supporting COIN planning are the same for USSOCOM as for other combatant commanders. Traditionally, a primary analytical technique for military planning has been computer-simulation models of combat. This is not the case for COIN. A good analytical technique for supporting COIN planning now is computer-supported wargaming. Analysts from the Office of the Secretary of Defense (OSD) and the Joint Staff, with assistance from the Services and combatant commanders have developed a way to support wargames with a federation of specialized models, some of which address the political, military, social, economic, infrastructure, and informational (PMESII) aspects of COIN operations. There are wargame integration tools, such as OZ, that increase the efficiency of computer-supported wargames by: Integrating wargames, simulations, rule-based systems and data for the purpose of analysis Branching the game and recording it for statistical and data mining analysis Streamlining the process of using many wargame adjudication modules There is a word of warning that has to be mentioned. The use of computer-supported wargames to support COIN planning is context specific. The technology is not mature enough to support some applications (e.g. programmatic issues). The difficulty stems from the inability of current M&S technology to satisfactorily capture human behavior. Operations research analysts are generally not satisfied with our COIN M&S capabilities. Traditional combat modeling is rooted in the physical sciences and our initial forays into simulating COIN operations have been designed and built in the same mold. It is not working very well. The Department of Defense (DOD) operations analysis community is working hard to improve our capability to where we want it to be, but we are not there yet.

42 Present – COIN Planning
Substantial efforts ongoing M&S as well as Non M&S Data is problematic across the board Context specificity Strategic, operational, tactical Difficult to separate analytical implications between levels of war (“Strategic Corporal”) Realize need for a conceptual framework for understanding and integrating causality across all levels of analysis Iterative process/dialogue Considerable effort within the DOD operations research community is looking at improving our Non-M&S COIN analytical capability as well as our M&S COIN capability. Obtaining data is a challenging effort in both arenas. In the past, with our traditional computer combat models, we could address the strategic, operational, and tactical levels of war with different approaches and techniques. This is not the case with COIN. The COIN environment complicates things inasmuch as tactical events have direct operational and strategic impact. Strategic decisions dictate operational and tactical constraints. We, in the DOD operations research community, are realizing the need for a more expansive analytical conceptual framework and are reaching out into the community of social scientists. We are discovering that our linear processes will have to yield in favor of iterative processes that feature continuous dialog with operators and decision makers.

43 Future - COIN Execution
Recommend USSOCOM develop a structure to provide analytical support to COIN forces Established during planning – every operation is different Diverse operating environments – varying footprints Reachback analytical support Support through GPF - when GPF are available Recommend SOF training/education/familiarization with benefits of analytical support We recommend that USSOCOM develop a structure to provide the same type of analytical support to special operations forces that operations research analysts are currently providing to general purpose forces. Every COIN operation is unique for general purpose forces and this is especially true for special operations forces. The operational planning phase should establish the why, where, when, and how regarding analytical support for each COIN operation. Forward deploying operations research analysts will not be an option in most cases. Maximizing reachback support and exploiting general purpose forces, when available, are options and other creative ways may surface. Simultaneously, we recommend that USSOCOM educate and familiarize special forces personnel on what operations research can bring to the table. An excellent example is the analysis handbook for commanders developed by the Center for Army Analysis.

44 Future – COIN Planning Recommend USSOCOM consider interdisciplinary teams Centralized Decentralized Hybrid Recommend USSOCOM look into a conceptual analytical framework to provide analytical support to USSOCOM COIN planning Mr. Miller’s Trinity (crime, migration, extremism) Left of boom Forecast next hot spot Correlation, not causality USSOCOM needs to leverage the talents and expertise across multiple social science disciplines to supplement the analytical processes in support of planning. We say more about interdisciplinary teams on a later slide but here we want to point out how USSOCOM could proceed. Centralized, decentralized and hybrid approaches are ways to organize analytical expertise to best support the command. A centralized approach consists of incorporating dedicated social scientists within the command’s analytical team. This approach is expensive to man and maintain. A decentralized approach taps into social science expertise when needed. This approach is more cost-effective, but, presents challenges for archiving lessons learned, sharing information among analysts, and maintaining continuity of analytical approaches. A hybrid approach could consist of a minimally staffing social scientists and acquiring additional expertise when needed. In addition, we recommend that USSOCOM consider cultivating a conceptual analytical framework based on the triad that Mr. Miller challenged the workshop with during the keynote address. The emphasis is on prediction and prevention and the reliance is on correlation, not causality. Although there are several analytical tools that currently attempt to do this, none are from the perspective that Mr. Miller set forth.

45 WG 5:Thinking Models Findings/Recommendations
WG5 Charter: Frame the context of the IW problem properly, break down IW operations into its natural components, and investigate the subject through discourse and the application of systems thinking. Findings: Many ways to see/represent IW – different languages/logic Lack of common terms/understanding about IW IW analysis at strategic/operational/tactical may require different cognitive models/techniques/representations Modeling is difficult – must learn to think differently Focus on uncovering indirect opportunities Need tools to improve research capabilities that enhance thought and shared understanding Need decision makers to shape/provide guidance: frame problem visualization – make the whiteboard a “group thinking pad” acquire a depth of understanding The Operational Design process: requires continuous learning provides insight, not answers expect some risks Identifies what we know and don’t know about the problem This working group was charged with trying to answer the question, “How should we be thinking about IW?” They were also asked if using a systemic approach could better frame the problems and lead to a new set of solutions. Their assessment is that the answer to the last question should be better described as leading to a better understanding of IW. Operational Design, used as a process methodology, and led by LTC Yancey, assisted the group in the development of various “Thinking Models” of various aspects of IW. A “group thinking pad” was used to visualize the concepts associated with a complex dynamic system that facilitates group understanding and learning. Operational Design leverages the concepts of ontology (study of the nature of being and existence) and epistemology (study of the nature and scope of knowledge). During this two-day exercise, the group was exposed to the Operational Design methodology and participated in the early stages of the investigation process. There is much more work to be done to hammer out the logic of the sub-systems and how they relate to the system as a whole. Once the relationships are represented it will begin to reveal insights to opportunities that could be exploited in order to transform the system to a more favorable posture. A working group recommendation is to establish a Community of Interest (COI) across all domains to continue the process of understanding IW as a system. One venue is the bi-weekly VTC being conducted by USSOCOM J-10 and the TRAC led IW-WG.

46 Gaps There is a gap between our analytical capability and our commander’s operational needs The repository of the IW “body of knowledge” has not been clearly identified (IW online Library) There is a relational, supportive, and authority gap between the military and “the interagencies” on IW We do not understand interagency lines of communications We don’t understand how to balance government capacity for “restoration of services,” security, or economic development We do not know the modeling requirements for IW analysis Many do not know about IW Community Hubs, Potential Data sources or samples of IW Activities available by Joint Data Support Some common themes they identified across the working groups are: There is a relational, supportive, and authority gap between the military and “the interagencies” on IW. Challenges extend well beyond DoD’s traditional boundaries requiring interagency and coalition collaboration. There is a gap between our analytical capability and our commanders’ operational needs. The repository of the IW “body of knowledge” has not been clearly identified (i.e. need for a comprehensive IW online Library).

47 Key Issues & Discussion Items
Our current metrics don’t capture the qualitative aspects of conflict that commanders need We have voids in our data and very little cause and effect data (e.g., temporal effects require years/decades of observations) There is no “owner” of a common lexicon We lack sufficient analysts/SMEs with DIMEFIL (Diplomatic, Informational, Military, Economic, Financial, Intelligence, Law Enforcement) experience Identifying the differences between “indicators” and “effects” and understanding some effects are not quantifiable (e.g., measuring persuasion and influence) We have not retained our history of IW, how do we bring it back—we need to leverage that operational experience and those earlier insights There are different levels of IW that require very different tools As noted by our opening tutorial speaker, the largest issue with IW stems from the lack of an overarching strategy appropriately linking not only the “whole-of-government” activities, but also those from non-governmental and coalition partners. We also have a need to develop a common lexicon to better communicate with each other and the need to improve our ability to build, retain, and share our knowledge, methods, and tools to support the planning and conduct of IW activities. We need broad interdisciplinary teams to help us with our thinking about people/populations and IW in general. We need to be able to capture the qualitative aspects of conflict as well as the quantitative ones. A large variety of interdisciplinary qualitative and quantitative approaches will be required to address the various problem areas. We have not retained our history of IW, how do we bring it back—we need to leverage that operational experience and those earlier insights. Many of the analytical methods, models, and tools currently in use to support the Department can also be adapted to meet the requirements for USSOCOM.

48 Recommendations Identify, create and sustain credible IW data
It will require iteration to decide on the data needed and to characterize it (e.g., metadata; pedigree) Develop a lexicon of key terms Current definitions are not acceptable to the interagency, coalition partners Continue the dialogue on MmTs to support IW analyses This workshop represents a significant step forward More dialogue is needed w/ whole of government participation MORS provide a forum to help organize the needed information Create a common template to compare and contrast key IW models and tools Continue to support efforts to identify key gaps and priorities to guide future actions MORS and Sponsors assist in bringing the various IW Communities of Interest (COI) together; e.g., IW Working Group Human, Social Cultural Behavior (HSCB) modeling MORS Social Science Community of Practice (COP) Support Service initiatives to put Operations Research Analysts in SOF operational staffs Invite more allies and the interagency to these meetings As an initial step to meeting the identified challenges, the following recommendations are made: Identify, create, and sustain credible IW data, requiring iteration to decide on the data needed and characterization of the data (metadata, pedigree), in order to meet analysts’ needs. Develop a common lexicon of key terms (some of the current definitions are not acceptable to coalition and interagency partners). Continue the dialogue on methods, models, and tools which can support IW analyses (IW workshop was a good start, more dialogue is needed with whole-of-government participation). MORS provide a forum to help organize the needed information Wiki site Common template to compare and contrast key IW models and tools MORS and Sponsors assist in bringing the various IW Communities of Interest (COI) together

49 Questions?


Download ppt "MORS Irregular Warfare II Analysis Workshop"

Similar presentations


Ads by Google