Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 OutbriefIW II Chair: Renee G. Carlucci, CAA Co-Chair: Don Timian, ATEC MORS Irregular Warfare II Analysis Workshop 3 - 6 February 2009 Davis Conference.

Similar presentations

Presentation on theme: "1 OutbriefIW II Chair: Renee G. Carlucci, CAA Co-Chair: Don Timian, ATEC MORS Irregular Warfare II Analysis Workshop 3 - 6 February 2009 Davis Conference."— Presentation transcript:

1 1 OutbriefIW II Chair: Renee G. Carlucci, CAA Co-Chair: Don Timian, ATEC MORS Irregular Warfare II Analysis Workshop February 2009 Davis Conference Center MacDill AFB, Florida Final Outbrief for SOCOM NATO UNCLASSIFIED

2 2 OutbriefIW II Agenda  Overview  Tutorial and Plenary Session  Observations  Key Issues and Discussion Items  Gaps  Recommendations

3 3 OutbriefIW II u Title: Irregular Warfare Analysis Workshop u Organization – 3-6 Feb 09, held at SECRET/REL to AUS/CAN/GBR/USA level at MacDill AFB, Tampa, FL – Program chairs: Ms. Renee Carlucci, Mr. Don Timian, and LTC Clark Heidelbaugh – 5 Working Groups: Global Engagement; Stability, Security, Transition, and Reconstruction (SSTR); Information Operations (IO)/Psychological Operations (PSYOP)/Social Sciences; Counterinsurgency (COIN); and Thinking Models About IW; – Synthesis Group u Objectives – To frame the toughest IW problems that SOCOM and DoD are facing – To bring special operators, analysts, and problem solvers together to help define IW analysis problems, explore techniques to deal with these problems, identify what has worked and not worked, and determine recommended ways ahead u Context – Changing strategy with greater emphasis on Irregular Warfare (NDS, QDR, IW JOC, etc.) – Continuing emphasis on building partnership capacity (BPC Roadmap, TSC, etc) – Limited policy, process, tools and methods for Irregular Warfare activities – Focus on partnering operators with analysts and problem solvers to gain operator perspectives u Scope: – Military dimension of conflict in Phases 0-5 of DoD’s campaign planning construct – Tools/methods, algorithms, historical/current data sources, on-going analysis, opportunities for collaboration on future analysis and tool development Overview

4 4 OutbriefIW II Tutorial, Plenary, and Keynote Speakers u Irregular Warfare Joint Operations Concept Mr. Jeffery (Gus) Dearolph, Deputy Director Internal,SOCOM J10 u Lessons from the Irregular Warfare Methods, Models, Techniques COL Jeff Appleget, TRAC u Summary of Improving Cooperation Among Nations for Irregular Warfare Analysis Workshop Dr. Al Sweetser, Director, OSD-PAE SAC u Operational Design LTC Reb Yancey, SOCOM SORR-J8-Studies u Sponsor’s Welcome Mr. E. B. Vandiver III, FS Dir, Center for Army Analysis u Keynote Speaker Mr. William J. A. Miller Dir, Strategy, Plans, and Policy, SOCOM

5 5 OutbriefIW II Irregular Warfare Joint Operations Concept Mr. Dearolph u There is continuing friction with the IW definition across Services, agencies, interagency, and among allies u There is a lack of grand strategy and a failure to understand population u Key IW factors are: indirect, enduring, persistent, proactive, population-centric, respect of legitimate sovereignty linked to over-arching strategy u Consists of : – Key missions (e.g., FID, UW, COIN, CT, Stab Ops) – Key activities (e.g., Strategic communications, IO, PSYOPS, Intel, Counter- intel, Support to law enforcement) u IW Military Leadership – JFCOM for General Purpose Forces (GPF) – SOCOM for Special Operations Forces (SOF)

6 6 OutbriefIW II Lessons from the Irregular Warfare Methods, Models, Techniques COL Jeff Appleget u “IW focus is on the population” u “COIN” is the Key when insurgents exert more influence on local populations than the national government” u IWMmAWG Study established a 7-element framework – Identified 35 gaps, 34 related to data and social sciences u Analytical Approach – Now, Top-down, Western perspective (DIMEFIL-PMESII) – Soon, Bottom-up employing social sciences expertise – Track strategic level Methods, models, Tools (MmTs) u Iterative development of “key data” is central u Over-all needs – Create credible, relevant MmTs to address decision maker issues – Make social scientists integral members of the analysis team u Continue community-wide dialogue through IW Working Group

7 7 OutbriefIW II Improving Cooperation Among Nations for Irregular Warfare Analysis Workshop (NPS) Dr. Al Sweetser, Director, OSD- PAE SAC u There is value having international participants from many different nations u Emphasized importance of “Whole of Government“ approach u Useful to conceptualize the problem as “Complex Adaptive Systems” (e.g., act, react, re-react,…) u Consider a hybrid approach (e.g., wargame – model – wargame)

8 8 OutbriefIW II Systemic Operational Design (SOD) LTC Reb Yancey, SOCOM J8- Studies u IW is a “wicked problem” u Akin to relearning COIN analysis approaches (Vietnam / Iraq) u SOD employs a structured method of inquiry that enables a commander to: – Make sense of a complex situation – Capture understanding – Share the resulting visualization u SOD is a method of inquiry, is based on discourse, creates a learning system u Requires accepting humility and valuing heresy u Means challenging the information and the boss u To deal with a dynamical complex system, one needs to explore the interactions among the key parts (“hermeneutics’)

9 9 OutbriefIW II Keynote Speaker Mr. William J. A. Miller, SOCOM Dir, Strategy, Plans, & Policy, u “IW is about populations” u In analyzing IW issues, a Lanchester view is not useful u “Behave” not kill our way to victory Shape vs. exploit, synthesis not analysis, transforming is satisfising whereas solutions are optimizing, presence changes the problem u Be as “un-wrong” as can be in conceptualizing a global perspective on issues u Globalization challenges and threats to the US—Migration, Crime, Extremism u SOCOM Challenges: Be up-stream (leverage), turn down the heat (affect), engage in dialogue with senior decision makers

10 10 OutbriefIW II Working Group Observations u The working groups (WG’s) were highly partitioned by their titles and topics areas (tough to find overlap) u WG’s employed from 4 to 9 presentations in their sessions—a total of 30 different workshop presentations u WG’s ranged in size from 16 to 50 members—the “modeling IW WG” had the highest numbers u WG’s recognized that they have more challenges and tasks then they can handle in a three-day workshop u WG’s have heart and intellectual energy but are limited by clock time and “soak time” u WG’s would like to “sit in” on other working groups (series vs. parallel information meetings)

11 11 OutbriefIW II General Observations u We are still struggling with the exact meaning and breadth of irregular warfare (bounding and characterization) u We are not familiar with the agencies that understand or have jurisdiction for DIMEFIL and PMESII u “Models and Tools” do not equal “computer programs and computer models” u Wargaming with the right players offers a powerful technique for discovery u Graphics in a storyboard approach has a prominent place in IW for displaying and understanding influences u Everyone is talking about data, its definition, its meaning, its form, who is collecting it, processing it, and storing it u No consensus on what information does exist, should exist or who is or should be responsible—regardless, the complexity of the situation transcends the data u VV&A is still the topic on IW models and data

12 12 OutbriefIW II WG 1:Global Engagement Findings/Recommendations u WG1: Global Engagement Charter: Provide recommendations on appropriate analytical techniques to prioritize, plan, and assess Theater Security Cooperation activities to assist the COCOMs in addressing the analytical challenges that they currently confront. u Key Questions: – (1) How should objectives and indicators be structured? – (2) How does an assessment division with four to ten people measure and maintain the baseline? – (3) How can you determine the right activities to support partner nations while making the most progress towards desired end-states? – (4) Is it possible to measure cause and effect in a complex system? – (5) What is missing from this process? u Takeaways: – Many of the effects are potentially unquantifiable (and will remain so). The challenge remains informing decision makers given this constraint. – Interagency analytical resources can assist and are essential – Don’t just accept objectives or rush to create them. Need to focus on shaping objectives as well as measuring progress. Reframe. – IW analysis will affect traditional analytical paradigms » Messy data, Cause & Effect, No easy “one-size-fits-all” toolset – Effective Security Cooperation exceeds the boundaries of DoD’s authorities and capabilities

13 13 OutbriefIW II Analysis MethodDescriptionExamples Quantitative data Collection of input/output data associated with activities and generic country indicators TCSMIS database Country indices on corruption, economic growth, security etc. Polling and surveys Public opinion or opinion of targeted groups View of the U.S. before and after USNS Comfort port visit Content Analysis Survey popular media for identified themes Failed States Index (Fund for Peace) Expert Opinion Subject Matter Expertise and Focus Groups TSC Working Group Objective identification Modeling/ Simulation and Gaming Simplified representation of a complex system COMPOEX (PACOM) Assessment Techniques Baseline Trends Forecasting

14 14 OutbriefIW II Activities (14 engagement tools) 1. Combined/Multinational Education 2. Combines/Multinational Exercises 3. Combined/Multinational Experimentation 4. Combined/Multinational Training 5. Counter/Non-Proliferation 6. Counter Narcotics Assistance 7. Defense & Military Contacts 8. Defense Support to Public Diplomacy 9. Facilities & Infrastructure support projects 10. Humanitarian Assistance 11. Information Sharing/Intelligence Cooperation 12. International Armaments Cooperation 13. Security Assistance 14. Cross-Cutting Programs

15 15 OutbriefIW II WG1 Findings-- Suggestions Q1.How should objectives and indicators be written, structured and prioritized? u Ideally, a comprehensive set of metrics should be identified, where that is not possible indicators should be MOE rather than MOP. u Beware, decomposition can be endless. “If you can’t measure the objective then you have no objective!” u Involvement of the analyst in structuring the specific language used in objectives is essential. It must mean something analytically. Embed analyst in the strategy division? u SME qualitative indicators. Are they valid and consistent between different experts? May not be able to trend the data, there will be limits here. Prefer something that is quantitative but don’t take judgment out of the process. u Consider prioritizing indicators, on basis of which are most important, not by which are easiest to collect. (Embedded analyst can assist). u Don’t forget to re-evaluate what indicators you are using. Iraq experience of looking at MOP rather than MOE to assess progress. And reframe the problem at the objective level, reprioritize when necessary, goals should be achievable.

16 16 OutbriefIW II WG1 Findings-- Suggestions Q2. How should a baseline be established and maintained? u Identify the indicators before looking for data, this allows you to identify gaps in the data you collect. – Have to put effort into thinking about inclusive measures – no laundry list – End states & steady-states. u Use, wherever possible, existing government or reliable data sources. Be aware of the origin of data where sources may not be reliable. – Be aware of dangers of active verses passive data collection – Some indicators useful for forecasting, others not. There will be some universal measures, such as child mortality rates which will be good indicators across a range of objectives. u Aggregate diverse data elements into composite index. Will show trends. – Need correct SMEs to interpret the data. Most data will be messy. – If you measure too often you may affect the system. u If you can’t present your data reliably you’ve failed. Map background and cartographic display and trends work well. For stop-light charts define criteria.

17 17 OutbriefIW II WG1 Findings-- Suggestions Q3. How should developing partner nations’ security forces be evaluated and supported? u Focus on sustainability (institutional change, 15 years) – Trust and confidence – Build the professional military education school house before going out on the rifle range – Target/create/instill/develop the cadre of professionals u Assessment methods for building security institutions – Defense Resource Management Study Project (DRMS) difficult to implement for under-developed institutions – Comprehensive baseline surveys must be conducted. E.g. U.S. Country Team or SOF site survey. Consider host nation’s security forces – not just military. – Can we do that with other U.S. government institutions? Authorities and treaties are issues. Other allies where required. u Assessment measures must be tailored to each country’s unique security requirements, authorities and situation – Existing U.S. assessment measures may be considered for establishing baseline or appropriate framework – A negotiation on suitable role/end-state for each partner nation’s forces – Leverage capacity of other allies to help build regional capacity – You don’t necessarily need a U.S. level of performance to be successful

18 18 OutbriefIW II WG1 Findings-- Suggestions Q4. How would you begin to address analyzing cause and effect? u Can’t easily get to cause and effect. Is measuring effect enough for the COCOMs to make good decisions? – Without cause and effect how do we build models? – Need to be realistic about the level of perfection that can be achieved, “better than a coin toss” may be an appropriate standard u Make more structured use of SMEs – Use techniques to add scientific rigor to SME contributions: pair-wise comparison, gaming, structured interviews, role-playing, value focused thinking, – SME selection remains important, encourage diversity of opinion - groupware – Try and think through to potential second & third order effects u Other techniques that may be valuable – Historical analysis, electronic markets, risk-consequence management – Near real time data required for insights on causal relationships. – Modeling needs to be issue specific, at least initially. Need to be able to look under the hood (no black boxes, we need insights not just answers) u Need to understand the lag between action and response in the system – System dynamics – What is the ideal refresh rate for indicators and reframing objectives? It may be different from one indicator or objective to the next.

19 19 OutbriefIW II WG1 Findings-- Suggestions Q5. What is missing from the process? u Consider the link between the indicators required for the baseline and measuring the effects of activities. Is there a common set? u Activity Identification is immediately resource constrained – Need to identify unconstrained requirement to estimate risk – Where in the process do we do the risk evaluation? u Policy incentives to encourage regional development – NATO was a strong incentive for development u Stronger links between COCOMs and OSD/ PA&E and Policy – Understanding resource constraint earlier in the process will assist with assessing IPL requests and creating new authorities, policies and funding vehicles. u Design new engagement tools to meet regional security challenges u Potential misalignment of assessment resources to assessment requirements – Continue to prioritize objectives and indicators.

20 20 OutbriefIW II WG 2:Stability Operations Findings/Recommendations u WG2: Stability Operations Charter: Identify SO challenges and problem areas to be solved and identify analytical methods that might help solve those areas u Critical Insights: – No single method, model or simulation will provide complete answer, but many can provide results to help inform decisions in one or more areas. – Several of the tools can be used immediately and many under development have promise. – Identification of metrics is absolutely critical. – Identification and collection of relevant data is difficult but must be done. u Key Takeaways – Though Stability Operations is only a part of Irregular Warfare, it still presents a large problem space – Challenge areas presented by different agencies had some common threads: Determination of demand/requirements,Prioritization of efforts/risk management, Determination/use of metrics, Attaining “whole of government” approach – Many challenge areas are not adequately addressed by current analytical methods, models, and techniques – Many promising methods, models, and techniques are in development

21 21 OutbriefIW II Working Group # 2 Challenge Areas Challenge Area DescriptionSeverity / ImpactDifficulty with Solutions 1. Foreign Security Forces Need a coherent plan for building and training Foreign Security Assistance (SFA) Forces. Supply of forces may not be available to meet demand from COCOMs. Coordination with interagency elements is inadequate. 2. Identification of SFA Requirements (missions, etc.) Need to understand and identify the demands driving SFA requirements. Related to challenge area above, need to identify demand to plan resources and schedule training. -Need clarification by identifying total US govt demand and then identifying DOD piece. 3. Prioritization of SFA Requirements - Need tools/methods to prioritize SFA activities. Each COCOM has high priority requirements but not enough resources to fill needs. Who has authority to prioritize between COCOM requirements? 4. Personnel Tracking Determine and track training, skill sets, and experience relevant to SO. Some missions may require special skills or experience. Who has them? - Need more than skill identification. - Consider implications for career path.

22 22 OutbriefIW II Working Group # 2 Challenge Areas Challenge Area DescriptionSeverity \ ImpactDifficulty with Solutions 5a. Information prep of the Operational Environment Determine what needs to be done within each sector. -Determine causes/fixes. -Reconstruction reqts for self-governance. - How to recognize when “self-governing” achieved Without this info, resources may be misapplied or inappropriate actions may be taken. Who has responsibility for this? Different agencies have different perspectives. Needs must be relevant the host nation. 5b. Information prep of the Operational Environment Determine potential partners and what they can do. -Affected govt/society. -Int’l partners (donor nations, humanitarian, financial org, non-govt). -USG agencies. Without this information, inefficient or ineffective efforts may result. Who has responsibility for leading or coordinating this effort?

23 23 OutbriefIW II Working Group # 2 Challenge Areas Challenge Area DescriptionSeverity \ ImpactDifficulty with Solutions 5c. Information prep of the Operational Environment Determine how to achieve unity of effort. -Collaborative and Cooperative architectures. - Public diplomacy and strategic communications. Without unity of effort, inefficient or ineffective efforts may be initiated that fail to meet needs and waste resources. This sends negative message to host nation. Information sharing is hindered by lack of common terminology and political issues. 5d. Information prep of the Operational Environment Determine how to measure progress toward achieving objectives. -Quantitative metrics -Qualitative metrics. Without proper metrics to measure progress, no way exists to determine whether certain projects or interventions are working or remain appropriate. Need prior identification of goals/objectives When is “good enough” achieved?

24 24 OutbriefIW II Working Group # 2 Challenge Areas Challenge Area DescriptionSeverity \ ImpactDifficulty with Solutions 6. What capability and capacity does DOD need for sectors other than security? -DOD is both supported and supporting agency for SO, therefore must know what is needed. -What factors should be considered when prioritizing support? -Need metrics to evaluate performance. Supply of forces of appropriate type may not be available to meet demand from COCOMs. - Requires decisions and guidance outside DOD. -“Restore” is relatively clear, but “support” is more open-ended 7. Security Force Assistance - Need to identify overall SFA demand. - Need process to identify and prioritize SFA needs of partners - Need metrics to evaluate performance. Meeting overall demand has implications for SOF/GPF and AC/RC Mix.

25 25 OutbriefIW II Working Group # 2 Challenge Areas Challenge Area DescriptionSeverity \ ImpactDifficulty with Solutions 8. Lethal and non-lethal capabilities Must use mix of methods to set conditions supporting other instruments of power. Must establish security for progress but not totally alienate relevant populations. Non-lethal capabilities are more than rubber bullets and tear gas. 9. How should the military support reconstruction and stabilization policy and strategy? This requires actions in Doctrine, Organization, Training, Materiel, Leadership, Personnel and Facilities. This could be a significant resource issue. Leaders must be able to make informed decisions. Need to clearly understand other agencies capabilities and intent with respect to this area. 10. How do information and info ops support and nest within stability operations? Information operations and strategic communications must be informed by data and send consistent messages. Inconsistent or late info ops and strategic comms make US look bad and can be exploited by rivals and opposition media. Can problem be solved analytically?

26 26 OutbriefIW II Working Group # 2 Challenge Areas Challenge Area DescriptionSeverity \ ImpactDifficulty with Solutions 11. Do the joint and service task lists sufficiently address the range of activities required to conduct joint stability operations? Must ensure unit missions and Mission Essential Task Lists are updated and that doctrine and training are appropriate. Supply of forces of appropriate type and capability may not be available to meet demand from COCOMs. -Army and Joint Task lists recently reviewed as part of Army’s SO Capabilities Based Assessment. -Must review other service task lists. 12. Is the military’s current approach sufficient for operations where the focus is on “relevant populations” and not an enemy force? This requires actions in Doctrine, Organization, Training, Materiel, Leadership, Personnel and Facilities. This could be a significant resource issue. Leaders must be able to make informed decisions. -Must be able to identify status of military’s ongoing efforts to assess. -Can problem be solved analytically? -How do you measure “sufficient” approach?

27 27 OutbriefIW II Working Group # 2 Challenge Areas Challenge Area DescriptionSeverity \ ImpactDifficulty with Solutions 13. How do we best determine appropriate MOEs and MOPs for full spectrum operations? Must identify Metrics that cover range of sectors, include strategic to tactical level, cover immediate response, transition, and sustaining efforts. Can they be modeled and simulated Without proper metrics to measure progress, no way exists to determine whether certain projects or interventions are working or remain appropriate. -Can metrics be modeled and/or simulated? -If some metrics are “qualitative” how do you evaluate? 14. How does the military support emerging security initiatives and DOD policy on Security Sector Reform? Must identify how support requirements change with changing policies in this area. This could be a significant resource issue. Leaders must be able to make informed decisions.

28 28 OutbriefIW II Working Group # 2 Challenge Areas Challenge Area DescriptionSeverity \ ImpactDifficulty with Solutions 15. How does the US military’s approach nest within the emerging body of interagency, intergovernmental, and allied approaches to reconstruction and stabilization? Must identify how force requirements change with changing approaches in this area. This could be a significant resource issue. Leaders must be able to make informed decisions. Can problem be solved analytically?

29 29 OutbriefIW II Working Group # 2 Methods, Models, Simulations Method\Model\ Simulation DescriptionAreas addressedCurrent Status 1. Integrated Gaming System -Flexible definition of infrastructure and factions. -Stochastic. -Faction satisfaction. -Infrastructure status. -Military impacts. -Operational insights. In use. 2. PSOM - Strategic and Operational level training assessment. -Social Behavioral response. - Stochastic \ Deterministic. -High level pol/mil. -Gain insights on operational impacts of high level decisions and resource allocations. In use. 3. Nexus Network Learner -Societal Assessment -Bayesian. - Stochastic. - Agent Based. - Modular. - Adaptive to data and other models. -Assess DIME impact (different COAs) on social changes. - Examine modification of behavior. In use with continuing development.

30 30 OutbriefIW II Method\Model\ Simulation DescriptionAreas AddressedCurrent Status 4. Primary Force Estimator (PRIME – ATLAS model) - Task based using approved rules of allocation. - Includes geospatial considerations. - Quick turn results. - Deterministic. Army forces only.Under development. 5. Wargaming - Human in the loop board game method. - Focused on Security. - Regional and overall Theater focus. -Assess force levels required to respond to different levels of violence. -Integration with other DIME aspects of campaign. -Established and in use. -Expansion of capabilities underway. Working Group # 2 Methods, Models, Simulations

31 31 OutbriefIW II Method\Model\ Simulation DescriptionAreas AddressedCurrent Status 6. Task Event Outcome (TEO) IW Analysis - Tactical focus. - Human in the loop Wargame. -Quick turn around. -Ties in to Lines of Effort (LOE) - Analyze changes in organization equipment and TTP. -Identify required changes in MOE (different metrics). - Capture, analyze and disperse experience/ information gained in the field. -Link actions to LOE to higher PMSEII states. Requirements under development. 7. Workshops - Provides an established structure to examine non- quantitative issues. -SME-based. -Senior level reviews. -Investigate a wide variety of issues related to stability operations. Available and in use. Working Group # 2 Methods, Models, Simulations

32 32 OutbriefIW II Method\Model\ Simulation DescriptionAreas AddressedCurrent Status 8. Contingency Operations Tiger Team (COTT) -Provides recommendations relating to USACE and ERDC R&D, analysis and studies for reconstruction, stability, contingency, aid, and relief efforts. - Understand challenges and build collaborative solutions to complex problems in reconstruction and stability efforts. -Develop, identify, and validate potential R&D solutions to strategic and mission-level stability and reconstruction challenges. -Link capabilities from different sources or programs. COTT formed, collaborations being established. 9.Agent-based model for cultural geography in SO - Stochastic - Agent Based - Stand alone tool - Tactical focus - Provide evaluation of impact of SO infrastructure projects on social perceptions. Under development. Working Group # 2 Methods, Models, Simulations

33 33 OutbriefIW II Working Group # 2 – Session 2 Assessment (A)=Available Tool

34 34 OutbriefIW II WG2 Findings & Suggestions u ♦ Findings: – −Even though everyone agrees that Stability Operations requires whole of government, non-government, coalition, and host nation/public participation, most of our methods, models, and techniques do not account for all of them – −It appears that many of the challenge areas are indirect results of an absence of overarching strategies and goals – −It is hard to understand how some tools, methods, and models work without common terms of reference – the same is true for data u ♦ Suggestions: – −Develop common terms of reference for understanding how tools, methods, and models work and for describing data – −Ensure future collaboration efforts continue and expand to include the entire SO community-of-interest

35 35 OutbriefIW II WG 3:IO/PSYOP/SocSciences Findings/Recommendations u WG3 Charter: Improve the foundations of information operations /PSYOP analysis; identify existing analytic capabilities and shortfalls; explore the application of quantitative and qualitative methods for improving analytical capabilities; evaluate and recommend concrete applications. u Key Takeaways: – A coherent taxonomy and lexicon of IO is required: Analysts and operators must use the same set of definitions – Models, methods, and tools must provide mechanisms for learning, understanding of the problem, not prediction – Coordinate PSYOP across related combined, joint, and inter agency arenas – Develop robust case studies which capture a full problem set to greatly benefit exercises, education, and training – Non-kinetic assessment (MOP, MOE) must be in the initial plan – Key gaps in PSYOP capabilities must be resolved by other means (traditional social sciences, ORSA approaches may assist): Red teaming, Evolutionary development of M&S, Enhanced Wargaming (Phase 0), Human terrain and media analysis

36 36 OutbriefIW II Analytic Design Issues u How do we appropriately choose models methods and tools for OD in PSYOPS? – Generic tools that can be fine-tuned to the situation through social – discourse (like the MpiCE project – Measuring Progress in Conflict Environments which provides a list of MOEs for organization’s SMEs to choose from to tailor to specific situation). – Develop different solutions that you can test – Know the TYPE of your problem – Test and compare using same data sets – Get a conformal standardized data set u What disciplines should be on the team? How do we choose the right ones and access them? – Analytic Ability/skills regardless of field – Open-minded and able to work across disciplines – Familiar with both military and OD process – Have both field and background analysis capabilities

37 37 OutbriefIW II Analytic Design Issues u What is the appropriate approach to measure effectiveness? What else needs to be measured? – Step 1: Know the intent of campaign or conditions to be changed – Step 2: then you can set measures up front and constantly refine over time (iteratively) u How should we study outcomes of our actions? – COORDINATE – form friendly network of interservice, interagency, govt, private partners – Tailor to sub-groups and integrate – Do in steps – eg – how much closer did I get to the goal? (eg – goal 50% positive polling – track trends from beginning) – Give your partners the collection requirements so they can collaborate – Don’t rely on a single measure (eg – not just polling) – There should be different measures for different timeframes – short/medium/long – Short – single behavior events (eg – vote, obey curfew, etc) – Medium – trends in behavior (eg. Calling a reporting hotline) – Longer term – attitudes underlying (Must understand what attitudes underly your objectives and then what behaviors reflect these attitudes iot measure them) – Address both good and bad outcomes – Cannot measure attitudes directly (polling can help but is not entirely reliable) –

38 38 OutbriefIW II Analytic Design Issues u Gap: need to fund longer-term studies on what kinds of observable behaviors reflect the attitudes we are likely to seek (eg – what behaviors underly acceptance of a “market democracy”?) u Further issue: giving people something positive, something to say “yes” to –something which reflects their self-interests and values. This approach might be more effective (can sponsor studies to determine) but also more likely to provide the types of objectives which lend themselves to observable/measurable behaviors.

39 39 OutbriefIW II WG 4:Counterinsurgency Findings/Recommendations u WG4: Counterinsurgency Charter: Explore Various Analytical Tools And Methods For Use In Planning And Conducting Counterinsurgency u Findings/Recommendations: – COIN analytical techniques applicable to general purpose forces (GPF) are equally applicable to special operations forces (SOF) – Strengths and weaknesses of COIN analytical support for GPF are the same for SOF – Tools include: Deployed analysts, Human-in-the-Loop (HITL) computer-supported wargaming, COIN M&S not mature (caution on ability to model human behavior) – Recommend USSOCOM develop a structure to provide analytical support to COIN forces – Recommend USSOCOM consider interdisciplinary teams – Recommend SOF training/education/familiarization with benefits of analytical support

40 40 OutbriefIW II u COIN analytical techniques applicable to general purpose forces (GPF) are equally applicable to special operations forces (SOF) – IED Analysis, Polling, Social Network Analysis, ISR Network Analysis, Assessment Analysis, Trend Analysis, Criminal Activity Profiling, RIO Analysis, Etc. u Strengths and weaknesses of COIN analytical support for GPF are the same for SOF – Assessing influence on population u Problem Area – at times data obtained from host nation untrained collectors Present – COIN Execution

41 41 OutbriefIW II u Human-in-the-Loop (HITL) computer-supported wargaming – Adequate way to provide insights now – Federations of specialized simulations – Wargame Integration toolkits – Must use caution; not mature enough for some contexts u Models and Simulation – Warm and fuzzy – not! – Emerging but still in its infancy Present – COIN Planning

42 42 OutbriefIW II u Substantial efforts ongoing – M&S as well as Non M&S – Data is problematic across the board u Context specificity – Strategic, operational, tactical » Difficult to separate analytical implications between levels of war (“Strategic Corporal”) » Realize need for a conceptual framework for understanding and integrating causality across all levels of analysis » Iterative process/dialogue Present – COIN Planning

43 43 OutbriefIW II u Recommend USSOCOM develop a structure to provide analytical support to COIN forces – Established during planning – every operation is different – Diverse operating environments – varying footprints – Reachback analytical support – Support through GPF - when GPF are available u Recommend SOF training/education/familiarization with benefits of analytical support Future - COIN Execution

44 44 OutbriefIW II Future – COIN Planning u Recommend USSOCOM consider interdisciplinary teams – Centralized – Decentralized – Hybrid u Recommend USSOCOM look into a conceptual analytical framework to provide analytical support to USSOCOM COIN planning – Mr. Miller’s Trinity (crime, migration, extremism) » Left of boom » Forecast next hot spot » Correlation, not causality

45 45 OutbriefIW II WG 5:Thinking Models Findings/Recommendations u WG5 Charter: Frame the context of the IW problem properly, break down IW operations into its natural components, and investigate the subject through discourse and the application of systems thinking. u Findings: – Many ways to see/represent IW – different languages/logic – Lack of common terms/understanding about IW – IW analysis at strategic/operational/tactical may require different cognitive models/techniques/representations – Modeling is difficult – must learn to think differently – Focus on uncovering indirect opportunities – Need tools to improve research capabilities that enhance thought and shared understanding – Need decision makers to shape/provide guidance: » frame problem » visualization – make the whiteboard a “group thinking pad” » acquire a depth of understanding – The Operational Design process: » requires continuous learning » provides insight, not answers » expect some risks » Identifies what we know and don’t know about the problem

46 46 OutbriefIW II Gaps u There is a gap between our analytical capability and our commander’s operational needs u The repository of the IW “body of knowledge” has not been clearly identified (IW online Library) u There is a relational, supportive, and authority gap between the military and “the interagencies” on IW u We do not understand interagency lines of communications u We don’t understand how to balance government capacity for “restoration of services,” security, or economic development u We do not know the modeling requirements for IW analysis u Many do not know about IW Community Hubs, Potential Data sources or samples of IW Activities available by Joint Data Support

47 47 OutbriefIW II Key Issues & Discussion Items u Our current metrics don’t capture the qualitative aspects of conflict that commanders need u We have voids in our data and very little cause and effect data (e.g., temporal effects require years/decades of observations) u There is no “owner” of a common lexicon u We lack sufficient analysts/SMEs with DIMEFIL (Diplomatic, Informational, Military, Economic, Financial, Intelligence, Law Enforcement) experience u Identifying the differences between “indicators” and “effects” and understanding some effects are not quantifiable (e.g., measuring persuasion and influence) u We have not retained our history of IW, how do we bring it back—we need to leverage that operational experience and those earlier insights u There are different levels of IW that require very different tools

48 48 OutbriefIW II Recommendations u Identify, create and sustain credible IW data – It will require iteration to decide on the data needed and to characterize it (e.g., metadata; pedigree) u Develop a lexicon of key terms – Current definitions are not acceptable to the interagency, coalition partners u Continue the dialogue on MmTs to support IW analyses – This workshop represents a significant step forward – More dialogue is needed w/ whole of government participation u MORS provide a forum to help organize the needed information – Create a common template to compare and contrast key IW models and tools – Continue to support efforts to identify key gaps and priorities to guide future actions u MORS and Sponsors assist in bringing the various IW Communities of Interest (COI) together; e.g., – IW Working Group – Human, Social Cultural Behavior (HSCB) modeling – MORS Social Science Community of Practice (COP) u Support Service initiatives to put Operations Research Analysts in SOF operational staffs u Invite more allies and the interagency to these meetings

49 49 OutbriefIW II Questions?

Download ppt "1 OutbriefIW II Chair: Renee G. Carlucci, CAA Co-Chair: Don Timian, ATEC MORS Irregular Warfare II Analysis Workshop 3 - 6 February 2009 Davis Conference."

Similar presentations

Ads by Google