1 / 17

Measuring Operational Effects in Afghanistan

Measuring Operational Effects in Afghanistan. Dr Sam Duff (HQ RC(S) JEAC OA, Oct 2009 – May 2010) August 2010. DSTL/CP49589. Presentation Outline. Introduction to RC South Assessment System Development Top Down Approach Bottom Up Approach Meeting in the Middle How it worked

teresabrown
Télécharger la présentation

Measuring Operational Effects in Afghanistan

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring Operational Effects in Afghanistan Dr Sam Duff (HQ RC(S) JEAC OA, Oct 2009 – May 2010) August 2010 DSTL/CP49589

  2. Presentation Outline • Introduction to RC South • Assessment System Development • Top Down Approach • Bottom Up Approach • Meeting in the Middle • How it worked • Why it worked • Analytical Process (?) • Outcomes / Impact • Questions

  3. Introduction to RC South

  4. Introduction to RC South HQ ISAF HQ IJC RC NORTH RC WEST RC SOUTH RC EAST RC CENTRE TASK FORCE URUZGAN TASK FORCE HELMAND U PRT H PRT Afghan National Police Afghan National Army TASK FORCE ZABUL TASK FORCE KANDAHAR Z PRT K PRT TASK FORCE TASK FORCE USAID TASK FORCE TASK FORCE TASK FORCE TASK FORCE

  5. OA within HQ RC(S) COM RC(S) 2* COS 1* DCOM 1* Dir Fu Plans 1* Dir Ops 1* Dir Sp 1* Dir Stab 1* CJ3 Ops LEGAD CJ5 Plans CIMIC CJ4 Logs CJOC CJ4 Med POLAD ANSF Dev CJ9 CJ3/5 CJ1 CULAD JEAC SOIC SF FP PsyOps CJ7 Training PRISM CJ8 Contracts CJ2 Int Strat Com KIFC KFC

  6. OA within HQ RC(S) JEAC • Role: • To provide the RC(S) Commander, his Command Team and Headquarters with analytical advice to assist evidence based decision making • Any Ad-hoc task or pre-emptive analysis to meet the requirements of the HQ • To provide the Commander with an understanding of the Campaign and the Effects it is achieving in his area of responsibility (CEA) • To serve as a conduit from Dstl UK and UK operational staff to satisfy information requirements from each

  7. What is Campaign Effects Assessment (CEA)? UK doctrine states that CEA is: • “Evaluation of campaign progress, based on levels of subjective and objective measurement in order to inform decision-making”, JDP 3-00, Chapter 4. • “Analysis conducted at the strategic, operational and tactical level to monitor and assess the cumulative effects of military actions with respect to centres of gravity in order to achieve the overall campaign end state”, JDP 5-00, Paragraph 267.

  8. Campaign Effects Assessment (CEA) ctd. • The purpose of the assessment process is to support the commander’s decision making process. • The assessment cycle provides advice directly to the commander and recommends adjustments to the plan, as well as facilitating a closer working relationship between the planning staff and the execution staff. • Key items necessary for this advice include • an overview of the system state, • an overview of operational activity, • issues related to system state • and operational activity, • as well as trends and forecasting.

  9. Assessment System Development • Requirement: • Current HQ measured the progress at the Provincial level, relied upon widely available polling information • New focus on population and ‘softer’ side of the Campaign • Multiple stakeholder environment needed to be considered • COM RC(S) 2* General wanted to revisit how the effects he wanted to achieve, were being achieved.

  10. Assessment System Development – Top-Down Approach • Needed to cover: • ISAF Lines of Operation • HQ RC(S) Decisive Conditions • HQ RC(S) Supporting Effects • Involvement of military planners RC(S) DCs RC(S) SEs ISAF LOOs

  11. Assessment System Development – Bottom Up Approach • Indicators of Society • Cover military and non military areas • Governance • Economic activity • Insurgent activity • Observed behavioural as well as polled perceptions • Collectable on a regular basis • Single variable change per indicator • 5 point step change scale with • Clear descriptor for each point on the scale • Stakeholder inclusion TF, PRT, USAID, Indicators

  12. Assessment System Development – Meeting in the Middle • Taking indicators and aligning them with the right SEs in the right combination to give the MOE for each SE • Had to involve military planners • Had to involve stakeholders • Needed high level endorsement TF, PRT, USAID, Indicators RC(S) DCs RC(S) SEs ISAF LOOs RC(S) MOEs

  13. How it Worked Reviewed by TFs Indicators Consolidate Review and Verify MOE, SE and DC scores calculated Review and Verify Changes tracked and researched Issues Elicited District 1 Indicators Issues Elicited Presentation prepared Reviewed by Branch Reps Reviewed by Branch Chiefs Indicators 24 DISTRICTS INDICATOR MATRICES 24 DISTRICTS INDICATOR MATRICES Correlation Analysis Indicators Reviewed by PRTs Changes tracked and researched MOE, SE and DC scores calculated Consolidate Review and Verify 1* Review Issues Elicited District 2 Indicators Geospatial Analysis Indicators Presentation Given to Comd Group Revised Presentation Prepared and Reviewed 2* Review and down-selection Actions taken by Comd group Analysts Process review and Lessons Identified for next iteration

  14. Why it Worked • Understood Commander’s intent and pressures • Worked with military • Achieved stakeholder buy in from all communities • Was endorsed by COM RC(S) throughout • RC(S) wide agreement and recognition of the need for a structured mechanism to measure progress.

  15. Analytical Process (?) • Validation of methodology for information capture • Verification of score allocation • Multi point v & v of information collated, issues elicited • Rigorous means of qualifying/ justifying scores allocated • TF / PRT / USAID Review • Desk Level Review • Branch Chief Review • 1* Review • Correlation of indicators (possible relationships or focus on outliers) • Geographical analysis of issue clusters • Further research into ‘new’ issues • Auditable process • Rendered Commanders accountable for enabling and assisting progress in their Areas of Responsibility (AORs)

  16. Outcomes / Impact • A Regional Level view of the situation on the ground • A means of communication from ground to RC level • An opportunity for Command level ‘Campaign’ steering • Means of holding TF Commanders accountable for delivering progress in their AORs • So far it has altered: • The positioning of partnered forces • Allocation of assets to our own forces • The allocation of resources for development funding • The transparency of GIRoA finances in some areas • Facilitation of the extension of GIRoA governance processes

  17. Thank you for listening Any questions?

More Related