400 likes | 634 Vues
Forecasting decisions in conflicts: Best methods for supply chain, competition, union-management, and takeover strategy problems. Ehrenberg Centre for Research in Marketing Monday 15 th November 2010 at 5:00 PM London South Bank University Kesten C. Green
E N D
Forecasting decisions in conflicts:Best methods for supply chain, competition, union-management, and takeover strategy problems Ehrenberg Centre for Research in Marketing Monday 15th November 2010 at 5:00 PM London South Bank University Kesten C. Green Ehrenberg-Bass Institute for Marketing Science International Graduate School of Business University of South Australia kestencgreen.comForPrin.comAdPrin.com
Scientific* forecasting methods • Procedures for making predictions about matters currently unknown, based on: • empirical comparisons of proper alternatives • ex ante tests of accuracy given stated conditions • (e.g. level of knowledge about the situation) • *Used interchangeably with ”evidence-based”
Evidence-based methods Knowledge advances when multiple hypotheses are tested, especially if hypotheses challenge accepted wisdom, e.g.: • Market-share objectives harm profits. • Minimum-wage laws harm low-skilled workers • Regulation harms consumers • Pre-announced satisfaction surveys harm satisfaction • Anti-inflammatory drugs harm head injury patients
Principles for the selection and application of forecasting methods Mid-1990s: Wharton School’s Scott Armstrong started the “principles of forecasting project” to summarize all knowledge about forecasting in the form of scientific principles. A principle is a condition-action statement. This project led to 139 principles as described in the Principles of Forecasting handbook. 39 authors & 123 reviewers The principles (currently 140 in number) can guide the selection and application of methods
ForPrin.com features Descriptions of 140 forecasting principles The Forecasting Canon with 9 key rules Answers to Frequently Asked Questions The Forecasting Dictionary Forecasting Methodology & Selection Trees Forecasting Audit Software Resources for practitioners, educators, researchers Special Interest Groups (SIGs) E.g. ConflictForecasting.com
Why not just ask an expert what will happen? Most decisions in business based on managers’, or other experts’, judgments about what will happen, but… Tetlock (2005): evaluated • 82,361 forecasts • made over 20 years • by 284 professional commentators and advisors on politics and economics Expertise did not lead to better forecasts… (but their excuses are better than novices’!) Tetlock’s finding is consistent with other findings from research on forecasting by unaided experts.
Today’s issue: “predicting decisions in conflicts” Predicting decisions of • parties with • divergent interests, • who interact Note the focus here is on predicting decisions, not outcomes.
Examples of conflict situations • What reactions to a first strike in Iran? • What offer to make in a union/management negotiation? • How to respond to demands by mob protesters? • How to best resolve legal conflicts? • How will proposed water sharing regimes work? • How to design policy for… • benefits? • tax? • job security? • market regulation? …all require predictions of how people will respond.
Water Dispute Syria is filling a new dam on the Euphrates River, thereby slowing the flow of water into Iraq in Spring 1975. Syria and Iraq mass troops on their border, both threatening invasion. Saudi Arabia offers mediation.
Telco takeover bid CenturyTel approaches AllTel with an offer to sell its mobile business. AllTel rejects the offer, offers to pay a 50%+ premium for the whole business. CenturyTel rejects counter offer, and AllTel pursues takover bid.
Methods for forecasting decisions in conflicts Novices Experts Guessing 28% 28% Unaided judgment Role thinking Game theory Structured analogies Simulated interaction
Unaided forecasts from experts on… • conflict management • political science • industrial relations • marketing • judgement & decision making • forecasting • game theory
Unaided judgment method The most commonly used method for predicting decisions in conflict situations. Appropriate when: • experts are unbiased • large changes are unlikely • relationships are known • experts get useful feedback from many similar cases Expert and novice subjects read descriptions and made predictions.
Unaided judgment accuracy Percent Accurate Forecasts Chance UJ-Naive UJ-Experts 17 Artists Protest 5 10 33 5 Distribution Channel 38 10 25 Telco Takeover 0 25 18 27 55% Pay Plan 33 Zenith Investment 36 29 25 Personal Grievance 31 44 33 Water Dispute 50 45 33 68 Nurses Dispute 73 28 29 Averages (unweighted) 32 Bold = more accurate
Effect of experience & time-spent on accuracy Percent correct forecasts <5 yrs5 yrs+ Experience 3629 <30 min30 min+ Time spent 32 35
Do game theorists recommend game theory for forecasting? Yes, judging from • textbooks • papers • consultants’ advertisements Google searches: “game theory” & “prediction” or “forecasting” . . .2,170,000 as of 15 November, 2010 adding “conflicts” to the search, yielded 789,000
Game theorists’ forecasts Game theorists made predictions in response to a request that read “Using game theory to predict the outcomes of conflicts” Hundreds invited; 23 participated Respondents selected only some of the situations, yielding 101 forecasts
Game theorists’ accuracy Percent Accurate Forecasts UJ GT
Use of analogies in forecasting Analogical reasoning commonly used (informally) for prediction 58% of respondents said their organizations used analogies to forecast competitor actions (Armstrong, Brodie & McIntyre 1987) Neustadt & May (1986). Thinking in time: The uses of history for decision makers. Kahneman & Lovallo (1993) example
Structured Analogies Domain experts individually: • list similar situations • rate similarity to target situation • match outcome to target situation An administrator mechanically derives forecast (e.g., select outcome of the most similar situation as forecast). Little prior evidence on structured analogies
Structured analogies accuracy Percent Accurate Forecasts UJ SA2+
Use of structured analogies Percent Correct Unaided experts’ judgment 32 Structured analogies with experts’ predictions 42 Structured analogies with mechanical predictions 46 Structured analogies with experts who “knew better” 25
Number of analogies and familiarity % Correct Number of analogies(forecasts) one 38 (53) two or more 56 (44) Familiarity with analogies indirect 37 (45) direct 49 (50) + two or more 60 (23) Conclusion Use experts who have direct experience with analogous conflicts
Collaborating for analogies Collaborators were more experienced and spent more time than solo forecasters. % correct (forecasts) Solo 44 (75) Collaborative 42 (22) Conclusions: • Collaboration is not effective when using analogies. • But, there is little harm if people want to discuss analogies.
Role thinking process In the Fog of War, Robert McNamara concluded that one should put oneself in the shoes of an opponent. Subjects received information about the situations along with the information about roles and were asked to think about the roles before making a prediction
Role thinking accuracy Percent Accurate Forecasts Role Thinking (Experts) Unaided Judgment (Experts) Role Thinking (Novices)
Simulated interaction (SI) process Brief role description, name badge (2+ roles) One-page situation description Simulate interactions (i.e., role play realistic interactions) Interactions take less than an hour Outcome (decision) of the simulation used as a forecast
Simulated interaction accuracy(Naïve Subjects) Percent Correct Predictions UJ SI (N)
Summary of accuracy Percent Correct Predictions _______ Experts________ Non-experts UJ (E) GT SA2+ SI (N)
Combine forecasts “mechanically”:Avoid face-to-face meetings • A meta analysis of 30 studies found 12% error reduction (3% to 24%) with combinations alwaysmore accurate than typical individual forecasts. • With favourable conditions (election forecasts)*, error reductions averaged between 42% and 50% compared to typical individual forecasts. • Each component must contain some information. *Several valid forecasting methods using different information sources allowing combining within and between methods. Reference Graefe, et al. (2010).
The logic of combining How bad can a second forecast (F2) be and still give an average that is no worse than F1? ------------------------------A--------F1----
The logic of combining 2 How bad can a second forecast (F2) be and still give an average that is no worse than F1? Answer: The error can be the same size, or up to three-times bigger if it is the opposite sign. ------- F2 -----------------------A--------F1----
Accuracy of combined forecasts(across 8 situations) Approx. % Error Reduction Game theorists 31 38 10 Structured analogies (SA) 46 63 31 Simulated interaction (SI) 628868 SA & SI [63 + 88]/2 =75.5 - 88 51 Percent correct Individual Combined
Conclusions • Unaided experts’ judgments little better than those of college students or guessing. • Forecasts by game theorists no better than unaided experts’ judgments. • Structured analogies method provides substantial improvements in accuracy. • Forecasts from simulated interaction were most accurate and the method most flexible. • Combining further improves accuracy.
Possible applications of SA and SI for business and government Forecast responses to alternative strategies in conflict situations: • Labour-management disputes • Competitor, supplier, and distributor behavior • Customer reactions to major changes in product, price or service (e.g., “New Coke”) • Behavior in response to new laws or regulations • Diplomatic and national security problems
Summary • Do not rely on experts’ unaided judgmental forecasts • Do require forecasts: • That are scientific (i.e. from evidence-based methods – see ForPrin.com) • For alternative policies or strategies • Of all effects • Of all costs and benefits
References Armstrong, J. S. (2001). Principles of Forecasting: A handbook for researchers and practitioners. Norwell, MA: Kluwer. Graefe, A., Armstrong, J. S., Jones, R. J. & Cuzan, A. (2010). Combining forecasts: An application to U.S. Presidential Elections. Working paper. Available at: http://dl.dropbox.com/u/3662406/Articles/Graefe_et_al_Combining.pdf Green, K. C. (2005). Game theory, simulated interaction, and unaided judgement for forecasting decisions in conflicts: Further evidence. International Journal of Forecasting, 21, 463-472. Available at http://www.kestencgreen.com/gt_update_in_IJF21.pdf Green, K. C. (2002). Forecasting decisions in conflict situations: a comparison of game theory, role-playing, and unaided judgement. International Journal of Forecasting, 18, 321-344. http://www.forecastingprinciples.com/paperpdf/Greenforecastinginconflict.pdf Green, K. C. & Armstrong, J. S. (2007). The value of expertise for forecasting decisions in conflicts. Interfaces, 37, 287-299. Available at http://kestencgreen.com/green&armstrong2007-expertise.pdf Green, K. C. & Armstrong J. S. (2007). Structured Analogies in Forecasting, International Journal of Forecasting, 23, 365-376. Available at http://www.forecastingprinciples.com/files/pdf/INTFOR3581_Publication15.pdf Green, K. C. & Armstrong, J. S. (2011). Role thinking: Standing in other people’s shoes to forecast decisions in conflicts. International Journal of Forecasting, 27, 69-80. Available at http://kestencgreen.com/group_shoes-2009.pdf Green, K. C., Graefe, A. & Armstrong, J. S. (2010). Forecasting principles. In Lovric, M. (ed.), International Encyclopedia of Statistical Science. Springer [In press]. Tetlock, P. E. (2005). Expert political judgment: How good is it? How can we know? Princeton, NJ: Princeton.