1 / 22

Selecting Better Portfolios Based on Uncertainty-Adjusted Performance Estimates

This research explores the challenge of portfolio selection in the face of uncertain performance estimates, providing insights on how to account for and minimize the optimizer's curse. Case studies and methodologies are presented to guide decision-makers in selecting better portfolios.

henrymartin
Télécharger la présentation

Selecting Better Portfolios Based on Uncertainty-Adjusted Performance Estimates

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Selecting Better Portfolios Based on Uncertainty-Adjusted Performance Estimates Ahti Salo, Juuso Liesiö and Eeva VilkkumaaDepartment of Mathematics and Systems Analysis Aalto University School of Science and TechnologyP.O. Box 11000, 00076 AaltoFINLAND

  2. Characteristics project portfolio selection • Large number of proposals • Typically dozens or even hundreds of proposal • Only a fraction can be selected with available resources • Even other resources than money may matter (critical competences) • “Value” may be measured with regard to several criteria • International collaboration, innovativeness, feasibility of plans • Reliable information about value is difficult hard to obtain • Different experts may give different ratings • The accuracy of evaluation information may vary from one project to the next

  3. Logic behind the optimizer’s curse • Projects offer different amounts of value (eg NPV) • Estimates about projects’ values are inherently uncertain • Yet decisions must be based on these uncertain estimates • In reality, projects whose values have been overestimated have a higher chance of getting selected • Thus the decision maker should expect to be disappointed with the performance of the selected portfolio

  4. Example – choose 5 out of 12 projects

  5. Value of information and optimality in DA • Optimizer’s curse: skepticism and postdecision surprise in decision analysis (Smith and Winkler, 2006) • Choose one out of many alternatives • Normally distributed values and errors • Positively correlated errors aggravate the curse • Value of information in project portfolio selection (Keisler, 2004) • For some selection rules, the selected portfolio has much higher value than for other selection rules • It pays off to devote attention to the design of the selection process

  6. http://www.finnsight2015.fi/

  7. Emphasis in the Priority-Setting Process Salo, A. & J. Liesiö. A Case Study in Participatory Priority-Setting for a Scandinavian Research Programme, International Journal of Information Technology and Decision Making 5/1 (2006) 65-88.

  8. Relevance to funding agencies • High expectations may not be necessarily met • Eg., biotechnological research in Finland has not lead to the emergence of a strong industrial sector • Management questions: Should projects with higher evaluation uncertainties be selected together with lower evaluation uncertainties? Should the level of uncertainties be explicitly accounted for in project selection decisions?

  9. What if evaluation uncertainties vary across projects? • Projects whose value has been overestimated are more likely to become selected • When the competition is strong, it is likely that more selections will be made from projects with high evaluation errors  these projects become overrepresented • Thus, one should pay attention not only to the estimates but also to the uncertainty of estimates • How can such uncertainties be systematically accounted for?

  10. Example – choose 5 projects out of 12 : Projects with low evaluation uncertainties : Projects with high evaluation uncertainties Estiamte Value

  11. Selection process • Select k out of n projects with the aim of maximizing the sum of the projects’ ‘true’ values μi, i=1,...,n • The values μi are generally unknown • Decisions are made based on estimates Vi about μi Estimates Portfolio selection Values t

  12. Optimizer’s curse in project portfolio selection • Assume that estimates are unbiased • Overestimated projects are more likely to get selected  Resulting portfolio value is less that what the estimates suggest (optimizer’s curse; cf. Smith and Winkler, 2006)where is the index set of the selected projects.

  13. Optimizer’s curse • Choose 10 projects out of 100 • Values i.i.d with • Unbiased estimates µi ~ N(0,1) Vi = µi + εi, εi ~ N(0,σ2)

  14. Optimal revision of the estimates Estimates do not account for the uncertainties Use Bayesian revised estimates instead as a basis for project selection For , the estimate and the prior mean mi are weighted according to their respective levels uncertainties, where

  15. Selections based on revised estimates : Projects with low evaluation uncertainties : Projects with high evaluation uncertainties Arvio E[V] Arvo

  16. Elimination of optimizer’s curse Portfolio value Standard deviation of estimation error µi ~ N(0,1) Vi = µi + εi, εi ~ N(0,σ2) With revised estimates, the optimizers’ curse is eliminated that is where is the index set of projects selected based on revised estimates Previous example • Choosing 10 projects out of 100 • True values i.i.d. with • Unbiased estimates

  17. Revised estimates and portfolio composition In this example, the projects’ values were identically distributed and the estimation errors had equal variances In this case, the priorization of projects remains unchanged when using revised estimates, because In general, revised estimates may result in a different project prioritization than the initial estimates

  18. Example on the revision of estimates Vi = µi + εi, εi ~ N(0,0.52) Vi = µi + εi, εi ~ N(0,1) Choose 3 projects out of 8 True values are i.i.d. with µi~ N(0,1) Left: All projects are equally difficult to evaluate equal variances of errors Right: Four project are harder to evaluate and have higher variances of error terms  steeper correction slopes

  19. Revision of estimates and portfolio selections • Left: For equal variances, all estimates are revised towards the mean in the same way • Right: More uncertain “dashed” estimates are revised to the mean more strongly • Thus different portfolios of three projects would be selected, depending on whether or not estimates are revised

  20. Portfolio for revised estimates Optimal Revised estimates Estimates • 1) εi ~ N(0,0.1) - small errors • 2) εi ~ N(0,1) - large errors Revised estimates tend to yield higher portfolio value Example • Select 10 out of 100 projects with values µi ~ N(3,12) • Two sub-populations

  21. Share of correct choices Share of correct choices [%] Share of projects with large error variance [%] Using revised estimatesincreases the share of projects that belong to the optimal portfolio, i.e., where K is the index set of the projects in the optimal portfolio There is a statistically significant difference between the portfolios (α=0.05) when the share of projects with high evaluation uncertainties is between 25-55%

  22. Conclusion • Selection based on unrevised estimates • Optimizer’s curse: The value of the portfolio will, on average, be lower than expected • If the proposals come from populations with different levels of estimation errors, the selected portfolio is likely to contain too many projects from the population with high uncertainties • Improving the selection process • Account for evaluation uncertainties by using revised estimates • Build separate portfolios for sub-populations with different levels of evaluation errors (e.g., a separate budget for ‘high-risk’ projects) • But do we know how uncertain the evaluations are?

More Related