1 / 19

The Impact of eGovernment in Europe Helsinki 13 September 2006

The Impact of eGovernment in Europe Helsinki 13 September 2006. “ Bench-learning in government – reflections from eGEP experience and beyond ” Cristiano Codagnone Professor, Milan State University Consultant, RSO SPA cristiano.codagnone@unimi.it ccodagnone@rso.it. Agenda. eGEP experience:

ayala
Télécharger la présentation

The Impact of eGovernment in Europe Helsinki 13 September 2006

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Impact of eGovernment in EuropeHelsinki 13 September 2006 “Bench-learning in government – reflections fromeGEP experience and beyond”Cristiano Codagnone Professor, Milan State UniversityConsultant, RSO SPAcristiano.codagnone@unimi.itccodagnone@rso.it

  2. Agenda eGEP experience: Why bench-learning Bench-learning design & principles Introduction "eGEP was an EU-funded study on eGovernment impact, running from 2005-2006 (see www.rso.it/egep )

  3. Irreconcilable positions? • A single figure to benchmark several countries is: • Reality distilled in manageable form for policy consumption • Apples and pears compared, context and processes overlooked, policy-learning and transfer obliterated • Bench-learning is bottom-up benchmarking • A learning process building transformative capacity

  4. Hard facts, soft measurement? • 36.5 bln for ICT public administration expenditure in 2004 • ICT could increase public administration productivity and boost EU25 GDP by up 1.54% (2005-2010) • Beyond supply side benchmarking? • Yes, but data, feasibility, and comparability challenges

  5. Private vs. public benchmarking • Private sector: • Unit of analysis: organisations • Voluntary and flexible tool, good data availability • Public sector national and international: • Unit of analysis: organisations, public policy, policy systems • Cooperation, comparability, feasibility challenges • From management tool to regulatory instrument

  6. Overall score Level Type Difficulties score 4 4 4 International (EU25) Policy system benchmark • Cooperation • Comparability • Feasibility HIGH 3 3 3 Member State (holistic) Public policy benchmark • Cooperation • Comparability • Feasibility MEDIUM-HIGH Member State (within vertical and/ or region) Organisational benchmark • Cooperation • Comparability • Feasibility 2 1 3 MEDIUM Individual public agency (voluntary) Measurement/ Internal benchmark • Cooperation • Comparability • Feasibility 1 0 2 LOW 4=High; 3=Medium-high; 2=Medium; 1=Low; 0= null Impact measurement difficulty

  7. Why Bench-learning? • For benchmarking only 10 eGEP indicators: • The simplest and more comparable • Bench-learning 1: • To focus on most sophisticated impact indicators • To build measurement capacities bottom up • Bench-learning 2: • Opportunity to look at processes complexity • Identify enabling and hindering factors

  8. Bench-learning = bottom-up benchmarking • Bench-learning is: • Voluntary, bottom up and learning oriented • Flexible, with no need of uniform rigid indicators • Gradually scalable from micro to meso and macro • Groups of similar organisations • Groups of similar verticals / regions • Groups of similar countries • Builds capacity for a portfolio of measurement tools • Variable benchmarking for horizontal policy learning • Traditional EU25 benchmarking • Micro/meso level measurement and benchmarking

  9. Bench-learning Groups: start simple • Micro level only: single public organisations • 1 champion plus 3-4 learning organisations • Groups assembled from similar countries • Leverage existing collaboration networks • Two Options • Central agencies in same vertical (holistic) • Local governments in one area (functional) • Third party facilitators (EU contractors) • intense and in depth work

  10. Ownership and governance • Voluntary participation • Participant self-interested in capacity building and learning • Clear mandate and leadership buy in • Groups to be assembled not by facilitator • Multi-stakeholders but firm governance • Exchange and consensus • But with clear lines of accountability

  11. Four inspiring principles • Mainstream measurement • Know where you start from • Use multidimensional metrics • Don’t just measure – manage!

  12. Mainstream Measurement • Measurement mainstreamed to strategy/policy • Not managed in isolation by an IT department • Measurement targets derived from objectives • Not from technicalities • Responding to law and policy directives can be a measurement target • Require qualitative metrics, possibly less comparable

  13. Know where you start from • Define a baseline for each metric, i.e.: • FTE cost of existing processes • Waiting times for constituency • No meaningful measurement without baseline • Processes and workflows analysis: a must • Needed to define baseline for some metrics • Help identify key success factors or bottlenecks

  14. Use multi-dimensional metrics • One-dimensional financial metrics inadequate to fully capture the public value ICT can deliver • Match metrics to objectives: • Hard cash values • Opportunity values • Volume metrics • Qualitative metrics

  15. Don’t just measure – manage! • Measurement not as one-shot exercise for investment decision • Use it through the lifecycle of the project • to keep costs down • to ensure that expected benefits are delivered • Measurement as pillar of strategic management • Not only for accountability but also for… • … strategic decision making and control, team motivation and risk management

  16. Annex Selected Bibliography on Benchmarking and the Open Method of Coordination

  17. Arrowsmith, J., K. Sisson and P. Marginson (2004). What can ‘benchmarking’ offer the open method of coordination?, Journal of European Public Policy, 11(2), pp. 311-328. Borrás, S. and B. Greve (2004). Concluding remarks: New method or just cheap talk?, Journal of European Public Policy, 11(2), pp. 329-336. Borrás, S. and K. Jacobsson (2004). The open method of co-ordination and new governance patterns in the EU, Journal of European Public Policy, 11(2), pp. 185-208. Bowerman, M., G. Francis, A. Ball, and J. Fry (2002). The evolution of benchmarking in UK local authorities, Benchmarking: An International Journal, 9(5), pp. 429-449. • Cox, A. and I. Thompson (1998). On the Appropriateness of Benchmarking, Journal of General Management, 23(3), pp. 1-20. • Dattakumar, R. and R. Jagadeesh (2003). A review of literature on benchmarking, Benchmarking: An International Journal, 10(3), pp. 176-209. • De la Porte, C. (2002). Is the Open Method of Coordination Appropriate for Organising Activities at European Level in Sensitive Policy Areas?, European Law Journal, 8(1), pp. 38-58.

  18. De la Porte, C., Ph. Pochet and G. Room (2001). Social benchmarking, policy making and new governance in the EU, Journal of European Social Policy, 11(4), pp. 291-307. Dolowitz, D.P. (2003). A Policy-maker’s Guide to Policy Transfer, The Political Quarterly, 74(1), pp. 101-108. Dolowitz, D. and D. Marsh (2000). Learning from Abroad: The Role of Policy Transfer in Contemporary Policy-Making, Governance, 13(1), pp. 5-24. • Dorsch, J.J. and M.M. Yasin (1998). A framework for benchmarking in the public sector. Literature review and directions for future research, International Journal of Public Sector Management, 11(2/3), pp. 91-115. • Kaiser, R. and H. Prange (2004). Managing diversity in a system of multi-level governance: the open method of co-ordination in innovation policy, Journal of European Public Policy, 11(2), pp. 249-266. • Kastrinos, N. (2001). Contribution of socio-economic research to the benchmarking of RTD policies in Europe, Science and Public policy, 28(4), pp. 238-246.

  19. Lundvall, B.-Å. and M. Tomlinson (2001). Learning-by-comparing: reflections on the use and abuse of international benchmarking. In: G. Sweeney (ed.), Innovation, Economic Progress and the Quality of Life, Cheltenham: Edward Elgar, pp. 120-136. Lundvall, B.-Å. and M. Tomlinson (2002). International benchmarking as a policy learning tool. In: M.J. Rodrigues, The New Knowledge Economy in Europe. A Strategy for International Competitiveness and Social Cohesion, Cheltenham: Edward Elgar, pp. 203-231. Magd, H. and A. Curry (2003). Benchmarking: achieving best value in public-sector organisations, Benchmarking: An International Journal, 10(3), pp. 261-286.. • Radaelli, C.M. (2003a). The Open Method of Coordination: A new governance architecture for the European Union?, Report 2003:1, Stockholm: Swedish Institute for European Policy Studies. • Room, G., “Policy Benchmarking In The European Union: Indicators and Ambiguities”, in Policy Studies, Vol. 26, No 2, 2005, pp. 117-132 • Schütz, H., S. Speckesser and G. Schmid (1998). Benchmarking Labour Market Performance and Labour Market Policies: Theoretical Foundations and Applications, Discussion Paper FS I 98 – 205, Berlin: Wissenschaftszentrum Berlin für Sozialforschung

More Related