1 / 26

Research on Sustainable Development Seminar

Research on Sustainable Development Seminar. Center for International Development Harvard University 9 March 2006. Center for International Development to establish Fund for Sustainable Development.

Télécharger la présentation

Research on Sustainable Development Seminar

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research on Sustainable Development Seminar Center for International Development Harvard University 9 March 2006

  2. Center for International Development to establishFund for Sustainable Development • “In an effort to address one of the world’s most pressing public problems – sustainable development – Harvard’s Center for International Development (CID) and the Ministry for the Environment and Territory of the Italian Republic will work together to create The Fund for Sustainable Development at the KSG. • The fund will support training and research programs on sustainable development and natural resource management with an international orientation and a vision toward achieving shared prosperity and reducing poverty while protecting the environment.”

  3. Global Environmental Assessments:Lessons from History Bill Clark for the Global Environmental Assessment Project (Ron Mitchell, Dave Cash, Nancy Dickson, Jill Jaeger, Alex Farrell, Sheila Jasanoff, Marybeth Long-Martello)

  4. The Problem… • > 200 international environmental treaties • Most requiring periodic science assessments • Through complex processes engaging ’00s – ‘000s • 2-3 completed/yr on climate, ozone, acid rain in 80s/90s • >12 on all topics underway in 2003 • What should we learn from the experience? • Many works advocating particular assessment methods • Growing body of work by reflective practitioners • Benedick, Bolin, Houghton, Mahlman, Jacobs… • Growing number/sophistication of scholarly studies on assessments of single issues providing depth of analysis • Haas, Litfin, Alcamo, Miller, Parson, Morgan… • Fewer comparative studies providing breadth… • Carnegie Commission (1992), OECD Mega-Science (1998) • Andresen et al (2000); Social Learning Group (2001); Young (2002)… • Global Environmental Assessment Project…

  5. Global Environmental Assessment Projecthttp://www.ksg.harvard.edu/gea • Multi-year research and training program • international, interdisciplinary team of faculty (20+) and fellows (30+) • workshops for scholars, practitioners • working papers (50+), published articles (40+), seminars • Global climate change and ENSO variability • Stratospheric ozone depletion • Transboundary tropospheric air pollution • Biological, chemical hazards… • Regional assessments within global change context (fisheries, water, coastal zone) • Summary books • Jasanoff and Martello, eds. (2004) Earthly Politics: Local and global in environmental governance • Farrell and Jaeger, eds. (2005) Assessments of Regional & Global Environmental risks: Designing processes for effective use of science • Mitchell, Clark, Cash and Dickson, eds. (2006) Global Environmental Assessments: Information and influence

  6. Findings: What is an “Assessment”? • A social process linking knowledge and action in public policy/decision contexts... • usually entailing the creation of discrete products (eg. models, forecasts, reports)… • within an institutional framework of rules, norms, expectations (eg. FCCC, LRTAP).

  7. A Conceptual Framework for thinking about Effective Assessments Ultimate Determinants Proximate Pathways Assessment Effectiveness Historical context Saliency Effectiveness Credibility User characteristics Legitimacy Assessment characteristics

  8. A Conceptual Framework for thinking about Effective Assessments Ultimate Determinants Proximate Pathways Assessment Effectiveness Historical context Saliency Effectiveness Credibility User characteristics Legitimacy Assessment characteristics

  9. Finding: What do assessments influence? • Environmental pressures, states, impacts • IIASA RAINS for LRTAP SOx-II • Actors’ agendas, strategies or decisions • Ozone Trends Panel (DuPont) • Issue framing, terms of the debate • WMO/UNEP Villach ’86 Climate assessment • R&D priorities, standards for monitoring • IPCC Special Report on Forest Sinks • … or, more generally, the “Issue Domain” • participants, institutions, behaviors, outcomes… • (Compare Sabatier’s “policy subsystem,” Ostrom’s “actor domain”)

  10. A Conceptual Framework for thinking about Effective Assessments Ultimate Determinants Proximate Pathways AssessmentEffectiveness Historical context Saliency Effectiveness Credibility User characteristics Legitimacy Assessment characteristics

  11. Finding: An assessment is more likely to influence actors’ decisions to the extent that it is perceived to be… • Credible (Is it true?) • of technical arguments to relevant communities • † US CIAP-Impacts vs. WMO ‘Blue Books’ • Salient (Is it relevant?) • to changing needs of specific users, producers • † US NAPAP vs. European RAINS • Legitimate (Is it fair / respectful / accountable?) • or fairness of the process to stakeholders. • †WRI GWP vs. German Enquete I

  12. Findings: SCL Complexities • S,C,L are more “multiplicative” than “additive” • poor perceptions of one cannot be (wholly) offset by good perceptions of others • Tight tradeoffs exist among saliency, credibility and legitimacy due to potential power of findings to support/undermine interests… • most ways of improving one dimension undermine other(s) • Its (relatively) easy to craft an assessment that a single user/country will perceive to be adequately SCL… • the challenge is designing assessments that are simultaneously perceived to meet SCL standards by multiple users/stakeholders with different goals

  13. A Conceptual Framework for thinking about Effective Assessments Ultimate Determinants Proximate Pathways Assessment Effectiveness Historical context Saliency Effectiveness Credibility User characteristics Legitimacy Assessment characteristics

  14. On what do perceptions of salience, credibility, legitimacy most depend? • Context of the assessment • issue characteristics, linkage, attention cycles

  15. Attention to Global Environmental Issues

  16. Findings: On what do perceptions of salience, credibility, legitimacy most depend? • Context of the assessment • issue characteristics, linkage, attention cycles • Characteristics of the user, target audiences • concern, openness, capacity • Implications for changing user, or changing assessments….

  17. A Conceptual Framework for thinking about Effective Assessments Ultimate Determinants Proximate Pathways Assessment Effectiveness Historical context - issue characteristics - linkage - attention cycle Saliency User characteristics Effectiveness - concern Credibility - capacity - openness Legitimacy Assessment characteristics - science/ governance - participation - scope, dissent

  18. Characteristics of theAssessment Process • Institutionalization • Participation; • Treatment of scope, dissent; • Provision for iteration, evaluation, learning

  19. How does the institutionalization of assessment influence effectiveness? • Dilemma: salience vs credibility • enhance communication btw science and policy • protect scientists, policy makers from contagion • Concept: the interface as boundary • not static gulf to be bridged (Carnegie); • rather a dynamic boundary to be negotiated; • embeddedness of assessment institutions

  20. How do participation decisions influence effectiveness? • Dilemma: legitimacy vs value vs credibility • identify, attract, retain relevant participants • “great expectations” vs great numbers • Concept: participation as means to an end • differentiate roles in the process (eg. scoping vs. fact-finding vs. policy advice) • match expectations to institutional capacity

  21. How does the treatment of assessment scope influence their effectiveness? • Dilemma: saliency vs. credibility • Concept: integrated assessments suffer from bounded rationality, vulnerability to deconstruction; dis-integrated assessments provide focused answers to specific questions • Cause/effect vs. impacts vs. policy options

  22. How does the treatment of uncertainty and dissent influence the effectiveness of assessments? • Dilemma: value vs credibility vs legitimacy • Concept: embracing inconclusiveness • insight oriented vs decision oriented assessment • strategies for treating extreme events • strategies for using dissent

  23. Provision for iteration,evaluation, and social learning • There exists a huge variety of experiments in how to do good assessments…. • But the target is moving (changing political context, issue framing, knowledge) … • … and the institutional frameworks tend to be “sticky,” locked in early forms (IPCC); • We don’t learn because its hard… but also because we don’t try (a few exceptions…).

  24. Practical implications…. • Adjust design details for scientific assessments dependent on case, context (attn. IPCC: One size does not fit all… smaller is often better) • Reconceptualize assessment as process of co-production through which interactions of experts and users define, shape, validate a shared body of usable knowledge… • Work for international system of research and assessment, coupling global knowledge and local use through national institutions.

  25. Summary of Findings on Influential Assessments • Assessments vary in the type of influence they have, not just the amount (influence on what?) • Influence of a given assessment varies across audiences (influence on whom?) • Influence for a given audience depends on its attribution of saliency and legitimacy, not just credibility, to the assessment (influence though what pathways?) • Such attributions, and thus influence, are best achieved through processes of “co-production” that involve users in the design of assessments • Successful co-production requires matching capacity of users with demands of assessment (and adjusting both)

  26. Further information… • Global Environmental Assessment Project • http://www.ksg.harvard.edu/gea • Science, Environment and Development Group (CID) • http://www.ksg.harvard.edu/sed/ • Bill Clark • Science, environment and Development Group • Center for International Development • John F. Kennedy School of Government, Harvard University • william_clark@harvard.edu

More Related