1 / 23

National environmental indicators: measuring what matters?

National environmental indicators: measuring what matters?. Elisabeth A. Graffy ASU -- March 30, 2007. Guideposts. Summarize a national initiative to design a national indicator system Purpose, Participants, Process Focus in on challenges, tensions, paradoxes

peyton
Télécharger la présentation

National environmental indicators: measuring what matters?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. National environmental indicators: measuring what matters? Elisabeth A. Graffy ASU -- March 30, 2007

  2. Guideposts • Summarize a national initiative to design a national indicator system • Purpose, Participants, Process • Focus in on challenges, tensions, paradoxes • Discuss current and potential role(s) for academia

  3. Why design a system for national environmental indicators? • Respond to claims of unmet needs for national indicators in policy, management, and public discourse. • Despite many indicator efforts and assessments, knowledge remains • Fragmented and incomplete • Inaccessible • Incomplete and/or not sufficiently relevant • Of mixed quality and trustworthiness

  4. Some leading U.S. indicator efforts • Heinz Center: State of the Nation’s Ecosystems • NAS Key National Indicator Initiative (aka State of the USA) • EPA’s Report on the Environment • Sustainable Resource Roundtables (4) • GAO and OSTP report dozens of federal assessments, indicator systems • (Global, state, local, corporate efforts)

  5. Rationale for a national system “The Nation does not now produce complete, consistent, and credible statistics and indicators about environmental conditions and trends that are needed to guide government and business decisions and to inform public discourse.”

  6. Major participants in dialogue • Federal agencies • CEQ, EPA, DOI, NOAA, USDA-NRCS, FS • Indirect: GAO • Non-federal entities • Heinz Center, State of the USA, NCSE • State and local government • University and Business • National Academy of Public Administration

  7. Participants emerged over time CEQ Interagency Indicator Coordination Academic Inputs Federal Agencies: DOI, EPA, USDA, FS, NOAA Corporate Inputs Roundtables: Forests, Rangelands Minerals, Water Agency Reports Heinz State of the Nation’s Ecosystems NAS KNII Stakeholders & Experts Stakeholders & Experts Collaborative Process Congressional Inputs Collaboration on Indicators on the Nation’s Environment and Natural Resources (CINE) Planning Group

  8. Common aspirational goal Complete, credible, and consistently reliable • National (scalable) • Routinely used by policy-makers, businesses, and citizens • Increasingly trusted over time • Match IT and use/access trends

  9. Core National Indicators Proposed operational goal Primary Goal • Achieve consensus among diverse partners about selection of Core National Indicators and process for periodic review and updating • Ensure consistent production and reporting of these indicators • Ensure the reliability of related statistical and data activities Public Discourse Policy, Planning and Management Indicators Corollary Responsibilities • Align priorities, protocols • Support consistency of tiers • Promote broad public access Inventory and Monitoring Data and Indicators

  10. Process of dialogue and design • Several years of intermittent activity • Now, coordinated series of meetings • Define federal interests and role(s) • Define non-federal interests and role(s) • Develop agreement on feasible options • Develop implementation strategy that successfully accommodates politics, Politics, organizational change…. • Focus on institutions, not indicators

  11. Implied design tensions • Conceptual • Institutional • Informational • Political

  12. Statistics Indices Aggregated data/bu Question-driven/td Science-based Values-based Policy-defined Valid, reliable Fact-based Comprehensive Selective Progress markers Descriptive - status Diagnostic - problems Rational decision tools “Truth to power” Dispute resolvers Myth-dispellers Collaborative origin Policy-relevant Apolitical Conceptual Tensions: What are indicators? (1)

  13. Dispute enhancers Translated scientific knowledge Co-produced/joint knowledge Boundary objects Serviceable truths Usable knowledge Metaphoric, symbolic Conceptual Tensions: What are indicators? (2)

  14. A problem • Indicator practitioners may be only loosely aware of the broader (and potentially very broad) intellectual context. • Disciplinary theorists may be only loosely aware of potentially broad intellectual context and opportunities to contribute.

  15. Policy, Planning and Management Indicators Inventory and Monitoring Data and Indicators Conceptual Tensions: What are indicators for? Public Understanding Social Learning, Action Synthesis, narrative Fusion of technical, cultural, economic, spiritual, … Salient Topics Core National Indicators Public Understanding Social Learning, Action Parameters of legitimized common knowledge Accountability to specified goals Data collection Expand frontiers of knowledge Relation of Indicator Type to Potential Social Functions

  16. Inventory And Monitoring Data and Indicators Conceptual Tensions: Who are indicators for? Salient Topics Generally Informed Public Core National Indicators Public and People with Topic or Issue Interests Policy, Planning And Management Indicators Managers And Policy Wonks Scientists Relation of Indicator Types to Potentially Interested People

  17. Conceptual Tensions: Independence, relevance, & politicization • Do environmental condition indicators relate to program planning, performance? • Should indicators address “hot” issues? • Should indicators be about ecological only or economic and social aspects, too? • Should indicators encompass condition only or causes and implications, too? • “Build it & they will come” or “build to suit”? • What are the real risks and for whom?

  18. Institutional Tensions: Hydra • Existing programs are dispersed, autonomous • 58 Heinz national indicators used data from 20 Federal programs in 15 agencies in 6 departments • Decentralized by historical design (eg., fires) • Many co-existing missions with constituencies • How much change is needed and acceptable? • Federal and non-federal interests exist • What roles, rights, authorities can/do each assume? • Science and policy domains are inevitable • What model(s) guide(s) this interface or boundary?

  19. Institutional Tensions: Language • Institutional arrangements: defined roles and responsibilities, not an “agency” • Vision: the aspiration • Goal: the concrete results to be achieved • Critical Functions: tasks/abilities/activities that are minimally necessary for goals • Design: the process of defining roles and responsibilities • Criteria: enables goal-based comparison and evaluation of design options

  20. Informational Tensions: DRIP • A lot of information --uncoordinated, fragmented, unaligned, incomplete • Heinz Center Report:½ of 103 indicators w/data • More data are collected than analyzed • Redundancy and inquiry vs. targeted needs and scarce resources? • What is “national” about an indicator? • Geographic scale, iconic value, ecological uniqueness, “sentinel”, constituency • Everglades, bald eagle, children, mercury, bees, Lake Tahoe, Sky Islands

  21. Political Tensions: First Steps • Poor understanding of conditions for • Acceptance by involved entities • Legitimacy in all relevant sectors • Authority sufficient for functional capacity • Ability to weather administrations, Congresses, and controversies • Lessons from other cases, literature, and consultations is inconclusive • Default stance: incremental, low-risk • Loss of relevance or long-term viability?

  22. Role(s) for academia • Take a broad view of whose research matters: natural sciences, organizational change & strategy, sociology, public affairs, law, political theory, science policy, resource management, communication, history, media • Help develop common concepts, terminology, methods • Promote innovation and experimentation that links theory and practice

  23. Role(s) for academia • Why bother? • Growth areas for research • Differentiates and prepares students • May have real-world impact

More Related