1 / 55

Towards indicators for ‘opening up’ science and technology policy

Stockholm, October 2013. Towards indicators for ‘opening up’ science and technology policy . Ismael R a fols , Tommaso Ciarli , Patrick van Zwanenberg and Andy Stirling Ingenio (CSIC-UPV), Universitat Polit ècnica de València &

tiana
Télécharger la présentation

Towards indicators for ‘opening up’ science and technology policy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Stockholm, October 2013 Towards indicators for ‘opening up’ science and technology policy Ismael Rafols, TommasoCiarli, Patrick van Zwanenberg and Andy Stirling Ingenio(CSIC-UPV), Universitat Politècnica de València & SPRU —Science and Technology Policy Research, University of Sussex Building on work with Loet Leydesdorff and Alan Porter

  2. Paper born out of the reflection on the contrast between interdisciplinary research and journal rankings versus rankings Interdisciplinary maps

  3. On the role of scientific advice in policy(scientometric is the science of science –hence scientific advice) The linearity-autonomy model of scientific advice (Jasanoff, 2011) • Scientific knowledge is the best possible foundation for public decisions • Scientists should establish the facts that matter independently. • S&T indicators produce evidence of these facts. However, this (enlightenment) model has been challenged • The mechanisms to establish facts and make decisions is a social process • “knowledge enables power, but power structures knowledge” (Stirling, 2012) • Modes of advice: The pure scientist vs. honest broker (Pielke, 2007) What is (should be) the role of STI indicators in policy advice? Closing down vs. Opening up

  4. The challengeProblems with current use of S&T indicators • Useof conventional S&T indicators is en *problematic* • (as many technologies, in particular those closely associated with power, e.g. nuclear) • Narrow inputs (only pubs!) • Scalar outputs (rankings!) • Aggregated solutions --missing variation • Opaque selections and classifications (privately owned databases) • Large, leading scientometric groups embedded in government / consultancy, with limited possibility of public scrutiny • Sometimes even mathematically debatable • Impact Factor of journals (only 2 years, ambiguity in document types) • Average number of citations (pubs) in skewed distributions

  5. From S&T indicators for justification and disciplining… Justification in decision-making • Weak justification, “Give me a number, any number!” • Strong justification, “Show in numberrs that X is the best choice!” S&T Indicators have a performative role: • They don’t just measure. Not ‘just happen to be used’ in science policy (neutral) • Constitutive part incentive structure for “disciplining” (loaded) • They signal to stakeholders what is important. Institutions use these techniques to discipline subjects • Articulate framings, goals and narratives on performance, collaboration, interdisciplinarity…

  6. … towards S&T indicators as tools for deliberation • Yet is possible to design indicators that foster plural reflection rather than justifying or reinforcing dominant perspectives • This shift is facilitated by trends pushed by ICT and visualisation tools • More inputs (pubs, pats, but also news, webs, etc.) • Multidimensional outputs (interactive maps) • Multiple solutions -- highlighting variation, confidence intervals • More inclusive and contrasting classifications (by-passing private data ownership? Pubmed, Arxiv) • More possibilities for open scrutiny (new research groups)

  7. 1. Conceptual framework: “broadening out” vs. “opening up” policy appraisal

  8. Policy use of S&T indicators: Appraisal Appraisal: ‘the ensemble of processes through which knowledges are gathered and produced in order to inform decision-making and wider institutional commitments’ Leach et al. (2008) Breadth: extent to which appraisal covers diverse dimensions of knowledge Openness: degree to which outputs provide an array of options for policies.

  9. Policy use of S&T indicators: Appraisal • Appraisal: • ‘the ensemble of processes through which knowledges are gathered and produced in order to inform decision-making and wider institutional commitments’ Leach et al. (2010) • Example: • Allocation of resources based on research “excellence” • Breadth: extent to which appraisal covers diverse dimensions of knowledge • Narrow: citations/paper • Broad: citations, peer interview, stakeholder view, media coverage, altmetrics • Openness: degree to which outputs provide an array of options for policies. • Closed: fixed composite measure of variables  unitary and prescriptive • Open: consideration of various dimensions  plural and conditional

  10. Appraisal methods: broad vs. narrow & closing vs. opening effect of appraisal ‘outputs’ on decision-making closing-down opening-up narrow range of appraisals inputs (issues, perspectives, scenarios, methods) broad Leach et al. 2010

  11. Appraisal methods: broad vs. narrow & close vs. open effect of appraisal ‘outputs’ on decision-making closing-down opening-up narrow cost-benefit analysis open hearings risk assessment structured interviews sensitivity analysis range of appraisals inputs (issues, perspectives, scenarios, methods) citizens’ juries q-method consensus conference decision analysis scenario workshops narrative-based participant observation multi-criteria mapping broad Stirling et al. (2007)

  12. Appraisal methods: broad vs. narrow & closing vs. opening effect of appraisal ‘outputs’ on decision-making closing-down opening-up narrow Most conventional S&T indicators?? range of appraisals inputs (issues, perspectives, scenarios, methods) broad

  13. Broadening out S&T Indicators effect of appraisal ‘outputs’ on decision-making closing-down opening-up narrow Conventional S&T indicators?? Incorporation plural analytical dimensions: global & local networks hybrid lexical-actor nets etc. New analytical inputs: media, blogsphere. range of appraisals inputs (issues, perspectives, scenarios, methods) Broadening out broad

  14. Appraisal methods: broad vs. narrow & closing vs. opening effect of appraisal ‘outputs’ on decision-making closing-down opening-up narrow Journal rankings Unitary measures that are opaque, tendency to favour the established perspectives … and easily translated into prescription University rankings range of appraisals inputs (issues, perspectives, scenarios, methods) European Innovation Scoreboard broad

  15. Opening up S&T Indicators effect of appraisal ‘outputs’ on decision-making closing-down opening-up narrow opening-up Conventional S&T Indicators?? range of appraisals inputs (issues, perspectives, scenarios, methods) Making explicit underlying conceptualisations and creating heuristic tools to facilitate exploration NOT about the uniquely best method Or about the unitary best explanation Or the single best prediction broad

  16. 2. Examples of Opening Up Broadening out AND Opening up Opening up WITH NARROW inputs

  17. 1. Preserving multiple dimensions in broad appraisals effect of appraisal ‘outputs’ on decision-making closing-down opening-up narrow Conventional S&T indicators?? range of appraisals inputs (issues, perspectives, scenarios, methods) opening-up Broadening out broad Leach et al. 2010

  18. Composite Innovation Indicators (25-30 indicators) European (Union) Innovation Scoreboard Grupp and Schubert (2010) show that order is highly dependent on indicators weightings. Sensitivity analysis

  19. Solution: representing multiple dimensions(critique by Grupp and Schubert, 2010) Use of spider diagrams allows comparing like with like U-rank, University performance Comparison tools (Univ. Twente) 5.4 Community trademarks indicator

  20. U-Map: Comparison of Universities in Multiple Dimensions

  21. 2. Examples of Opening Up b. Opening up WITH NARROW inputs

  22. Opening up S&T Indicators effect of appraisal ‘outputs’ on decision-making closing-down opening-up narrow opening-up Conventional S&T Indicators?? range of appraisals inputs (issues, perspectives, scenarios, methods) Making explicit underlying conceptualisations and creating heuristic tools to facilitate exploration NOT about the uniquely best method Or about the unitary best explanation Or the single best prediction broad Leach et al. 2010

  23. Interdisciplinarity as diversity

  24. Rafols, Porter and Leydesdorff (2010) A Global Map of Science 222 SCI-SSCI Subject Categories Agri Sci Ecol Sci Geosciences Infectious Diseases Env Sci & Tech Clinical Med Chemistry Matls Sci Biomed Sci Engineering Cognitive Sci. Health & Social Issues Psychology Physics Business & MGT Computer Sci Social Studies Econ Polit. & Geography

  25. Warwick Business SchoolSubject Categories of publicationsNodes labelled if >0.5% publications

  26. Manchester MIoIRSubject Categories of publicationsNodes labelled if >0.5% publications

  27. Heuristics of diversity(Stirling, 1998; 2007) Diversity: ‘attribute of a system whose elements may be apportioned into categories’ Characteristics: Variety: Number of distinctive categories Balance: Evenness of the distribution Disparity: Degree to which the categories are different. Variety Shannon (Entropy): i piln pi Herfindahl (concentration):  i pi2 Dissimilarity: i di Balance Disparity Generalised Diversity (Stirling) ij(ij) (pipj)a (dij)b

  28. Comparing degree of interdisciplinarity of two university units: Manchester is more??

  29. Multiple concepts of interdisciplinarity: Conspicuous lack of consensus but most indicators aim to capture the following concepts Integration (diversity & coherence) • Research that draws on diverse bodies of knowledge • Research that links different disciplines Intermediation • Research that lies between or outside the dominant disciplines

  30. Assessing interdisciplinarity Diversity ISSTI Edinburgh WoS Cats of references

  31. Coherence Assessing interdisciplinarity ISSTI EdinburghObserved/Expected Cross-citations

  32. Assessing interdisciplinarity Intermediation ISSTI Edinburgh References

  33. Summary: IS (blue) units are more interdisciplinary than BMS (orange) More Diverse Rao-Stirling Diversity More Coherent Observed/Expected Cross-Citation Distance More Interstitial Average Similarity

  34. 2. Excellence: Opening Up Perspectives Provide different perspectives of performance (alternative measures of the same type of indicator)

  35. Are measures of “excellence” consistent and robust? Citations: not stable to changes in classification and granularity (Zitt et al., 2005; Adams et al., 2008). Clinical neurology Is basic always better than applied? More basic More applied Good Average Bad Van Eck, Waltman et al. (2013)

  36. Measures of “excellence” Which one is more meaningful??

  37. The new Leiden ranking (2011-12) • Different measures of performance • MNC, MNCS, MNCJ, Top 10%, • Under different conditions (fractional, language) • Include confidence interval (bootstrapping)

  38. 3. Summary and conclusions

  39. S&T indicator as a tools to open up the debate • ‘Conventional’ use of indicators (‘Pure scientist ‘--Pielke) • Purely analytical character (i.e. free of normative assumptions) • Instruments of objectification of dominant perspectives • Aimed at legitimising /justifying decisions (e.g. excellence) • Unitary and prescriptive advice • Opening up scientometrics (‘Honest broker’ --Pielke) • Aimed at locating the actors in their context and dynamics  Not predictive, or explanatory, but exploratory • Construction of indicators is based on choice of perspectives  Make explicit the possible choices on what matters • Supporting debate  Making science policy more ‘socially robust’ • Plural and conditionaladvice Barré (2001, 2004, 2010), Stirling (2008)

  40. Strategies for opening up or how to “keep it complex” yet “manageable” • Presenting contrasting perspectives • At least TWO, in order to give a taste of choice • Simultaneous visualisation of multiple properties / dimensions • Allowing the user take its own perspective • Interactivity • Allowing the user give its own weigh to criteria / factors • Allowing the user manipulate visuals .

  41. Is ‘opening up’ worth the effort? (1)Sustaining diversity in S&T system Decrease in diversity. Potential unintended consequence of the evaluation machine: Why diversity matters Systemic (‘ecological’) understanding of the S&T • S&T outcomes depend on synergistic interactions between disparate elements. Dynamic understanding of excellence and relevance • New social needs, challenges, expectations from S&T Manage diverse portfolios to hedge against uncertainty in research • Office of Portfolio Analysis (National Institutes of Health) http://dpcpsi.nih.gov/opa/ Open possibility for S&T to work for the disenfranchised • Topics outside dominant science (e.g. neglected diseases)

  42. Is ‘opening up’ worth the effort? (2)Building robustness against bias Do conventional indicators tend to favour incumbents? Hypothesis: Elites and incumbents (directly or not) influence choice of indicators, which tend to benefit them… “knowledge enables power, but power structures knowledge” (Stirling, 2012) • Crown indicator –Standard measure of performance (~1990-2010) • ‘systematic underrating of low-ranked scientists’ (Opthof and Leydesdorff, 2010) (Not spotted for 15 years!) • Journal rankings in Business and Management. • systematic underrating of interdisciplinary (heterodox) depts. (Rafols et al., 2012). • Others?? H-index?? • favours established academics over younger.

  43. Conventional Policy Dynamics Stirling (2010) ‘lock-in’ to policy favoured by incumbent power structures POSSIBLE FUTURES GOVERNANCE COMMITMENTS complex, dynamic, inter-coupled and mutually-reinforcing socio-technical configurations in science IIIIII $ GUIDANCE / NARRATIVE narrow scope of attention simple ‘unitary’ prescriptions SOCIAL APPRAISAL • S&T indicators • risk assessment • cost-benefit analysis multiple practices, and processes, for informing social agency (emergent and unstructured as well as deliberately designed ) • expert judgements / • ‘evidence base’ • “best / optimal /legitimate” • also: restricted options, knowledges, uncertainties in participation • incomplete knowledges • Res. Excellence

  44. Breadth, Plurality and Diversity Stirling (2010) dynamic portfolios pursuing diverse trajectories POSSIBLE PATHWAYS MULTIPLE TRAJECTORIES GOVERNANCE COMMITMENTS           $        broad-based processes of ‘precautionary appraisal’ ‘opening up’ with ‘plural conditional’ outputs to policymaking SOCIAL APPRAISAL • multiple: methods, criteria, options, frames, uncertainties, contexts, properties, perspectives • viable options under: conditions,dissonant views, sensitivities,scenarios, maps,equilibria,pathways, discourses • Sustainability

  45. S&T indicator as a tools to open up the debate • ‘conventional’ use of indicators • Instruments of objectification • Analyticalcharacter (i.e. free of normativeassumptions) • Aimed at making decisions (e.g. excellence) • Unitary and prescritiveadvice • Opening up scientometrics • Construction of indicators is based on choice of perspectives  implicit normative choice on what matters • Aimed at locatingthe actors in their context and dynamics  Notpredictive, or explanatory, butexploratory • Supportingdebate  makingsciencepolicymore ‘socially robust’ • Plural and conditionaladvice Barré (2001, 2004, 2010), Stirling (2008)

  46. Heuristics of diversity(Stirling, 1998; 2007) Diversity: ‘attribute of a system whose elements may be apportioned into categories’ Characteristics: Variety: Number of distinctive categories Balance: Evenness of the distribution Disparity: Degree to which the categories are different. Variety Shannon (Entropy): i piln pi Herfindahl (concentration):  i pi2 Dissimilarity: i di Balance Disparity Generalised Diversity (Stirling) ij(ij) (pipj)a (dij)b

  47. Rafols, Porter and Leydesdorff (2010) A Global Map of Science 222 SCI-SSCI Subject Categories Agri Sci Ecol Sci Geosciences Infectious Diseases Env Sci & Tech Clinical Med Chemistry Matls Sci Biomed Sci Engineering Cognitive Sci. Health & Social Issues Psychology Physics Business & MGT Computer Sci Social Studies • CD-ROM version of the JCR of SCI and SSCI of 2009. • Matrix of cross-citations between journals (9,000 x 9,000) • Collapse to ISI Subject Category matrix (222 x 222) • Create similarity matrix using Salton’s cosine Econ Polit. & Geography

  48. Diversity indexes Stirling GeneralisedDiversity

More Related