270 likes | 421 Vues
Researchers’ views on research evaluation and the Danish Bibliometric Research Indicator . Agenda . Introduction Methods Results - Influence of main field on attitudes towards research evaluation elements - Influence of publication activity on attitudes towards DBRI
E N D
Researchers’ views on research evaluation and the Danish Bibliometric Research Indicator
Agenda Introduction Methods Results -Influence of mainfield on attitudes towardsresearch evaluation elements -Influence of publicationactivity on attitudes towards DBRI -Themes in respondents’ comments Conclusions
Introduction LIS practice and bibliometrics DBRI launched in 2009 Previous studies on the effect of evaluationbasedfunding on publicationbehaviour(e.g. Butler, 2003; Gläser et al., 2002) and howbibliometricsimpact the science system (Weingart, 2005) 2012: Evaluation of DBRI (Sivertsen & Schneider, 2012)
Objectives of the survey • To investigate researchers’ attitudes towards research evaluation in the form of h-index, publication and citation counts • To explore their view on the Danish Bibliometric Research Indicator (DBRI) and how it may have affected their research
Methods • 400 researchers from 5 major universities in DK (University of Copenhagen, Aarhus University, University of Southern Denmark, Aalborg University, Roskilde University) 80 from each university • Systematic random sampling • Email with link to online questionnaire, short description of the survey and information about anonymity for them and their institution • 161 respondents = response rate of 40 %
Representativeness of the sample Percentage per main field compared to statistics from Danish Universities (DanskeUniversiteter, n.d.)
comments on research evaluation aspects • One size doesn’t fit all! • Too general and undifferentiated • Disciplines/fields are diverse • Lack of context • Quantity not quality The three measures are totally dependent on the type of research, the specialty and the size of the subject area, “sex factor” and the number of persons interested in the subject. Publication culture and channels differ very much in the various fields. It is impossible to compare quantitatively […]
citation counts Numbers of citations can be misleading because of a few high impact papers, a large number of reviews, or co-authorship on high impact papers with little direct involvement by the researcher. If you work in a smaller research field (e.g. pituitary gland neoplasms) it will be less cited etc. than a broader field (e.g. diabetes, type 2) – but that does not mean that one type of research is more important than the other. Researchers with in [the same] network cite each other not only because it is relevant, but because it boosts the citation impact factor. Strategically a sensible action, but does that say anything about quality and impact?
h-index There is not necessarily a connection between a researcher’s productivity and his/her h-index/number of publications. A researcher’s publications can easily have high impact (i.e. been read by many) and not necessarily been cited for it. The h-index is not accurate as it does not take the amount of time a person has published into account […] […] A researcher, for example, can have a high h-index as co- author on many high impact papers, without having contributed much to the work. A scientist can also publish many [papers] with little or no impact […]
‘dismissal’ […] The numbers are easily manipulated by researchers and the system is grossly exploited by the journals who demand exorbitant prices for publishing scientific articles. [The three measures] are good for nothing; except for disciplining (the idea is to have as many areas to measure on as possible so that the individual is always behind). They impact the publication style so that people for example perform minor changes in texts and publish them again […]
Comments on the dbri Adversely affects publication behaviour DBRI authority file Societal consequences ‘Dismissal’
Publication behaviour Now I only think of points and no longer of recognition. I intentionally cut my research into bits suitable for DBRI-publication and I am no longer interested in creating new coherent understanding/knowledge. DBRI crucially increases the motivation for more slicing […]. DBRI has done nothing but give rise to suboptimisation regarding publication channels. Consequently, people speculate in more publications, not better publications […] DBRI forces me to publish in prestigious channels, but not in the channels that are read by the people who apply my research […]
Publicationbehaviour The winning strategy is to publish as much as possible in the lowest of the top 20% journals. […] A second possible winning strategy is to type fast, since the difference between 3 and 1 point is not stark. So you may be able to type three times more 1-point papers and still come out on top. […] I do not believe that coercion in relation to publishing and the focus on particular journals increase quality. On the contrary, it is an alignment that pleases the journal publishers’ demands and interests […] I publish in a more conscious way now, but I do not think (the dissemination of) my research has either improved or worsened because of this – it is just different.
DBRI authority file of publication channels […] A two-level separation as in the Danish system is not sufficient: the very top journals have exponentially more impact and visibility than the lower journals within the top 20% […] […] Many journals accepted by DBRI are in my opinion of little value and with extremely low impact […] […] The classification of journal levels is enveloped in mystery. It does not always depend on quality, but on where the committee members themselves publish. Therefore, it is crucial for institutions to have persons in these committees so that the journals you yourself publish in are placed in the top-level.
DBRI authority file of publication channels The committees make lists permeated by subjective choices and personal interpretations […] […] The classification of a journal (level 1 or 2) seems very random – it seems to be decided by where the committee members themselves publish. […] A publication is accepted by a level 2 journal one year and when it finally comes out a year later, the very same journal has become a level 1 journal. This motivates going for the low-hanging fruit […] The system [the authority files] should be prospective; everything else is an arbitrary lottery.
Societalconsequences The purpose of the DBRI is to make the researchers manipulate their personal research indicator by producing boring, easy publishable assembly line research instead of obtaining original and innovative results. […] In my opinion the state pays for research that high ranking journals can ‘patent’ and after that the state can pay the publishers to provide access to the very same research […]
‘dismissal’ DBRI is a very precise measure of absolutely nothing and has been invented to the delight of bookkeepers. The indicator damages Danish research and must be discontinued ASAP. The system is useless. I consider DBRI as a necessary evil and loyally participate in the work to ensure ‘damage control’ and fairness.
The assessment of research quality requires peer assessment of research quality. This is the way it is done in the world's leading research countries. Why has Denmark opted for this route? […] The only answer must be that they are a job-creation scheme for librarians? DBRI is in my opinion probably the most stupid initiative in the modern history of Danish research policy. I believe that DBRI will significantly change the publication tradition with increased slicing, increased self- and friend-citations. […] The only positive thing is that the DBRI might increase job security for research librarians.
Someconclusions No surprises? DBRI Cautions Further research
references Butler, L. (2003). Modifying publication practices in response to funding formulas. Research Evaluation, 12(1), 39-46. doi: 10.3152/147154403781776780 DanskeUniversiteter. (n.d.). Universiteternesstatistiskeberedskab: Personaleuniversiteterne 2007-2012. København: DanskeUniversiteter. Retrieved 01/10. 2013, from http://www.dkuni.dk/Statistik/Universiteternes-statistiske-beredskab Gläser, J., Laudel, G., Hinze, S., & Butler, L. (2002). Impact of evaluation-based funding on the production of scientific knowledge: What to worry about, and how to find out. Fraunhofer ISI. Retrieved 09/17.2013, from Sivertsen, G., Schneider, J. (2012). Evalueringav den bibliometriskeforskningsindikator. Oslo: Nordisk institutt for studier avinnovasjon, forskningogutdanning. Retrieved 09/25.2013, from fivu.dk/forskning-og-innovation/statistik-og-analyser/den-bibliometriske-forskningsindikator/endelig-rapport-august-2012.pdf Weingart, P. (2005). Impact of bibliometrics upon the science system: Inadvertent consequences? Scientometrics, 62(1), 117-131. doi: 10.1007/s11192-005-0007-7