1 / 27

Altmetrics - How do I rate thee? Let me count the tweets!

Altmetrics - How do I rate thee? Let me count the tweets!. Mike Taylor Research Specialist http:// orcid.org /0000-0002-8534-5985.

stacie
Télécharger la présentation

Altmetrics - How do I rate thee? Let me count the tweets!

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Altmetrics - How do I rate thee? Let me count the tweets! Mike Taylor Research Specialist http://orcid.org/0000-0002-8534-5985

  2. Mike Taylor of Elsevier Labs discusses altmetrics and how it plays with open access, social impact, Orcid and why it's got a long journey to become truly significant

  3. Elsevier Labs • “Researchers” and “developers”, ie, experimental architects • Researchers’ specialities include text-mining, NLP, semantics, ontologies, etc

  4. Elsevier Labs (and me) • Research Specialist • My projects are: altmetrics, contributorship, networks, identity / Orcid, author profiles • Not an academic (although…) • Not entirely commercially focused

  5. In relation to altmetrics: • Work with and support academic researchers • Support movement towards integrated model of reference / citation / mentions incbibliometrics • Support and encourage innovation in this area in Elsevier • Publish data and findings • Develop position as thought leader

  6. Previous credits include: • ORCID • Collaborate on technical architecture • Previously working on similar project with EU university • End of year 1 – great success • www.orcid.org/statistics • www.orcidlive.org

  7. Why bother with Orcid? • Disambiguation is a growing problem • Importance of the personal • Permanent labels are good things: DOIs, ISSNs, ISBNs, Orcids • 430,000 minted • 102,000 contain ‘works’

  8. The problem • Disambiguation technologies are reliant on good meta-data, and are focused towards western/northern names • Poor meta-data / Asian names are difficult • Eg, 3 Korean family names cover >50% population • c/w US, several thousand, “Spangler” being the tipping point

  9. Elsevier infrastructure • #1 to integrate with Scopus > Orcid (free to use, no need to have Scopus account) • Scopus display uses Orcidapi • Editorial submission system • Integration into metadata hub • Searchable field on Sciencedirect

  10. Infrastructure / cultural issues • Brazil, Korea, Denmark = excellent metadata • China, Italy, India = dreadful metadata

  11. Infrastructure / cultural issues • Brazil, Korea, Denmark = excellent metadata • China, Italy, India = dreadful metadata • Chinese attitude towards relative reward • Research tools are primarily English language • Poverty of infrastructure… • …which impacts on altmetrics

  12. Altmetrics • The collection of social network data • Term coined in 2009 by Jason Priem on Twitter • http://altmetrics.org/manifesto/ • Ambitions: filtering, hidden impact, replacement for peer review

  13. Altmetrics: a potentially brilliant development with a terrible name • “Alt” to what – not really “metrics” either • The data in altmetrics is … from whatever is available. • Calling it “alt” potentially alienates “metrics” people

  14. Pragmatic and technocratic • Eg, Mendeley is included, Zotero isn’t, Colwiz isn’t. • Big old pile of data: Twitter, Github, Dryad, Facebook, blogs, usage data (sometimes),re-use data (sometimes) • Is mostly reliant on DOIs (caveats apply) • Collect what you can, how you can • (not the best basis for clear activity)

  15. Research! Papers! Start-ups! • Very exciting: Altmetric.com, Plum Analytic, Grow Kudos, impactstory.org • Lots of papers published (though not research heavy, this is starting to happen) • Several “special issues” • Couple of PhDs in progress • I’ve heard a book might get published • NISO

  16. Research findings • Seems to be a correlation between Mendeley adds and citation rates • There are definitely patterns of things that happen together (“impact flavors” – Piwowar, Priem et al) • There are definitely differences between disciplines • No OA advantage obvious (yet?) • N is too small

  17. Altmetric.com • Owned by Digital Science / Macmillan • The “donut” • Quick demo of Altmetric.com • Appears on all Scopus articles (with Altmetric.com data) since June 2012 • Trialing on Sciencedirect • Lots of criticism

  18. Altmetrics data is variable Which is why I favour small, low-judgment buckets of data classes (what does it take to…): • Social activity • Component re-use • Scholarly commentary • Scholarly activity • Mass media

  19. Nuclear error editorial, Nature

  20. Supervolcano, Nature Earth

  21. Right handed vs left handed tail wagging(Current Biology)

  22. Altmetrics – 7 use cases • Prediction of ultimate citation, identifying potentially impactful authors • Measuring / recognizing component re-use / preparatory work, reproducibility • Hidden impact (impact without citation) • Real-time filtering, real-time evaluation • Platform / publisher / institution comparison • Measuring social reach, estimating social impact • Altmetrics is of interest by itself

  23. Data classes vs use cases Guesswork that needs verification and data!

  24. Research project • Are the classes internally viable? • Do they survive disruption by uptake, new contributors, how do we normalize? • Are the classes distinct and discrete? • How (and when and why) do they interact? • (Questions I hope to address in articles over next few years and in PhD)

  25. Wider role – in community • Support open standards • Support researchers with data (etc) • Form links between bibliometricians and altmetricians • Be a generator of ideas • Support special editions, workshops etc

  26. Wider role – in Elsevier • Encourage support for my community work • Champion open standards for metrics (everyone is doing this) • Support product development and outreach • For example:

  27. Things that we can do • Real-time suggestions (allied to much improved search) • Hidden research • Social impact statements for researchers • Re-use indicators support open data • ‘Evaluation / impact’ network with Fundref, Orcid, DOIs, etc

More Related