1 / 22

Lauren Algee, By the People Community Manager

By the People , with the people: User-centered assessment for user-centered crowdsourcing at the Library of Congress. Lauren Algee, By the People Community Manager Senior Innovation Specialist, Library of Congress Labs @ Crowd_LOC @ algeebraten. By the People. Digital Collections

abyard
Télécharger la présentation

Lauren Algee, By the People Community Manager

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. By the People, with the people: User-centered assessment for user-centered crowdsourcing at the Library of Congress Lauren Algee, By the People Community Manager Senior Innovation Specialist, Library of Congress Labs @Crowd_LOC @algeebraten

  2. By the People • Digital Collections • Open source platform - Concordia • Program of engagement • Community of practice Goals: • Connect all Americans to the Library of Congress • Transcribing and tag pages to improve: • Search • Readability (by individuals and accessibility technologies) • Computationality

  3. By the People by the numbers In 9 months: • 5,200 registered users • 17,000 actions by anonymous users • 37,500 pages transcribed and awaiting review • 18,000 pages complete • 6,200 transcriptions published on loc.gov How can we tell the rest of the story?

  4. Approaching user-centered assessment: Challenges and opportunities • Balance collection and engagement goals: • Engage lifelong learners • Enhance digital collections • Check our assumptions • Merge quantitative and qualitative • “Yes, and…” • Use our strengths • Informed and actionable

  5. Data sourcesQualitative Quantitative • User surveys • Twitter • E-Newsletter (GovDelivery) • History Hub (Jive) • Email • Events feedback • Existing research • User surveys • Twitter • E-Newsletter (GovDelivery) • History Hub (Jive) • Website metrics (Adobe) • Crowdsourcing activity database (Kibana)

  6. Review workflow

  7. 3 months in – review bottleneck

  8. 5 months later - completed & awaiting review

  9. “Big improvement”

  10. Platform driven by program Weekly review rates Jan – March 2019

  11. Data quality

  12. Branch Rickey scouting reports data anyalsis

  13. Volunteer questions/comments about Features

  14. From user query to new feature “I did a little digging on the internet to see if I could identify names of battles that I couldn't make out in the letters I transcribed.  I was able to find references to Bermuda Hundred and Chester Station that I'd like to put into a letter I transcribed.  How can I pull that out of review status and make those edits?  The transcription currently has [??] in it for those words.” https://github.com/LibraryOfCongress/concordia/issues/690

  15. Volunteer questions/comments about transcription instructions

  16. Onboarding survey – April 2019 What would help you get started?

  17. Centering Users to Center Collections

  18. Thank you! Even more at crowd.loc.gov & labs.loc.gov

More Related