1 / 8

CSR Quick Feedback Pilot

CSR Quick Feedback Pilot. Mary Ann Guadagno, PhD Senior Scientific Review Officer CSR Office of the Director. Pilot Objective. To collect feedback on CSR peer review in a survey

Télécharger la présentation

CSR Quick Feedback Pilot

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSR Quick Feedback Pilot Mary Ann Guadagno, PhD Senior Scientific Review Officer CSR Office of the Director

  2. Pilot Objective To collect feedback on CSR peer review in a survey • Evaluate the utility of asking reviewers in chartered study sections about their assessments of meeting experience: • Quality of Prioritization • Collective Expertise • Assignment of Applications to Reviewers • Quality of Discussion

  3. Pilot Scope • Two CSR Integrated Review Groups (IRGs) • Genes, Genomes, and Genetics (GGG) • Dr. Richard Panniers • Brain Disorders and Clinical Neuroscience (BDCN) • Dr. Samuel Edwards • 18 CSR Study Sections (January –March 2014) • Very short questionnaire – 4 agreement statements with ability to answer in about 5 minutes and 1 open answer • Delivered via email • Completed near end of study section meeting

  4. Agreement Statements and Comments – on line • S1 - The Panel was able to prioritize applications according to their impact/scientific merit. • S2 – The roster of reviewers was an appropriate assembly of scientific expertise for the set of applications in the meeting. • S3 – Assignment of applications to reviewers made appropriate use of their broad expertise. • S4 – The nature of the scientific discussions supported the ability of the panel to evaluate the applications being reviewed. • General Comments – In addition to the answers you provided in this questionnaire, please add any other comments in the text box below.

  5. Overall Feedback was Favorable n=248

  6. Verbatim Comments from Reviewers • CSR panels are generally high quality. • Clear commitment of all reviewers to fairly review applications. • Video review once a year is a great idea. • Assignments are balanced and appropriate. • Differing score calibration by reviewers is a problem. • Scoring is uneven among reviewers. Still have score inflation. • Should separate overall scientific impact rating from technical merit. • IAM was difficult to move back and forth between so many discussions.

  7. What Did We Learn? • Identification of reviewer likes and concerns. • Some SRGs and some practices received constructive feedback. • Strengths and limitations of methodology. • Technical issues – email, survey software, compliance, ease of analysis. • Input for future surveys – next steps. • Platform evaluation • Input from program observers • Change over time

  8. Acknowledgements • Charles Dumais • George Chacko • Mei-Ching Chen • Paul Kennedy • Amanda Manning • Adrian Vancea • Richard Panniers and GGG SROs • Samuel Edwards and BDCN SROs • Michael Micklin

More Related