1 / 11

The ( Mis )Use Of Statistics In Evaluating ASP Programs

The ( Mis )Use Of Statistics In Evaluating ASP Programs. O.J. Salinas & Jon McClanahan University of North Carolina School of Law. Successful Program?. Bar Prep Program Negatively Correlated With Bar Passage!. Successful Program?. ASP Program Causes Significant Improvements in Grades!.

borna
Télécharger la présentation

The ( Mis )Use Of Statistics In Evaluating ASP Programs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The (Mis)UseOf Statistics In Evaluating ASP Programs O.J. Salinas & Jon McClanahan University of North Carolina School of Law

  2. Successful Program? Bar Prep Program Negatively Correlated With Bar Passage!

  3. Successful Program? ASP Program Causes Significant Improvements in Grades!

  4. Evaluating ASP Programs

  5. Sampling Basics Population of Interest Measurement Sample(s)

  6. Introducing Error During Sampling • Misidentifying Population of Interest • Sampling Bias • Self-Selection (Volunteer) Bias • Exclusion Bias • Using Improper Comparison Groups

  7. Identifying Outcome Measures • Objective Measures • Performance in Individual Courses • GPA / Change in GPA • Performance on the Bar Examination • Subjective Measures • Program-specific Evaluation Forms • School-wide Evaluation Forms

  8. Introducing Error During Measurement • Objective Measures • Instrument Bias • Confounding Variables • Subjective Measures • “Experiential” Bias • Loaded and Compound Questions

  9. (Some) Study Design Best Practices • Take stock before implementation. • Proceed step-by-step in making program changes. • Take steps to reduce sampling bias, through outreach and alternatives. • Quantify and capture related data. • Evaluate along several measures.

  10. Avoiding Pitfalls in Analyzing Data and Communicating Results • Provide Context and Baselines. • Acknowledge Limitations in Experiment Design and Data Analysis. • Distinguish Between Profiling and Predicting. • Avoid Overgeneralizations. • Remember: Not Everything is Quantifiable.

  11. Thank You! O.J. Salinas osalinas@email.unc.edu Jon McClanahan jonmc@email.unc.edu

More Related