230 likes | 329 Vues
Knots, Nooses, and Loops. What to Do With This Mess. Timeline of Assessment. January 2001: “Student Assessment of Learning in the College Introductory And Intermediate Writing Course” Student self assessment of their learning 2001 assessment of the eight new learning outcomes
E N D
Knots, Nooses, and Loops What to Do With This Mess
Timeline of Assessment • January 2001: “Student Assessment of Learning in the College Introductory And Intermediate Writing Course” • Student self assessment of their learning • 2001 assessment of the eight new learning outcomes • 2002: “A Student Assessment of Their Improvement in Writing Ability in English 1010 and 2010” • Follow-up on peer review groups which was the lowest learning goal from the January 2001 study • 2006: 2010 pilot assessment of student writing • First assessment to look directly at student writing • Fall 2007: Assessment of professional writing, English 2100 • 2007 : follow-up on English 2010 assessment • Follow up on citation which was the lowest trait in the 2006 assessment • Spring 2007: “Assessment Report: English 1010” • Holistic blind reading of student work • Spring 2008: Genre Translation: A descriptive Assessment • Spring 2008: “Increasing Student Retention: An Ethnographic Study of English 1010” • First assessment to look at broader issues impacting student success and retention • Audience Assessment • Spring 2008: Manifestations of Anxiety in the Writing Class Room
A Focus On English 1010 • English 1010: Front Door Course • Bookend Assessments in the Midst of Change • Spring 2007: “Assessment Report: English 1010” • Spring 2008: “Genre Translation: A Descriptive Assessment” • Spring 2008: “Increasing Student Retention: An Ethnographic Study of English 1010”
English 1010 Outcome Goals • Rhetorical Strategies including adapting to differences in purpose, audience, and genre. • Critical Thinking Processes including summary, analysis, synthesis and argumentation • Composing Processes such as invention, drafting, revision, editing, peer feedback, and self-assessment • Conventions of Writing especially the conventions of citing multiple texts and incorporating them into one’s own writing
Traits • The student writer takes a point of view on an issue. • The student writer uses and documents sources appropriately for the writing task. • The student writer can analyze effectively, using appropriate evidence. • The student writer writes persuasively. • The student writer effectively illustrates how they adapt writing to address particular situations in written self-assessment.
Finding a total score • Traits
What Have We Gained From This Assessment? • Collegial Conversations • Transparency • A Clearer Picture of the Course • Good Questions to Ask
What To Do With This Mess? • What does the data really tell us? • What about the assessment tool and the context of the readings? • What have we learned about our courses and our students as a result of this assessment? • What actions will we take based on this assessment?
Increasing Student Retention: An Ethnographic Study of English 1010 • Nationwide groundswell of activity focused on student retention and engagement • CCSSE (Community College Survey of Student Engagement) • SENSE (Survey of Entering Student Engagement) which is part of the Starting Right initiative—preliminary results of the pilot program were published in 2007 • Survey of students in 22 colleges mostly in Texas during 4th and 5th week • Understanding What Happens at the Front Door • English 1010: • A major front door course at SLCC • A focus on a particular discipline
Methodology • Methodology of Our Study: • Conduct an audience assessment of our “readers” • Employ triangulation of multiple data sources • Pedagogical Practices: our study is NOT an attempt to quantify if certain pedagogical practices are effective or not; rather to examine how students “take up” established writing practices and how we as teachers might adjust the implementation of these practices to improve engagement and retention.
Data Collection • Class surveys (10 sections, 271 respondents) • General info from IR • Two student focus groups (21 participants) • Instructor focus group (4 participants) • Instructor Email Survey (16 respondents) • Phone Calls (over 20 attempted, 5 completed)
At a Glance: Statistical Data • 1893 students were enrolled in English 1010 in spring 2008 • 61.2% of students received A’s or B’s • 10.8% of students received C’s • 3.5% of students received D’s • 16.1% of students received E’s • .3% of students received I’s • 7.8% of students withdrew • In other words 27.7% of students (approximately 525) did not successfully complete English • 59% of withdraws took place in weeks 4-8 of the semester
At a Glance: Student Survey • 52% of students were first year students • 17% of students have been at SLCC 5 or more semesters • 60% of students graduated from high school in 2004 or earlier • 18% of students graduated from high school in 2007 • 55% of students work 31 or more hours per week • 29% of students work between 11 and 30 hours per week • 58.7% of students were part-time students • 41.3% of students were full-time students • 35% of students spent 3 or fewer hours per week doing homework for English 1010 • 52% of students spent between 4 and 6 hours per week doing homework
At a Glance: Student Focus Groups • Students report that time management is essential for success in college and in English 1010 • Students report that English 1010 is a time consuming class—more so than they originally expected it to be • Students report that attending class every day is essential for success in English 1010 • Students report primary reasons for dropping English 1010 include overloaded schedules and falling behind in the course work • Students report that it’s important to feel connected to the class—to the teacher, the other students, and the topics they are writing about
What Can We Do as Instructors? • Stop flipping a broken switch and expecting the light to come on • “They don’t seem ready for college work and college expectations. No late work [policy] surprises them. Having to read the text surprises them” (SLCC instructor). • “I think students are lazier these days. Some don’t even spell check” (SLCC instructor). • Why do we find ourselves surprised each semester when our new students do not meet our expectations?
What Can We Do as Instructors? • Instructor training should focus on *realistic* audience assessment: • Two themes from our study come together and seem hell bent on a collision—teachers who persist in believing the next batch of students are going to be or should be prepared collides that first day with students who think they are prepared based off a misperception of college, time, and what it means to write. The remnants of this collision will always be frustration, confusion, and resentment. • So, why are we continually surprised? Maybe we get focused on “I,” the teacher, and do not make the move to the “you,” the reader, the students who read our syllabi, who listen to our lectures, and who read the comments we make on their papers.
What Have We Gained From This Assessment? • The value of listening to student voices • A clearer picture of our audience • A clearer picture of how students “take up” our course • Collegial conversations • More good questions to ask
What to Do With This Mess? • So, how does all this data impact our curricular design? • Shifting course leaders and text books • Programmatic issues: uniformity vs freedom • Adjunct issues • Bridging the courses • Institutional pressures