1 / 32

The Networked Education Database

The Networked Education Database. Matthew Pittinsky Ph.D. Candidate, Teachers College, Columbia University. NED: A Vision. Schools have long invested in student administrative systems.

Télécharger la présentation

The Networked Education Database

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Networked Education Database Matthew PittinskyPh.D. Candidate, Teachers College, Columbia University

  2. NED: A Vision • Schools have long invested in student administrative systems. • Schools are now adopting eLearning systems equipped with gradebooks, class rosters and Web-based survey (assessment) tools. • Both are Internet-enabled. • Could “generic” and custom data be collected through school systems automatically, and anonymously, massively reducing the cost and complexity of educational research?

  3. NED The Problem…

  4. Original Data Collection • Requires precious classroom time. • Informed consents difficult to secure. • Customizing instruments for context across sites and time is costly and discourages certain types of data collection (e.g. sociometric). • Data entry and coding inhibits sharing and re-use. • Incomplete responses undermine results and inhibit certain types of data collection (e.g. full classrooms).

  5. Major Secondary Datasets • International studies (e.g. TIMSS). • Federal studies (e.g. NELS, HSB). • State data warehouses (e.g. Florida K-20 Education Data Warehouse (EDW)). • Sponsored private studies (e.g. AddHealth).

  6. Secondary Datasets: Issues • Requires tough trade-off’s when operationalizing specific research questions. • e.g. classmate effect studies. • Often based on stratified samples, not whole classrooms and schools. • e.g. same-teacher class periods. • Rarely longitudinal within academic years. • Rarely contextualized (e.g. relationship questions that require roster).

  7. NED The Solution…

  8. NED: A Dataset • Classroom-level data. • Same-teacher data. • Sociometric & social-psychological data. • Longitudinal data. • Multi-site data. • At scale…

  9. NED: A Data Collection Model • Asynchronous (outside class time) • Automatic (pre-scheduled, add/drops) • Contextual (draws on system data to generate questions) • Non-duplicative (uses already stored or entered data where possible) • Complete (form checks) • Anonymous (unique ID) • Efficient (paperless and coded) • Sustainable (self-perpetuating)

  10. How NED Works • School installs NED extension and marks participating classes. • School’s eLearning system automatically posts survey based on schedule (w/ announcement). • Participant provides consent. • Survey is delivered through eLearning system GUI. • Survey draws on class context (roster, subject matter, student information, etc.) when phrasing customized questions. • Survey enforces certain completion rules. • Survey responses stored in special encrypted tables that self-delete after posting. • Survey responses and pre-existing data (demographic, gradebook) are packaged and securely posted to NED. • New students are “caught up” when added to course. • Teacher knows how many completes, but not who.

  11. Scaling NED

  12. Participant Anonymity • Data arrives to NED as secondary data (anonymous and coded). • Survey responses are tagged with unique participant IDs. • Generic data tagged with same ID and automatically merged with survey responses (gradebook, demographic, course overlap). • Structure of unique participant ID allows for sorting by class and school, however... • Data arrives at NED without any knowledge of participant’s school or classroom identity.

  13. Participant Confidentiality • Survey responses are encrypted and automatically self-delete on local server. • No school official has access to student responses or completion status. • Data transmission is via secure protocol.

  14. NED Data Feed: Example

  15. NED The Pilot…

  16. NED Pilot • Custom extension to the Blackboard Learning System. • Solicited 15 sites, 5 agreed, 3 ultimately participated. • All secondary schools (2 private / 1 public), 3 different states and regions. • 18 teachers, 37 classes, 732 participants • Three pre-scheduled NED survey administrations (October, January, May). • Each “live” for two weeks • NED staff know site names, but not names of participating schools (if district), teachers or class information. • Students provided incentive to participate. • Surveys included questions from other datasets to compare responses. • Approximately 250 development hours.

  17. NED Status • First administration launched 10-16 and ends today (10-30). • Several showstopper technical issues identified and resolved. • One site dropped out. • % participation and incompletes will be assessed 11-1.

  18. NED Pilot: Walkthrough

  19. NED Pilot: Walkthrough

  20. NED Pilot: Walkthrough

  21. NED Pilot: Walkthrough

  22. NED Pilot: Walkthrough

  23. NED Pilot: Walkthrough

  24. NED Pilot: Walkthrough

  25. NED Pilot: Walkthrough

  26. Implementation Issues • Not a standard “building block;” required custom coding. • Bb installations vary, affecting custom code. • Not Bb’s standard survey tool. • Survey formatting limited. • Save and start, adaptive, and timing features limited. • Low ease of use (e.g. self-reference not grayed in sociometric questions; matrix questions scroll off screen without freezing roster). • Gradebook entries are user-defined, without a standard taxonomy. • Many schools create one mega-site for all class periods. • Pilot leans away from core subjects. • Pilot leans away from same-teacher course sections. • Many schools do not use Bb as their gradebook or student profile of record. • Relying on teacher responses for student-level data not always viable (e.g. mixed age-grade classes).

  27. Implementation Issues • Required “enterprise license” of Blackboard. • Data transmission via local SQL scripts, not Web service. • IP address of sending site could allow for matching of school name with unique ID schema. • Different participant ID’s across classes (if student changed class periods). • System reports fragmented and unusable without additional programming. • Total eligible population not included in system report. • Ideal survey length difficult to asses. • Anonymity and class time impact concerns during site solicitation. • Will students participate?

  28. Future Directions • Implement through standardized APIs and via eLearning system’s survey tool. • Pilot with larger number of sites. • Pilot with smaller, more frequent surveys. • Pilot with full site participation across all classes and grades. • Pilot with full age-grade population over time. • Include non eLearning systems (e.g. TPR) and non Bb eLearning systems. • Formalize vendor NED interface program. • Expand to higher education.

  29. NED: Imagine • A national dataset. • Fed from tens of thousands of sites. • Collecting unique classroom-level data. • Throughout the academic year and a student’s educational career. • With minimal site-specific maintenance. • Efficiently and cost effectively.

  30. NED Team • Technical • Tim Streightiff, lead developer • Linda Merryman, project director • Basheer Azizi, database engineer • Functional • Matthew Pittinsky, principal investigator • Gary Natriello, principal investigator

  31. The Networked Education Database A Joint Research Project Contact: Matthew Pittinsky: mp2055@columbia.edu Gary Natriello: gjn6@columbia.edu http://edlab.tc.columbia.edu/index.php?q=node/904 www.blackboard.com

More Related