190 likes | 280 Vues
Guide to the Smarter Balanced IT Architecture Connecticut Assessment Forum August 14, 2012. Agenda. Smarter Balanced Update IT Architecture Project Overview Defining the Architecture – Period A Governance and Implementation – Period B Architecture Summary Report
E N D
Guide to the Smarter Balanced IT Architecture Connecticut Assessment Forum August 14, 2012
Agenda Smarter Balanced Update IT Architecture Project Overview Defining the Architecture – Period A Governance and Implementation – Period B Architecture Summary Report Technical Standards for Interoperability Q & A
The Approach: This is larger than any single company agenda; it requires an open and collaborative approach The Team: We have assembled a Collaborative with: The thought leadership (industry, academia, industry associations & individual contributors) to ensure the vision is grounded in assessment and education best practices The technical capability to deliver The Process: Experienced and Agile; sustainability via a community approach The Goals: Open, Flexible, Standards Based, and Componentized Overview of Smarter Balanced Architecture Initiative
Planning & blueprinting • Item types • Content development & universal design • Learning standard alignment • Content and data reviews • Test form construction • Field testing • Item banking & statistics • Content exchange / interoperability Assessment Lifecycle Content Development • Administration planning & scheduling • Registration, assignment, • Form sampling • Online infrastructure readiness assessment • Pre-session planning (paper / online) & setup • Alternate form assignment • Psychometric analysis • Equating • Score tables - scaling, norming • Performance levels / cut scores • Field test analysis • Aligning results with curriculum / instruction • Program and teacher effectiveness Post-Test Administration Pre-Test Administration Assessment Life Cycle • Test form delivery • Platform (paper, online, mobile) presentation • Item content & tools • Adaptive testing • Response collection • Proctoring controls • Form content security • Desktop security • Accessibility • Testing anomalies • Individual reporting • Diagnostic reporting • Informing & personalizing instruction • Performance on standards • Dashboard / summary reporting • Aggregation / disaggregation • Exchanging results / data Reporting Test Administration Scoring • Computer scoring • Professional scoring • Algorithmic (AI) scoring • Portfolio scoring • Sub test / strand scoring • Attemptedness • Performance levels • Scaling / norming • Growth scores • Range finding
Project Approach Analysis Envisioning and Decomposition Validation and Documentation Governance and OSS Strategy Execution Drivers and Tradeoffs Execute Architecture Governance Model Scenario Mapping Goals and Capabilities Consolidate Library Assist Vendor Selection and Oversight Conceptual Models Define Arch Review Process User Personas Define OSS Mgmt Process Monitor OSS Development Activities Logical Models Validate Architecture Detailed Architecture Reqs Coordinate QA Activities Scenarios Technology Models 0-3 Weeks 3-7 Weeks 7-10 Weeks 10-12 Weeks Jan 2012 -> Ongoing
Process • Workshops • Showcases • Reviews • Final Deliverables
Governance Structure • Executive Committee • Architecture Review Board • Architecture Core Team • Extended Architecture Team
Architecture Overview Architecture Summary Report available at: http://www.smarterbalanced.org/smarter-balanced-assessments/technology/ Title: Assessment System Architecture and Technology Phase 1 Report
Standards Groups • CEDS • IMS Global • SIFA • AIF