Situated Tutors Tutorial Sae Schatz MESH Solutions, LLC – a DSCI Company
Schedule • Part 1: Background • Part 2: Theory • Part 3: Technical Details • Part 4: Use Case • Part 5: Recommendations
ITS Effectiveness Average (Human) Tutor ClassroomLearning Number of Students Top 2% Performance Bloom, B. S. (1984).
The Challenge Effective For higher-order cognitive skills Efficient For training declarative/procedural skills One-on-One (Human) Tutoring Apprentice Learning Didactic Lecture to a Group Computer-Based Training Currently, higher-order skills training is effective or efficient
Situated Tutors Intelligent Tutors Instructional Simulations
Static Computer-Based Learning Same for everyone
Intelligent Tutors Different for different students
ITS Effectiveness Average (Human) Tutor ClassroomLearning Top 2% SHERLOCK (1988) Andes (2005) Ecolab (1999)
ITS Effectiveness & Efficiency • Typical effectiveness gains of ITSs: • 0.48–0.61σ(Dede, 2008) • 1.0σ(Lane, 2006) • LISP tutor = 48% improvement on posttest (Anderson, 1990)* • Typical efficiency gains of ITSs: • One third of the time vs. classroom (Lajoie & Lesgold, 1992) • 4σ efficiency gain over traditional CBT (Romero et al., 2006) • Air Force electronics tutor for 20hr = 48 months of OJT (Lesgold et al., 1990) • LISP tutor = 30% less time vs. classroom (Anderson, 1990) *same study as above
Pros and Cons Intelligent Tutors Cons • Lacks Intrinsic Feedback • Usually Declarative/Procedural • Usually More Defined Domains • Usually Single-user Intelligent Tutors Pros • Adaptive • Manpower Efficiency • Embedded Pedagogy/Andragogy
Situated Tutors Intelligent Tutors Instructional Simulations
Simulation-Based Training • Instructional simulations include those simulations that employ a systematic instructional methodology (scenario-based training, for our purposes), as well as accurately represent the problem-solving domain (Salas, Bowers, & Rhodenizer, 1998; Oser et al., 1997). • Average performance gains of SBT vs. classroom: • 72% fewer errors in practice with SBT (Haque & Srinivasan, 2006) meta-analysis • Average efficiency gains of SBT: • 84%less time vs. traditional (Haque & Srinivasan, 2006) meta-analysis
But, in practice, SBT often falls short… Low Efficiency Heavy instructor workload Instructors must be SMEs, instructional designers, and technologists Deployed systems may have no instructional staff (e.g., McCarthy, 2008; Loftin et al.,2004; Smith-Jentsch et al., 1998). Low Effectiveness If instructors cannot meet all requirements or cope with the workload, suboptimal training may result involved This may result in negative training (e.g., Loftin et al., 2004; NRC, 1985; Houck & Thomas, 1991; Andradóttir et al., 1997; Air Force, 1991; Wray et al., 2004)
Pros and Cons Simulation Cons • One-Size-Fits(?) All • Relies on Instructor for Pedagogy • Relies on Instructor for Sequencing • Heavy Instructor Workload • No Good Instructor = Poor Training Simulation Pros • Good Transfer of Training • Often Supports Team Training • Supports Complex Contexts
Situated Tutors Intelligent Tutors Instructional Simulations
Situated Tutors + = SITUATED TUTOR INTELLIGENTTUTOR SIMULATION-BASED LEARNING Situated tutors are a special class of Intelligent Tutoring Systems that combine the features of an intelligent tutor with the scenario-based situated learning environment of instructional simulations.
Situated Tutors SITUATED TUTOR Automation Situated learning context + = Adaptation Supports higher-order cognitive skills Careful operationalization of domain Facilitate less determinate domains Includes instructional support Intrinsic feedback INTELLIGENTTUTOR SIMULATION-BASED LEARNING Extrinsic feedback Team training
Situated Tutors Situated tutors are computer-based instructional technologies that, at a minimum, include a simulated learning or training environment of Interactive Multimedia Instruction (IMI) Level 3 or above and instruct with intelligent adaptation. Further, these features are, at least, loosely federated with each other.
Simulation Depth IMI LEVEL 1 Page turner: does not include any simulation-like features Example: A basic website, like the Red Cross’s Preparing for Events IMI LEVEL 2 Medium simulation: supports limited interactivity, such as asking and scoring a response to a question Example: Interactive courseware or website scripting, like this quiz from Discovery.com IMI LEVEL 3 High simulation: Surface simulation with 2-3 levels of complex branching Example: A highly interactive simulation, such as Dafur is Dying, a robust serious game made in Flash IMI LEVEL 4 Full simulation: rich interactivity and branching, extensive high-fidelity surface simulation capabilities Example: A video game, such as America’s Army Does the system offer sufficient psychological fidelity and freedom of action to support the training of higher-order cognitive skills?
Sophistication of Adaptation ADAPTATION AS PREFERENCE Learner choice: Allows learner to control nature of interactions—generally diminishes outcomes Example: Self-sought instruction, like this flash game ROLE ADAPTATION Categorical: Broad learner-selected categories, such as by MOS, often distinguishes content presented Example: Many training websites, such as the GPRIME medical trainer MACRO ADAPTATION Tailored Pre-training: Individual learner KSAs and traits affect pre-task adaptation Example: Often found in CBT systems; supports sequencing and ATI, e.g., some LMSs MICRO ADAPTATION Tailored During-training: Tailored intervention is triggered based on during-task actions Example: Found in conventional ITSs; supports immediate feedback; e.g., PAT intelligent tutor ACTIVE ADAPTATION Overall: Combination of effective macro- and micro-adaptations Example: Facilitates immediate feedback and long-range sequencing; e.g., Rosetta Stone Does the system support tailored pre-task adaptation (e.g., instructional sequencing)and during-task adaptation (e.g., personalized hinting and feedback).
Situated Tutors: Tasks, Conditions, and StandardsDegree of Component Integration NO FEDERATION Separated: No data are passed between the simulation and ITS components. Generally, an ITS lesson is delivered and then the student is told to use the simulation Example: Microsoft Flight Simulator training web site LOOSELY FEDERATED Side-by-Side: The ITS and simulation exchange only outcome data. The two systems are often physically separated Example: Many military situated tutors follow this model; protocols such as IPA, DTECS, and SITA facilitate this integration; e.g., FBCB2/Tactical Decision-Making ITS TIGHTLY FEDERATED Full Integration: The ITS and simulation components can exchange data constantly; ITS features often “overlay” the simulation Example: The most sophisticated military systems, such as PORTS TAO ITS; I/SIS protocol can be used (but is rarely applied) Does the system support simultaneous functioning and robustdata interchange between the ITS and SBT components?
Find & Classify Situated Tutors Definitions from: Schatz, S., Oakes, C., Folsom-Kovarik, J. T., & Dolletski-Lazar, R. (2012). ITS + SBT: A Review of Operational Situated Tutors. Military Psychology. Detailed literature review of 86 situated tutors
Situated Tutors Effectiveness • 59% • Better than the video only • Ablative test: 56% in the video-only condition, 67% in the no-coach condition, and 89% of in the coach condition were successful (Lane et al., 2008). • 87% • Improved with simulation + ITS • 87% of police officers explained additional crimes with both the ITS and sim. vs. simulation alone (Furtado & Vasconcelos) ELECT BiLAT (aka VCAT) (USC Institute for Creative Technologies) ExpertCop (Furtado & Vasconcelos, 2006)
Situated Tutors Efficiency • Over 2000% • More Efficient Per Class • Previously, one instructor needed for two students, for a class of 42; now one instructor manages whole class (Stottler & Panichas 2006) • 98% • Cost Saving Versus F2F Approach • AFRL’sIATSreduced costs from $1172 a seat/year to $28 a seat/year for shipboard maintenance training (Madni, 2010). TAO ITS (Stottler Henke) IATS (Madni, 2010)
Traditional Intelligent Tutor Domain Model Pedagogical Model Learner Model Learner Data Instructional Methods Domain Content
Situated Tutor Game Engine Domain Model Pedagogical Model Learner Model Learner Data Instructional Methods Domain Content
Macro-Adaptation: • Content Selection • Preset Hints/Coaching • Preset Teaching Approach • Scaffold Challenge Level • Preset Scenario Variables • Historic Inputs: • Prior Knowledge • General Aptitude • Constitutional Attributes • Affective Attributes • Learner Preferences • Micro-Adaptation: • Give Hints/Coaching • Change Teaching Approach • Change Challenge Level • Adjust Scenario Story • Give Intrinsic Feedback • Immediate Inputs: • Current Performance • System Use/Abuse • Affective State (e.g., boredom, confusion, delight, flow, and frustration)
ITS Learner Model Varieties Overlay Models Classifier Models Constraint-Based Models Example Tracing Models (or “Pseudotutors”) Perturbation Models (or “Buggy Models”) Production Rule Models Bayesian Networks Model Tracing Systems Dynamic Bayesian Networks ACT-R Models or “Cognitive Tutors” Behavior Transition Networks Case Libraries Overlay models ignore details of how students learn and instead track what students have learned in a simple way, similar to a checklist. Slow to develop and moderately effective for macro-adaptation. Uses model tracing algorithm (i.e., rules drawn from a general model of human cognition), hence called “cognitive tutors.” Slow to develop, but have high returns. Monitor the immediate problem state. As long as a learner never reaches a state that the model identifies as wrong, he or she may perform any action. Highly effective. Authors define incorrect responses for single questions, and they are less concerned with the cognitive theories, hence example-tracing systems were called pseudo-intelligent tutors or pseudotutors. Bayesian networks and other classifiers are efficient but have lower detail than some other model types. Perturbation models or buggy models, try to describe all the incorrect knowledge the learner may have. Can require extensive investment and have mixed results. Decision Trees Finite-State Automata Neural Networks … and Others … Less Detailed More Detailed
Process • Review of literature • Interviews with SMEs, stakeholders • Concept designs for team • Learning objectives Dynamic tailoring requirements • GOTS/COTS Trade-off analysis • Hardware/software feasibility/cost analysis • Iterative requirements authoring • Iterative development • Iterative testing
Concept Creation Virtual Ville PercepTS Virtual Ville Concept
Virtual Ville Spatial Layout Multiple OPs (or Combat Outposts) OT / Control Room AAR & Vicarious Learning Room TOC
Concept Creation: Conceptual Architecture INPUT/OUTPUT SAF Behavior Authoring Interface Training Content Authoring Interface Observer Trainer Terminal Optical Interface Devices Radio Interface Device Visualization System SYNTHETIC TRAINING ENVIRONMENT Positional Tracking System Optical System Visualization Radio Interface Controllers Simulation Environment (Torque) Additional Simulation Plug-Ins Dynamic Tailoring Toolkit NPC Controller (SAF Behaviors) Virtual Team Speech Generation Speech Recognition Assessment Module (Micro/Macro) AAR Module Scenario and Lesson Toolkit Metrics Authoring Toolkit Domain Module Dynamic Tailoring Module Trainee Module AUTHORING AND MANAGEMENT Virtual Environment Database Scenario Content Database Patterns of Life Database Dynamic TailoringDatabase Trainee Records Database INSTRUCTIONAL AND EXPERT KNOWLEDGE DATABASES
#1: Reporting Situated Tutor Development • Report systems’ • (a) interactivity (including IMI Levels) • (b) forms of adaptation • (c) integration of features
#2: Report Situated Tutor Evaluation Results • Empirically assess situated tutors’ effectiveness and efficiency • Use ablative conditions, not just versus classroom • Remark on real-world impacts (e.g., reduced cost per seat, increased readiness reports)
#3: Expand Intrinsic Adaptation • Investigate novel situated tutor methods • Carefully assess adaptations impacts • Document categorical types of intrinsic adaptation, their best uses, and potential pitfalls to avoid • Consider macro-adaptive approaches, such as dynamic scenario generation, too
#4: Expand Higher-Order Instruction • Emphasize sophisticated cognitive, affective, and psychosocial competencies • Examine instructional strategies—specifically for situated tutors—that engender higher-order skills