1 / 12

Fusing physical and cognitive spaces: Using wireless networked sensors to assess the who, what, where, when, and how of

Fusing physical and cognitive spaces: Using wireless networked sensors to assess the who, what, where, when, and how of student learning Gregory K. W. K. Chung UCLA/CRESST Mani B. Srivastava Department of Electrical Engineering, UCLA

jaden
Télécharger la présentation

Fusing physical and cognitive spaces: Using wireless networked sensors to assess the who, what, where, when, and how of

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Fusing physical and cognitive spaces: Using wireless networked sensors to assess the who, what, where, when, and how of student learning Gregory K. W. K. ChungUCLA/CRESST Mani B. SrivastavaDepartment of Electrical Engineering, UCLA Annual Conference of the National Center for Research on Evaluation, Standards, and Student Testing (CRESST) September 14-15, 2000 Los Angeles, CA

  2. Measuring Behavior • Current techniques • Real-time observation with sampling • Observation of video or audio taped data • Characteristics • Are time-consuming and prone to error • Rarely capture temporal properties of behavior • Major advantage: human in-the-loop categorizing of observations

  3. Measuring Behavior • Sensor-based techniques • Computationally measure physical properties of person and related objects • Computationally derive observations from sensor data • Vast improvement in observation capabilities • Scalability (high number of observations) • Efficiency (more information / unit cost) • Timeliness (rapid turnaround time) • Accuracy

  4. Measuring Behavior • Sensor-based techniques (continued) • Measuring the who, what, where, when, and how of human-human and human-object interactions • Key challenges: • Develop algorithms to support the aggregation of sensor data that accurately measure the construct of interest, are meaningful, are credible, and are in a form usable to different end-users • Relate behavioral measurements to cognitive processes and task outcomes • Approximate the 24/7 human observer

  5. Wireless Networked Sensors • Wireless networked sensors • Integrate sensing and short-range communication function in a single unit • Low-power consumption (long operational life) • Small form factor (embed in everyday objects) • RF (avoid line of sight problems) • Tetherless bi-directional connection to the Internet • Remote measurement and control capability • Embed “intelligence” and interactivity in everyday objects

  6. Sample of Sensor Types • Acoustic • Light • Image/video • Touch/pressure • Temperature • Identification • Position (x,y,z) • Proximity (x’,y’,z’) • Orientation (360°) • Movement (acceleration)

  7. Potential Application • Describing interaction • Student-object • Student-student • Student-teacher • Teacher-object • Triangulate multiple measures of interaction to successively refine inferences about interaction

  8. Example: Deriving observations of small group object categorization task Object and student position, student orientation, object- proximity data allow the following questions to be answered: S1 1. How many objects are categorized correctly by shape? (12-squares, triangles, circles) 2. What object are students focused on? (rhombus) 3. How many objects remain to be categorized? (1-rhombus) S2 S3

  9. Example: Deriving observations of small group instruction Position, orientation, acoustic data allow the following questions to be answered: S1 T 1. Who is paying attentionto the teacher? (S1, S2) 2. Which students are participating? (S1, S2) 3. What is the nature of the utterance? (S2 - question) 4. Which students are not paying attention orparticipating? (S3, S4) ??? S2 S4 S3

  10. Potential Application • Describing the classroom environment • Measures of: • Amount of lecture, independent, small-group instruction • Student resource use • Student roaming profiles • Teacher-student interaction • Student-student interaction • Student attention

  11. Next Steps • NSF Information Technology Research Grant (2000-2002) • UCLA Electrical Engineering lead department (PI Srivastava), UCLA Computer Science department and CRESST are partners • Develop technology wireless protocols, network architectures, middleware architecture, data management and mining, user profiling, speech recognition • Application domain: Assessing young children’s (K-1) problem-solving development

  12. Next Steps • Qualitative analyses of classroom, children’s interactions with each other, and children’s interaction with objects • Develop measures using sensor data • Validate measures with human observations • Develop sensor-based assessment of children’s problem-solving skills • Use play or other manipulative-based task that requires demonstration of performance • Use extended task to gather data over time

More Related