1 / 14

Gazelle Task 7: Connectathon Test Process

Gazelle Task 7: Connectathon Test Process. Connectathon Manager Training Steve Moore Mallinckrodt Institute of Radiology. What the Participants Want. They sit in their chair Each of their test partners magically shows up, sits next to them and tests for 15 minutes

isla
Télécharger la présentation

Gazelle Task 7: Connectathon Test Process

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Gazelle Task 7:Connectathon Test Process Connectathon Manager Training Steve Moore Mallinckrodt Institute of Radiology

  2. What the Participants Want • They sit in their chair • Each of their test partners magically shows up, sits next to them and tests for 15 minutes • Instructions are read to them in their native language, leading them through each test • Each test is automatically evaluated • The system reminds them of work to do in a friendly voice: “Turn right in 250 meters.”

  3. The Real Process • Participant selects a test and negotiates with test partners for a start time • Test is completed; participant signals event • Time elapses (1 min, 60 min, 120 min) • Monitor comes by and evaluates results • Monitor records results • Connectathon Manager sums/publishes results

  4. Small Connectathon • Participant selects reads test from paper list and taps partner on the shoulder • When test is done, participant writes name and table number on the wall • Monitor comes by and evaluates results • Monitor records results in a notebook • Connectathon Manager reads notebook entries and sums results

  5. Gazelle Enabled Test Process • The next slides walk you through the process using the Gazelle tool • While we show the slides, you should be executing the steps in Gazelle

  6. Select Test, Partners • Gazelle presents a list of tests tailored to your registration • Participant decides which test to run now (or in the future) • Participant selects a test; Gazelle lists possible partners • Participant negotiates start time with partner • Participant starts test

  7. Test Execution • Covered in detail in a different session • Participants are given a set of steps to execute • Participants check off each step, walking through the entire process

  8. End Test, Evaluation • Participant selects button to end the test • An entry is generated on a worklist that is readable by one or more monitors • One monitor selects an item which locks it • Monitor evaluates test results / data • Monitor signs off and says test is complete (for all participants) • Monitor does not tell participants they are done for a profile / Connectathon

  9. Meta Tests • Some actor / profile pairs have similar requirements • Radiology / Cardiology Image Display Actors • Separate tests are written for separate profiles, but there is no reason for the system to execute 3 instances of two tests • Meta Test allows us to bind similar tests together. Participant is told to run 3 tests from the group of Meta Tests

  10. Assigning Monitors to Tests • Apply common sense • For a small event, all monitors probably evaluate all profiles • For a large event, monitors specialize • Enter the assignments in Gazelle • They are then mapped to test results

  11. Monitor Worklists • Several monitors are assigned to a profile (or a domain, or a test) • When a test is finished, each monitor can see tests of interest • Monitor selects a test instance, which locks that instance • Monitor can finish the work or return the instance to the list to let others take it

  12. Test Scoring • Each test should have instructions indicating evaluation criteria • Each test should have individual test steps to be reviewed • Monitor asks to see the test in real time or asks the participants to show evidence • Monitor evaluates and scores each step • Monitor provides final pass/fail status.

  13. Critical Status • This is appropriate for large events • Ungraded tests begin to pile up • Many are redundant, but they take up space in the worklist • When this is enabled, a participant can flag a few tests and have them move up in priority in the queue

  14. Final Evaluation • Eric will cover in more detail • This is final determination of pass/fail for actor/profile pairs

More Related