1 / 12

Thoughts on HCI Requirements Elicitation

Thoughts on HCI Requirements Elicitation. Glenn Fink 31 August 2006. Agenda. What is requirements elicitation ? What I Did What Worked What Didn’t Work What’s Hard Work What I Haven’t Tried (Yet) Conclusions. What is requirements elicitation ?. Finding out what the users want

Télécharger la présentation

Thoughts on HCI Requirements Elicitation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Thoughts on HCI Requirements Elicitation Glenn Fink 31 August 2006

  2. Agenda • What is requirements elicitation? • What I Did • What Worked • What Didn’t Work • What’s Hard Work • What I Haven’t Tried (Yet) • Conclusions

  3. What is requirements elicitation? • Finding out what the users want • Who are they? • What do they do? • Where do they have problems? • Designing prototype solutions • As simple, cheap, and quick as possible • Working with the users to improve the prototypes (evolutionary prototyping)

  4. What I Did • Discovering what system administrators need for security • Finding out how (whether) information visualization could help • Repeatedly: • Building prototypes • Evaluating the results

  5. Understanding the user community's true needs required evolutionary prototyping July 2003 psgraph Network Pixel Map Network Eye GL VISUAL User Interviews Paper Prototypes Portall August 2006 Host-Network Visualizer (HoNe)

  6. Summative Usability Evaluation • This was the last of a series of evaluations • I tested 27 system administrators performing intrusion-detection tasks • My visualization significantly improved scores • Subjects said that HoNe was better than their existing tools

  7. Users preferred and got the most insight from the VC condition • Statistically significant at 0.001 level (2=135, df=9) • Many users asked for a copy of the visualization • Users complained when going from visualization to text • I received many unsolicited positive comments

  8. What Worked • Semi-structured interviews • HCI Experts • Domain Experts • Expert Users • Audio recording, taking notes afterward • Paper and PowerPoint prototypes • Brainstorming sessions with users • Carefully designed usability evaluations • Rewarding participants

  9. What Didn’t Work • Transcribing audio interviews verbatim • High-fidelity prototypes • Costly to build, but never enough • Small problems hide core issues • Changing the interview protocol during the study • May not be avoidable in exploratory studies • Careless errors in usability evaluation design

  10. What’s Hard Work • Distilling quantifiable facts from semi-structured interviews • Making sure users understand low-fidelity prototypes the same way you do • Written interviews (Domain Experts) • Requires follow-up live interviews • You may never get the results • Solid statistical analysis (get help)

  11. What I Haven’t Tried (yet) • Broad Surveys • Diary Studies • Field studies (tag-along)

  12. Conclusions • Research is always built slowly, brick by painful brick • It’s relatively easy to: • Schedule interviews • Take notes • Get a good feeling for the subject • It’s relatively hard to: • Talk on the same wavelength as your subjects • Distill useful facts from interview notes • Write up your findings coherently

More Related