1 / 32

Research Methods – Measuring User Experience

Research Methods – Measuring User Experience. What might we measure in relation to user experience?. Measures of User Experience. Experience of a specific emotion Experience of a type of emotional response Experience of a type of pleasure Experience of “flow state”.

Télécharger la présentation

Research Methods – Measuring User Experience

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research Methods – Measuring User Experience

  2. What might we measure in relation to user experience?

  3. Measures of User Experience • Experience of a specific emotion • Experience of a type of emotional response • Experience of a type of pleasure • Experience of “flow state”

  4. Lazzaro: Four Keys to More Emotion without Story • Hard Fun • Easy Fun • Serious Fun • People Fun • Emotions: fear, surprise, disgust, naches/kvell, fiero, schadenfreude, wonder

  5. Frome – Game Generated Emotion • Game Emotions • Emotions of competition • Narrative emotions • Emotions from engaging with artwork • Artefact emotions • Emotions of aesthetic evaluation • Ecological Emotions • Response to what the artwork represents

  6. Player Pleasures

  7. Csikszentmihalyi - Flow

  8. Neilsen – Usability Attributes • Learnability • Memorability • Efficiency • Errors and their severity • Subjective satisfaction

  9. Juul & Norton • Different from productivity-based software • User challenge /difficulty is expected (sought out) • Challenge can be in any aspect of the games, including the interface

  10. Mandryke

  11. Heuristic Evaluation • Traditionally – examining compliance with recognised usability principles

  12. Neilsen – Usability Heuristics • Visibility of system status • Match between system and real world • User control and freedom • Consistency & standards • Error prevention • Error diagnosis and recovery • Recognition rather than recall • Flexibility & efficiency of use • Aesthetic and minimalist design • Help and documentation

  13. Pinelle, Wong & Stach – Game Usability • Unpredictable / inconsistent response to user’s actions • Does not allow enough customization • Artificial intelligence problems • Mismatch between camera/view and action • Does not let user skip non-playable content • Clumsy input scheme • Difficult to control actions in the game • Does not provide enough information on game status • Does not provide adequate training and help • Command sequences are too complex • Visual representations are difficult to interpret • Response to user’s action not timely enough

  14. Desurvire, Caplan & Toth – Heuristic Evaluation for Playability (HEP) • Gameplay • Game story • Game mechanics • Game usability

  15. Physiological Data • Galvanic skin response (GSR) • Respiration • Blood volume pulse (BVP) • Heart rate variation (HRV) • Electromyography (EMG) • Pupil dilation (PD) • Arousal (GSR, Resp, BVP, HR) • Mental effort (HRV, PD, EMG) • Valance (EMG, HRV, PD)

  16. Electrodermal Activity • Galvanic skin response (GSR) • Measures variation in electrodermal activity between tonic baseline and phasic responses • Uses eccrine sweat glands – palms of hands and soles of feet

  17. Cardiovascular • Blood pressure – pressure needed to push blood through circulatory system • Blood Volume – how much blood is being pushed around • Heart rate – number of beats per minute • Heart rate variability – change in heart rate

  18. Muscles • Electromyography – measure of muscle activity • Brow • Jaw • Cheek

  19. Arousal • Increases in galvanic skin response • Increased respiration • Decreased blood volume pulse • Increased heart rate

  20. Mental Effort • Decreased heart rate variability • Greater pupil dilation • Increases in jaw clenching or brow-raising • Increased respiration rate • Decreased variability of respiration rate

  21. Positive vs. Negative Emotions • Valance of an emotion • Facial muscle analysis of brow and cheek • Heart rate, • Irregularity of respiration • Pupil diameter

  22. Physiological Data - Advantages • Continuously collected to evaluate process not just outcome • Doesn’t interfere with experience • High bandwidth – lots of data • Can be used to infer underlying emotions

  23. Physiological Data - Disadvantages • High variability between individuals • Sensor error, interference and noise is prevealent • Requires baseline and normalization techniques • Can be invasive and impact performance

  24. System Gathered Data • Time on task • Number/type of errors • Choices made • Number of times help system used • Number of time area/page visited • Any user input

  25. Research Case Study: Red-eye Removal • Eastman Kodak – Removal of red-eye defect from images in direct print kiosks

  26. Red-Eye: Pre-Artefact • Research, evaluation/review of existing systems • Scoping parameters for system design - range of size of pupils with red-eye defect • Negotiated system requirements and specifications • Touch screen • Screen resolution • Amount of zoom

  27. Red-Eye: Building Artefacts • System captured data • Time on task – how long to adjust each of three • How many something was undone and what was undone

  28. Red-Eye: User Testing • 24 participants – Kodak factory workers variety of ages and gender • Three versions of the system – all participants used all • Variation in order that the versions were tested • Used talk aloud – video recorded sessions • Post test questionnaire – subjective/qualitative

  29. Red-Eye: Data Analysis • Time on task analysis • Error rates/types • Speak aloud comment classification • Which did users say they preferred/found easiest • Correlation between: • Order used and user preference • Order used and time on task • Order used and speak aloud comment types

  30. Sources • http://www.nngroup.com/articles/ten-usability-heuristics/ • http://www.useit.com/papers/heuristic/heuristic_evaluation.html • http://userbehavioristics.com/downloads/usingheuristics.pdf • http://userbehavioristics.com/downloads/usingheuristics.pdf • http://mi-lab.org/wp-content/blogs.dir/1/files/publications/uxInGames_Koeffel_et_al.pdf

  31. Crawford (1982) “Why do people play games?” in The Art of Computer Game Design. [online] Available at: http://www.scribd.com/doc/140200/Chris-Crawford-The-Art-of-Computer-Game-Design (Last Accessed 31 January 2013) • Frome, J. (2007) "Eight Ways Videogames Generate Emotion" in Situated Play, Proceedings of DiGRA 2007 Conference. [Online] Available at http://www.digra.org/dl/db/07311.25139.pdf(Last Accessed 28/01/13) • Lazzaro, N. (2004) Why we play videogames: Four keys to more emotion without story. XEODesign. [Online] Available at: http://xeodesign.com/xeodesign_whyweplaygames.pdf  (Last Accessed 7 Feb 2013)

  32. Pinelle, D., Wong, N., Stach, T. (2008) “Heuristic Evaluation for Games: Usability Principles for Video Game Design” in Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI 2008), 1453-1462. (http://hci.usask.ca/publications/2008/p1453-pinelle.pdf) • Isbister, K. & Schaffer, N. eds. (2008) Game usability: advice from the experts for advancing the player experience. London: Morgan Kaufmann. • http://www.jesperjuul.net/text/easydifficult/ • http://armorgames.com/play/4309/this-is-the-only-level

More Related