1 / 40

Sketching Interfaces for Specification

Sketching Interfaces for Specification. Readings. James A. Landay, Brad A. Myers, “Sketching Interfaces: Toward More Human Interface Design”, IEEE Computer , v34, n3, pp. 56-64, 2001. http://ieeexplore.ieee.org/iel5/2/19651/00910894.pdf

loydv
Télécharger la présentation

Sketching Interfaces for Specification

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sketching Interfaces for Specification

  2. Readings James A. Landay, Brad A. Myers, “Sketching Interfaces: Toward More Human Interface Design”, IEEE Computer, v34, n3, pp. 56-64, 2001. http://ieeexplore.ieee.org/iel5/2/19651/00910894.pdf Bailey, B.P., J.A. Konstan, and J.V. Carlis, "DEMAIS: designing multimedia applications with interactive storyboards", Proceedings of the International Conference on Multimedia, 2001, Ottawa, pp. 241-250. http://doi.acm.org/10.1145/500141.500179

  3. Supporting Early Design • Key tasks being supported • Design space exploration • Medium for expression and “visual thinking” • Increase concreteness • Providing inspiration and points of departure • Understanding the implications of design choices • Communications • Vehicle for talking about design options with others

  4. Sketching is Good • Almost everyone starts in early stage with very rough pencil and paper sketches • Rough ideas quickly • Uncertainty • Uncertainty and ambiguity is important to have at this stage (important to creativity) • Can’t have the details before you have done the design • Fluidity is critical • Investment must be very low • Changes must be very easy • Must stay “light on your feet” and not get solidified into small range of alternatives too early

  5. Need ability to test / operate as early as possible • But also need to be able to evaluate the real meaning / consequence of things • Must make them “operable” or executable at some level to get a real understanding • Big win • Herein lies the hard problem: Ambiguity and uncertainty are very antithetical to being able to “execute” something

  6. Getting past the divide • Being able to “execute” something which is ambiguous and only partially defined • Need to infer intent • Need sensible interpretation of all partial specifications • Hard: probably need to limit domain • Also need graceful path from early/ambiguous to later/precise (“real”) • Would rather not be forced to “throw it away” after a certain point

  7. SILK • Sketch • On digitizing tablet rather than paper • Interactive rather than static setting • Turn sketches into something that has enough structure and meaning that rich computational things can be done with it • Try to get fluidity benefits of pen but add a lot more • Storyboard • Used as central mechanism for specifying behavior • Limited, but well targeted (and already familiar to many) • Transition to real widgets • Gives (some) “upward path” for results

  8. SILK modes • Sketch • Annotate • Decorate • Run • Also use of pen mode • Special button on pen for command gestures • Modes help a lot with recognition

  9. SILK interfaces • Sketches • Window for sketching (specifying and placing) interactive components (widgets) • Screen at a time or dialog box at a time • Sort of assuming static layout and somewhat moded style for result • Storyboard • Window which contains collected copies of sketched screens • Place to specify behavior

  10. Sketching widgets • Attempt to recognize meaningful objects • Key to executability • Make recognized sketches into operating widgets • Still appear as sketches • Later transform into “real” widgets

  11. Widget set • Seven widgets in fixed set • Menu bar • Scrolling window • Palette • Button • Radio button • Check box • Text Field • Plus rows and columns (“panels”) Fixed set shows up as a significant limitation in user tests But supporting an arbitrary set introduces big programming issues

  12. Widgets recognized from primitive components • Four primitive gestures recognized • Rectangle • Squiggly Line • Straight Line • Ellipse • All single stroke • May require slight adjustment by user • Makes life a lot easier for recognizer • No segmentation • Rubine recognizer

  13. Recognition • Rubine recognizer • Single stroke • Feature based • E.g., number of inflection points, angles, pt-pt distances, etc… • Not person independent • Trained w/ 15-20 examples • Also feedback during use (via corrections made)

  14. Recognition • Steps • Individual component recognition • Spatial relationship analysis • Widget inference • Look for higher level groupings • Feedback • Possible user correction

  15. Recognition steps • Spatial relationship analysis • Relationships recognized • Contains / is contained by • Near • Left, right, above, below • Vertical or horizontal sequence • (of the same type)

  16. Recognition Steps • Rule based widget inference <pattern> <widget selection, confidence, code> • Pattern • Look for certain configurations of components • Rectangle contains rectangle and outer has very vertical aspect ratio  scroll bar • Body • Compute confidence from details of matched parts • Provide code for operating as a widget

  17. Recognition Steps • Look for “panels” • Horizontal or vertical groupings of widgets • Proximity • Alignment • Must all be the same type • Looks to create new panels or add to existing after each widget recognition

  18. Recognition Steps • Feedback • Recognized widgets turn purple(Not clear what feedback for groupings was) Usability problem uncovered in evaluation: Can’t tell which widget was recognized (without looking at control panel), but if you don’t get it right, correct panels don’t work

  19. Recognition Steps • Biggest usability problem will turn out to be recognition errors • User correction • Command gesture for “next best guess” • Users can also give “hints” via selections • Explicit change in control panel • If no widget found control panel has “New guess” button to tell system to “look harder”

  20. Delete Group Ungroup Next Guess Insert Text Also have recognition of command gestures • Recall: special “command” button on pen • Very common approach • All structured as a single stroke

  21. Specifying behavior via storyboards • Can place individual screens (collections of ink, widgets, etc.) into storyboard view • Copy/paste then edit • All behavior specified as navigation in story board • Specified by drawing a line (in storyboard editor) from either widget or background to a new scene. • In run-mode click on that object means “goto …”

  22. Storyboards

  23. Annotation and decoration • Decoration mode • Basically turns off the recognizer • All strokes just in as unrecognized ink • Annotation mode • Attaches ink and text to whole screen or individual widgets • Annotations can be turned on and off

  24. Run mode • Interface (with same sketched appearance) runs • Scroll bars slide, etc. • Buttons, etc. cause movements in storyboard • May not look that way to the user (e.g., single change between boards) • Also supplies some debug help • Storyboard highlights current and object which caused last last transition

  25. Transition to “real” interfaces • Can ask for sketched widgets to be turned into “real” widgets • VB or Garnet • Still need to add: • Precise alignments • Color • Text labels • Actual icons images, etc • Real action code!

  26. Experience & usability • Overall seems very positive • Biggest issue: recognition errors Recognition rates: Edit gestures: 89% Primitives: 93% Widgets: 69%  not going to work End analysis: • Recognition probably worse than a more explicit strategy

  27. Experience & usability • Second big issue: fixed widget set • Can do some things, but support falls way off for “non-standard” interactions • A lot of effort to work around with inadequate tools • Hard problem because taking on large part of “the programming problem” may hurt simpler and more common aspect • Interesting research issues here

  28. Demais System • New (but related) domain: Multimedia presentations • Time based media (video and audio) • Timing and synchronization behavior now the tricky part • Understanding design choices requires understanding how the detailed timing and sync will play out • Contribution is in specification of behavior • A more complex programming task (but still limited)

  29. Demais interface components • Again, storyboard based • Single screen (layout) editor • Narration editor • Deals with audio narration • Storyboard editor • Also “multi-view” editor • Collections of screens, storyboards, etc. that are useful to pull up and manipulate together • Sort of a “scrapbook” manager

  30. Entities system deals with in a layout screen • Plain text objects • Interpreted and uninterpreted • (Can do synch spec in scripting language) • Ink strokes • uninterpreted • Recognized objects • Behavioral ink strokes • Visual language icons • Annotations

  31. Recognition • Very simple for sketched object recognition • Only recognizes two things (Rubine algorithm again) • Rectangle • Indicates “content item” (still or video) • Tap to get file browser to load actual or simulated content • Behavior stroke • Single line from something to something • Otherwise treat as uninterpreted ink • Text recognition • Tries to parse as script text • If fails, uninterpreted

  32. Narration editor Can insert “synchronization markers” in text • Points to synchronize other stuff against • System does text-to-speech • Also supports recorded speech but not clear how sync points are specified • Probably like video (position of slider when behavior attached), but not explicitly stated

  33. Narration editor • To synchronize, • e.g., image appearance with a point in the narration sketch a (behavior) stroke from the sync point to another storyboard element

  34. Visual language for behaviors • Behavior strokes are between objects • Creates implicit parameter and subject from source and destination of the stroke • Strokes annotated to specify their actual behavior • Triggering event (from source) • Action to perform (at destination)

  35. Behavior iconsEvents • “T”s replaced by numbers • For sync: use current time of object • For elapsed: user prompted

  36. Action icons

  37. System picks default event & actions based on context

  38. Can replace defaults • Edit by tap-and-drop from palette • Better than drag-and-drop with pen • Can also use text annotations in scripting language for specifying behaviors • English like (details not discussed here) • Infers referents (“this”) • Doesn’t seem to fit well

More Related